Abolish the job application
Job applications are wasteful and demeaning. Maybe it's time we got rid of them.
Earlier this month, I lost out on my one-hundredth job application. When you look for work for any prolonged period, you start to notice different shades of no: The most common (incurred by 52 of my applications) is no response at all. I received 38 automated rejection emails; eight applications were declined after one or more interviews — two via phone call, one (to a public institution in Berlin) via letter. Three resulted in offers, of which I accepted two. Beyond these baseline numbers, the outcomes feel completely random. I was equally unsuccessful applying to entry-level and more qualified roles, local and national, fixed-term and permanent, in the public and private sector, and my rate of failure remained constant over time.
This is not a unique experience. Most of my friends are steadily applying for work, and their numbers aren’t much better. Across the economy, the median number of job applications people write every month nearly tripled over the last 40 years, and the share of people who submit ten or more applications per month doubled, from 15% to 30%1, a trend that is only accelerating. Despite this precipitous increase in applications, the number of people getting hired a given month has remained the same.
All the cover letters, work histories, personal statements, homework assignments and other materials I’ve produced for my hundred applications could fill a book at this point, but instead they went into the shredder along with everyone else’s bundles of paperwork. It’s hard to imagine a more stupid, demeaning, wasteful method to match people with employment. How did we end up here?
The immediate cause, I submit, is that organisations figured out sometime in the 2010s that they could use technology to outsource the most labour-intensive parts of recruiting to job applicants themselves. Why pay a HR person to read applications and think about how they match a given position when you can simply buy an applicant tracking system (those clunky online recruitment portals) and have the applicants do it for free?
Freed from those labour costs, companies can engage in the kinds of frustrating, unproductive behaviour we complain about:
- They get to inflate requirements for positions at all levels — it's not unusual to see a laundry list of twenty-five ‘essential criteria’ (including advanced degrees and years of relevant work experience) for entry-level positions — because you're doing the interpretive work of parsing and explaining how you fulfil each one in upbeat, easy-to-read prose.
- They get to deploy crude, pseudo-scientific algorithms to reject applications because you're doing the data entry required to make your personality, skills and experience legible to statistical modelling.
- They get to advertise ghost jobs (job openings they have no intention to fill) to scam investors, gather information on competitors, placate existing employees, or out of simple incompetence, because you’re either responding to them for free anyway, or doing the cognitive work of separating them from genuine openings.
None of these practices would be viable if companies had to bear their full cost. They do, of course, keep any incedental products: detailed information about applicants (including, increasingly, biometric data), work samples, test results and, should they decide to hire someone, a pre-qualified, thoroughly legible employee. It’s socialised costs and privatised gains in a trenchcoat again.
But outside the particular mechanics of the hiring process, there is a longer development at play. For much of the 20th century, applying for jobs was a young person’s game: Once you had landed an entry-level position at a firm, you would stay there for decades and gradually move up the ranks along well-defined trajectories, support by strong labour unions — at least that was the expectation.
What this system lacked was freedom — when you’re going along a pre-defined career path, you have little room for personal initiative, creativity or personal expression. In The New Spirit of Capitalism Luc Boltanski and Eve Chiapello describe how this artistic critique, first loudly articulated in the protests of 1968, was quickly incorporated by capitalism and eventually led to the development of the projective city2 — the organising principle of of capitalism we still inhabit today.
Here workers are no longer engaged in a single career, but a series of short-term projects to which they attach themselves only temporarily, and people’s success is no longer determined by their efficiency, but their flexibility, ability to network, and level of personal engagement. Individual freedom was undoubtetly gained, albeit at the cost of collective security.
One of the consequences of this shift, Boltanski and Chiapello show, is the disruption of long-established tests of status we use to mediate people’s access to jobs, promotions, education, benefits, and other social provisions.
We generally accept these tests because we’re fairly clear about the specific qualities being assessed, measures are taken to prevent cheating, and preliminary filtering ensures that those permitted to the test have a reasonable chance of success. For example, we’re okay with essay assignments at school because everyone understands the skills they're meant to assess (reasoning and style, not typing speed or access to quality stationery). They’re also supervised well enough to prevent blatant cheating (students are mostly prevented from bringing ghostwriters to exams) and an orderly system of instruction, homework and mock tests gives students a reasonable chance at a passing grade.
The job application is just another test of status. In the 20th century, it met the criteria of legitimacy: Applicant pools were smaller, and industry-wide qualification grids, salary scales, and established career paths ensured at least a perception, if not always a reality, of transparency and fairness. But in the 21st, these control mechanisms are removed and the job application’s claim to legitimacy begins to crack.
It’s increasingly difficult to discern the set of attributes being assessed in any given application process: Education? Technical skill? Reputation? Malleability? Immigration status? Employers, emboldened by self-service technology, tend to take a maximalist approach and demand evidence for every attribute they can think of, leaving applicants guessing which ones are going to be decisive. Hundreds of people apply for any given position, driving any given applicant’s chance of success toward zero. Outside supervision is largely absent from the process. And if you do succeed, your reward is no longer a permanent work contract with a built-in progression system, but a fixed-term, often fractional engagement at the end of which you'll be expelled back onto the labour market to start the process all over again.
A hiring process like this, where people are tested with incessant frequency, ‘identification of the the most important tests is non-existent’ and ‘the criteria of judgement are multiple, variable and sometimes not formalised’3 is bound to loose its legitimacy and result in the kind of exasperation felt by me and everyone else. Soon after, it begins to degrade from a test of status into a test of strength where any remaining notion of structure and control is abandoned and people simply do whatever it takes to win. The decay is gradual, and in the case of hiring it is only beginning. But the increasigly common suggestion, offered by mainstream business publications, to improve your chances by obtaining inside recommendations, keyword-stuffing your cover letter, social engineering the hiring manager and similar tactics, must be its signature.
The result is a world of winners whose strengths are unspecified and mostly invisible, and loosers who don’t understand what hit them.
Boltanski and Chiapello propose several mechanisms to reconstruct a sense of justice to existing employment tests. New regulation could require companies to provide training, networking and public engagement to improve their workers’ future employability, and hold them responsible when they fail to do so. A new public intermediary between job seekers and firms could replace the current system of inside recommendations and elite alumni networks preventing wider access to desireable positions. A universal basic income (funded by corporate taxes) could soften the threat of unemployment and give applicants more agency in the process.
These are sensible ideas, but in the interim I'm drawn to a much simpler one: Just fill jobs at random.
Sure, companies might end up with a few more bad recruits than they do now, but they’ll also hire some great candidates they would have otherwise missed, and the costs saved by reducing a months-long recruitment process to simple coin toss would surely make up the difference. Random hiring would certainly be more equitable4 and than any deliberative method, and it would force companies to train, relocate, and otherwise invest new employees — undeniably a more productive activity than perpetuating an inflated hiring process, and wholly in line with the policy ideas sketched out above.
Even if some individual firms ended up with a less productive workforce, the amount of labour power and emotional energy freed for everyone involved — employed and unemployed — would be transformative.
Birinci, S., See, K., Wee, S.L. (2023): Job Applications and Labor Market Flows, Federal Reserve Bank of St. Louis Working Paper 2020-023. ↩︎
Luc Boltanski and Eve Chiapello (2018), The New Spirit of Capitalism. Verso Books. ↩︎
Ibid p. 319 ↩︎
In this piece, the author refers to ‘a Danish mathematician’, who ‘explained one clever rationale for randomisation’. I looked it up, and I think the theory is this: If the impact high performers make on their firm is bigger in absolute terms than the impact made by low performers, then Jensen's Inequality holds that the overall impact on the firm will be positive. ↩︎