Martin Burch had been working for the Wall Street Journal and its parent company Dow Jones for a few years and was looking for new opportunities. One Sunday in May 2021, he applied for a data analyst position at Bloomberg in London that looked like the perfect fit. He received an immediate response, asking him to take a digital assessment.
It was strange. The assessment showed him different shapes and asked him to figure out the pattern. He started feeling incredulous. “Shouldn’t we be testing my abilities on the job?” he asked himself.
The next day, a Monday, which happened to be a public holiday in the UK, he got a rejection email. He decided to email a recruiter at Bloomberg. Maybe the company made a mistake?
What Burch discovered offers insight into a larger phenomenon that is baffling experts: while there are record level job openings in both the UK and in the US, why do many people still have to apply to sometimes hundreds of jobs, even in sought-after fields like software development, while many companies complain they can’t find the right talent?
Some experts argue that algorithms and artificial intelligence now used extensively in hiring are playing a role. This is a huge shift, because until relatively recently, most hiring managers would handle applications and resumes themselves. Yet recent findings have shown that some of these new tools discriminate against women and use criteria unrelated to work to “predict” job success.
While companies and vendors are not required to disclose if they use artificial intelligence or algorithms to select and hire job applicants, in my reporting I have learned that this is widespread. All the leading job platforms – including LinkedIn, ZipRecruiter, Indeed, CareerBuilder, and Monster – have told me they deploy some of these technologies.
Ian Siegel, the CEO of ZipRecruiter, said that artificial intelligence and algorithms have already conquered the field. He estimates that at least three-quarters of all resumes submitted for jobs in the US are read by algorithms. “The dawn of robot recruiting has come and went and people just haven’t caught up to the realization yet,” he said.
A 2021 survey of recruiting executives by the research and consulting firm Gartner found that almost all reported using AI for at least one part of the recruiting and hiring process.
Yet it is not foolproof. One of the most consequential findings comes from Harvard Business School professor Joe Fuller, whose team surveyed more than 2,250 business leaders in the US, UK and Germany. Their motives for using algorithmic tools were efficiency and saving costs. Yet 88% of executives said that they know their tools reject qualified candidates.
Despite the prevalence of the technology, there have just been a few famous cases of misfires. A few years back, Amazon discovered that its resume screener tool was biased against women. The algorithm was trained on resumes of current employees, who skewed male, reflecting a gender disparity in many tech fields. Over time, the tool picked up on male preferences and systematically downgraded people with the word “women” on their resumes, as in “women’s chess club” or “women’s soccer team.” Amazon’s engineers tried to fix the problem, but they couldn’t and the company discontinued the tool in 2018.
“This project was only ever explored on a trial basis, and was always used with human supervision,” said Amazon spokesperson Brad Glasser.
AI vendors that build these kinds of technologies say that algorithm-based tools democratize the hiring process by giving everyone a fair chance. If a company is drowning in applications, many human recruiters read only a fraction of the applications. An AI analyzes all of them and any assessments and judges every candidate the s