AI hiring tools may be filtering out the best job applicants
AlamyShe adds job candidates rarely ever know if these tools are the sole reason companies reject them – by and large, the software doesn’t tell users how they’ve been evaluated. Yet she says there are many glaring examples of systemic flaws.
In one case, one user who’d been screened out submitted the same application but tweaked the birthdate to make themselves younger. With this change, they landed an interview. At another company, an AI resume screener had been trained on CVs of employees already at the firm, giving people extra marks if they listed “baseball” or “basketball” – hobbies that were linked to more successful staff, often men. Those who mentioned “softball” – typically women – were downgraded.
Marginalised groups often “fall through the cracks, because they have different hobbies, they went to different schools”, says Schellmann.
In some cases, biased selection criteria is clear – like ageism or sexism – but in others, it is opaque. In her research, Schellmann applied to a call centre job, to be screened by AI. Then, she logged in from the employer’s side. She’d received a high rating in the interview, despite speaking nonsense German when she was supposed to be speaking English, but received a poor rating for her actual relevant credentials on her LinkedIn profile.
She worries the negative effects will spread as the technology does. “One biased human hiring manager can harm a lot of people in a year, and that’s not great,” she says. “But an algorithm that is maybe used in all incoming applications at a large company… that could harm hundreds of thousands of applicants.”
‘No-one knows exactly where the harm is’
“The problem [is] no-one knows exactly where the harm is,” she explains. And, given that companies have saved money by replacing human HR staff with AI – which can process piles of resumes in a fraction of the time – she believes firms may have little motivation to interrogate kinks in the machine.
Source link



