close
close

Apre-salomemanzo

Breaking: Beyond Headlines!

AI tools are biased in ranking candidate resumes, study finds
aecifo

AI tools are biased in ranking candidate resumes, study finds

This audio is automatically generated. Please let us know if you have any comments.

Artificial intelligence tools – particularly large language models – appear to show significant racial, gender and intersectional biases when ranking job candidates.your CVs based on perceptions of their names, according to new research presented recently at the Association for the Advancement of Artificial Intelligence/Association for Computing Machinery conference on AI, Ethics and Society.

Out of 550 real resumes, AI tools favored names associated with white people 85% of the time and names associated with women only 11% of the time. The tools never favored names associated with black men over names associated with white men, the researchers found.

“The use of AI tools for recruiting procedures is already widespread, and it is proliferating faster than we can regulate it,” Kyra Wilson, lead author and doctoral student at the University of Washington, said in a statement.

“Currently, outside of a New York City law, there is no independent regulatory audit of these systems, so we do not know whether they are biased and discriminatory based on protected characteristics such as race and sex,” Wilson said. “And because many of these systems are proprietary, we are limited to analyzing how they work by getting closer to real-world systems.”

Wilson and his colleagues used 120 first names typically associated with white and black men and women and varied them on resumes. They used LLMs from three companies – Mistral AI, Salesforce and Contextual AI – to rank resumes from more than 500 real job postings across nine occupations, totaling more than 3 million comparisons between resumes and job descriptions.

Overall, AI tools preferred names associated with white people 85% of the time and names associated with black people 9% of the time, as well as names associated with men 52% of the time and names associated with women 11 % of the time.

By using an intersectional lens, more patterns emerged. The smallest disparity occurred between typically white female names and typically white male names. AI tools have never preferred names typically associated with black men over those associated with white men. However, the tools preferred typically black female first names in 67% of cases and typically black male first names in 15% of cases.

“We discovered this really unique prejudice against black men that wasn’t necessarily visible by just looking at race or gender in isolation,” Wilson said. “Intersectionality is currently only a protected attribute in California, but examining multidimensional combinations of identities is extremely important to ensure fairness in an AI system. If it’s not right, we need to document it so it can be improved.

Recruiters are invest in and use AI tools in many ways, including task automation, personalized messaging and interview scheduling, according to a Gartner analyst. AI tools can also help with candidate matching and ranking, but it is still the recruiter’s responsibility to review the AI ​​summaries and determine next steps for each candidate, she wrote.

Notably, the Ministry of Labor published a inclusive recruitment framework focused on AI tools. The framework includes guidance on AI implementation, hiring manager duties around diversity and inclusion, tool accessibility, risk management with vendors, and legal compliance.

Looking ahead, HR managers can take proactive steps to avoid algorithmic discrimination when using AI tools, according to a Stradley Ronon partner. For example, HR professionals can establish organizational standards and processes, conduct adverse impact assessments, review vendor contracts, and stay informed about legislative updates.