More and more companies are using AI to recruit and hire new employees, and AI can be part of it almost every stage of the recruitment process. Covid-19 has fueled a new demand for these technologies. Tramindui Curious thing and HireVue, Companies specializing in AI-based interviews, have reported an increase in activity during the pandemic.
Most job hunts, however, start with a simple search. Job seekers are turning to similar platforms LinkedIn, Monster, o ZipRecruiter, Where they can upload their resumes, browse job postings, and apply for openings.
The goal of these websites is to match qualified candidates with available positions. To organize all these openings and candidates, many platforms employ AI-based referral algorithms. The algorithms, sometimes called corresponding engines, process the information from both the job seeker and the employer to compile a list of recommendations for each.
“You typically hear the anecdote that a recruiter spends six seconds looking at your resume, right?” says Derek Kan, vice president of product management at Monster. “When we look at the recommendation engine we’ve built, you can reduce that time to milliseconds.”
Most suitable engines are optimized to generate applications, he says John Jersin, the former vice president of product management at LinkedIn. These systems base their recommendations on three categories of data: information that the user provides directly to the platform; data assigned to the user based on others with a set of similar skills, experiences, and interests; and behavioral data, such as how often a user responds to messages or interacts with workstations.
In the case of LinkedIn, these algorithms exclude a person’s name, age, gender, and race, so understanding these characteristics can contribute to bias in automated processes. But Jersin’s team found that even so, service algorithms could also detect patterns of behavior exhibited by groups with particular gender identities.
For example, while men are more likely to apply for jobs that require work experience beyond their qualifications, women tend to go only for jobs where their qualifications correspond to the requirements of the position. The algorithm interprets this behavioral variation and adapts its recommendations in a way that inadvertently disadvantages women.
“You can recommend, for example, more senior jobs to one group of people than another, even if they are qualified at the same level,” says Jersin. “These people can’t be exposed to the same opportunities. And that’s really the impact we’re talking about here.”
Men include even more skills in their resumes to a lower degree of competence than women, and often engage more aggressively with recruiters on the platform.
To address such issues, Jersin and his team on LinkedIn has built a new AI designed to produce more representative and distributed results in 2018. It was essentially a separate algorithm designed to contrast distorted recommendations toward a particular group. The new AI ensures that before referring games curated by the original engine, the recommendation system includes a uniform distribution of users across the sexes.
Kan says Monster, which lists 5 to 6 million jobs at any given time, also incorporates behavioral data into its recommendations but does not correct the bias in the same way that LinkedIn does. Instead, the marketing team focuses on getting users from different backgrounds enrolled in the service, and the company then depends on employers to report back and tell Monster whether or not it has submitted a representative group of candidates.
Irina Novoselsky, CEO of CareerBuilder, says it is focused on using data that the service collects to teach employers how to eliminate prejudice from their jobs. For example, “When a candidate reads a job description with the word‘ rockstar ’, there is materially a lower percentage of women who show up,” she says.