By Kim Elsesser
Many organizations are struggling to find strategies for recruiting a more diverse workforce, and some are turning to artificial intelligence (AI). But artificial intelligence got a bad rap last year when Amazon discovered their AI recruiting tool had “learned” gender bias. So, is AI beneficial to those seeking diversity, or will it just exacerbate the problem? One recruiting firm has found that AI is an effective strategy for increasing the diversity of candidate pools, as long as its implemented correctly.
The Problems With Using AI In Recruiting
Last year, Amazon scrapped its AI recruiting efforts after realizing that its recruiting program was biased against women. The computer models were trained with résumés submitted to Amazon over the previous ten years. Not surprisingly, most of these résumés came from men. Therefore, the computer models “learned” that men were superior job candidates.
As a result, the models became biased against women, generating lower ratings to graduates of two all-women’s colleges and penalizing résumés that contained the word “women’s” as in “women’s chess club captain.” Although the programs could be altered to become neutral to these particular terms, Amazon decided to scrap the project out of fear that the models may develop other discriminatory selection criteria.
The Benefits Of Using AI In Recruiting
Despite Amazon’s setbacks, Genevieve Jurvetson, cofounder and CMO of the artificial intelligence-based recruiting firm, Fetcher, believes AI is an essential tool for those seeking a more diverse candidate pool. “If you get too caught up in the fears surrounding AI, you miss a huge opportunity,” Jurvetson says. She explains that there are innumerable criteria that you can use to find good candidates that are not remotely gender-related. Jurvetson claims that Fetcher’s use of AI techniques allows them to bring minorities and women into their clients’ hiring pipeline in a way other recruiting methods just can’t.
Why are AI techniques so effective? We all know humans have biases. Take gender bias, for example. After a lifetime of exposure to mostly men in leadership roles, in boardrooms and in tech jobs, we develop a bias to prefer men in these positions. The big advantage of artificial intelligence is that it removes the humans and their biases from a large chunk of the recruiting process. The more you can remove human intervention, particularly at the stages of the recruiting process that are most prone to bias, the less that bias will be able to influence decision-making.
Although Fetcher allows clients to “thumbs up or thumbs down” potential candidates, Jurvetson thinks the most diverse candidate pools come from what they call their fully automated mode. In this mode, Fetcher hones in on client preferences by asking the client to review a handful of candidates. Then, Fetcher’s systems take control, assembling a candidate pool, and contacting potential candidates directly. This way, a client’s potential biases are removed from the process of choosing who will be contacted. Jurvetson describes, “When you pull yourself out of the process, that’s a really important step because when you’re not hand-selecting each candidate, you’re not bringing in these inherent biases that we all have.”
One reason that organizations have trouble increasing diversity is that the traditional methods of searching for candidates are often biased. The problem, Jurvetson explains, is that searches often use bias-ridden proxies for candidate potential. “Did they go to a top 20 university? Did they come from a top-tier company? Those tend to be pools of talent that might not be that diverse to start with. Using AI to be able to identify patterns, career progression is a great one, that correlate with success better than those old proxies is exciting and effective,” she explains.
According to Jurvetson, searches using these less-traditional proxies for success identified by artificial intelligence programs allow Fetcher to find candidates that might otherwise go overlooked. This allows organizations to cast a wider net than they might otherwise.
Research confirms there are huge advantages of casting a wide net when assembling a candidate pool if you want to increase diversity. One study found that if you only have one female candidate or underrepresented minority candidate in your pool of candidates, they have almost no chance of making it to the offer stage. That’s because the lone woman or minority seems too different from the norm. However, if you add a second female or minority candidate, their odds of making it to the final round increase dramatically, and they have the same chance of receiving an offer as the other candidates.
What keeps the Fetcher models from having the same issue as Amazon’s failed model? Jurvetson suggests that precautions need to be taken to minimize any unintended effects. At Fetcher, she says they keep their models simple, and they keep a trained human eye on the models and the outputs to be sure the recruiting criteria are gender-neutral and that the models are returning diverse candidate pools.
The Future Of AI In Recruiting
Recently, Tomas Chamorro-Premuzic and Reece Akhtar made an argument in Harvard Business Review for taking the use of AI techniques one step further and applying them to the interview process. The researchers report, “One of the major problems with the way we currently interview job candidates is that the process is largely unstructured, leaving the questioning to the whims and fancies of the interviewer. It shouldn’t take much convincing to see how this is not only inefficient, but how it also leads to biased decision-making due to interviewers expressing and seeking to confirm their own preferences. This is where video or digital interviews are likely to help. Digital interviews can remove these limitations almost entirely.”
While optimistic at about their potential, Jurvetson doesn’t think AI job interviews will take over any time soon. “I think if used thougthtfully, they can be really powerful and helpful for a lot of candidates, especially at early stages of the recruiting process, but I don’t have to tell you all the ways that can go wrong as well. You have to be smart about how and when you utilize these types of interviews, or racial or gender issues could come into play in a really sad way, and you could also negatively impact the candidate experience,” she reports. One more limitation she added, “I’ve heard from candidates interviewing for more experienced roles that they sometimes find [automated interviews] insulting, because they can give the impression that the potential employer isn’t willing to give you their time.”
For those looking to create a more diverse workforce, removing human decision-making from at least a part of the recruiting process seems like a no-brainer for reducing unconscious bias in hiring. Implemented correctly, AI tools can be a great way to search in an unbiased manner. And for those job seekers of the future, keep an open mind about a robot interviewer—hopefully it will be programmed to be less biased than a human manager.