Skip to main content

Amazon discards ‘sexist’ AI recruiting tool

October 12, 2018 | Expert Insights

Amazon reportedly scrapped an internal recruiting tool that used artificial intelligence to sort through applications after the company’s machine-learning specialists realized the engine favoured male candidates over female ones.

Background

Machine learning is a core technique used in the field of artificial intelligence (AI) that involves feeding automated systems troves of data about a particular subject and training it to learn, improve and become more accurate from experience without being explicitly programmed.

As machine learning and automation are gaining traction across the globe due to low-cost computing power and improved efficiency, there are still concerns about the dangers of inherent bias in the use of algorithms. Although these algorithms are not trained to favour one outcome over another, they are susceptible to unintended biases since they do not function on context, but are learn based on the data ingested.

Analysis

Amazon decided to discard an AI-powered recruiting tool after its team discovered the program was biased against women. The AI was developed by a team at Amazon’s Ediburgh office in 2014 as a way to automatically sort through job applicants’ resumes and select promising candidates, Reuters reported citing five unnamed members of the team.

In 2015, the company discovered that the tool did not rate candidates for software developer jobs and other technical positions in a gender-neutral manner.

The automated tool was trained to identify promising applicants by observing patterns in resumes submitted to Amazon over a 10-year period. It then would assign job candidates scores ranging from one to five stars. However, most of the resumes came from men given that the tech industry is notoriously male-dominated.

Based on the data it used, the system taught itself to prefer male candidates over female ones, and penalized resumes that included words like “women’s”, such as “women’s chess club captain”. It also downgraded graduates of two unnamed all-women’s colleges and gave preference to what Reuters referred to as “masculine language” and the use of stronger verbs such as “executed” or “captured.”

Although Amazon attempted to tweak the program to address the bias and make it more neutral, there was no guarantee that it would not devise another manner of sifting through resumes that could be discriminatory.

In early 2017, Amazon executives lost faith in the tool’s ability to be neutral and the project was abandoned. Amazon said the tool was “never used by Amazon recruiters to evaluate candidates.” Anonymous sources familiar with the matter told Reuters that the company’s recruiters looked at recommendations generated by the tool, but did not rely on the rankings, when filtering candidates. The company now uses a “much-watered down version” of the recruiting tool while a new team has been formed in Edinburg to try out automated employment screening again, but with a focus on diversity. Female employees currently make up 40% of the company’s workforce.

A number of major companies have integrated tools based on machine learning and predictive analytics into their recruiting and talent development to streamline the process including Goldman Sachs, LinkedIn, and Microsoft among others.

The report highlights the increasing adoption of AI and automation across various sectors as well as its limitations and the unexpected challenges they present. According to a 2017 survey by talent software firm CareerBuilder, about 55% of US human resources mangers said AI would become a part of their regular work within the next five years.

Counterpoint

Many startups specializing in AI-powered recruiting solutions are promoting the idea that such tools can eliminate human bias in job-hiring decisions. However, hiring algorithms can only reduce biases in the hiring process so long as the input data is accurate and unbiased. Given that these algorithms are trained on historical data, any earlier biases embedded in the data must be addressed and eliminated

Assessment

Our assessment is that the challenges faced by Amazon’s recruiting tool showcase the challenges and limitations of AI and automation. As AI becomes increasingly more commonplace in daily life with companies looking to incorporate smart technology in the pursuit of optimization, we believe these algorithms and methods must be thoroughly evaluated and regularly audited for bias. We also believe these tasks – although complex and challenging – are necessary as an ethical and societal responsibility to ensure there are no discriminatory results or negative outcomes.