Incident 37: Female Applicants Down-Ranked by Amazon Recruiting Tool

Description: Amazon shuts down internal AI recruiting tool that would down-rank female applicants.


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: Amazon developed and deployed an AI system, which harmed female applicants.

Incident Stats

Incident ID
Report Count
Incident Date
Sean McGregor

CSETv0 Taxonomy Classifications

Taxonomy Details

Full Description

In 2015, Amazon scrapped an internal recruiting algorithm developed by its Edinburgh office that would down-rank resumes when it included the word "women's", and two women's colleges. The algorithm ranked an applicant out of five stars, and it would give preference to resumes that contained what Reuters called "masculine language," or strong verbs like "executed" or "captured". These patterns occured because the engineered who made the algorithm trained it with past candidates' resumes submitted over the previous ten years, and the past candidates in the industry were male-dominated.

Short Description

Amazon shuts down internal AI recruiting tool that would down-rank female applicants.



Harm Distribution Basis


Harm Type

Psychological harm, Financial harm

AI System Description

Resume screening tool developed by Amazon to scan resumes and raise strong job applicants for consideration

System Developer


Sector of Deployment

Professional, scientific and technical activities

Relevant AI functions

Perception, Cognition

AI Techniques

Natural language processing

AI Applications

Natural language processing


Edinburgh, Scotland

Named Entities

Amazon, Edinburgh

Technology Purveyor


Beginning Date


Ending Date


Near Miss

Near miss



Lives Lost


Data Inputs


Amazon ditches sexist AI

Amazon ditches sexist AI

Amazon abandoned sexist AI recruitment tool

Is AI Sexist?

Is AI Sexist?

Amazon scraps 'sexist AI' recruitment tool · 2018

Amazon has scrapped a “sexist” tool that used artificial intelligence to decide the best candidates to hire for jobs.

Members of the team working on the system said it effectively taught itself that male candidates were preferable.

The arti…

Amazon scraps 'sexist AI' recruiting tool that showed bias against women · 2018

Amazon has scrapped a "sexist" internal tool that used artificial intelligence to sort through job applications.

The AI was created by a team at Amazon's Edinburgh office in 2014 as a way to automatically sort through CVs and pick out the m…

How Amazon Accidentally Invented a Sexist Hiring Algorithm · 2018

Amazon discovered a problem with using artificial intelligence to hire: their AI was biased against women.

The Seattle-based company developed computer programs designed to filter through hundreds of resumes and surface the best candidates,…

Amazon Killed Its AI Recruitment System For Bias Against Women-Report · 2018

Machine learning, one of the core techniques in the field of artificial intelligence, involves teaching automated systems to devise new ways of doing things, by feeding them reams of data about the subject at hand. One of the big fears here…

Amazon scrapped 'sexist AI' tool · 2018

Image copyright Getty Images Image caption The algorithm repeated bias towards men, reflected in the technology industry

An algorithm that was being tested as a recruitment tool by online giant Amazon was sexist and had to be scrapped, acco…

Amazon Accidentally Created A 'Sexist' Recruitment Tool, Then Shut It Down · 2018

Machine learning technology is becoming increasingly common across various industries, from policing to recruiting. But reports have shown that many of these systems have long-standing problems regarding discrimination. To avoid amplifying …

Amazon scraps secret AI recruiting tool that showed bias against women · 2018

SAN FRANCISCO (Reuters) - Inc’s (AMZN.O) machine-learning specialists uncovered a big problem: their new recruiting engine did not like women.

The team had been building computer programs since 2014 to review job applicants’ resu…

Amazon's AI hiring tool discriminated against women. · 2018

Amazon sign, with dude. David Ryder/Getty Images

Thanks to Amazon, the world has a nifty new cautionary tale about the perils of teaching computers to make human decisions.

According to a Reuters report published Wednesday, the tech giant d…

Amazon Shuts Down AI Hiring Tool for Being Sexist · 2018

Why Global Citizens Should Care

Gender discrimination in the workplace prevents women from achieving to their full potential. Eliminating gender inequality in the workforce would greatly increase economic activity. When half of the populati…

Amazon's AI recruitment tool scrapped for being sexist · 2018

Amazon has been forced to scrap its AI recruitment system after it was discovered to be biased against female applicants.

The AI was developed in 2014 by Amazon as a way of filtering out most candidates to provide the firm with the top five…

Amazon Fired Its Resume-Reading AI for Sexism · 2018

Algorithms are often pitched as being superior to human judgement, taking the guesswork out of decisions ranging from driving to writing an email. But they're still programmed by humans and trained on the data that humans create, which mean…

It turns out Amazon’s AI hiring tool discriminated against women · 2018

Amazon had to scrap its AI hiring tool because it was ‘sexist’ and discriminated against female applicants, a report from Reuters has found.

Amazon’s hopes for creating the perfect AI hiring tool were dashed when it realised that the algori…

Amazon built an AI tool to hire people but had to shut it down because it was discriminating against women · 2018

David Ryder/Getty Images Amazon CEO Jeff Bezos.

Amazon tried building an artificial-intelligence tool to help with recruiting, but it showed a bias against women,Reuters reports.

Engineers reportedly found the AI was unfavorable toward fema…

Amazon scraps ‘sexist’ AI hiring tool · 2018

What is artificial intelligence (AI)? We look at the progress of AI and automation in Australia compared to the rest of the world and how the Australian workforce may be affected by this movement.

Will the rise of AI take away our jobs? 0:5…

Amazon ditches sexist AI · 2018

Amazon ditches sexist AI

It’s not news to learn that AI can be something of a bigot.

Amazon scrapped an algorithm designed to become a recruitment tool because it was too sexist.

Did you hear the one about my wife — well, she… is a really n…

Amazon's sexist recruiting algorithm reflects a larger gender bias · 2018

AI may have sexist tendencies. But, sorry, the problem is still us humans.

Amazon recently scrapped an employee recruiting algorithm plagued with problems, according to a report from Reuters. Ultimately, the applicant screening algorithm di…

Amazon ditches AI recruitment tool that 'learnt to be sexist' · 2018

London | Amazon has scrapped a "sexist" internal tool that used artificial intelligence to sort through job applications.

The program was created by a team at Amazon's Edinburgh office in 2014 as a way to sort through CVs and pick out the m… · 2018

Amazon decided to scrap a machine learning (ML) algorithm it was creating to help automate the recruitment process because the model kept favoring male candidates, Reuters revealed. The discrimination against female candidates has been put …

Amazon ditched AI recruiting tool that favored men for technical jobs · 2018

Specialists had been building computer programs since 2014 to review résumés in an effort to automate the search process

This article is more than 5 months old

This article is more than 5 months old

Amazon’s machine-learning specialists unc…

Amazon Shuts Down Secret AI Recruiting Tool That Taught Itself to be Sexist · 2018

Artificial intelligence (AI) human resourcing tools are all the rage at the moment and becoming increasingly popular. The systems can speed up, simplify and even decrease the cost of the hiring process becoming every recruiter's dream come …

Amazon trained a sexism-fighting, resume-screening AI with sexist hiring data, so the bot became sexist · 2018

Amazon trained a sexism-fighting, resume-screening AI with sexist hiring data, so the bot became sexist

Some parts of machine learning are incredibly esoteric and hard to grasp, surprising even seasoned computer science pros; other parts of…

Amazon scraps sexist AI recruiting tool · 2018

Amazon's has scraped its artificial intelligence hiring tool after it was found to be sexist.

Photo: © 2014, Ken Wolter

A team of specialists familiar with the project told Reuters that they had been building computer programmes since 2014…

Amazon AI sexist tool scrapped · 2018

So AI may be the future in hiring and recruitment but it certainly isn't there yet it seems.

If you're basing it's learning on history which quite possibly may have been biased towards men, then it is likely that it will discriminate agains…

Is Tech Doomed To Reflect The Worst In All Of Us? · 2018

Amazon’s AI gurus scrapped a new machine-learning recruiting engine earlier this month. Why? It transpired that the AI behind it was sexist. What does this mean as we race to produce ever-better artificial intelligence, and how can we under… · 2018

The tech giant canned their experimental recruitment system riddled with problems, according to Reuters.

Amazon, back in 2014, set up the recruiting system in place, hoping to mechanize the entire hiring process. It used artificial intellig…

Is AI Sexist? · 2018

Amazon recently scrapped an experimental artificial intelligence (AI) recruiting tool that was found to be biased against women. At this point, I hope you might have a few questions, such as: What is an AI recruiting tool and how does it wo… · 2018

Amazon has scrapped a "sexist" internal tool that used artificial intelligence to sort through job applications.

The AI was created by a team at Amazon's Edinburgh office in 2014 as a way to automatically sort through CVs and pick out the m…

Why Amazon's sexist AI recruiting tool is better than a human. · 2018

However, bias also appears for other unrelated reasons. A recent study into how an algorithm delivered ads promoting STEM jobs showed that men were more likely to be shown the ad, not because men were more likely to click on it, but because…

2018 in Review: 10 AI Failures · 2018

Last December Synced compiled its first “Artificial Intelligence Failures” recap of AI gaffes from the previous year. AI has achieved remarkable progress, and many scientists dream of creating the Master Algorithm proposed by Pedro Domingos…

Sexist AI: Amazon ditches recruitment tool that turned out to be anti-women · 2019

It was supposed to make finding the right person for the job easier. However, an AI tool developed by Amazon to sift through potential hires has been dropped by the firm after developers found it was biased against picking women.

From prici…

New York City proposes regulating algorithms used in hiring · 2021

In 1964, the Civil Rights Act barred the humans who made hiring decisions from discriminating on the basis of sex or race. Now, software often contributes to those hiring decisions, helping managers screen résumés or interpret video intervi…

Auditors are testing hiring algorithms for bias, but there’s no easy fix · 2021

I’m at home playing a video game on my computer. My job is to pump up one balloon at a time and earn as much money as possible. Every time I click “Pump,” the balloon expands and I receive five virtual cents. But if the balloon pops before …

AI tools fail to reduce recruitment bias - study · 2022

Artificially intelligent hiring tools do not reduce bias or improve diversity, researchers say in a study.

"There is growing interest in new ways of solving problems such as interview bias," the Cambridge University researchers say, in the …


A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents