Incident 37: Female Applicants Down-Ranked by Amazon Recruiting Tool

Description: Amazon shuts down internal AI recruiting tool that would down-rank female applicants.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover
Alleged: Amazon developed and deployed an AI system, which harmed female applicants.

Incident Stats

Incident ID
37
Report Count
33
Incident Date
2016-08-10
Editors
Sean McGregor

CSET Taxonomy Classifications

Taxonomy Details

Full Description

In 2015, Amazon scrapped an internal recruiting algorithm developed by its Edinburgh office that would down-rank resumes when it included the word "women's", and two women's colleges. The algorithm ranked an applicant out of five stars, and it would give preference to resumes that contained what Reuters called "masculine language," or strong verbs like "executed" or "captured". These patterns occured because the engineered who made the algorithm trained it with past candidates' resumes submitted over the previous ten years, and the past candidates in the industry were male-dominated.

Short Description

Amazon shuts down internal AI recruiting tool that would down-rank female applicants.

Severity

Negligible

Harm Distribution Basis

Sex

Harm Type

Psychological harm, Financial harm

AI System Description

Resume screening tool developed by Amazon to scan resumes and raise strong job applicants for consideration

System Developer

Amazon

Sector of Deployment

Professional, scientific and technical activities

Relevant AI functions

Perception, Cognition

AI Techniques

Natural language processing

AI Applications

Natural language processing

Location

Edinburgh, Scotland

Named Entities

Amazon, Edinburgh

Technology Purveyor

Amazon

Beginning Date

2014-01-01

Ending Date

2015-01-01

Near Miss

Near miss

Intent

Accident

Lives Lost

No

Data Inputs

Resumes

Amazon ditches sexist AI

Amazon ditches sexist AI

information-age.com

Amazon abandoned sexist AI recruitment tool

Amazon abandoned sexist AI recruitment tool

channels.theinnovationenterprise.com

Is AI Sexist?

Is AI Sexist?

wellesley.edu

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents