Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Entities

Women applying to Amazon

Incidents Harmed By

Incident 3734 Report
Amazon’s Experimental Hiring Tool Allegedly Displayed Gender Bias in Candidate Rankings

2016-08-10

Between 2014 and 2017, Amazon reportedly developed an AI-powered recruiting tool to score job applicants, trained on a decade of resumes purportedly drawn largely from men. Media reports say the system learned to favor male candidates, penalizing terms like "women's" and graduates from certain all-women's colleges. Efforts to remove these biases reportedly did not guarantee fairness, and the project was ultimately abandoned. Amazon reportedly states recruiters never solely relied on the tool.

More

Related Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
 

Entity

Amazon

Incidents involved as both Developer and Deployer
  • Incident 37
    34 Reports

    Amazon’s Experimental Hiring Tool Allegedly Displayed Gender Bias in Candidate Rankings

More
Entity

Amazon applicants

Incidents Harmed By
  • Incident 37
    34 Reports

    Amazon’s Experimental Hiring Tool Allegedly Displayed Gender Bias in Candidate Rankings

More
Entity

Amazon experimental AI resume scoring engine

Incidents implicated systems
  • Incident 37
    34 Reports

    Amazon’s Experimental Hiring Tool Allegedly Displayed Gender Bias in Candidate Rankings

More
Entity

Associated machine learning models trained on historical Amazon resume data

Incidents implicated systems
  • Incident 37
    34 Reports

    Amazon’s Experimental Hiring Tool Allegedly Displayed Gender Bias in Candidate Rankings

More

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • b9764d4