Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 624

Associated Incidents

Incident 3733 Report
Female Applicants Down-Ranked by Amazon Recruiting Tool

Amazon scraps 'sexist AI' recruiting tool that showed bias against women
telegraph.co.uk · 2018

Amazon has scrapped a "sexist" internal tool that used artificial intelligence to sort through job applications.

The AI was created by a team at Amazon's Edinburgh office in 2014 as a way to automatically sort through CVs and pick out the most promising candidates.

However, it quickly taught itself to prefer male candidates over female ones, according to members of the team who spoke to Reuters.

They noticed that it was penalising CVs that included the word "women's," such as "women's chess club captain." It also reportedly downgraded graduates of two all-women's colleges.

The problem stemmed from the fact that the system was trained on data submitted by people over a 10-year period, most of which came from men.

The AI was tweaked in an attempt to fix the bias. However, last year, Amazon lost faith in its ability to be neutral and abandoned the project altogether.

Amazon recruiters are believed to have used the system to look at the recommendations when hiring, but didn't rely on the rankings. Currently, women make up 40pc of Amazon's workforce.

Stevie Buckley, the co-founder of UK job website Honest Work, which is used by companies such as Snapchat to recruit for technology roles, said that “the basic premise of expecting a machine to identify strong job applicants based on historic hiring practices at your company is a surefire method to rapidly scale inherent bias and discriminatory recruitment practices.”

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • d414e0f