Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 617

Associated Incidents

Incident 3734 Report
Amazon’s Experimental Hiring Tool Allegedly Displayed Gender Bias in Candidate Rankings

Loading...
Amazon Shuts Down It’s Problematic Sexist AI Recruitment System
mansworldindia.com · 2018

The tech giant canned their experimental recruitment system riddled with problems, according to Reuters.

Amazon, back in 2014, set up the recruiting system in place, hoping to mechanize the entire hiring process. It used artificial intelligence to give candidates scores ranging from one to five stars. The system would then spit out the top 5 candidates, with the highest rating and qualifications.

But the machine-learning specialists found out a huge problem with this almost perfect system: It was sexist. The models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. A majority of them being men.

Thus, Amazon’s AI learnt that only male candidates would be suitable, and started to penalize resumes that included the word “women’s”. For example, a resume that may have contained, “the captain of the women’s basketball team”, was pushed back further behind on the list of most suitable candidates.

They did make changes to the program and claimed they fixed the issue, but people were not convinced as they believed the AI might find other discriminatory ways to target women.

Amazon defended the company by claiming the recruiters looked at the resumes put forth by the system but never completely relied on it. Furthermore, emphasized on its commitment to workplace diversity and equality.

The reports of the flawed system only supported the claims of women about the growing gender disparity in the tech industry.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd