Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 601

Associated Incidents

Incident 3734 Report
Amazon’s Experimental Hiring Tool Allegedly Displayed Gender Bias in Candidate Rankings

Loading...
Amazon scraps ‘sexist’ AI hiring tool
news.com.au · 2018

What is artificial intelligence (AI)? We look at the progress of AI and automation in Australia compared to the rest of the world and how the Australian workforce may be affected by this movement.

Will the rise of AI take away our jobs? 0:57

AMAZON was forced to abandon a secret artificial intelligence recruiting tool after discovering it was discriminating against women.

According to a report in Reuters, since 2014 Amazon engineers have been building a computer program to review resumes with the goal of automating the talent search process.

The tool would give job candidates a score from one to five stars.

“Everyone wanted this holy grail,” one source told the news agency. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”

After a year, however, Amazon realised its system was favouring male candidates for software developer and other technical roles, because it was observing patterns in resumes submitted over a 10-year period — most of which came from men.

It also penalised resumes that included the word “women’s”, according to Reuters, such as in the phrase “women’s chess club captain” and all-women’s colleges.

Even though the program was edited to make it neutral to those terms, the programmers couldn’t guarantee the AI would not teach itself to sort candidates in other discriminatory ways, the report said.

The project was eventually scrapped altogether in early 2017.

It’s understood the project was only ever used in a developmental phase, never independently, and never rolled out to a larger group.

It was abandoned for many reasons — it never returned strong candidates for the roles — and not because of the bias issue.

An Amazon spokeswoman said, “This was never used by Amazon recruiters to evaluate candidates.”

frank.chung@news.com.au

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd