Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 613

関連インシデント

インシデント 3734 Report
Amazon’s Experimental Hiring Tool Allegedly Displayed Gender Bias in Candidate Rankings

Loading...
Amazon built an AI tool to hire people but had to shut it down because it was discriminating against women
businessinsider.com.au · 2018

David Ryder/Getty Images Amazon CEO Jeff Bezos.

Amazon tried building an artificial-intelligence tool to help with recruiting, but it showed a bias against women,Reuters reports.

Engineers reportedly found the AI was unfavorable toward female candidates because it had combed through male-dominated résumés to accrue its data.

Amazon reportedly abandoned the project at the beginning of 2017.

Amazon worked on building an artificial-intelligence tool to help with hiring, but the plans backfired when the company discovered the system discriminated against women, Reuters reports.

Citing five sources, Reuters said Amazon set up an engineering team in Edinburgh, Scotland, in 2014 to find a way to automate its recruitment.

The company created 500 computer models to trawl through past candidates’ résumés and pick up on about 50,000 key terms. The system would crawl the web to recommend candidates.

“They literally wanted it to be an engine where I’m going to give you 100 résumés, it will spit out the top five, and we’ll hire those,” one source told Reuters.

A year later, however, the engineers reportedly noticed something troubling about their engine – it didn’t like women. This was apparently because the AI combed through predominantly male résumés submitted to Amazon over a 10-year period to accrue data about whom to hire.

Consequently, the AI concluded that men were preferable. It reportedly downgraded résumés containing the words “women’s” and filtered out candidates who had attended two women-only colleges.

Amazon’s engineers apparently tweaked the system to remedy these particular forms of bias but couldn’t be sure the AI wouldn’t find new ways to unfairly discriminate against candidates.

Gender bias was not the only problem, Reuters’ sources said. The computer programs also spat out candidates who were unqualified for the position.

Remedying algorithmic bias is a thorny issue, as algorithms can pick up on subconscious human bias. In 2016, ProPublica found that risk-assessment software used to forecast which criminals were most likely to reoffend exhibited racial bias against black people. Overreliance on AI for things like recruitment, credit-scoring, and parole judgments have also created issues in the past.

Amazon reportedly abandoned the AI recruitment project by the beginning of last year after executives lost faith in it. Reuters’ sources said Amazon recruiters looked at recommendations generated by the AI but never relied solely on its judgment.

Amazon told Business Insider but declined to comment further.

An Amazon spokesperson told Business Insider, “This was never used by Amazon recruiters to evaluate candidates” and that the company was committed to workplace diversity and equality

Business Insider Emails & Alerts Site highlights each day to your inbox. Email Address Join

Follow Business Insider Australia on Facebook, Twitter, LinkedIn, and Instagram.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd