Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 601

関連インシデント

インシデント 3734 Report
Amazon’s Experimental Hiring Tool Allegedly Displayed Gender Bias in Candidate Rankings

Loading...
Amazon scraps ‘sexist’ AI hiring tool
news.com.au · 2018

What is artificial intelligence (AI)? We look at the progress of AI and automation in Australia compared to the rest of the world and how the Australian workforce may be affected by this movement.

Will the rise of AI take away our jobs? 0:57

AMAZON was forced to abandon a secret artificial intelligence recruiting tool after discovering it was discriminating against women.

According to a report in Reuters, since 2014 Amazon engineers have been building a computer program to review resumes with the goal of automating the talent search process.

The tool would give job candidates a score from one to five stars.

“Everyone wanted this holy grail,” one source told the news agency. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”

After a year, however, Amazon realised its system was favouring male candidates for software developer and other technical roles, because it was observing patterns in resumes submitted over a 10-year period — most of which came from men.

It also penalised resumes that included the word “women’s”, according to Reuters, such as in the phrase “women’s chess club captain” and all-women’s colleges.

Even though the program was edited to make it neutral to those terms, the programmers couldn’t guarantee the AI would not teach itself to sort candidates in other discriminatory ways, the report said.

The project was eventually scrapped altogether in early 2017.

It’s understood the project was only ever used in a developmental phase, never independently, and never rolled out to a larger group.

It was abandoned for many reasons — it never returned strong candidates for the roles — and not because of the bias issue.

An Amazon spokeswoman said, “This was never used by Amazon recruiters to evaluate candidates.”

frank.chung@news.com.au

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd