Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 614

関連インシデント

インシデント 3733 Report
Female Applicants Down-Ranked by Amazon Recruiting Tool

Amazon ditches sexist AI
information-age.com · 2018

Amazon ditches sexist AI

It’s not news to learn that AI can be something of a bigot.

Amazon scrapped an algorithm designed to become a recruitment tool because it was too sexist.

Did you hear the one about my wife — well, she… is a really nice person, actually.

We know that people suffer from bias. Alas, a growing pile of evidence suggests AI can be too.

Now it seems that Amazon has found this out the hard way — after investing in an AI recruitment tool.

Ethical AI – the answer is clear Being transparent with ethical AI is vital to engaging with the public in a responsible manner.

The idea was for the AI engine to scan job applications and give hopeful recruits a score between one and five. Reuters quoted one engineer saying: “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”

Alas, it started weeding out CVs that included a certain five letter word. The ‘W’ word — women, there said it.

This was back in 2015, let’s face it, as far as AI is concerned, 2015 is ancient history.

See also: Regulating robots: keeping an eye on AI

It’s not news to learn that AI can be something of a bigot.

In 2016, it emerged that US risk assessment algorithms — used by courtrooms throughout the country to decide the fates and freedoms of those on trial – are racially biased, frequently sentencing Caucasians more leniently than African Americans despite no difference in the type of crime committed. How could this happen within a system that’s supposed to be neutral?

AI researcher Professor Joanna Bryson, said at the time: “If the underlying data reflects stereotypes, or if you train AI from human culture, you will find bias.”

See also: Augmented intelligence: why the human element can’t be forgotten

This brings us to the issue of diversity. Scot E Page is an expert on diversity and complex systems. Areas where he is most well-known include ‘collective wisdom’.

He is famous for saying “progress depends as much on our collective differences as it does on our individual IQ scores.”

And: “If we can understand how to leverage diversity to achieve better performance and greater robustness, we might anticipate and prevent collapses.”

AI, however, because of the way it learns from data, can reflect the biases in society.

“The fact that Amazon’s system taught itself that male candidates were preferable, penalising resumes that included the word ‘women’s’, is hardly surprising when you consider 89% of the engineering workforce is male,” observed Charlotte Morrison, General Manager of global branding and design agency, Landor.

She added: “Brands need to be careful that when creating and using technology it does not backfire by highlighting society’s own imperfections and prejudices. The long-term solution is of course getting more diverse candidates into STEM education and careers – until then, brands need to be alert to the dangers of brand and reputational damage from biased, sexist, and even racist technology.”

See also: Augmented intelligence: predicting the best customer moments

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • d414e0f