Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 1369

関連インシデント

インシデント 1623 Report
Images of Black People Labeled as Gorillas

Loading...
Google Photos identified two black people as 'gorillas'
mashable.com · 2015

Google Photos uses sophisticated facial-recognition software to identify not only individuals, but also specific categories of objects and photo types, like food, cats and skylines.

Image recognition programs are far from perfect, however; they sometimes gets things comically wrong, and sometimes offensively so — as one Twitter user recently found out.

SEE ALSO: Facebook developing tech that can recognize you in photos — even if your face isn't showing

Browsing his Google Photos app, Brooklyn resident Jacky Alciné noticed that photos of him and a friend, both of whom are black, were tagged under the label "Gorillas." He shared a screencap of the racist label on Twitter, which was spotted by Yahoo Tech.

Google Photos, y'all fucked up. My friend's not a gorilla. pic.twitter.com/SMkMCsNVX4

— diri noir avec banan (@jackyalcine) June 29, 2015

Yonatan Zunger, Google's chief social architect, responded quickly.

@jackyalcine Holy fuck. G+ CA here. No, this is not how you determine someone's target market. This is 100% Not OK.

— Yonatan Zunger (@yonatanzunger) June 29, 2015

In a subsequent tweetstorm, Zunger said Google was scrambling a team together to address the issue, and the label was removed from his app within 15 hours, Alciné confirmed to Mashable. Zunger said Google was looking at longer-term fixes, too. A Google spokesperson also sent an official statement:

“We’re appalled and genuinely sorry that this happened. We are taking immediate action to prevent this type of result from appearing. There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these types of mistakes from happening in the future.”

This isn't the first time software has inadvertently maligned dark-skinned people, unfortunately. In May, Flickr's auto-tagging feature tagged a black person as an "ape," although it put the same tag on a white woman as well. And years ago, some webcams on laptops made by HP didn't track the faces of black people even though they did so for white users.

At least in the case of Google Photos, the incident appears to be isolated, as it doesn't appear that other users have come forward with similar complaints of offensive tags. But it's a reminder that, although computers are beginning to do a really good job of simulating human vision, they're a long way off from simulating human sensitivity.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd