Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 85

関連インシデント

インシデント 1623 Report
Images of Black People Labeled as Gorillas

Loading...
Google Photos Tags Two African-Americans As Gorillas Through Facial Recognition Software
forbes.com · 2015

When Brooklyn-native Jacky Alcine logged onto Google Photos on Sunday evening, he was shocked to find an album titled “Gorillas,” in which the facial recognition software categorized him and his friend as primates. Immediately, Alcine posted on Twitter: “Google Photos, y'all f***ed up. My friend's not a gorilla.” This comment prompted over 1,000 re-tweets and an online discussion about how shocking the situation was. One user replied, “That is completely unacceptable and very low. I'm so sorry you had to come across such hurtful ignorance.”

Alcine added a series of follow-up tweets, including one that stated, “Like I understand HOW this happens; the problem is moreso on the WHY. This is how you determine someone's target market.”

Yonatan Zunger, Google's chief architect of social, was quick to address the problem. Within hours of Alcine's original post, Zunger tweeted, “Holy f***. G+ CA here. No, this is not how you determine someone's target market. This is 100% Not OK.” The team immediately went to work to examine the data and fix the problem. Zunger followed-up with Alcine the next morning just to make sure everything was okay.

“We’re appalled and genuinely sorry that this happened," said a Google spokesperson. "We are taking immediate action to prevent this type of result from appearing. There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these types of mistakes from happening in the future.”

It is important to note that African-Americans are not the only group mislabeled by Google Photos. As Zunger notes in a tweet, “Until recently, [Google Photos] was confusing white faces with dogs and seals. Machine learning is hard."

Brian Brackeen, CEO of facial recognition company Kairos, says that machines can make culturally inappropriate assumptions when not properly trained. “It’s scarily similar to how a child learns,” he said.

This is not the first time that facial recognition software, which is based on machine learning and computer vision, has messed up its identification of people.

This past May, Flickr's facial recognition software labeled both black and white people as “animals” and “apes” (these tags were promptly removed). Furthermore, many Native American dancer photos were tagged with the word “costume,” which added great insult to the community.

Back in 2009, Nikon's face-detection cameras were accused of being “racist.” Many times, when an Asian face was photographed, a message flashed across the screen asking, "Did someone blink?” — even when their eyes were wide open. As a Japanese company, Nikon apparently neglected to design its camera with Asian eyes in mind.

A few months after the Nikon controversy, a Youtube video about an HP MediaSmart Computer went viral. Although it was designed to follow the faces of all users, it couldn't recognize the African-American man moving in front of it. However, it quickly started tracking a white woman's face as soon as she walked in front of the camera.

These points are not to shame Google, Nikon, or HP, which are companies that have no malicious intent behind their facial recognition software. The software will continue to be far from perfect for the foreseeable future.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd