Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 93

Associated Incidents

Incident 1623 Report
Images of Black People Labeled as Gorillas

Loading...
Google Apologizes For Tagging Photos Of Black People As ‘Gorillas'
huffingtonpost.com · 2015

When Jacky Alciné checked his Google Photos app earlier this week, he noticed it labeled photos of himself and a friend, both black, as “gorillas.”

The Brooklyn programmer posted his screenshots to Twitter to call out the app’s faulty photo recognition software:

Google Photos, y'all fucked up. My friend's not a gorilla. pic.twitter.com/SMkMCsNVX4 — Jacky lives on @jalcine@playvicious.social now. (@jackyalcine) June 29, 2015

And it's only photos I have with her it's doing this with (results truncated b/c personal): pic.twitter.com/h7MTXd3wgo — Jacky lives on @jalcine@playvicious.social now. (@jackyalcine) June 29, 2015

Yonatan Zunger, Google’s chief architect of social, responded on Twitter with a promise to fix the tag. The next day, USA Today reports, Google removed the "gorilla” tag completely.

"We're appalled and genuinely sorry that this happened," Google spokeswoman Katie Watson said in a statement to BBC. "We are taking immediate action to prevent this type of result from appearing. There is still clearly a lot of work to do with automatic image labeling, and we're looking at how we can prevent these types of mistakes from happening in the future."

@jackyalcine and image recognition itself. (e.g., better recognition of dark-skinned faces) — Yonatan Zunger 🔥 (@yonatanzunger) June 29, 2015

This isn’t the first time this year Google has had to apologize for something offensive in their software . Google Maps made headlines when users discovered last month that entering a racial slur into the search field in some area yielded the address of the White House. The site quickly corrected the lapse.

Google’s diversity numbers have remained largely static over the years. About 70 percent employees are men. Sixty percent of the company’s employees are white and 31 percent are Asian. Combined, African Americans and Latinos make up only 5 percent of the work force.

As Alciné told The Huffington Post via Twitter direct message, “A diverse QA team could have caught this if they tested it on themselves, or a diverse focus group for testing."

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd