Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 1369

Associated Incidents

Incident 1623 Report
Images of Black People Labeled as Gorillas

Loading...
Google Photos identified two black people as 'gorillas'
mashable.com · 2015

Google Photos uses sophisticated facial-recognition software to identify not only individuals, but also specific categories of objects and photo types, like food, cats and skylines.

Image recognition programs are far from perfect, however; they sometimes gets things comically wrong, and sometimes offensively so — as one Twitter user recently found out.

SEE ALSO: Facebook developing tech that can recognize you in photos — even if your face isn't showing

Browsing his Google Photos app, Brooklyn resident Jacky Alciné noticed that photos of him and a friend, both of whom are black, were tagged under the label "Gorillas." He shared a screencap of the racist label on Twitter, which was spotted by Yahoo Tech.

Google Photos, y'all fucked up. My friend's not a gorilla. pic.twitter.com/SMkMCsNVX4

— diri noir avec banan (@jackyalcine) June 29, 2015

Yonatan Zunger, Google's chief social architect, responded quickly.

@jackyalcine Holy fuck. G+ CA here. No, this is not how you determine someone's target market. This is 100% Not OK.

— Yonatan Zunger (@yonatanzunger) June 29, 2015

In a subsequent tweetstorm, Zunger said Google was scrambling a team together to address the issue, and the label was removed from his app within 15 hours, Alciné confirmed to Mashable. Zunger said Google was looking at longer-term fixes, too. A Google spokesperson also sent an official statement:

“We’re appalled and genuinely sorry that this happened. We are taking immediate action to prevent this type of result from appearing. There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these types of mistakes from happening in the future.”

This isn't the first time software has inadvertently maligned dark-skinned people, unfortunately. In May, Flickr's auto-tagging feature tagged a black person as an "ape," although it put the same tag on a white woman as well. And years ago, some webcams on laptops made by HP didn't track the faces of black people even though they did so for white users.

At least in the case of Google Photos, the incident appears to be isolated, as it doesn't appear that other users have come forward with similar complaints of offensive tags. But it's a reminder that, although computers are beginning to do a really good job of simulating human vision, they're a long way off from simulating human sensitivity.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd