Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 95

Associated Incidents

Incident 1624 Report
Images of Black People Labeled as Gorillas

Google Removed Gorillas From Search to Fix Racist Algorithm
nymag.com · 2018

It’s been over two years since engineer Jacky Alciné called out Google Photos for auto-tagging black people in his photos as “gorillas.” After being called out, Google promptly and profusely apologized, promising it’d fix the problems in the algorithm. “Lots of work being done, and lots still to be done,” tweeted Yonatan Zunger, chief architect of social at Google, according to CNET. “We’re very much on it.” It’s 2018, and it appears “on it” just meant a shoddy work-around that involved blocking all things the algorithm identified as “gorilla” from being tagged, just in case the algorithm opted to tag a black person again.

Google Photos, y'all fucked up. My friend's not a gorilla. pic.twitter.com/SMkMCsNVX4 — jackyalciné is about 40% into the IndieWeb. (@jackyalcine) June 29, 2015

Wired uncovered the “fix” in a series of tests using 40,000 images containing animals and running them through Google Photos. “It [Google Photos] performed impressively at finding many creatures, including pandas and poodles,” the magazine reports. “But the service reported ‘no results’ for the search terms ‘gorilla,’ ‘chimp,’ ‘chimpanzee,’ and ‘monkey.’” The program was able to find some primates, including baboons, gibbons, and marmosets. Capuchin and colobus monkeys were also identified correctly, so long as the word monkey wasn’t included in the search. Searches for “black man” and “black woman” turned up photos of people of the chosen gender in black and white, rather than of a given race.

A Google spokesperson confirmed to Wired that several primate terms, including “gorilla,” are still blocked following the 2015 incident. “Image labeling technology is still early and unfortunately it’s nowhere near perfect,” the spokesperson said. It’s unclear if image-labeling tech is just “still early” — it’s been several years, you’d think Google could have figured some things out in that time — or if Google is just being careful to avoid being called racist, again. Alternatively, properly fixing Google Photos simply isn’t worth the money and the time to Google. It’s easier to slap a Band-Aid on it and pretend that gorillas — and much worse, black people — don’t exist in its photos.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • d414e0f