Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 99

Associated Incidents

Incident 1624 Report
Images of Black People Labeled as Gorillas

Google Photos Still Has a Problem with Gorillas
technologyreview.com · 2018

In 2015, Google drew criticism when its Photos image recognition system mislabeled a black woman as a gorilla—but two years on, the problem still isn’t properly fixed. Instead, Google has censored image tags relating to many primates.

What’s new: Wired tested Google Photos again with a bunch of animal photos. The software could identify creatures from pandas to poodles with ease. But images of gorillas, chimps, and chimpanzees? They were never labeled. Wired confirmed with Google that those tags are censored.

But: Some of Google’s other computer vision systems, such as Cloud Vision, were able to correctly tag photos of gorillas and provide answers to users. That suggests the tag removal is a platform-specific shame-faced PR move.

Bigger than censorship: Human bias exists in data sets everywhere, reflecting the facets of humanity we’d rather not have machines learn. But reducing and removing that bias will take a lot more work than simply blacklisting labels.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 69ff178