Incident 16: Images of Black People Labeled as Gorillas

Description: Google Photos image processing software mistakenly labelled a black couple as "gorillas."

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover
Alleged: Google developed and deployed an AI system, which harmed Black people.

Incident Stats

Incident ID
16
Report Count
23
Incident Date
2015-06-03
Editors
Sean McGregor

CSET Taxonomy Classifications

Taxonomy Details

Full Description

Google's Google Photo image processing software "mistakenly labelled a black couple as being 'gorillas.'" The error occurred in the software's image processing that attempts to assign themes to groups of similar photos. In this example, the suggested themes were "Graduation, Bikes, Planes, Skyscrapers, Cars, and Gorillas."

Short Description

Google Photos image processing software mistakenly labelled a black couple as "gorillas."

Severity

Minor

Harm Distribution Basis

Race

Harm Type

Psychological harm, Harm to social or political systems

AI System Description

Google's Google Photo Image Processing

System Developer

Google

Sector of Deployment

Arts, entertainment and recreation

Relevant AI functions

Perception, Cognition

AI Techniques

image classification

AI Applications

image processing, facial recognition, image classification

Location

Global

Named Entities

Google, Google Photos

Technology Purveyor

Google

Beginning Date

2015-06-29

Ending Date

2015-06-29

Near Miss

Harm caused

Intent

Accident

Lives Lost

No

Data Inputs

photographs, images, multi-media content

GMF Taxonomy Classifications

Taxonomy Details

Known AI Goal

Image Tagging

Known AI Technology

Face Detection, Convolutional Neural Network, Keyword Filtering

Known AI Technical Failure

Underfitting

Potential AI Technical Failure

Dataset Imbalance, Context Misidentification

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Biased Google Image Results

· 18 reports

FaceApp Racial Filters

· 23 reports