Incident 18: Gender Biases of Google Image Search

Description: Google Image returns results that under-represent women in leadership roles, notably with the first photo of a female "CEO" being a Barbie doll after 11 rows of male CEOs.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover
Alleged: Google developed and deployed an AI system, which harmed Women.

Incident Stats

Incident ID
18
Report Count
11
Incident Date
2015-04-04
Editors
Sean McGregor

CSET Taxonomy Classifications

Taxonomy Details

Full Description

Reports show Google Image produces results that under-represent women in leadership roles. When searching "CEO" in Google Images, approximately 11% of results feature women while around 28% of CEO's in the United States were women when this complaint was raised. Other examples include the search under "cop" returning results where the first woman featured is wearing a "sexy Halloween costume". Another report showed that when searching "CEO" the first woman to appear was a version of Barbie doll, and that didn't appear until the 12th row of results.

Short Description

Google Image returns results that under-represent women in leadership roles, notably with the first photo of a female "CEO" being a Barbie doll after 11 rows of male CEOs.

Severity

Minor

Harm Distribution Basis

Sex

Harm Type

Harm to social or political systems

AI System Description

Google Image search that allows a search based on a word or phrase to produce photos deemed relevant to that search phrase

System Developer

Google

Sector of Deployment

Information and communication

Relevant AI functions

Perception, Cognition

AI Techniques

Google Image, image processing

AI Applications

image suggestion, image processing, image content processing

Location

Global

Named Entities

Google

Technology Purveyor

Google

Beginning Date

2018-01-01T00:00:00.000Z

Ending Date

2018-01-01T00:00:00.000Z

Near Miss

Harm caused

Intent

Accident

Lives Lost

No

Data Inputs

open source internet, user requests, user searches

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents