Incident 49: AI Beauty Judge Did Not Like Dark Skin

Description: In 2016, after artificial inntelligence software Beauty.AI judged an international beauty contest and declared a majority of winners to be white, researchers found that Beauty.AI was racially biased in determining beauty.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover
Alleged: Youth Laboratories developed and deployed an AI system, which harmed People with Dark Skin.

Incident Stats

Incident ID
49
Report Count
10
Incident Date
2016-09-05
Editors
Sean McGregor

CSET Taxonomy Classifications

Taxonomy Details

Full Description

In 2016, Beauty.AI, an artificial intelligence software designed by Youth Laboratories and supported by Microsoft, was used to judge the first international beauty coontest. Of the 600,000 contestants who submitted selfies to be judged by Beauty.AI, the artificial intelligence software choose 44 winners, of which a majority were white, a handful were Asian, and only one had dark skin. While a majority of contestants were white, approximately 40,000 submissions were from Indians and another 9,000 were from Africans. Controversy ensued that Beauty.AI is racially biased as it was not sufficiently trained with images of people of color in determining beauty.

Short Description

In 2016, after artificial inntelligence software Beauty.AI judged an international beauty contest and declared a majority of winners to be white, researchers found that Beauty.AI was racially biased in determining beauty.

Severity

Negligible

Harm Distribution Basis

Race

Harm Type

Harm to intangible property

AI System Description

artificial intelligence software that uses deep learning algorithms to evaluate beauty based on factors such as symmetry, facial blemishes, wrinkles, estimated age and age appearance, and comparisons to actors and models

System Developer

Youth Laboratories

Sector of Deployment

Arts, entertainment and recreation

Relevant AI functions

Perception, Cognition, Action

AI Techniques

Deep learning, open-source

AI Applications

biometrics, image classification

Location

Global

Named Entities

Youth Laboratories, Microsoft

Technology Purveyor

Youth Laboratories, Microsoft, Insilico Medicine

Beginning Date

1/2016

Ending Date

6/2016

Near Miss

Unclear/unknown

Intent

Accident

Lives Lost

No

Data Inputs

images of people's faces

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Gender Biases in Google Translate

· 10 reports

TayBot

· 28 reports