Incident 60: FaceApp Racial Filters

Description: FaceApp is criticized for offering racist filters.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover
Alleged: FaceApp developed and deployed an AI system, which harmed Minority Groups.

Incident Stats

Incident ID
60
Report Count
23
Incident Date
2017-04-25
Editors
Sean McGregor

CSET Taxonomy Classifications

Taxonomy Details

Full Description

FaceApp, which uses facial recognition to change users' expressions and look, received a storm of criticism after releasing its new "black", "white", "Asian" and "Indian" filters. It received backlash on social media who described it as "racist" and "offensive". The photo editing app, which uses neural networks to modify pictures of people while keeping them realistic, was also criticized for the fact that its "hot" filter often lightens the skin of people with darker complexions.

Short Description

FaceApp is criticized for offering racist filters.

Severity

Negligible

Harm Distribution Basis

Race

Harm Type

Psychological harm

AI System Description

The facial recognition algorithm used by FaceApp, based on deep generative convolutional neural networks, which can edit selfies using filters and other tools.

System Developer

FaceApp

Sector of Deployment

Arts, entertainment and recreation

Relevant AI functions

Unclear

AI Techniques

Facial recognition, convolutional neural networks

AI Applications

Facial recognition, image generation

Location

Global

Named Entities

FaceApp, Russia

Technology Purveyor

FaceApp

Beginning Date

08/2017

Ending Date

08/2017

Near Miss

Unclear/unknown

Intent

Unclear

Lives Lost

No

Data Inputs

Photos of faces

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Biased Google Image Results

· 18 reports

TayBot

· 28 reports