Incident 593: AI Photo Filter Lightens Skin, Changes Eye Color in Student's 'Professional' Image

Description: An AI application modified an MIT student's photo to appear 'professional' by lightening her skin and changing her eye color to blue, highlighting the racial bias in the training data of the program.


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: Playground AI developed and deployed an AI system, which harmed Rona Wang and Racial minorities who may have experienced the same result.

Incident Stats

Incident ID
Report Count
Incident Date
Daniel Atherton
An MIT student asked AI to make her headshot more ‘professional.’ It gave her lighter skin and blue eyes. · 2023

Rona Wang is no stranger to using artificial intelligence.

A recent MIT graduate, Wang, 24, has been experimenting with the variety of new AI language and image tools that have emerged in the past few years, and is intrigued by the ways the…


A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents



· 28 reports