Incident 165: Image Upscaling Algorithm PULSE Allegedly Produced Facial Images with Caucasian Features More Often

Description: Image upscaling tool PULSE powered by NVIDIA's StyleGAN reportedly generated faces with Caucasian features more often, although AI academics, engineers, and researchers were not in agreement about where the source of bias was.


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: Duke researchers developed and deployed an AI system, which harmed people having non-Caucasian facial features.

Incident Stats

Incident ID
Report Count
Incident Date
Sean McGregor, Khoa Lam
What a machine learning tool that turns Obama white can (and can’t) tell us about AI bias · 2020

It’s a startling image that illustrates the deep-rooted biases of AI research. Input a low-resolution picture of Barack Obama, the first black president of the United States, into an algorithm designed to generate depixelated faces, and the…

Once again, racial biases show up in AI image databases, this time turning Barack Obama white · 2020

A new computer vision technique that helps convert blurry photos of people into fake, realistic images has come under fire for being racially biased towards white people.

The tool known as PULSE was introduced by a group of researchers from…


A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents