Incident 439: Detroit Police Wrongfully Arrested Black Man Due To Faulty Facial Recognition

Description: A Black man was wrongfully detained by the Detroit Police Department as a result of a false facial recognition (FRT) result.


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: DataWorks Plus developed an AI system deployed by Detroit Police Department, which harmed Michael Oliver and Black people in Detroit.

Incident Stats

Incident ID
Report Count
Incident Date
Kate Perkins
Controversial Detroit facial recognition got him arrested for a crime he didn’t commit · 2020

The high-profile case of a Black man wrongly arrested earlier this year wasn’t the first misidentification linked to controversial facial recognition technology used by Detroit Police, the Free Press has learned. 

Last year, a 25-year-old D…

Faulty Facial Recognition Led to His Arrest—Now He’s Suing · 2020

Detroit police wrongfully arrested another Black man based on flawed facial recognition technology that often yields errors in identifying people of color, according to a new lawsuit obtained by Motherboard.

Michael Oliver, 26, was arrested…

Wrongful arrest exposes racial bias in facial recognition technology · 2020

In July of 2019, Michael Oliver, 26, was on his way to work in Ferndale, Michigan, when a cop car pulled him over. The officer informed him that there was a felony warrant out for his arrest. 

"I thought he was joking because he was laughin…


A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.