Incident 439: Detroit Police Wrongfully Arrested Black Man Due To Faulty Facial Recognition
Description: A Black man was wrongfully detained by the Detroit Police Department as a result of a false facial recognition (FRT) result.
EntitiesView all entities
Alleged: DataWorks Plus developed an AI system deployed by Detroit Police Department, which harmed Michael Oliver and Black people in Detroit.
The high-profile case of a Black man wrongly arrested earlier this year wasn’t the first misidentification linked to controversial facial recognition technology used by Detroit Police, the Free Press has learned.
Last year, a 25-year-old D…
Detroit police wrongfully arrested another Black man based on flawed facial recognition technology that often yields errors in identifying people of color, according to a new lawsuit obtained by Motherboard.
Michael Oliver, 26, was arrested…
In July of 2019, Michael Oliver, 26, was on his way to work in Ferndale, Michigan, when a cop car pulled him over. The officer informed him that there was a felony warrant out for his arrest.
"I thought he was joking because he was laughin…
A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.