Incident 74: Detroit Police Wrongfully Arrested Black Man Due To Faulty FRT

Description: A Black man was wrongfully detained by the Detroit Police Department as a result of a false facial recognition (FRT) result..

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover
Alleged: DataWorks Plus developed an AI system deployed by Detroit Police Department, which harmed Robert Julian-Borchak Williams and Black people in Detroit.

Incident Stats

Incident ID
74
Report Count
10
Incident Date
2020-01-30
Editors
Sean McGregor, Khoa Lam

CSET Taxonomy Classifications

Taxonomy Details

Full Description

In June 2020, the Detroit Police Department wrongfully arrested Robert Julian-Borchak Williams after facial recognition techonology provided by DataWorks Plus had mistaken Williams for a black man who was recorded on a CCTV camera stealing. This incident is cited as an instance where facial recognition continues to possess racial bias, especially towards the Black and Asian population.

Short Description

The Detroit Police Department wrongfully arrest a black man due to its faulty facial recognition program provided by Dataworks Plus.

Severity

Moderate

Harm Distribution Basis

Race

Harm Type

Harm to civil liberties

AI System Description

DataWorks Plus facial recognition software was provided to the Detroit Police Department and focuses on biometrics storage and matching, including fingerprints, palm prints, irises, tattoos, and mugshots.

System Developer

DataWorks Plus

Sector of Deployment

Public administration and defence

Relevant AI functions

Perception, Cognition, Action

AI Techniques

facial recognition, machine learning, environmental sensing

AI Applications

Facial recognition, environmental sensing, biometrics, image recognition, speech recognition

Location

United States (Detroit, Michigan)

Named Entities

Detroit Police Department, DataWorks Plus

Technology Purveyor

DataWorks Plus

Beginning Date

06/2020

Ending Date

06/2020

Near Miss

Harm caused

Intent

Accident

Lives Lost

No

Data Inputs

biometrics, images, camera footage

GMF Taxonomy Classifications

Taxonomy Details

Known AI Goal

Face Recognition

Known AI Technology

Face Detection

Potential AI Technology

Convolutional Neural Network, Distributional Learning

Potential AI Technical Failure

Dataset Imbalance, Generalization Failure, Underfitting, Covariate Shift

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.