インシデント 74の引用情報

Description: A Black man was wrongfully detained by the Detroit Police Department as a result of a false facial recognition (FRT) result..
推定: DataWorks Plusが開発し、Detroit Police Departmentが提供したAIシステムで、Robert Julian-Borchak Williams Black people in Detroitに影響を与えた

インシデントのステータス

インシデントID
74
レポート数
10
インシデント発生日
2020-01-30
エディタ
Sean McGregor, Khoa Lam

CSETv0 分類法のクラス

分類法の詳細

Full Description

In June 2020, the Detroit Police Department wrongfully arrested Robert Julian-Borchak Williams after facial recognition techonology provided by DataWorks Plus had mistaken Williams for a black man who was recorded on a CCTV camera stealing. This incident is cited as an instance where facial recognition continues to possess racial bias, especially towards the Black and Asian population.

Short Description

The Detroit Police Department wrongfully arrest a black man due to its faulty facial recognition program provided by Dataworks Plus.

Severity

Moderate

Harm Distribution Basis

Race

Harm Type

Harm to civil liberties

AI System Description

DataWorks Plus facial recognition software was provided to the Detroit Police Department and focuses on biometrics storage and matching, including fingerprints, palm prints, irises, tattoos, and mugshots.

System Developer

DataWorks Plus

Sector of Deployment

Public administration and defence

Relevant AI functions

Perception, Cognition, Action

AI Techniques

facial recognition, machine learning, environmental sensing

AI Applications

Facial recognition, environmental sensing, biometrics, image recognition, speech recognition

Location

United States (Detroit, Michigan)

Named Entities

Detroit Police Department, DataWorks Plus

Technology Purveyor

DataWorks Plus

Beginning Date

06/2020

Ending Date

06/2020

Near Miss

Harm caused

Intent

Accident

Lives Lost

No

Data Inputs

biometrics, images, camera footage

GMF 分類法のクラス

分類法の詳細

Known AI Goal

Face Recognition

Known AI Technology

Face Detection

Potential AI Technology

Convolutional Neural Network, Distributional Learning

Potential AI Technical Failure

Dataset Imbalance, Generalization Failure, Underfitting, Covariate Shift

AI technologies — like police facial recognition — discriminate against people of colour
theconversation.com · 2020

Detroit police wrongfully arrested Robert Julian-Borchak Williams in January 2020 for a shoplifting incident that had taken place two years earlier. Even though Williams had nothing to do with the incident, facial recognition technology use…

nytimes.com · 2020

"Note: In response to this article, the Wayne County prosecutor’s office said that Robert Julian-Borchak Williams could have the case and his fingerprint data expunged. “We apologize,” the prosecutor, Kym L. Worthy, said in a statement, add…

'The Computer Got It Wrong': How Facial Recognition Led To False Arrest Of Black Man
npr.org · 2020

Updated 9:05 p.m. ET Wednesday

Police in Detroit were trying to figure out who stole five watches from a Shinola retail store. Authorities say the thief took off with an estimated $3,800 worth of merchandise.

Investigators pulled a security…

Detroit police admit to first facial recognition mistake after false arrest
techrepublic.com · 2020

On Wednesday morning, the ACLU announced that it was filing a complaint against the Detroit Police Department on behalf of Robert Williams, a Black Michigan resident whom the group said is one of the first people falsely arrested due to fac…

Detroit Police Chief: Facial Recognition Software Misidentifies 96% of the Time
vice.com · 2020

Detroit police have used highly unreliable facial recognition technology almost exclusively against Black people so far in 2020, according to the Detroit Police Department’s own statistics. The department’s use of the technology gained nati…

Facial Recognition Blamed For False Arrest And Jail Time
silicon.co.uk · 2020

Racial bias and facial recognition. Black man in New Jersey arrested by police and spends ten days in jail after false face recognition match

Accuracy and racial bias concerns about facial recognition technology continue with the news of a …

Teaneck NJ bans facial recognition usage for police, citing bias
northjersey.com · 2021

Teaneck just banned facial recognition technology for police. Here's why

Show Caption Hide Caption Facial recognition program that works even if you’re wearing a mask A Japanese company says they’ve developed a system that can bypass face c…

Wrongfully arrested man sues Detroit police over false facial recognition match
washingtonpost.com · 2021

A Michigan man has sued Detroit police after he was wrongfully arrested and falsely identified as a shoplifting suspect by the department’s facial recognition software in one of the first lawsuits of its kind to call into question the contr…

It's time to address facial recognition, the most troubling law enforcement AI tool
thebulletin.org · 2021

Since a Minneapolis police officer killed George Floyd in March 2020 and re-ignited massive Black Lives Matter protests, communities across the country have been re-thinking law enforcement, from granular scrutiny of the ways that police us…

How Wrongful Arrests Based on AI Derailed 3 Men's Lives
wired.com · 2022

ROBERT WILLIAMS WAS doing yard work with his family one afternoon last August when his daughter Julia said they needed a family meeting immediately. Once everyone was inside the house, the 7-year-old girl closed all the blinds and curtains …

バリアント

「バリアント」は既存のAIインシデントと同じ原因要素を共有し、同様な被害を引き起こし、同じ知的システムを含んだインシデントです。バリアントは完全に独立したインシデントとしてインデックスするのではなく、データベースに最初に投稿された同様なインシデントの元にインシデントのバリエーションとして一覧します。インシデントデータベースの他の投稿タイプとは違い、バリアントではインシデントデータベース以外の根拠のレポートは要求されません。詳細についてはこの研究論文を参照してください

よく似たインシデント

テキスト類似度による

Did our AI mess up? Flag the unrelated incidents