Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 1191: NYPD Facial Recognition System Allegedly Produced Erroneous Match That Reportedly Resulted in Wrongful Detention of Trevis Williams

Description: The NYPD's facial recognition system allegedly misidentified Trevis Williams as a suspect in a Union Square indecent exposure case. Despite reportedly notable physical differences and exculpatory phone data, Williams was arrested, jailed for more than two days, and charged. The case was later dismissed.
Editor Notes: Timeline note: This incident ID takes 04/21/2025 as the incident ID date because that is when Trevis Williams was reportedly arrested and jailed following an alleged facial recognition misidentification. The original crime was reported on 02/10/2025, and Williams was allegedly taken into custody on 04/21/2025. The charges were purportedly dismissed in July 2025, and The New York Times subsequently reported on the case on 08/26/2025 (the date of ingestion into the database as well).

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: Unknown facial recognition system developer developed an AI system deployed by New York Police Department and NYPD, which harmed Trevis Williams , Judicial integrity , Judicial system , Law enforcement , General public and General public of New York.
Alleged implicated AI system: Unknown facial recognition system

Incident Stats

Incident ID
1191
Report Count
1
Incident Date
2025-04-21
Editors
Daniel Atherton

Incident Reports

Reports Timeline

Incident OccurrenceHow the N.Y.P.D.’s Facial Recognition Tool Landed the Wrong Man in Jail
Loading...
How the N.Y.P.D.’s Facial Recognition Tool Landed the Wrong Man in Jail

How the N.Y.P.D.’s Facial Recognition Tool Landed the Wrong Man in Jail

nytimes.com

Loading...
How the N.Y.P.D.’s Facial Recognition Tool Landed the Wrong Man in Jail
nytimes.com · 2025

In February, a woman told the police that a delivery man had exposed himself to her in a Manhattan building. He was about 5 feet 6 inches tall.

Two months later, evidence shows, the police arrested the wrong man. He was 6-foot-2.

The man, T…

Variants

A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Loading...
Detroit Police Wrongfully Arrested Black Man Due To Faulty FRT

Detroit Police Wrongfully Arrested Black Man Due To Faulty FRT

Jan 2020 · 11 reports
Loading...
New Jersey Police Wrongful Arrested Innocent Black Man via FRT

New Jersey Police Wrongful Arrested Innocent Black Man via FRT

Jan 2019 · 4 reports
Loading...
Predictive Policing Biases of PredPol

Predictive Policing Biases of PredPol

Nov 2015 · 17 reports
Previous IncidentNext Incident

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Loading...
Detroit Police Wrongfully Arrested Black Man Due To Faulty FRT

Detroit Police Wrongfully Arrested Black Man Due To Faulty FRT

Jan 2020 · 11 reports
Loading...
New Jersey Police Wrongful Arrested Innocent Black Man via FRT

New Jersey Police Wrongful Arrested Innocent Black Man via FRT

Jan 2019 · 4 reports
Loading...
Predictive Policing Biases of PredPol

Predictive Policing Biases of PredPol

Nov 2015 · 17 reports

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • b9764d4