Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 896: Alleged Misuse of Facial Recognition Technology by Law Enforcement Reportedly Leading to Wrongful Arrests and Violations of Investigative Standards

Description: Law enforcement agencies across the U.S. have allegedly been misusing AI-powered facial recognition technology, leading to wrongful arrests and significant harm to at least eight individuals. Officers have reportedly been bypassing investigative standards, relying on uncorroborated AI matches to build cases, allegedly resulting in prolonged detentions, reputational damage, and personal trauma.
Editor Notes: Editor Notes: This collective incident ID, based on a Washington Post investigation, tracks alleged misuse of facial recognition technology by law enforcement across the U.S., similar to Incident 815: Police Use of Facial Recognition Software Causes Wrongful Arrests Without Defendant Knowledge. While that incident focuses on allegations of withholding information regarding arrests, this incident focuses on reports of law enforcement allegedly relying primarily on facial recognition technology without sufficient corroborative investigative procedures. Some reported incidents include: (1) December 2020: Facial recognition technology reportedly misidentified Christopher Gatlin in Missouri, resulting in his arrest and over 16 months in jail before charges were dropped in March 2024. (2) 2022: Maryland police allegedly misidentified Alonzo Sawyer for assault using facial recognition; his wife later provided evidence that reportedly cleared his name. (3) 2022: Detroit police arrested Robert Williams based on a reported facial recognition error; the city later settled a lawsuit in 2023 for $300,000 without admitting liability. (4) July 2024: Miami police reportedly relied on facial recognition to identify Jason Vernau for check fraud; he was jailed for three days before charges were dropped. (5) January 13, 2025: The Washington Post published its investigation, detailing at least eight wrongful arrests reportedly linked to the use of facial recognition technology and alleged failures to corroborate AI-generated matches. See the full report at The Washington Post for more details on specific cases, timelines, and deployers of this technology.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: Developers of mugshot recognition software , Developers of law enforcement facial recognition software and Clearview AI developed an AI system deployed by Florence Kentucky Police Department , Evansville Indiana Police Department , Detroit Police Department , Coral Springs Florida Police Department , Bradenton Florida Police Department and Austin Police Department, which harmed Wrongfully arrested individuals , Vulnerable communities , Robert Williams , Quran Reid , Porcha Woodruff , People of color , Nijeer Parks , Jason Vernau , Christopher Gatlin , Black people and Alonzo Sawyer.
Alleged implicated AI systems: Clearview AI , Statewide facial recognition systems , St. Louis mugshot recognition technology , Michigan state facial recognition system and Florida state facial recognition system

Incident Stats

Incident ID
896
Report Count
1
Incident Date
2025-01-13
Editors
Daniel Atherton

Incident Reports

Reports Timeline

+1
Arrested by AI: Police ignore standards after facial recognition matches
Arrested by AI: Police ignore standards after facial recognition matches

Arrested by AI: Police ignore standards after facial recognition matches

washingtonpost.com

Arrested by AI: Police ignore standards after facial recognition matches
washingtonpost.com · 2025

See the full Washington Post report for additional information, including an explanation of the methodology they employed.

PAGEDALE, Missouri — After two men brutally assaulted a security guard on a desolate train platform on the outskirts …

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Detroit Police Wrongfully Arrested Black Man Due To Faulty FRT

Detroit Police Wrongfully Arrested Black Man Due To Faulty FRT

Jan 2020 · 11 reports
Uber AV Killed Pedestrian in Arizona

Uber AV Killed Pedestrian in Arizona

Mar 2018 · 25 reports
Unreliable ShotSpotter Audio Previously Used to Convict Chicago Man in Murder Case

Unreliable ShotSpotter Audio Previously Used to Convict Chicago Man in Murder Case

May 2020 · 9 reports
Previous IncidentNext Incident

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Detroit Police Wrongfully Arrested Black Man Due To Faulty FRT

Detroit Police Wrongfully Arrested Black Man Due To Faulty FRT

Jan 2020 · 11 reports
Uber AV Killed Pedestrian in Arizona

Uber AV Killed Pedestrian in Arizona

Mar 2018 · 25 reports
Unreliable ShotSpotter Audio Previously Used to Convict Chicago Man in Murder Case

Unreliable ShotSpotter Audio Previously Used to Convict Chicago Man in Murder Case

May 2020 · 9 reports

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • ecd56df