Description: Law enforcement agencies across the U.S. have allegedly been misusing AI-powered facial recognition technology, leading to wrongful arrests and significant harm to at least eight individuals. Officers have reportedly been bypassing investigative standards, relying on uncorroborated AI matches to build cases, allegedly resulting in prolonged detentions, reputational damage, and personal trauma.
Editor Notes: Editor Notes: This collective incident ID, based on a Washington Post investigation, tracks alleged misuse of facial recognition technology by law enforcement across the U.S., similar to Incident 815: Police Use of Facial Recognition Software Causes Wrongful Arrests Without Defendant Knowledge. While that incident focuses on allegations of withholding information regarding arrests, this incident focuses on reports of law enforcement allegedly relying primarily on facial recognition technology without sufficient corroborative investigative procedures. Some reported incidents include: (1) December 2020: Facial recognition technology reportedly misidentified Christopher Gatlin in Missouri, resulting in his arrest and over 16 months in jail before charges were dropped in March 2024. (2) 2022: Maryland police allegedly misidentified Alonzo Sawyer for assault using facial recognition; his wife later provided evidence that reportedly cleared his name. (3) 2022: Detroit police arrested Robert Williams based on a reported facial recognition error; the city later settled a lawsuit in 2023 for $300,000 without admitting liability. (4) July 2024: Miami police reportedly relied on facial recognition to identify Jason Vernau for check fraud; he was jailed for three days before charges were dropped. (5) January 13, 2025: The Washington Post published its investigation, detailing at least eight wrongful arrests reportedly linked to the use of facial recognition technology and alleged failures to corroborate AI-generated matches. See the full report at The Washington Post for more details on specific cases, timelines, and deployers of this technology.
Entidades
Ver todas las entidadesAlleged: Developers of mugshot recognition software , Developers of law enforcement facial recognition software y Clearview AI developed an AI system deployed by Florence Kentucky Police Department , Evansville Indiana Police Department , Detroit Police Department , Coral Springs Florida Police Department , Bradenton Florida Police Department y Austin Police Department, which harmed Wrongfully arrested individuals , Vulnerable communities , Robert Williams , Quran Reid , Porcha Woodruff , People of color , Nijeer Parks , Jason Vernau , Christopher Gatlin , Black people y Alonzo Sawyer.
Sistemas de IA presuntamente implicados: Clearview AI , Statewide facial recognition systems , St. Louis mugshot recognition technology , Michigan state facial recognition system y Florida state facial recognition system
Estadísticas de incidentes
ID
896
Cantidad de informes
1
Fecha del Incidente
2025-01-13
Editores
Daniel Atherton
Informes del Incidente
Cronología de Informes
washingtonpost.com · 2025
- Ver el informe original en su fuente
- Ver el informe en el Archivo de Internet
*Vea el informe completo del Washington Post para obtener información adicional, incluida una explicación de la metodología que emplearon. * PAGEDALE, Missouri — Después de que dos hombres atacaron brutalmente a un guardia de seguridad en u…
Variantes
Una "Variante" es un incidente que comparte los mismos factores causales, produce daños similares e involucra los mismos sistemas inteligentes que un incidente de IA conocido. En lugar de indexar las variantes como incidentes completamente separados, enumeramos las variaciones de los incidentes bajo el primer incidente similar enviado a la base de datos. A diferencia de otros tipos de envío a la base de datos de incidentes, no se requiere que las variantes tengan informes como evidencia externa a la base de datos de incidentes. Obtenga más información del trabajo de investigación.
Incidentes Similares
Did our AI mess up? Flag the unrelated incidents
Wrongfully Accused by an Algorithm
· 11 informes
Tempe police release report, audio, photo
· 25 informes
Incidentes Similares
Did our AI mess up? Flag the unrelated incidents
Wrongfully Accused by an Algorithm
· 11 informes
Tempe police release report, audio, photo
· 25 informes