Description: Police departments across the U.S. have used facial recognition software to identify suspects in criminal investigations, leading to multiple false arrests and wrongful detentions. The software's unreliability, especially in identifying people of color, has resulted in misidentifications that were not disclosed to defendants. In some cases, individuals were unaware that facial recognition played a role in their arrest, violating their legal rights and leading to unjust detentions.
Editor Notes: This collective incident ID, based on a Washington Post investigation, details many harm events, the overarching theme of which is widespread facial recognition technology assisting in arrests made by police departments across the United States combined with a lack of transparency about the technology's use in making the arrests. Some of the documented incidents in the Washington Post's investigation are as follows: (1) 2019: Facial recognition technology used to misidentify Francisco Arteaga in New Jersey, which led to his wrongful detention for four years (see Incident 816). (2) 2020-2024: Miami Police Department conducts 2,500 facial recognition searches, leading to at least 186 arrests and 50 convictions. Less than 7% of defendants were informed of the technology's use. (3) 2022: Quran Reid is wrongfully arrested in Louisiana due to a facial recognition match, despite never visiting the state (see Incident 515). (4) June 2023: New Jersey appeals court rules that a defendant has the right to information regarding the use of facial recognition technology in their case. (5) July 2023: Miami Police Department acknowledges that they may not have informed prosecutors about the use of facial recognition in many cases. (6) October 6, 2024: The Washington Post publishes its investigation on these incidents and practices.
Entités
Voir toutes les entitésAlleged: Clearview AI developed an AI system deployed by Police departments , Evansville PD , Pflugerville PD , Jefferson Parish Sheriff’s Office , Miami PD , West New York PD , NYPD , Coral Springs PD et Arvada PD, which harmed Quran Reid , Francisco Arteaga et Defendants wrongfully accused by facial recognition.
Statistiques d'incidents
ID
815
Nombre de rapports
1
Date de l'incident
2024-10-06
Editeurs
Daniel Atherton
Rapports d'incidents
Chronologie du rapport
washingtonpost.com · 2024
- Afficher le rapport d'origine à sa source
- Voir le rapport sur l'Archive d'Internet
Des centaines d’Américains ont été arrêtés après avoir été liés à un crime par un logiciel de reconnaissance faciale, a révélé une enquête du Washington Post, mais beaucoup ne le savent jamais car la police révèle rarement son utilisation d…
Variantes
Une "Variante" est un incident qui partage les mêmes facteurs de causalité, produit des dommages similaires et implique les mêmes systèmes intelligents qu'un incident d'IA connu. Plutôt que d'indexer les variantes comme des incidents entièrement distincts, nous listons les variations d'incidents sous le premier incident similaire soumis à la base de données. Contrairement aux autres types de soumission à la base de données des incidents, les variantes ne sont pas tenues d'avoir des rapports en preuve externes à la base de données des incidents. En savoir plus sur le document de recherche.