Description: Police departments across the U.S. have used facial recognition software to identify suspects in criminal investigations, leading to multiple false arrests and wrongful detentions. The software's unreliability, especially in identifying people of color, has resulted in misidentifications that were not disclosed to defendants. In some cases, individuals were unaware that facial recognition played a role in their arrest, violating their legal rights and leading to unjust detentions.
Editor Notes: This collective incident ID, based on a Washington Post investigation, details many harm events, the overarching theme of which is widespread facial recognition technology assisting in arrests made by police departments across the United States combined with a lack of transparency about the technology's use in making the arrests. Some of the documented incidents in the Washington Post's investigation are as follows: (1) 2019: Facial recognition technology used to misidentify Francisco Arteaga in New Jersey, which led to his wrongful detention for four years (see Incident 816). (2) 2020-2024: Miami Police Department conducts 2,500 facial recognition searches, leading to at least 186 arrests and 50 convictions. Less than 7% of defendants were informed of the technology's use. (3) 2022: Quran Reid is wrongfully arrested in Louisiana due to a facial recognition match, despite never visiting the state (see Incident 515). (4) June 2023: New Jersey appeals court rules that a defendant has the right to information regarding the use of facial recognition technology in their case. (5) July 2023: Miami Police Department acknowledges that they may not have informed prosecutors about the use of facial recognition in many cases. (6) October 6, 2024: The Washington Post publishes its investigation on these incidents and practices.
推定: Clearview AIが開発し、Police departments , Evansville PD , Pflugerville PD , Jefferson Parish Sheriff’s Office , Miami PD , West New York PD , NYPD , Coral Springs PD と Arvada PDが提供したAIシステムで、Quran Reid , Francisco Arteaga と Defendants wrongfully accused by facial recognitionに影響を与えた
インシデントのステータス
インシデントID
815
レポート数
1
インシデント発生日
2024-10-06
エディタ
Daniel Atherton
インシデントレポート
レポートタイムライン
washingtonpost.com · 2024
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
translated-ja-Hundreds of Americans have been arrested after being connected to a crime by facial recognition software, a Washington Post investigation has found, but many never know it because police seldom disclose their use of the contro…
バリアント
「バリアント」は既存のAIインシデントと同じ原因要素を共有し、同様な被害を引き起こし、同じ知的システムを含んだインシデントです。バリアントは完全に独立したインシデントとしてインデックスするのではなく、データベースに最初に投稿された同様なインシデントの元にインシデントのバリエーションとして一覧します。インシデントデータベースの他の投稿タイプとは違い、バリアントではインシデントデータベース以外の根拠のレポートは要求されません。詳細についてはこの研究論文を参照してください