Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

インシデント 310: High False Positive Rate by SWP's Facial Recognition Use at Champion's League Final

概要: South Wales Police (SWP)’s automated facial recognition (AFR) at the Champion's League Final football game in Cardiff wrongly identified innocent people as potential matches at an extremely high false positive rate of more than 90%.

ツール

新しいレポート新しいレポート新しいレスポンス新しいレスポンス発見する発見する履歴を表示履歴を表示

組織

すべての組織を表示
Alleged: NEC developed an AI system deployed by South Wales Police, which harmed FInals attendees と falsely accused Finals attendees.

インシデントのステータス

インシデントID
310
レポート数
8
インシデント発生日
2017-06-03
エディタ
Khoa Lam
Applied Taxonomies
MIT

MIT 分類法のクラス

Machine-Classified
分類法の詳細

Risk Subdomain

A further 23 subdomains create an accessible and understandable classification of hazards and harms associated with AI
 

7.3. Lack of capability or robustness

Risk Domain

The Domain Taxonomy of AI Risks classifies risks into seven AI risk domains: (1) Discrimination & toxicity, (2) Privacy & security, (3) Misinformation, (4) Malicious actors & misuse, (5) Human-computer interaction, (6) Socioeconomic & environmental harms, and (7) AI system safety, failures & limitations.
 
  1. AI system safety, failures, and limitations

Entity

Which, if any, entity is presented as the main cause of the risk
 

AI

Timing

The stage in the AI lifecycle at which the risk is presented as occurring
 

Post-deployment

Intent

Whether the risk is presented as occurring as an expected or unexpected outcome from pursuing a goal
 

Unintentional

インシデントレポート

レポートタイムライン

+1
NEC provides facial recognition system to South Wales Police in the UK
+3
2,000 wrongly matched with possible criminals at Champions League
+1
EWCA Civ 1058 – R (Bridges) v. CC South Wales
UK Police Use of Facial Recognition Fails to Meet 'Legal And Ethical Standards'
NEC provides facial recognition system to South Wales Police in the UK

NEC provides facial recognition system to South Wales Police in the UK

nec.com

2,000 wrongly matched with possible criminals at Champions League

2,000 wrongly matched with possible criminals at Champions League

bbc.com

Facial recognition wrongly identified 2,000 people as possible criminals when Champions League final came to Cardiff

Facial recognition wrongly identified 2,000 people as possible criminals when Champions League final came to Cardiff

walesonline.co.uk

UK police say 92% false positive facial recognition is no big deal

UK police say 92% false positive facial recognition is no big deal

arstechnica.com

British police defend their new criminal facial recognition technology – even though it's failing at a rate of 92%

British police defend their new criminal facial recognition technology – even though it's failing at a rate of 92%

businessinsider.com

EWCA Civ 1058 – R (Bridges) v. CC South Wales

EWCA Civ 1058 – R (Bridges) v. CC South Wales

judiciary.uk

UK Court of Appeal Finds Automated Facial Recognition Technology Unlawful in Bridges v South Wales Police

UK Court of Appeal Finds Automated Facial Recognition Technology Unlawful in Bridges v South Wales Police

huntonprivacyblog.com

UK Police Use of Facial Recognition Fails to Meet 'Legal And Ethical Standards'

UK Police Use of Facial Recognition Fails to Meet 'Legal And Ethical Standards'

pcmag.com

NEC provides facial recognition system to South Wales Police in the UK
nec.com · 2017

Tokyo & London, July 11, 2017 - NEC Corporation (NEC; TSE: 6701) today announced that it has provided a facial recognition system for South Wales Police in the UK through NEC Europe Ltd. The system utilizes NeoFace® Watch, NEC's flagship fa…

2,000 wrongly matched with possible criminals at Champions League
bbc.com · 2018

More than 2,000 people were wrongly identified as possible criminals by facial scanning technology at the 2017 Champions League final in Cardiff.

South Wales Police used the technology as about 170,000 people were in Cardiff for the Real Ma…

Facial recognition wrongly identified 2,000 people as possible criminals when Champions League final came to Cardiff
walesonline.co.uk · 2018

Facial recognition software wrongly identified more than 2,000 people as potential criminals as police patrolled the Champions League final in Cardiff.

The technology provided hundreds of “false positives” wrongly marking out innocent peopl…

UK police say 92% false positive facial recognition is no big deal
arstechnica.com · 2018

A British police agency is defending (this link is inoperable for the moment) its use of facial recognition technology at the June 2017 Champions League soccer final in Cardiff, Wales—among several other instances—saying that despite the sy…

British police defend their new criminal facial recognition technology – even though it's failing at a rate of 92%
businessinsider.com · 2018
  • Police in South Wales have been relying on facial recognition technology for 12 months.

  • An FOI request has revealed that the technology provides a "false positive" ID in more than 90% of cases.

  • The police have admitted that "of course…

EWCA Civ 1058 – R (Bridges) v. CC South Wales
judiciary.uk · 2020

Sir Terence Etherton MR, Dame Victoria Sharp PQBD and Lord Justice Singh:

This appeal concerns the lawfulness of the use of live automated facial recognition technology (“AFR”) by the South Wales Police Force (“SWP”) in an ongoing trial usi…

UK Court of Appeal Finds Automated Facial Recognition Technology Unlawful in Bridges v South Wales Police
huntonprivacyblog.com · 2020

On August 11, 2020, the Court of Appeal of England and Wales overturned the High Court’s dismissal of a challenge to South Wales Police’s use of Automated Facial Recognition technology (“AFR”), finding that its use was unlawful and violated…

UK Police Use of Facial Recognition Fails to Meet 'Legal And Ethical Standards'
pcmag.com · 2022

Use of live facial recognition technology by UK police fails to meet “minimum ethical and legal standards” and should be banned from application in public spaces, say researchers from the University of Cambridge.

A team of researchers at th…

バリアント

「バリアント」は既存のAIインシデントと同じ原因要素を共有し、同様な被害を引き起こし、同じ知的システムを含んだインシデントです。バリアントは完全に独立したインシデントとしてインデックスするのではなく、データベースに最初に投稿された同様なインシデントの元にインシデントのバリエーションとして一覧します。インシデントデータベースの他の投稿タイプとは違い、バリアントではインシデントデータベース以外の根拠のレポートは要求されません。詳細についてはこの研究論文を参照してください

よく似たインシデント

テキスト類似度による

Did our AI mess up? Flag the unrelated incidents

ETS Used Allegedly Flawed Voice Recognition Evidence to Accuse and Assess Scale of Cheating, Causing Thousands to be Deported from the UK

The English test that ruined thousands of lives

Jan 2014 · 1 レポート
UK passport photo checker shows bias against dark-skinned women

UK passport photo checker shows bias against dark-skinned women

Oct 2020 · 1 レポート
Passport checker Detects Asian man's Eyes as Closed

Robot passport checker rejects Asian man's photo for having his eyes closed

Dec 2016 · 22 レポート
前のインシデント次のインシデント

よく似たインシデント

テキスト類似度による

Did our AI mess up? Flag the unrelated incidents

ETS Used Allegedly Flawed Voice Recognition Evidence to Accuse and Assess Scale of Cheating, Causing Thousands to be Deported from the UK

The English test that ruined thousands of lives

Jan 2014 · 1 レポート
UK passport photo checker shows bias against dark-skinned women

UK passport photo checker shows bias against dark-skinned women

Oct 2020 · 1 レポート
Passport checker Detects Asian man's Eyes as Closed

Robot passport checker rejects Asian man's photo for having his eyes closed

Dec 2016 · 22 レポート

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 86fe0f5