インシデント 310の引用情報

Description: South Wales Police (SWP)’s automated facial recognition (AFR) at the Champion's League Final football game in Cardiff wrongly identified innocent people as potential matches at an extremely high false positive rate of more than 90%.
推定: NECが開発し、South Wales Policeが提供したAIシステムで、FInals attendees falsely accused Finals attendeesに影響を与えた

インシデントのステータス

インシデントID
310
レポート数
8
インシデント発生日
2017-06-03
エディタ
Khoa Lam
NEC provides facial recognition system to South Wales Police in the UK
nec.com · 2017

Tokyo & London, July 11, 2017NEC Corporation (NEC; TSE: 6701) today announced that it has provided a facial recognition system for South Wales Police in the UK through NEC Europe Ltd. The system utilizes NeoFace® Watch, NEC's flagship fa…

2,000 wrongly matched with possible criminals at Champions League
bbc.com · 2018

More than 2,000 people were wrongly identified as possible criminals by facial scanning technology at the 2017 Champions League final in Cardiff.

South Wales Police used the technology as about 170,000 people were in Cardiff for the Real Ma…

Facial recognition wrongly identified 2,000 people as possible criminals when Champions League final came to Cardiff
walesonline.co.uk · 2018

Facial recognition software wrongly identified more than 2,000 people as potential criminals as police patrolled the Champions League final in Cardiff.

The technology provided hundreds of “false positives” wrongly marking out innocent peopl…

UK police say 92% false positive facial recognition is no big deal
arstechnica.com · 2018

A British police agency is defending (this link is inoperable for the moment) its use of facial recognition technology at the June 2017 Champions League soccer final in Cardiff, Wales—among several other instances—saying that despite the sy…

British police defend their new criminal facial recognition technology – even though it's failing at a rate of 92%
businessinsider.com · 2018
  • Police in South Wales have been relying on facial recognition technology for 12 months.

  • An FOI request has revealed that the technology provides a "false positive" ID in more than 90% of cases.

  • The police have admitted that "of course…

EWCA Civ 1058 – R (Bridges) v. CC South Wales
judiciary.uk · 2020

Sir Terence Etherton MR, Dame Victoria Sharp PQBD and Lord Justice Singh:

This appeal concerns the lawfulness of the use of live automated facial recognition technology (“AFR”) by the South Wales Police Force (“SWP”) in an ongoing trial usi…

UK Court of Appeal Finds Automated Facial Recognition Technology Unlawful in Bridges v South Wales Police
huntonprivacyblog.com · 2020

On August 11, 2020, the Court of Appeal of England and Wales overturned the High Court’s dismissal of a challenge to South Wales Police’s use of Automated Facial Recognition technology (“AFR”), finding that its use was unlawful and violated…

UK Police Use of Facial Recognition Fails to Meet 'Legal And Ethical Standards'
pcmag.com · 2022

Use of live facial recognition technology by UK police fails to meet “minimum ethical and legal standards” and should be banned from application in public spaces, say researchers from the University of Cambridge.

A team of researchers at th…

バリアント

「バリアント」は既存のAIインシデントと同じ原因要素を共有し、同様な被害を引き起こし、同じ知的システムを含んだインシデントです。バリアントは完全に独立したインシデントとしてインデックスするのではなく、データベースに最初に投稿された同様なインシデントの元にインシデントのバリエーションとして一覧します。インシデントデータベースの他の投稿タイプとは違い、バリアントではインシデントデータベース以外の根拠のレポートは要求されません。詳細についてはこの研究論文を参照してください

よく似たインシデント

テキスト類似度による

Did our AI mess up? Flag the unrelated incidents