Incident 310: High False Positive Rate by SWP's Facial Recognition Use at Champion's League Final

Description: South Wales Police (SWP)’s automated facial recognition (AFR) at the Champion's League Final football game in Cardiff wrongly identified innocent people as potential matches at an extremely high false positive rate of more than 90%.
Alleged: NEC developed an AI system deployed by South Wales Police, which harmed FInals attendees and falsely accused Finals attendees.

Suggested citation format

Lam, Khoa. (2017-06-03) Incident Number 310. in Lam, K. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
310
Report Count
8
Incident Date
2017-06-03
Editors
Khoa Lam

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover

Incident Reports

NEC provides facial recognition system to South Wales Police in the UK

Tokyo & London, July 11, 2017NEC Corporation (NEC; TSE: 6701) today announced that it has provided a facial recognition system for South Wales Police in the UK through NEC Europe Ltd. The system utilizes NeoFace® Watch, NEC's flagship fa…

2,000 wrongly matched with possible criminals at Champions League

More than 2,000 people were wrongly identified as possible criminals by facial scanning technology at the 2017 Champions League final in Cardiff.

South Wales Police used the technology as about 170,000 people were in Cardiff for the Real Ma…

Facial recognition wrongly identified 2,000 people as possible criminals when Champions League final came to Cardiff

Facial recognition software wrongly identified more than 2,000 people as potential criminals as police patrolled the Champions League final in Cardiff.

The technology provided hundreds of “false positives” wrongly marking out innocent peopl…

UK police say 92% false positive facial recognition is no big deal

A British police agency is defending (this link is inoperable for the moment) its use of facial recognition technology at the June 2017 Champions League soccer final in Cardiff, Wales—among several other instances—saying that despite the sy…

British police defend their new criminal facial recognition technology – even though it's failing at a rate of 92%
  • Police in South Wales have been relying on facial recognition technology for 12 months.

  • An FOI request has revealed that the technology provides a "false positive" ID in more than 90% of cases.

  • The police have admitted that "of course…

EWCA Civ 1058 – R (Bridges) v. CC South Wales

Sir Terence Etherton MR, Dame Victoria Sharp PQBD and Lord Justice Singh:

This appeal concerns the lawfulness of the use of live automated facial recognition technology (“AFR”) by the South Wales Police Force (“SWP”) in an ongoing trial usi…

UK Court of Appeal Finds Automated Facial Recognition Technology Unlawful in Bridges v South Wales Police

On August 11, 2020, the Court of Appeal of England and Wales overturned the High Court’s dismissal of a challenge to South Wales Police’s use of Automated Facial Recognition technology (“AFR”), finding that its use was unlawful and violated…

UK Police Use of Facial Recognition Fails to Meet 'Legal And Ethical Standards'

Use of live facial recognition technology by UK police fails to meet “minimum ethical and legal standards” and should be banned from application in public spaces, say researchers from the University of Cambridge.

A team of researchers at th…

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.