Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 5100

Associated Incidents

Incident 6923 Report
London Metropolitan Police's Facial Recognition Technology Reportedly Misidentified Shaun Thompson as Suspect Leading to Arrest

Loading...
Black Activist Brings Legal Challenge To Police After False Facial Recognition Arrest
peopleofcolorintech.com · 2024

Black British anti-knife crime activist Shaun Thompson, 38, has launched a legal challenge against the Metropolitan Police.

The police detained the 38-year-old after live facial recognition technology wrongly identified him as a suspect.

'Stop and search on steroids'

Thompson, who volunteers with the Street Fathers youth outreach group, described the system as 'stop and search on steroids' following his 20-minute detention at London Bridge station earlier this year.

Returning from a volunteer shift in south London, Thompson was wrongly flagged as a suspect on the Met's facial recognition database, leading him to be held for almost 30 minutes by the police.

Thompson told the BBC he was only let go after handing over a copy of his passport but had previously been asked to give fingerprints.

"It felt intrusive," he told the BBC. "I was treated guilty until proven innocent."

Legal review of the facial recognition technology

According to MyLondon, Thompson reflected on the incident and called the technology 'flawed' and said he felt like he was being treated as 'guilty until proven innocent.'

He has put in an application for a judicial review of the use of the technology, 

BBC Newsnight reported that the mistake may have been caused by a family resemblance, but the Metropolitan Police declined to comment.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd