Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 1305: UK Facial Recognition System Reportedly Exhibits Higher False Positive Rates for Black and Asian Subjects

Description: UK government testing of police facial recognition technology reportedly found significantly higher false positive identification rates for Black and Asian individuals compared with white subjects, with particularly elevated error rates for Black women. The findings reportedly emerged from analysis of retrospective searches of the police national database and were disclosed by the Home Office amid plans for expanded national deployment.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: Unknown facial recognition technology developers developed an AI system deployed by Home Office , Metropolitan Police , Government of the United Kingdom , Law enforcement and British law enforcement, which harmed General public , General public of the United Kingdom , Minorities in the United Kingdom , Black people in the United Kingdom , Asian people in the United Kingdom , Epistemic integrity and National security and intelligence stakeholders.
Alleged implicated AI system: Unknown facial recognition technology

Incident Stats

Incident ID
1305
Report Count
2
Incident Date
2025-12-05
Editors
Daniel Atherton

Incident Reports

Reports Timeline

+2
Facial recognition cameras ‘more likely to wrongly flag black and Asian people’
Loading...
Facial recognition cameras ‘more likely to wrongly flag black and Asian people’

Facial recognition cameras ‘more likely to wrongly flag black and Asian people’

telegraph.co.uk

Loading...
Home Office admits facial recognition tech issue with black and Asian subjects

Home Office admits facial recognition tech issue with black and Asian subjects

theguardian.com

Loading...
Facial recognition cameras ‘more likely to wrongly flag black and Asian people’
telegraph.co.uk · 2025

Facial recognition technology has been more likely to incorrectly flag black and Asian people as possible suspects, according to tests.

An investigation into how the technology works when used to search the police national database found it…

Loading...
Home Office admits facial recognition tech issue with black and Asian subjects
theguardian.com · 2025

Ministers are facing calls for stronger safeguards on the use of facial recognition technology after the Home Office admitted it is more likely to incorrectly identify black and Asian people than their white counterparts on some settings.

F…

Variants

A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?
Previous Incident

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 5440a2a