Description: UK government testing of police facial recognition technology reportedly found significantly higher false positive identification rates for Black and Asian individuals compared with white subjects, with particularly elevated error rates for Black women. The findings reportedly emerged from analysis of retrospective searches of the police national database and were disclosed by the Home Office amid plans for expanded national deployment.
Entities
View all entitiesAlleged: Unknown facial recognition technology developers developed an AI system deployed by Home Office , Metropolitan Police , Government of the United Kingdom , Law enforcement and British law enforcement, which harmed General public , General public of the United Kingdom , Minorities in the United Kingdom , Black people in the United Kingdom , Asian people in the United Kingdom , Epistemic integrity and National security and intelligence stakeholders.
Alleged implicated AI system: Unknown facial recognition technology
Incident Stats
Incident ID
1305
Report Count
2
Incident Date
2025-12-05
Editors
Daniel Atherton
Incident Reports
Reports Timeline
Loading...
Facial recognition technology has been more likely to incorrectly flag black and Asian people as possible suspects, according to tests.
An investigation into how the technology works when used to search the police national database found it…
Loading...

Ministers are facing calls for stronger safeguards on the use of facial recognition technology after the Home Office admitted it is more likely to incorrectly identify black and Asian people than their white counterparts on some settings.
F…
Variants
A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?
