Entities
View all entitiesIncident Stats
Incident Reports
Reports Timeline
Facial recognition technology has been more likely to incorrectly flag black and Asian people as possible suspects, according to tests.
An investigation into how the technology works when used to search the police national database found it…

Ministers are facing calls for stronger safeguards on the use of facial recognition technology after the Home Office admitted it is more likely to incorrectly identify black and Asian people than their white counterparts on some settings.
F…
Facial recognition technology that has been used hundreds of times a day by British police forces is biased against minority groups and women, the Home Office has acknowledged.
Official research showed that the technology identified the wro…
Police forces successfully lobbied to use a facial recognition system known to be biased against women, young people, and members of ethnic minority groups, after complaining that another version produced fewer potential suspects.
UK forces…
