Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 1400: West Midlands Police Reportedly Relied on Erroneous Copilot-Generated Intelligence in Maccabi Tel Aviv Away-Fan Ban Decision

Responded
Description: West Midlands Police reportedly included inaccurate intelligence purportedly generated using Microsoft Copilot in materials used to justify banning Maccabi Tel Aviv supporters from attending a November 2025 Europa League match against Aston Villa. The reported Copilot-linked error, which referred to a match that had not taken place, was later acknowledged by Chief Constable Craig Guildford.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: Microsoft developed an AI system deployed by West Midlands Police, which harmed Maccabi Tel Aviv supporters , Jewish community in Birmingham and the West Midlands , Epistemic integrity and Maccabi Tel Aviv F.C..
Alleged implicated AI system: Copilot

Incident Stats

Incident ID
1400
Report Count
2
Incident Date
2025-10-24
Editors
Daniel Atherton

Incident Reports

Reports Timeline

Incident Occurrence+1
West Midlands police chief apologises after AI error used to justify Maccabi Tel Aviv ban
Loading...
West Midlands police chief apologises after AI error used to justify Maccabi Tel Aviv ban

West Midlands police chief apologises after AI error used to justify Maccabi Tel Aviv ban

theguardian.com

Loading...
Inspection of police forces’ contributions to safety advisory groups: West Midlands Police (accessible)

Inspection of police forces’ contributions to safety advisory groups: West Midlands Police (accessible)

gov.uk

Loading...
West Midlands police chief apologises after AI error used to justify Maccabi Tel Aviv ban
theguardian.com · 2026

The chief of West Midlands police has apologised to MPs for giving them incorrect evidence about the decision to ban Maccabi Tel Aviv football fans, saying it had been produced by artificial intelligence (AI).

Craig Guildford told the home …

Loading...
Inspection of police forces’ contributions to safety advisory groups: West Midlands Police (accessible)
gov.uk · 2026
Andy Cooke post-incident response

23 Stephenson Street
Birmingham
B2 4BH

Sir Andy Cooke QPM DL
HM Chief Inspector of Constabulary
HM Chief Inspector of Fire & Rescue Services

Sent by email:

The Rt Hon Shabana Mahmood MP
Secretary of State for the Home Department

14 January …

Variants

A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Loading...
Facial Recognition Trial Performed Poorly at Notting Hill Carnival

Facial Recognition Trial Performed Poorly at Notting Hill Carnival

Aug 2017 · 4 reports
Loading...
Opaque Fraud Detection Algorithm by the UK’s Department of Work and Pensions Allegedly Discriminated against People with Disabilities

Opaque Fraud Detection Algorithm by the UK’s Department of Work and Pensions Allegedly Discriminated against People with Disabilities

Oct 2019 · 6 reports
Loading...
ETS Used Allegedly Flawed Voice Recognition Evidence to Accuse and Assess Scale of Cheating, Causing Thousands to be Deported from the UK

ETS Used Allegedly Flawed Voice Recognition Evidence to Accuse and Assess Scale of Cheating, Causing Thousands to be Deported from the UK

Jan 2014 · 1 report
Previous IncidentNext Incident

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Loading...
Facial Recognition Trial Performed Poorly at Notting Hill Carnival

Facial Recognition Trial Performed Poorly at Notting Hill Carnival

Aug 2017 · 4 reports
Loading...
Opaque Fraud Detection Algorithm by the UK’s Department of Work and Pensions Allegedly Discriminated against People with Disabilities

Opaque Fraud Detection Algorithm by the UK’s Department of Work and Pensions Allegedly Discriminated against People with Disabilities

Oct 2019 · 6 reports
Loading...
ETS Used Allegedly Flawed Voice Recognition Evidence to Accuse and Assess Scale of Cheating, Causing Thousands to be Deported from the UK

ETS Used Allegedly Flawed Voice Recognition Evidence to Accuse and Assess Scale of Cheating, Causing Thousands to be Deported from the UK

Jan 2014 · 1 report

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd