Description: An evidence photo from a Westbrook, Maine, drug bust was reported to have been unintentionally altered by an AI‑powered editing tool before being posted to Facebook, changing packaging details and creating false visual elements. The altered image reportedly led to questions and concerns over the authenticity of police evidence. The Westbrook Police Department later confirmed the use of AI, explained the error occurred while adding their patch to the photograph, and issued a public apology.
Editor Notes: See the Westbrook Police Department's public apology here: https://www.facebook.com/WestbrookPD/posts/pfbid02aDWUJAtszA81VRJJZvzPaTMA98ir18fJpLgRHKFu6ZGkZesZ3f3veWotqKq9dgvQl.
Tools
New ReportNew ResponseDiscoverView History
The OECD AI Incidents and Hazards Monitor (AIM) automatically collects and classifies AI-related incidents and hazards in real time from reputable news sources worldwide.
Entities
View all entitiesAlleged: Unknown AI image-editing software developer developed an AI system deployed by Westbrook Police Department, which harmed Westbrook Police Department , General public of Maine , General public of Westbrook, Maine and Epistemic integrity.
Alleged implicated AI systems: Unknown AI image-editing software and Facebook
Incident Stats
Risk Subdomain
A further 23 subdomains create an accessible and understandable classification of hazards and harms associated with AI
3.1. False or misleading information
Risk Domain
The Domain Taxonomy of AI Risks classifies risks into seven AI risk domains: (1) Discrimination & toxicity, (2) Privacy & security, (3) Misinformation, (4) Malicious actors & misuse, (5) Human-computer interaction, (6) Socioeconomic & environmental harms, and (7) AI system safety, failures & limitations.
- Misinformation
Entity
Which, if any, entity is presented as the main cause of the risk
AI
Timing
The stage in the AI lifecycle at which the risk is presented as occurring
Post-deployment
Intent
Whether the risk is presented as occurring as an expected or unexpected outcome from pursuing a goal
Unintentional
Incident Reports
Reports Timeline
Loading...
A Maine police department has offered a mea culpa after the agency said it inadvertently shared an AI-altered photo of drug evidence on social media.
The image from the Westbrook Police Department showed a collection of drug paraphernalia …
Variants
A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Similar Incidents
Did our AI mess up? Flag the unrelated incidents

