Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 1354: Purportedly AI-Altered Fake Nude Images of High School Girls and Women Reportedly Created and Disseminated in Pensacola, Florida

Description: In Pensacola, Florida, an 18-year-old man allegedly used an online AI image-alteration application to digitally "undress" photos of dozens of girls and young women without their consent, creating realistic fake nude images. Some source photos were reportedly taken when the victims were minors. The images were reportedly discovered on the man's phone and subsequently copied and shared with other students by a third party, prompting a police investigation and arrest.
Editor Notes: Timeline note: Reporting indicates that the AI-altered images were discovered and brought to law-enforcement attention in October 2024, though the precise date of creation and initial dissemination is unclear. A related criminal filing for dissemination of the images was reported on November 14, 2024. The incident ID was created on January 31, 2026.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: Unknown image generator developers and Unknown deepfake technology developers developed an AI system deployed by Unnamed 18-year-old male student from Pensacola, which harmed Unnamed students from Pensacola , students , minors and Epistemic integrity.
Alleged implicated AI systems: Unknown nudify app , Unknown image generator technology and Unknown deepfake technology

Incident Stats

Incident ID
1354
Report Count
2
Incident Date
2024-10-10
Editors
Daniel Atherton

Incident Reports

Reports Timeline

+1
Student used AI to 'undress' dozens of high school girls. Parents want him arrested.
Police arrest girl for sending fake, nude photos of females. Why parents are outraged
Loading...
Student used AI to 'undress' dozens of high school girls. Parents want him arrested.

Student used AI to 'undress' dozens of high school girls. Parents want him arrested.

pnj.com

Loading...
Police arrest girl for sending fake, nude photos of females. Why parents are outraged

Police arrest girl for sending fake, nude photos of females. Why parents are outraged

pnj.com

Loading...
Student used AI to 'undress' dozens of high school girls. Parents want him arrested.
pnj.com · 2024

A few weeks ago, 18-year-old Bryre Thomson was settling into her freshman year of college when she started getting text messages from her friends back home in Pensacola that fake, nude photos of girls and young women from several local high…

Loading...
Police arrest girl for sending fake, nude photos of females. Why parents are outraged
pnj.com · 2024

The girlfriend of a young Pensacola man who allegedly used Artificial Intelligence to digitally "undress" pictures of more than a dozen girls and women has been arrested for sending the images without consent. The 18-year-old former Washing…

Variants

A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Loading...
Skating Rink’s Facial Recognition Cameras Misidentified Black Teenager as Banned Troublemaker

Skating Rink’s Facial Recognition Cameras Misidentified Black Teenager as Banned Troublemaker

Jul 2021 · 3 reports
Loading...
Tesla Driver on Autopilot Ran a Red Light, Crashing into a Car and Killing Two People in Los Angeles

Tesla Driver on Autopilot Ran a Red Light, Crashing into a Car and Killing Two People in Los Angeles

Dec 2019 · 4 reports
Loading...
Detroit Police Wrongfully Arrested Black Man Due To Faulty FRT

Detroit Police Wrongfully Arrested Black Man Due To Faulty FRT

Jan 2020 · 11 reports
Previous IncidentNext Incident

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Loading...
Skating Rink’s Facial Recognition Cameras Misidentified Black Teenager as Banned Troublemaker

Skating Rink’s Facial Recognition Cameras Misidentified Black Teenager as Banned Troublemaker

Jul 2021 · 3 reports
Loading...
Tesla Driver on Autopilot Ran a Red Light, Crashing into a Car and Killing Two People in Los Angeles

Tesla Driver on Autopilot Ran a Red Light, Crashing into a Car and Killing Two People in Los Angeles

Dec 2019 · 4 reports
Loading...
Detroit Police Wrongfully Arrested Black Man Due To Faulty FRT

Detroit Police Wrongfully Arrested Black Man Due To Faulty FRT

Jan 2020 · 11 reports

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • d690bcc