Description: Purported deepfake nude images of female students were circulated without consent at an educational institute in Valencia, Spain. At least 16 minors reported that original photographs had been digitally manipulated so they appeared completely naked, with some images shared via social media accounts created in victims' names. The alleged creation and distribution of the images is being examined under Spanish laws related to the protection of minors.
Editor Notes: Timeline notes: The reported incident occurred sometime in December 2024. Outlets began widely reporting on it on July 27, 2025.
Entities
View all entitiesAlleged: Unknown deepfake technology developers and Unknown image generator developers developed an AI system deployed by Unnamed student , Unnamed male student and Unnamed male student in Valencia, which harmed Unnamed students , Unnamed female students and Unnamed female students in Valencia.
Alleged implicated AI systems: Unknown deepfake technology , Unknown image generators and Social media platforms
Incident Stats
Incident ID
1342
Report Count
1
Incident Date
2024-12-01
Editors
Daniel Atherton
Incident Reports
Reports Timeline
Loading...

Spanish police said on Sunday they were investigating a 17-year-old on suspicion of using artificial intelligence to deepfake nude images of female classmates for sale.
Sixteen young women at an educational institute in Valencia, in southea…
Variants
A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Loading...
Defamation via AutoComplete
· 28 reports
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Loading...
Defamation via AutoComplete
· 28 reports


