Description: An image dataset, NudeNet, used to train systems for detecting nudity was reportedly found to contain CSAM images, including material involving identified or known victims. According to the Canadian Centre for Child Protection, the dataset had been widely downloaded and cited in academic research prior to discovery. The images were allegedly included without vetting, exposing researchers to legal risk and perpetuating harm to victims. The dataset was subsequently removed following notification.
Entities
View all entitiesAlleged: NudeNet dataset maintainers and NudeNet model developers developed an AI system deployed by Academic researchers , Research institutions , AI developers , Dataset users , Independent researchers and AI researchers, which harmed Academic researchers , minors , Identified CSAM victims and Individuals subjected to sexual exploitation imagery.
Alleged implicated AI systems: NudeNet , AI image classification systems , AI content detection models and Dataset scraping and aggregation pipelines
Incident Stats
Incident ID
1349
Report Count
1
Incident Date
2025-10-24
Editors
Daniel Atherton
Incident Reports
Reports Timeline
Loading...
A large image dataset used to develop AI tools for detecting nudity contains a number of images of child sexual abuse material (CSAM), according to the Canadian Centre for Child Protection (C3P).
The NudeNet dataset, which contains more th…
Variants
A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Similar Incidents
Did our AI mess up? Flag the unrelated incidents

