Entities
Dataset scraping and aggregation pipelines
Incidents implicated systems
Incident 13491 Report
AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims
2025-10-24
An image dataset, NudeNet, used to train systems for detecting nudity was reportedly found to contain CSAM images, including material involving identified or known victims. According to the Canadian Centre for Child Protection, the dataset had been widely downloaded and cited in academic research prior to discovery. The images were allegedly included without vetting, exposing researchers to legal risk and perpetuating harm to victims. The dataset was subsequently removed following notification.
MoreRelated Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
Related Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.