Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Entities

AI content detection models

Incidents implicated systems

Incident 13491 Report
AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

2025-10-24

An image dataset, NudeNet, used to train systems for detecting nudity was reportedly found to contain CSAM images, including material involving identified or known victims. According to the Canadian Centre for Child Protection, the dataset had been widely downloaded and cited in academic research prior to discovery. The images were allegedly included without vetting, exposing researchers to legal risk and perpetuating harm to victims. The dataset was subsequently removed following notification.

More

Related Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
 

Entity

Academic researchers

Incidents Harmed By
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Incidents involved as Deployer
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

More
Entity

Research institutions

Incidents involved as Deployer
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

More
Entity

AI developers

Incidents involved as Deployer
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

More
Entity

Dataset users

Incidents involved as Deployer
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

More
Entity

Independent researchers

Incidents involved as Deployer
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

More
Entity

AI researchers

Incidents involved as Deployer
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

More
Entity

NudeNet dataset maintainers

Incidents involved as Developer
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

More
Entity

NudeNet model developers

Incidents involved as Developer
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

More
Entity

minors

Incidents Harmed By
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

More
Entity

Identified CSAM victims

Incidents Harmed By
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

More
Entity

Individuals subjected to sexual exploitation imagery

Incidents Harmed By
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

More
Entity

NudeNet

Incidents implicated systems
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

More
Entity

AI image classification systems

Incidents implicated systems
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

More
Entity

Dataset scraping and aggregation pipelines

Incidents implicated systems
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

More

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • d690bcc