Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Découvrir
Envoyer
  • Bienvenue sur AIID
  • Découvrir les incidents
  • Vue spatiale
  • Vue de tableau
  • Vue de liste
  • Entités
  • Taxonomies
  • Soumettre des rapports d'incident
  • Classement des reporters
  • Blog
  • Résumé de l’Actualité sur l’IA
  • Contrôle des risques
  • Incident au hasard
  • S'inscrire
Fermer
Découvrir
Envoyer
  • Bienvenue sur AIID
  • Découvrir les incidents
  • Vue spatiale
  • Vue de tableau
  • Vue de liste
  • Entités
  • Taxonomies
  • Soumettre des rapports d'incident
  • Classement des reporters
  • Blog
  • Résumé de l’Actualité sur l’IA
  • Contrôle des risques
  • Incident au hasard
  • S'inscrire
Fermer
Entités

AI content detection models

Incidents implicated systems

Incident 13491 Rapport
AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

2025-10-24

An image dataset, NudeNet, used to train systems for detecting nudity was reportedly found to contain CSAM images, including material involving identified or known victims. According to the Canadian Centre for Child Protection, the dataset had been widely downloaded and cited in academic research prior to discovery. The images were allegedly included without vetting, exposing researchers to legal risk and perpetuating harm to victims. The dataset was subsequently removed following notification.

Plus

Entités liées
Autres entités liées au même incident. Par exemple, si le développeur d'un incident est cette entité mais que le responsable de la mise en œuvre est une autre entité, ils sont marqués comme entités liées.
 

Entity

Academic researchers

Affecté par des incidents
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Incidents involved as Deployer
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Plus
Entity

Research institutions

Incidents involved as Deployer
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Plus
Entity

AI developers

Incidents involved as Deployer
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Plus
Entity

Dataset users

Incidents involved as Deployer
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Plus
Entity

Independent researchers

Incidents involved as Deployer
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Plus
Entity

AI researchers

Incidents involved as Deployer
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Plus
Entity

NudeNet dataset maintainers

Incidents involved as Developer
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Plus
Entity

NudeNet model developers

Incidents involved as Developer
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Plus
Entity

minors

Affecté par des incidents
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Plus
Entity

Identified CSAM victims

Affecté par des incidents
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Plus
Entity

Individuals subjected to sexual exploitation imagery

Affecté par des incidents
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Plus
Entity

NudeNet

Incidents implicated systems
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Plus
Entity

AI image classification systems

Incidents implicated systems
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Plus
Entity

Dataset scraping and aggregation pipelines

Incidents implicated systems
  • Incident 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Plus

Recherche

  • Définition d'un « incident d'IA »
  • Définir une « réponse aux incidents d'IA »
  • Feuille de route de la base de données
  • Travaux connexes
  • Télécharger la base de données complète

Projet et communauté

  • À propos de
  • Contacter et suivre
  • Applications et résumés
  • Guide de l'éditeur

Incidents

  • Tous les incidents sous forme de liste
  • Incidents signalés
  • File d'attente de soumission
  • Affichage des classifications
  • Taxonomies

2024 - AI Incident Database

  • Conditions d'utilisation
  • Politique de confidentialité
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • d690bcc