Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Descubrir
Enviar
  • Bienvenido a la AIID
  • Descubrir Incidentes
  • Vista espacial
  • Vista Tabular
  • Vista de lista
  • Entidades
  • Taxonomías
  • Enviar Informes de Incidentes
  • Ranking de Reportadores
  • Blog
  • Resumen de noticias de IA
  • Control de Riesgos
  • Incidente aleatorio
  • Registrarse
Colapsar
Descubrir
Enviar
  • Bienvenido a la AIID
  • Descubrir Incidentes
  • Vista espacial
  • Vista Tabular
  • Vista de lista
  • Entidades
  • Taxonomías
  • Enviar Informes de Incidentes
  • Ranking de Reportadores
  • Blog
  • Resumen de noticias de IA
  • Control de Riesgos
  • Incidente aleatorio
  • Registrarse
Colapsar
Entidades

NudeNet model developers

Incidents involved as Developer

Incidente 13491 Reporte
AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

2025-10-24

An image dataset, NudeNet, used to train systems for detecting nudity was reportedly found to contain CSAM images, including material involving identified or known victims. According to the Canadian Centre for Child Protection, the dataset had been widely downloaded and cited in academic research prior to discovery. The images were allegedly included without vetting, exposing researchers to legal risk and perpetuating harm to victims. The dataset was subsequently removed following notification.

Más

Entidades relacionadas
Otras entidades que están relacionadas con el mismo incidente. Por ejemplo, si el desarrollador de un incidente es esta entidad pero el implementador es otra entidad, se marcan como entidades relacionadas.
 

Entity

Academic researchers

Afectado por Incidentes
  • Incidente 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Incidents involved as Deployer
  • Incidente 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Más
Entity

Research institutions

Incidents involved as Deployer
  • Incidente 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Más
Entity

AI developers

Incidents involved as Deployer
  • Incidente 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Más
Entity

Dataset users

Incidents involved as Deployer
  • Incidente 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Más
Entity

Independent researchers

Incidents involved as Deployer
  • Incidente 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Más
Entity

AI researchers

Incidents involved as Deployer
  • Incidente 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Más
Entity

NudeNet dataset maintainers

Incidents involved as Developer
  • Incidente 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Más
Entity

minors

Afectado por Incidentes
  • Incidente 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Más
Entity

Identified CSAM victims

Afectado por Incidentes
  • Incidente 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Más
Entity

Individuals subjected to sexual exploitation imagery

Afectado por Incidentes
  • Incidente 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Más
Entity

NudeNet

Incidents implicated systems
  • Incidente 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Más
Entity

AI image classification systems

Incidents implicated systems
  • Incidente 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Más
Entity

AI content detection models

Incidents implicated systems
  • Incidente 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Más
Entity

Dataset scraping and aggregation pipelines

Incidents implicated systems
  • Incidente 1349
    1 Report

    AI Training Dataset for Detecting Nudity Allegedly Found to Contain CSAM Images of Identified Victims

Más

Investigación

  • Definición de un “Incidente de IA”
  • Definición de una “Respuesta a incidentes de IA”
  • Hoja de ruta de la base de datos
  • Trabajo relacionado
  • Descargar Base de Datos Completa

Proyecto y Comunidad

  • Acerca de
  • Contactar y Seguir
  • Aplicaciones y resúmenes
  • Guía del editor

Incidencias

  • Todos los incidentes en forma de lista
  • Incidentes marcados
  • Cola de envío
  • Vista de clasificaciones
  • Taxonomías

2024 - AI Incident Database

  • Condiciones de uso
  • Política de privacidad
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • d690bcc