minors
Incidents Harmed By
Incident 95811 Report
Europol Operation Cumberland Investigates at Least 273 Suspects in 19 Countries for AI-Generated Child Sexual Abuse Material
2025-02-26
Europol’s Operation Cumberland uncovered a global network distributing AI-generated child sexual abuse material (CSAM). The operation has led to 25 arrests and 273 identified suspects across 19 countries. The AI-enabled abuse allows criminals to create exploitative content at scale with minimal expertise.
MoreIncident 6894 Report
Holmen, Wisconsin Man Allegedly Used Stable Diffusion to Create and Then Share Sexually Explicit Images Depicting Minors
2024-03-26
The FBI has arrested Steven Anderegg of Holmen, Wisconsin for having allegedly used Stable Diffusion to generate about 13,000 sexually explicit images of minors, which he then is also alleged to have shared and distributed, including with at least one minor, via Telegram and Instagram. Anderegg was originally apprehended by state police in March, and this case marks one of the first times the FBI has brought charges against someone for having used AI to generate CSAM.
MoreIncident 10102 Report
GenNomis AI Database Reportedly Exposes Nearly 100,000 Deepfake and Nudify Images in Public Breach
2025-03-31
In March 2025, cybersecurity researcher Jeremiah Fowler discovered an unprotected database linked to GenNomis by AI-NOMIS, a South Korean company offering face-swapping and "nudify" AI services. The exposed 47.8GB dataset included nearly 100,000 files. Many depicted explicit deepfake images, some involving minors or celebrities. No personal data was found, but the breach was a serious failure in data security and consent safeguards in AI image-generation platforms.
MoreIncident 10402 Report
Meta User-Created AI Companions Allegedly Implicated in Facilitating Sexually Themed Conversations Involving Underage Personas
2025-04-26
Third-party testing of Meta's AI chatbot services on Instagram, Facebook, and WhatsApp reportedly found that both official and user-created bots engaged in sexually explicit roleplaying with accounts identifying as minors. Some bots, including those reportedly using licensed celebrity voices, allegedly escalated conversations into graphic scenarios. Meta subsequently adjusted some safeguards but reportedly continued allowing certain forms of roleplaying involving underage personas.
MoreRelated Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
Related Entities
Incidents involved as both Developer and Deployer
- Incident 5831 Report
Instagram Algorithms Allegedly Promote Accounts Facilitating Child Sex Abuse Content
- Incident 7881 Report
Instagram's Algorithm Reportedly Recommended Sexual Content to Teenagers' Accounts