Unknown image generator developers
Incidents involved as Developer
インシデント 61011 Report
Purported Deepfake Technology Was Reportedly Used to Generate Naked Pictures of Underage Girls in Spanish Town
2023-09-17
In Spain, an AI app was used to digitally alter photos of young girls, making them appear naked. This manipulation sparked an investigation after these images were circulated in Almendralejo, a town in the Extremadura region, raising serious concerns about digital privacy violations and the potential spread of these images on pornographic sites.
もっとインシデント 7543 Report
Female Politicians in the United Kingdom Reportedly Victimized by Purported Deepfake Pornography
2024-07-01
Female politicians in the United Kingdom, including Angela Rayner, Gillian Keegan, Penny Mordaunt, Priti Patel, Stella Creasy, and Dehenna Davison, have reportedly been targeted by nonconsensual, purportedly AI-generated deepfake pornography. The images, which some accounts suggest may have circulated online for an extended period, were reported to have caused distress and, in at least some instances, to have prompted reports to or involvement by law enforcement.
もっとインシデント 14063 Report
Purported AI-Generated War Footage Reportedly Circulated Widely Online During the Opening Phase of the War in Iran
2026-02-28
Reports said that, in the early days of the war in Iran, purported AI-generated fakes showing nonexistent wartime scenes reached millions of viewers online. Their spread across social media and messaging apps allegedly distorted public perception of the conflict and contributed to information disorder.
もっとインシデント 13152 Report
Purportedly AI-Generated Nude Images of Middle School Students Reportedly Circulated at Louisiana School
2025-08-26
Purportedly AI-generated sexually explicit images depicting a 13-year-old student and other middle school students were reportedly created and circulated at a Louisiana school. The images were shared via social media and shown to students on campus, resulting in ongoing harassment and psychological harm. After the images were displayed on a school bus, the affected student became involved in a physical altercation and was expelled.
もっと関連団体
同じインシデントに関連するその他のエンティティ。たとえば、インシデントの開発者がこのエンティティで、デプロイヤーが別のエンティティである場合、それらは関連エンティティとしてマークされます。
関連団体
students
影響を受けたインシデント
- インシデント 13542 レポート
Purportedly AI-Altered Fake Nude Images of High School Girls and Women Reportedly Created and Disseminated in Pensacola, Florida
- インシデント 13481 レポート
Purported Deepfake Explicit Images of Middle School Students Allegedly Created and Circulated Using Mobile App in Goffstown, New Hampshire