Description: AI 'nudify' apps are being used to generate hyperrealistic non-consensual nude photos of individuals, which are then exploited for extortion and harassment. These apps use generative AI to remove clothing from images and create convincing fakes, often distributed on platforms like Telegram. Victims are threatened or shamed using these AI-generated images, with little recourse for removal or legal action.
Alleged: Unknown deepfake creators developed an AI system deployed by Unknown deepfake creators と Extortionists, which harmed Women in India , Women と General public.
インシデントのステータス
インシデントID
782
レポート数
1
インシデント発生日
2024-09-09
エディタ
Daniel Atherton