Description: Kate Isaacs, a London-based activist and founder of the #NotYourPorn campaign, was targeted in a deepfake incident. Her face was alleged to have been digitally manipulated onto a pornographic video using AI and shared online. The reported video, tagged with her name, is alleged to have led to streams of abuse, doxing, and threats of violence. The attack reportedly followed her efforts to pressure PornHub to remove unverified content.
Editor Notes: A note on the timeline of this event: The specific timing of this incident remains unclear. The #NotYourPorn campaign is reported to have succeeded in pressuring PornHub to remove 10 million unverified videos in 2020. It is reported that the deepfaked video itself appeared sometime in 2020 as well. The initial reporting on the deepfake incident is dated 10/21/2022.
Alleged: Unknown deepfake technology developer developed an AI system deployed by Unknown Twitter user, which harmed Kate Isaacs.
関与が疑われるAIシステム: Unknown deepfake app
インシデントのステータス
インシデントID
904
レポート数
8
インシデント発生日
2022-10-21
エディタ
Daniel Atherton