Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 4534

Associated Incidents

Incident 9048 Report
Kate Isaacs, Advocate Against Image-Based Abuse, Reports Being Deepfaked

Loading...
'Sicko imposed my face on sex scene – it took me ages to clock it wasn't me'
dailystar.co.uk · 2023

A woman was stunned to discover her face had been plastered onto deepfake porn that was posted to a public Twitter account.

Kate Isaacs said she was scrolling through Twitter and was horrified to see that a porn film posted to the site showing a man and woman having sex appeared to be starring her.

She revealed that she initially did not understand that it was a deepfake video, with her mind instead scrambling to understand how the video could even exist.

“I remember feeling hot, having this wave come over me. My stomach dropped. I couldn't even think straight,” Kate, from west London, told MailOnline.

“I was going 'Where was this? Has someone filmed this without me knowing? Why can't I remember this? Who is this man?”

She admitted that it took her several minutes to realise that the video was not of her, and that it was in fact a deepfake.

Deepfakes are videos that have been so cleverly altered using complex software that they appear to be the real deal.

While deepfaking software can be used for a wide range of purposes, they are most sinisterly used to alter porn films by digitally changing the faces of the actors and actresses involved.

In 2017, several famous actresses, including Emma Watson, Natalie Portman, and Scarlett Johansson, were the subject of nonconsensual deepfaked videos, with the internet set alight after believing that the videos were real.

The same thing appears to have happened to Kate, who said:

“It was so convincing, it even took me a few minutes to realise that it wasn't me.”

“Anyone who knew me would think the same. It was devastating. I felt violated, and it was out there for everyone to see.”

She said that while she never managed to figure out who made the video of her, she believes she was targeted because of her campaigning work against non-consensual porn.

While her work as the founder of the NotYourPorn campaign saw her speaking out against using pictures of women, often nudes, for pornographic purposes without consent, she believes the video of her was made using innocent footage of her on the internet.

“This is all it takes now. It makes every woman who has an image of herself online a potential victim; that's pretty much all of us, these days,” she said.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd