Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 4020

Associated Incidents

Incident 7722 Report
Kristen Bell Deepfaked in Non-Consensual AI-Generated Pornography

Kristen Bell ‘shocked’ to learn her face was used in ‘deepfake’ porn video
independent.co.uk · 2020

Actor Kristen Bell has revealed that she felt "exploited" after learning that her face was used in "deepfake" pornography.

The Good Place star told *Vox*that she only found out she'd fallen victim to the practice, which sees the faces of people's faces (usually women) non-consensually imposed onto the bodies of pornographic actors, after actor Ashton Kutcher alerted her husband Dax Shepard.

"[Ashton] actually told him, 'Oh, by the way, there are these things calls deepfakes and your wife is one of them,'" Bell explained.

'I was just shocked, because this is my face. [It] belongs to me."

According to Vox's report, 96 per cent of all deepfakes online are used in a pornographic context and nearly 100 per cent of those of women. While they are often labelled as being "fake", they can still be extremely damaging to the people involved.

"You know, we're having this gigantic conversation about consent and I don't consent, so that's why it's not OK," Bell said. "Even if it's labelled as, 'Oh, this is not actually her', it's hard to think about that.

"I wish that the internet were a little bit more responsible and a little bit kinder."

In 2018, Pornhub and Twitter banned the uploading of celebrity deepfakes to their platforms, while Reddit changed its rules to include "depictions that have been faked" in its banning of nudity and sexually explicit content.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • d414e0f