Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 4085

Associated Incidents

Incident 7722 Report
Kristen Bell Deepfaked in Non-Consensual AI-Generated Pornography

Loading...
The most urgent threat of deepfakes isn’t politics. It’s porn.
vox.com · 2020

Kristen Bell first found out there were deepfake porn videos of her online from her husband, the actor Dax Shepard. In the videos, her face has been manipulated onto porn performers' bodies.

"I was just shocked," Bell told me. "It's hard to think about that I'm being exploited."

And this isn't only happening to celebrities. Noelle Martin, a recent law graduate in Perth, Australia, discovered that someone took photos she'd shared on social media and used them first to photoshop her into nude images, and then to create deepfake videos.

Deepfakes are often portrayed as a political threat --- fake videos of politicians making comments they never made. But in a recent report, the research group Deeptrace found that 96 percent of deepfakes found online are pornographic. Of those videos, virtually all are of women. And virtually all are made without their consent.

"There's a lot of talk about the challenges that come with the advancements in deepfake technology," Martin said. "But I think what is often missed from the discussion is the impact to individuals right now. Not in a few years, not in a couple of months. Right now."

What's happening in these videos is a specific kind of digital manipulation. It's not the same as the older face-swapping filters you might have used on social media. Those tools let you put your face onto a friend's head, but because they transfer both your facial features and your expressions, you still control it.

Deepfakes are different. They can take your facial features alone and animate your face with someone else's expressions. That's what makes them so invasive. The creator takes away a victim's control of her face, using it for something she never wanted. In doing so, they contribute to a long history of sexual humiliation of women.

You can find this video and all of Vox's videos on YouTube. And join the Open Sourced Reporting Network to help us report on the real consequences of data, privacy, algorithms, and AI.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd