Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 4562

Associated Incidents

Incident 79927 Report
Aledo High School Student Allegedly Generates and Distributes Deepfake Nudes of Seven Female Classmates

Loading...
Advocates call for passage of bill to make AI-generated deepfake porn illegal
wwnytv.com · 2024

WASHINGTON (Gray DC) - On Wednesday, congressional lawmakers and victims of sexual exploitation pushed for the passage of bill that would criminalize the publication of non-consensual deepfake pornography.

“The legislation, the Take It Down Act, does two things. Number one, it makes it a crime, it makes it a felony to publish non-consensual intimate images. This is wrong. It is criminal,” said Sen. Ted Cruz (R-TX), who drafted the bill with Sen. Amy Klobuchar (D-MN).

After the bill unanimously passed in the Senate earlier this month, Cruz organized the press conference on Wednesday to try to push the House to vote on it before the end of the year.

“Our challenge really is just the hours in the day and the calendar. We’re near the end of the year. There are lots of competing priorities,” he said.

AI-generated deepfake pornography has become more common in recent years and has had an increasing impact on teenagers like 15-year-old Texas student Elliston Berry, who was at the press conference Wednesday.

“I remember feeling that hopelessness and feeling, as if, honestly the world was going to end,” she said. “I mean, I’m a teenage girl in high school, and these fake photos of me were going around my entire school, and it was just truly so, so terrifying.”

A classmate of Berry’s used AI to turn innocent photos of her into deepfake nude pictures, then sent them out using Snapchat.

When she and her mom were unable to get Snapchat to delete the images, she reached out to Sen. Cruz’s office, which prompted the creation of the legislation.

The bill would require social media platforms and other websites to take down non-consensual deepfake porn within 48 hours of a victim reporting it.

“Fewer Americans will have their lives turned upside down if we can get this done,” said Sen. Klobuchar. “Fewer children will have their innocence snatched away from them, and more victims can seek justice that they deserve.”

Cruz said the issue now is trying to get a House vote on the bill before the end of the year and stressed the importance of not delaying the vote.

“This will pass eventually, but if it passes a year from now, how many other victims will be victimized?” said Cruz. “How many girls will be victims? How many parents will lose a child because Congress delayed and we couldn’t act?”

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd