Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 4563

Associated Incidents

Incident 79927 Report
Aledo High School Student Allegedly Generates and Distributes Deepfake Nudes of Seven Female Classmates

Loading...
Sen. Ted Cruz, Texas high schooler urge US House to pass Senate bill banning deepfake porn
statesman.com · 2024

In October 2023, when Texas high school student Elliston Berry was 14 years old, a male classmate used artificial intelligence to turn innocent photos of her and her friends into "deepfake" pornography. Then, he shared the realistic-looking nude photos on the social media platform Snapchat.

"That morning, when I woke up, it was one of the worst feelings I've ever felt, feeling hopeless and feeling as if my entire innocence was stripped away," Berry, who hails from the Fort Worth suburb of Aledo, said at a Wednesday news conference on Capitol Hill. "The unknown was so terrifying, that this is my reality."

It took eight months to get Snapchat to remove the photos, Berry said -- and a call to the company from U.S. Sen. Ted Cruz, R-Texas.

Cruz said Wednesday that Berry's experience, which he learned about because her mother contacted his office, led him to draft legislation to help others facing similar situations.

The "TAKE IT DOWN" Act, which Cruz introduced with Democratic Sen. Amy Klobuchar of Minnesota, makes it a crime to post nude or sexually explicit imagery without a person's consent, including computer-generated photos and videos that depict real people. It also requires social media and other websites to remove such images within 48 hours after notification from a victim.

"It should not take a sitting U.S. senator or a sitting member of Congress to make a phone call to get this garbage down," Cruz said.

The Senate unanimously passed the bipartisan bill on Dec. 4. In Wednesday's news conference, Cruz, Klobuchar and leading House sponsor Rep. Maria Elvira Salazar, R-Fla., urged the lower chamber to schedule the bill for a vote before the end of the year.

Also urging swift passage was South Carolina state Rep. Brandon Guffey, whose son Gavin died by suicide minutes after an online scammer threatened the 17-year-old with "sextortion."

"That is the hurt, that is the shame, because there is no out," Guffey said of his son's death. "The threat of these images going viral -- you can't hide the amount of shame."

Klobuchar and Cruz's bill would make it a federal felony to publish nonconsensual intimate images, including deepfakes.

The bill is among numerous solutions lawmakers have pushed in recent years to address the proliferation of nonconsensual online graphic imagery, including deepfakes and "revenge porn." The latter refers to intimate images posted by romantic or sexual partners, especially ex-partners, without an individual's consent.

There has been a "huge increase" in the number of children and teens --- particularly young boys ---who have been threatened with sextortion in recent years, according to the FBI. More than 20 young victims died by suicide between October 2021 and March 2023, Klobuchar emphasized.

Nearly all U.S. states have laws protecting against nonconsensual intimate imagery, including the 29 states that ban "deepfake," or computer-generated, nonconsensual pornography. But penalties vary, and even with the states' laws, victims struggle to have content removed from websites that might be based in other states.

Congress in a 2022 law also created a civil cause of action for victims to sue perpetrators. However, filing lawsuits can be difficult, costly and time-consuming.

According to Cruz's office, nearly 90 organizations have backed the TAKE IT DOWN Act, including victim advocacy groups, law enforcement and tech industry leaders.

"If this bill is on the floor of the House, this bill will pass," Cruz said. "We need to get this done now. We can't deal with this later because that means we're turning a blind eye on the victims."

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd