Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 4390

Associated Incidents

Incident 8747 Report
1 in 6 Congresswomen Have Reportedly Been Targeted by AI-Generated Nonconsensual Intimate Imagery

Loading...
Dozens of lawmakers victims of sexually explicit deepfakes: Report
thehill.com · 2024

More than two dozen lawmakers have been the victims of deepfake pornography, with female lawmakers significantly more likely to be targeted, according to a new report released Wednesday. 

The report from the American Sunlight Project, an advocacy group focused on combatting disinformation, found more than 35,000 mentions of 26 lawmakers on prominent deepfake websites. The impacted lawmakers included 25 women and one man. 

"This report reveals a stark and disturbing reality," Nina Jankowicz, CEO of the American Sunlight Project, said in a statement.  

"Female lawmakers are being targeted by AI-generated deepfake pornography at an alarming rate," she continued, adding, "This isn't just a tech problem; it's a direct assault on women in leadership and democracy itself." 

After the group notified the affected members of Congress, the content depicting 14 of these lawmakers was removed in less than 48 hours, according to the report. 

The content depicting another nine lawmakers was entirely or mostly removed but remained on landing or search result pages. 

"The vast majority of targets of deepfake sexual abuse are private citizens, and even minors, who frequently lack the resources to rectify the harm done to them," Jankowicz said.

"I myself have been targeted with this vile content," she added. "As both a survivor and a researcher, I strongly feel that all women like me deserve to be protected by their government and have a path to justice for the sexual abuse they have endured. It is long past time for Congress to act." 

The Senate passed legislation last week that would criminalize non-consensual, sexually explicit imagery, including deepfakes generated using artificial intelligence (AI), and require platforms to remove the content once notified. 

The bill, known as the TAKE IT DOWN Act, is one of several pieces of legislation moving through Congress that has sought to address the growing problem of sexually explicit deepfakes.

The Disrupt Explicit Forged Images and Non-Consensual Edits Act, also known as the DEFIANCE Act, similarly passed the Senate in July. The bill would create a federal civil remedy for victims of deepfake pornography. 

In the House, the DEFIANCE Act was introduced by Rep. Alexandria Ocasio-Cortez (N.Y.), who has talked about her own personal experience as the target of sexually explicit AI-generated content.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd