Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 3740

Associated Incidents

Incident 64112 Report
Nonconsensual Deepfake Porn of Bobbi Althoff Spreads Rapidly on X

Loading...
Podcaster Bobbi Althoff insists graphic viral video is fake, AI-generated: ‘Sorry to disappoint’
pagesix.com · 2024

Bobbi Althoff is the latest victim of X-rated, AI-generated content being shared online.

The TikToker-turned-podcast host took to her Instagram Story on Wednesday to clear the air after someone edited her face onto a now-viral video of a woman pleasuring herself in bed.

“Hate to disappoint you all, but the reason I’m trending is 100% not me & is definitely AI generated,” she wrote atop a screenshot of her name trending on X.

The deepfakes started going viral on X, landing the podcaster on the trending page. Getty Images

Althoff, 26, doubled down on her denial in a follow-up video, voicing her disgust about the “graphic” video being so widely shared.

“Yesterday I went on X and I saw that I was trending and I was like, ‘Oh my god, that’s a first. I’m trending on Twitter! You guys must really love my podcast,'” she recalled.

However, the mother of two was left stunned after she “clicked” her name to see what everyone was talking about.

“I was like, ‘What the f–k is this?’ I felt like it was a mistake or something,” she continued. “I thought it was bots or something. I didn’t realize that it was actually people believing that that was me.”

Althoff, who has been in the news recently amid her split from husband Cory Althoff, said her entire PR team called her to see if the edited clip was “real” due to how convincing it was.

“[It’s] not me. Sorry to disappoint, but what the f–k?” the “Really Good Podcast“ host concluded. “That was so graphic, too. … I had to cover my eyes.”

Just last month, explicit, edited images of Taylor Swift started circulating on X, reportedly leading the “furious” pop star, 34, to consider legal action against those involved.

At the time, a source told the Daily Mail that Swift was appalled that the social media platform even allowed the vile images — which depicted her in a variety of provocative poses at Kansas City Chiefs games — to be posted in the first place.

“Whether or not legal action will be taken is being decided, but there is one thing that is clear: These fake, AI-generated images are abusive, offensive, exploitative and done without Taylor’s consent and/or knowledge,” the insider told the newspaper.

The social media star is the latest victim of nonconsensual content being spread online. WireImage

“Legislation needs to be passed to prevent this, and laws must be enacted.”

Although Swift never publicly spoke out about the images, the backlash even reached the White House, with Press Secretary Karine Jean-Pierre calling the issue “alarming.”

A few legislators even proposed a new bill to combat the spread of nonconsensual deepfakes in wake of the fallout.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd