Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 3740

関連インシデント

インシデント 64112 Report
Nonconsensual Deepfake Porn of Bobbi Althoff Spreads Rapidly on X

Loading...
Podcaster Bobbi Althoff insists graphic viral video is fake, AI-generated: ‘Sorry to disappoint’
pagesix.com · 2024

Bobbi Althoff is the latest victim of X-rated, AI-generated content being shared online.

The TikToker-turned-podcast host took to her Instagram Story on Wednesday to clear the air after someone edited her face onto a now-viral video of a woman pleasuring herself in bed.

“Hate to disappoint you all, but the reason I’m trending is 100% not me & is definitely AI generated,” she wrote atop a screenshot of her name trending on X.

The deepfakes started going viral on X, landing the podcaster on the trending page. Getty Images

Althoff, 26, doubled down on her denial in a follow-up video, voicing her disgust about the “graphic” video being so widely shared.

“Yesterday I went on X and I saw that I was trending and I was like, ‘Oh my god, that’s a first. I’m trending on Twitter! You guys must really love my podcast,'” she recalled.

However, the mother of two was left stunned after she “clicked” her name to see what everyone was talking about.

“I was like, ‘What the f–k is this?’ I felt like it was a mistake or something,” she continued. “I thought it was bots or something. I didn’t realize that it was actually people believing that that was me.”

Althoff, who has been in the news recently amid her split from husband Cory Althoff, said her entire PR team called her to see if the edited clip was “real” due to how convincing it was.

“[It’s] not me. Sorry to disappoint, but what the f–k?” the “Really Good Podcast“ host concluded. “That was so graphic, too. … I had to cover my eyes.”

Just last month, explicit, edited images of Taylor Swift started circulating on X, reportedly leading the “furious” pop star, 34, to consider legal action against those involved.

At the time, a source told the Daily Mail that Swift was appalled that the social media platform even allowed the vile images — which depicted her in a variety of provocative poses at Kansas City Chiefs games — to be posted in the first place.

“Whether or not legal action will be taken is being decided, but there is one thing that is clear: These fake, AI-generated images are abusive, offensive, exploitative and done without Taylor’s consent and/or knowledge,” the insider told the newspaper.

The social media star is the latest victim of nonconsensual content being spread online. WireImage

“Legislation needs to be passed to prevent this, and laws must be enacted.”

Although Swift never publicly spoke out about the images, the backlash even reached the White House, with Press Secretary Karine Jean-Pierre calling the issue “alarming.”

A few legislators even proposed a new bill to combat the spread of nonconsensual deepfakes in wake of the fallout.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd