Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 3701

Associated Incidents

Incident 63231 Report
Significant Increase in Deepfake Nudes of Taylor Swift Circulating on Social Media

Loading...
After deepfake porn images target Taylor Swift, is it safe to post photos of kids online?
kslnewsradio.com · 2024

SALT LAKE CITY — AI-generated deepfake porn victimized Taylor Swift, the biggest pop star in the world, last week as sexually explicit images of her swept across the internet and X — formerly Twitter. An expert advises parents not to post images of their kids in online public forums.

One of the most prominent examples of Swift on X attracted more than 45 million views, 24,000 reposts and hundreds of thousands of likes and bookmarks before the verified user who shared the images had his or her account suspended for violating platform policy as reported by The Verge.

Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content. Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them. We're closely…

— Safety (@Safety) January 26, 2024

‘I should be mad,’ says victim, 14

CEO of Nexus IT Earl Foote joins the discussion about deepfake images online.

“Recent studies show that about 95% of all deepfake videos and images that are created are created of celebrities and not of the random populace but that doesn’t mean it doesn’t happen,” Foote said, referencing the case last year of a 14-year-old New Jersey high school girl  who was one of the victims of fake AI-generated nude images circulating among students.

“I realized I should not be sad, but I should be mad. So, I came home, and I told my mom, and I told her that we have to do something about this because it is unfair to girls, and it’s just not right,” Francesca Mani told “Good Morning America.”

Teen and mother speak out after alleged AI-generated photos sent around high school

Don’t share images of kids in online public forums

Minors should not be sharing content with public audiences, Foote advised. Furthermore, parents should not be sharing photos, image and videos of their children with public audiences, either, he said.

Don’t take the risk of content falling into the hands of predators, Foote warned.

“In today’s world, there’s just too many risks involved with that. So, my recommendation is that parents and minors use the functions within social-media applications to narrow down the group they’re sharing content with to close friends and families that they know and trust,” he said.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd