Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 6090

Associated Incidents

Incident 11653 Report
Grok Imagine Reportedly Produces Non-Consensual Taylor Swift Deepfake Nudes Without Explicit Prompting

Loading...
Grok generates fake Taylor Swift nudes without being asked
arstechnica.com · 2025

Backlash over offensive Grok outputs continues, just a couple weeks after the social platform X scrambled to stop its AI tool from dubbing itself "MechaHitler" during an antisemitic meltdown.

Now, The Verge has found that the newest video feature of Elon Musk's AI model will generate nude images of Taylor Swift without being prompted.

Shortly after the "Grok Imagine" was released Tuesday, The Verge's Jess Weatherbed was shocked to discover the video generator spat out topless images of Swift "the very first time" she used it.

According to Weatherbed, Grok produced more than 30 images of Swift in revealing clothing when asked to depict "Taylor Swift celebrating Coachella with the boys." Using the Grok Imagine feature, users can choose from four presets---"custom," "normal," "fun," and "spicy"---to convert such images into video clips in 15 seconds.

At that point, all Weatherbed did was select "spicy" and confirm her birth date for Grok to generate a clip of Swift tearing "off her clothes" and "dancing in a thong" in front of "a largely indifferent AI-generated crowd."

The outputs that Weatherbed managed to generate without jailbreaking or any intentional prompting is particularly concerning, given the major controversy after sexualized deepfakes of Swift flooded X last year. Back then, X reminded users that "posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content."

"Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them," the X Safety account posted. "We're closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed. We're committed to maintaining a safe and respectful environment for all users."

But X Safety may need to ramp up monitoring to clean up Grok outputs following the Verge's reporting. Grok cited The Verge's reporting while confirming that its own seemingly flawed design can trigger partially nude outputs of celebrities.

xAI can likely fix the issue through more fine-tuning. Weatherbed noted that asking Grok directly to generate non-consensual nude Swift images did not generate offensive outputs, but instead blank boxes. Grok also seemingly won't accept prompts to alter Swift's appearance in other ways, like making her appear to be overweight. And when Weatherbed tested using "spicy" mode on images of children, for example, Grok refused to depict kids inappropriately.

However, it may not be easy to get Grok to distinguish between adult user requests for "spicy" content versus illegal content. The "spicy" mode didn't always generate Swift deepfakes, Weatherbed confirmed, but in "several" instances it "defaulted" to "ripping off" Swift's clothes.

With enforcement of the Take It Down Act starting next year---requiring platforms to promptly remove non-consensual sex images, including AI-generated nudes---xAI could potentially face legal consequences if Grok's outputs aren't corrected, though.

So far, X has not commented on The Verge's report. Instead, Musk has spent the day hyping Grok Imagine and encouraging users to share their "creations."

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd