Associated Incidents

Sexually explicit photographs of Taylor Swift are all over the internet. Except they are not actually pictures of Swift: they are deepfakes created with AI technology. Still, the fact they are computer generated doesn’t mean the photos don’t look horrifyingly realistic – and it certainly hasn’t stopped people from gawking at them. One of the images shared by a user on X (Twitter) was viewed more than 45m times in the 17 hours before it was taken down.
If this story sounds familiar it’s because you’ve probably heard a variation of it before. Deepfake pornography is everywhere – according to Danielle Citron, a professor at the University of Virginia School of Law, there are over more than 9,500 sites “devoted to non-consensual intimate imagery”. It used to take hundreds of images of someone and an immense amount of computer processing power to create a convincing deepfake. Now you just need a couple of a photos of someone’s face and a phone app. While Swift may be one of the most recent, and most high-profile, victims of the technology, it has upended an enormous number of lives.
More specifically, it’s upended the lives of girls and women. While men aren’t immune to being victims of other forms of AI-generated imagery, deepfake porn overwhelmingly targets women. A 2019 report by cybersecurity company Deeptrace labs, for example, found that non-consensual deepfake pornography accounted for 96% of the total deepfake videos online and concluded that the phenomenon “exclusively targets and harms women”. In the years since the Deeptrace report came out that harm has only increased. There has also been an uptick in cases involving deepfakes of minors. Last year, for example, boys at a New Jersey high school created and shared sexually explicit AI-generated imagery of more than 30 teenage girls.
Deepfake porn, it can’t be stressed enough, isn’t meant to titillate – it’s often meant to humiliate. Being a woman with any sort of public platform means dealing with harassment online: deepfake porn is yet another tool that troll armies and misogynists are weaponizing to punish women who behave in ways they don’t like. It’s meant to shut women up, to drive them offline. When high-profile figures like Swift are targeted with deepfake porn, it sends a message to young girls: put your head above the parapet and you will be punished for it. It doesn’t matter how successful you are, how many billions you have in the bank, the world will still find a way to objectify and humiliate you.
Girls, by the way, are hearing this message loud and clear. A 2022 study in the UK found that girls aged nine-to-18 ranked “being a leader” the lowest priority in a list of 17 attributes for future work. Why would they want to be leaders, why would they want to be in the public eye, when they see what sort of abuse those women receive?
Swift, for her part, hasn’t released an official statement about the fake nude images that have been circulating, but, according to the Daily Mail, she is furious about the pictures and is considering taking legal action. Swift’s devoted fans are also furious and they have been taking more immediate action. Swift fans mass-reported accounts sharing the deepfakes and also posted thousands of images tagged with phrases like “Taylor Swift AI” to make the deepfakes hard to find.
If there is a silver lining to this sordid situation it is this: angry Swifties are a force to be reckoned with. If anyone can get the government to take deepfake porn seriously, if anyone can get social media platforms to pay attention, it is Swift’s enormous and extremely motivated fan base. Let’s not forget the Great Ticketmaster Fiasco of 2022, shall we? After Ticketmaster made a mess of presale tickets for Swift’s Eras Tour, lawmakers in the US started making a lot of noise about anti-competitive practices in the music industry. Swift’s fans managed to make antitrust law a mainstream talking point. Now their collective anger may help spur changes to federal law when it comes to deepfake porn.
And that change can’t come soon enough. While some individual US states have passed laws aimed at curtailing the sharing of deepfake porn without consent, there is no federal law which criminalizes the creation or sharing of fake pornographic images. Victims have very little recourse. And as lawmakers drag their feet, use of this technology is growing rapidly. While the imagery may be fake, the harm it is causing is very, very real.