Associated Incidents
This is an extract from a Politico newsletter. For the full report, please visit the website itself.
DEEPFAKES: WHEN THE PERSONAL IS POLITICAL
--- CARA HUNTER, A NORTHERN IRISH POLITICIAN, WAS ONLY WEEKS away from the country's 2022 legislative elections when she received a WhatsApp message from someone she didn't know. The man quickly asked her if she was the woman in an explicit video --- a 40-second clip that he shared with the then-24-year-old. Opening the video, Hunter was confronted with an AI-generated deepfake video of herself performing graphic sexual acts. Within days, the false clip had gone viral, and the Northern Irishwoman was bombarded with direct messages from men around the world with increasingly sexual and violent messages.
"It was a campaign to undermine me politically," Hunter, who won her 2022 Northern Ireland Assembly seat by just a few votes, told me. "They had felt that, because they saw an explicit of someone who looked like me, it was OK to send me nasty messages. It has left a tarnished perception of me that I can't control. I'll have to pay for the repercussions of this for the rest of my life."
Before we get into this, let's be very clear: Deepfake pornography, unfortunately, is not new. It's been around for almost a decade and almost entirely targets women. It regained public attention after Taylor Swift became the latest victim when AI-generated graphic images of her were created, mostly via a 4Chan message board, and then shared widely on X, formerly known as Twitter. I also don't want to mansplain what every woman reading this already knows. This is all about power. Power to demean women; power to control how women can participate in public life; power to silence voices that men (and it's almost entirely men) believe are not worthy. Don't take my word for it. There's been some great reporting on this, for years. (Here, here and here.) I could find no examples of male politicians targeted with such sexual abuse.
"These deepfake videos are telling women to get out of the public eye," Claire McGlynn, a leading expert on legal responses to online sexual abuse, told me. "This is men telling women, 'This is what we can do to you.' Many younger women are very aware that this is [a] new threat hovering over them." For as much as what happened to Swift caught global attention, deepfake porn --- fueled by the latest generation of AI technology that makes it too easy to create and share such non-consensual explicit content --- is everywhere. It's happening to teenage girls in Spain; it's happening to South Korean movie actors; it's happening, increasingly, to female politicians like Hunter to silence their participation in the democratic process.
For something so widespread, few, if any, laws have outlawed the creation and sharing of image-based sexual abuse. U.S. states like California and New York have deepfake porn laws, but, so far, they haven't been used. In the wake of the Swift scandal, U.S. senators fell over themselves to propose legislation --- though it's unlikely to go anywhere and focuses on victims suing their attackers. The EU's Digital Services Act says almost nothing about deepfake porn, though European lawmakers this week backed a new law to criminalize the sharing of such nonconsensual images --- though only from 2027. The United Kingdom's Online Safety Act has provisions targeting deepfake porn, but how that will be implemented is still very much up in the air.
So far, Australia has gone the furthest via its Online Safety Act, which gives the country's eSafety commissioner powers to compel people who share deepfake porn to remove the image-based sexual abuse. It doesn't, however, include criminal sanctions. Late last year, the agency used that authority to force an Australian living in the Philippines to take down deepfake porn --- but only after that individual returned to Australia and was arrested by local police on contempt of court charges."We're seeing a lot more deepfakes," Australian eSafety Commissioner Julie Inman-Grant, who also has been targeted with such graphic explicit content, told me. "It's very easy for a young person to use Midjourney or Stable Diffusion --- very powerful consumer-facing apps --- to create deepfakes of their classmates and use it to bully them."
For Hunter, the Northern Irish politician, this is where the personal becomes political. After she was targeted with a deepfake porn attack, she had to explain to her parents what was happening --- and that the AI-generated video was not her. She attended public events, even having a drink at the pub, where people made humiliating insinuations. She had to weigh --- given that it happened weeks before an election --- whether to come out publicly with a rebuttal or wait until all the votes had been counted. She felt she was attacked not because of her political views, but because she was a young woman. "It's grotesque to see an emerging technology used to abuse women," she said. "Years of building trust in my community and in one fell swoop, it meant sweet FA."