Associated Incidents
An SDLP MLA who has experienced online harassment is advocating for a comprehensive image-based abuse law to protect women and girls.
Cara Hunter has been the victim of sexual harassment, something she believes was used as a "chauvinistic digital weapon" aimed at reputational damage and derailing her electoral ambitions nearly three years ago.
A pornographic video clip was shared thousands of times on WhatsApp, along with false claims that she appeared in it. As a result, she received numerous unsolicited messages from men on social media.
Because of a legislative gap and the encrypted nature of WhatsApp, Ms Hunter (29) has never found out who was responsible for what she described as a "traceless crime".
The video spread uncontrollably, Ms Hunter said, and every time she went to the shop or pub, people would be "looking and laughing".
The abuse took a mental toll and still affects her to this day.
"Sadly, these things live forever and the rumours that come with them live forever as well," she explained.
"Thankfully, most people know that it isn't me, which is really comforting.
"But of course there are those awkward moments still to this day where somebody will mention it and think it's a leak.
"It's not empowering at all to feel that these falsehoods and fake videos linger on, even though this happened almost three years ago now." She added: "This was a political attack out of left field, I could never have foreseen this coming.
"It was extremely difficult, it still lingers on with people thinking it's you and that embodies how damaging deepfakes are. It is taking an online lie and it's actually impacting your real life and can cause you harm.
"I was stopped on the street and asked for a sexual favour by a man. That alludes to digital harms impacting you in real life and can impact your safety."
Deepfake porn, where someone's likeness is imposed into sexually explicit images with artificial intelligence, emerged around 2017 and is a growing problem almost exclusively targeted at women.
While UK laws criminalise sharing deepfake porn without consent, they do not cover its creation. The possibility of creation alone instils fear and threat into women's lives.
Ms Hunter has also been harassed by an individual who sent deepfake images of her in lingerie via Instagram, which she described as "terrifying".
Manipulated images are becoming a widespread issue, with very little protection for victims.
Professor Clare McGlynn, an expert on violence against women and girls at Durham University, is working alongside Ms Hunter and other victims to drive policy reform.
On the severity of this abuse, she said: "There are examples of young women who have taken their lives after deepfake abuse. Many others experience suicidal thoughts.
"It can be life-ending and life-threatening. It's a real violation of women and girls' autonomy and sense of self."
In Ms Hunter's case, she said the PSNI were "earnest", but they didn't have the cyber-crime technology.
It wasn't defined as "revenge porn" as she wasn't in the video and there is no helpline for victims. She said there have been studies in the US examining how this is pervading American politics.
In one instance, a doctored video of former House Speaker Nancy Pelosi, which made it look like she was slurring her words, was viewed millions of times.
That low-tech example demonstrated how easy it is for anyone, let alone potential adversaries, to interfere with elections for nefarious reasons.
On one prominent fake pornography website, high-profile figures in British politics appeared, including Labour deputy leader Angela Rayner and former leader of the House of Commons Penny Mordaunt.
Ms Hunter would like to see the Electoral Commission about conveying this message to female candidates in NI.
She said: "We want to see women in politics. Are we adequately preparing them? Are we letting them know the kind of falsehoods that can be made and who they can turn to?
"The Electoral Commission has been receptive, which is really positive, but time is of the essence."
She hopes to meet the Science and Technology Secretary Peter Kyle this month to demand action on image-based abuse through deepfakes along with Professor McGlynn and campaigners.
Ms Hunter said legislation needs exact language to ensure there are no loopholes, and if people producing deepfakes of women know they could face two years in prison, that would be a "strong deterrent".
Professor McGlynn said: "We need a comprehensive law covering all forms of image-based sexual abuse.
"Campaigners are calling for an image-based abuse law which provides comprehensive civil and criminal laws, effective regulation of platforms, support for survivors and education."
In terms of social media platforms like X and Facebook, Elon Musk has been spreading disinformation of late.
Meta boss Mark Zuckerberg has also vowed to remove fact-checkers, creating what Ms Hunter described as an "unregulated wild west" online.
Professor McGlynn added: "Regulation and content moderation is vital and Meta's capitulation to Donald Trump is deeply concerning. It is women's freedom of speech that is being restricted by online abuse and sexually explicit deepfakes.
"This abuse is trying to silence women and curb their free speech."
Next Tuesday, Ms Hunter will be urging the Justice Minister to act with haste, as any changes at Westminster will have to be implemented "slightly differently" here. She would also like to see preventative action taken by tackling it at the root through education that's focused on respect for women.
Social media companies, she believes, need to have enhanced support centres for victims and a commitment to watermark deepfakes to make it clear it's AI.
Ms Hunter concluded: "People need to feel safe online. When I reached out to Meta, they were non-responsive.
"Social media giants have a moral and professional duty to ensure they're as safe as possible. Yes, there's an importance there in terms of freedom of speech and from the American lens, they'll say, 'I don't agree with what you're saying, but I'll fight to the death to protect your right to say it'.
"But people shouldn't be allowed to create intimidation, threat or harm online. It's important to have fact-checkers because misinformation and disinformation is dangerous. It can be a danger to democracy."
The Department for Science, Innovation and Technology said: "Sharing intimate images online without consent is a vile trend which can inflict profound and lasting harm to victims, particularly for women and girls. Under the Online Safety Act, it is a criminal offence to share or threaten to share intimate images, including deepfakes, without consent.
"Last year we strengthened laws to provide greater protections for adults, particularly women. These measures require platforms to proactively tackle deepfake intimate image abuse and prevent it from appearing altogether."