Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 3989

Associated Incidents

Incident 7543 Report
Female Politicians in the United Kingdom Reportedly Victimized by Purported Deepfake Pornography

Loading...
British female politicians targeted by fake pornography
theguardian.com · 2024

British female politicians have become the victims of fake pornography, with some of their faces used in nude images created using artificial intelligence.

Political candidates targeted on one prominent fake pornography website include: the Labour deputy leader, Angela Rayner; the education secretary, Gillian Keegan; the Commons leader, Penny Mordaunt; the former home secretary, Priti Patel; and the Labour backbencher Stella Creasy, according to Channel 4 News.

Many of the images have been online for several years and attracted hundreds of thousands of views.

While some are crude Photoshops featuring the politician’s head imposed on to another person’s naked body, other images appear to be more complicated deepfakes that have been created using AI technology. Some of the politicians targeted have now contacted police.

Dehenna Davison, a Conservative MP until the recent dissolution of parliament, is one of those featured on the site. She told Channel 4 News it was “really strange” that people would target women like her and she found it “quite violating”.

She said that unless governments around the world put in place a proper regulatory framework for AI, there would be “major problems”.

Creasy told the broadcaster that she felt “sick” to learn about the images and that “none of this is about sexual pleasure, it’s all about power and control”.

Nonconsensual deepfake technology, which takes a photograph of an individual and uses artificial intelligence to strip clothes or create a fake nude photo, has become a growing issue as part of the wider AI boom.

Earlier this year the Guardian investigated ClothOff, an AI app that invites users to “undress anyone using AI”, which channelled its transactions through a company registered in London, and has caused chaos in some schools.

Thousands of female celebrities are already victims of fake pornography. The site featuring the female British politicians, which the Guardian has not named, features user-created content and claims to only host lawful content featuring adults.

Since the Online Safety Act was introduced in January, sharing such imagery without consent has been illegal in the UK. Yet sites hosting this material are easily accessible through mainstream search engines such as Google.

The creation of such material also remains legal in the UK. The government in April announced plans to close this loophole and ban the creation of deepfake pornography in England and Wales but the proposed law was dropped when Rishi Sunak decided to call an early election.

The Conservatives, Labour and the Liberal Democrats have pledged to bring it back if they win the next election, meaning it is likely that creation of the images will also be banned.

The UK’s stance on deepfake pornography is tougher than many other countries. This has already had an impact, with some of the biggest sites choosing to pre-emptively block British users from their sites rather than risk the potential legal ramifications.

The US representative Alexandria Ocasio-Cortez is pushing for similar laws in the US. She said encountering a deepfake of herself performing a sex act resurfaced past trauma and predicted that “people are going to kill themselves over this”.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd