Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 5463

Associated Incidents

Incident 112222 Report
Reportedly Sustained Multi-Celebrity Deepfake Persona Scam Targeting Vulnerable Southampton Resident

Loading...
Scammed by a Jennifer Aniston deepfake: AI videos and fake messages to steal money and identity
decripto.org · 2025

A British man has lost hundreds of pounds after being duped by a sophisticated online scam scheme that used artificially generated images and videos to impersonate famous American actress Jennifer Aniston. The case reignites alarm over the growing use of deepfakes in digital romance scams.

A virtual relationship lasting five months

The victim, a 43-year-old man living in Southampton, England, reported that he had been in an online relationship for about five months with a figure he believed to be Jennifer Aniston. The contact had started via social media, with seemingly harmless but well-constructed messages, which over time had turned into an emotionally involving exchange.

Numerous 'evidence materials' were provided to reinforce the illusion: photos of the alleged celebrity, voice notes with a voice identical to Aniston's, a forged copy of her driver's licence and, above all, a video in which the fake Aniston greeted him by name. The video, generated with deepfake technology, showed credible movements and expressiveness compatible with the real content disseminated online by the actress.

The victim stated that the message was full of affection, in perfect harmony with the conversations they had had: 'He told me that he loved me. That I was special. I wanted to believe it."

The call for help: gift card for Apple

The turning point of the scam came with a request for money. The scammer, in the guise of Jennifer Aniston, explained that she was having difficulties with the payment of Apple service subscriptions due to problems with her US bank account. She then asked the man to buy Apple gift cards to help her, promising a quick refund and claiming temporary problems with her accounting team. The victim, now emotionally involved, gave in. She purchased gift cards worth about £200, sending the codes via direct message. When the requests became more insistent, she began to doubt. But by then it was too late.

The moment of realisation

Only after contacting some friends and doing more in-depth research did the man realise that he had been the victim of deception. The videos had been created using artificial intelligence technology, capable of replicating celebrities' expressions, voice and behaviour in a surprisingly realistic way. "I feel humiliated. I'm not stupid, but I was alone. That voice, those messages... they sounded real. I wanted to believe it, if only for a moment."

An expanding phenomenon

The case is not isolated. Increasingly, digital fraudsters are exploiting advances in voice and visual generation software to create convincing synthetic identities. The use of deepfakes is no longer limited to simple parody videos, but is being systematically applied in sentimental scams, financial frauds and manipulations on a global scale. According to the 2024 report by the cybersecurity company SlowMist, the use of generative AI in so-called 'romance scams' has increased by 215% in the last year. Victims are often selected from people who are lonely, vulnerable or looking for genuine connections, exploiting the human desire for love and belonging.

Celebrities as digital weapons

Jennifer Aniston is not the only unwitting victim of these scams. In the past, the images and voices of Brad Pitt, Owen Wilson, Scarlett Johansson and Elon Musk have also been used to conduct fraudulent operations. In some cases, fraudsters have even faked live TV broadcasts or interviews, digitally fabricated to increase credibility. The goal is always the same: to build trust, generate involvement, induce the victim to provide money, sensitive data or access to bank accounts.

No real protection

At present, social platforms are often slow or ineffective in blocking such content, and legislation struggles to keep up with technological developments. Reports from users are handled piecemeal and there are no effective preventive tools to distinguish between a real video and an AI-generated one, especially for the average user. Furthermore, tracing fraudsters is complicated by the use of VPNs, fictitious accounts and foreign servers. In the case of the British man, it was not possible to trace the real identity of the person hiding behind the fake Aniston.

The voice of the victims

Many victims, like the man in Southampton, avoid public denunciation out of shame or fear of being ridiculed. But it is precisely their silence that makes it easier for these patterns to be repeated. In an interview with The Sun, the victim explained that she wanted to tell her story to prevent others from falling into the same trap: 'It's not just about the money. They take something away from you inside. They deceive you, they use you, and then they disappear. It's devastating."

A new frontier of online crime

The story of the fake love affair with Jennifer Aniston is just the latest example of how the misuse of artificial intelligence is profoundly changing the cybercrime landscape. Law enforcement, governments and digital platforms will need to equip themselves quickly to deal with this threat. In the meantime, the recommendation is one: never blindly trust what you see or hear online. Even if it has the voice, face and smile of a Hollywood star.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd