Unknown fraudsters
Incidents involved as Deployer
Incident 112219 Report
Reportedly Sustained Multi-Celebrity Deepfake Persona Scam Targeting Vulnerable Southampton Resident
2025-06-28
Over about five months in 2025, Paul Davis, a Southampton, UK man, reports that he was repeatedly targeted by scammers using purported deepfake videos and images of celebrities including Jennifer Aniston, Mark Zuckerberg, Elon Musk, and Ellie Goulding. The perpetrators allegedly rotated personas to sustain a romance and prize scam, extracting £200 in gift cards. This case suggests a new shift from one-off celebrity deepfakes to persistent, multi-persona targeting of a single vulnerable victim.
MoreIncident 1476 Report
Reported AI-Cloned Voice Used to Deceive Hong Kong Bank Manager in Purported $35 Million Fraud Scheme
2020-01-15
In January 2020, a Hong Kong-based bank manager for a Japanese company reportedly authorized $35 million in transfers after receiving a call from someone whose voice matched the company director's. According to Emirati investigators, scammers used AI-based voice cloning to impersonate the executive. The fraud allegedly involved at least 17 individuals and reportedly led to global fund transfers that triggered a UAE investigation. U.S. authorities were reportedly later asked to help trace part of the funds sent to U.S. banks.
MoreIncident 11264 Report
Reported Use of Deepfake Video Impersonating Owen Wilson in Romance Scam with Fake Job Payments
2025-05-16
A Reddit user reported in May 2025 that her mother is being groomed by scammers using an AI-generated deepfake video impersonating actor Owen Wilson. The scam reportedly began on a game app (Yahtzee with Friends) and shifted to WhatsApp voice calls, with the victim promised a fake Warner Bros job paying small sums via Cash App. The scammer allegedly claims he will buy a house for the victim's family as part of the deception. The video is cited as the main “proof” of authenticity.
MoreIncident 10532 Report
Mumbai Businessman Reportedly Defrauded via Purported AI-Cloned Voice Impersonating Son
2024-03-30
A Mumbai businessman reportedly identified as KT Vinod reportedly lost Rs 80,000 after receiving a call from someone claiming to be a representative of the Indian Embassy in Dubai, who said his son had been arrested. The caller allegedly used AI-generated voice cloning to simulate the voice of Vinod's son, pleading for help. Reportedly convinced of the urgency, Vinod instructed a money transfer via Google Pay. The scam was discovered only after he contacted his son directly.
MoreRelated Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
Related Entities
Unknown voice cloning technology developer
Incidents involved as Developer
- Incident 112219 Reports
Reportedly Sustained Multi-Celebrity Deepfake Persona Scam Targeting Vulnerable Southampton Resident
- Incident 1476 Reports
Reported AI-Cloned Voice Used to Deceive Hong Kong Bank Manager in Purported $35 Million Fraud Scheme
Incidents implicated systems
Unknown deepfake technology developer
Incidents involved as Developer
- Incident 112219 Reports
Reportedly Sustained Multi-Celebrity Deepfake Persona Scam Targeting Vulnerable Southampton Resident
- Incident 1476 Reports
Reported AI-Cloned Voice Used to Deceive Hong Kong Bank Manager in Purported $35 Million Fraud Scheme
Incidents implicated systems
X (Twitter)
Incidents implicated systems
- Incident 10561 Report
Purported AI-Generated Videos Impersonating President of Malta Myriam Spiteri Debono Circulate on Social Media in Alleged Crypto Scam Campaigns
- Incident 11111 Report
Reported AI-Generated Video Call Impersonation of Cryptocurrency Analyst Leads to Alleged Malware Installation and Account Theft