Unknown deepfake technology
Incidents implicated systems
Incident 104818 Rapports
Tennessee Meteorologist's Likeness Reportedly Used in Sextortion Campaign Involving Purported AI-Generated Content
2025-01-10
Bree Smith, a meteorologist in Nashville, Tennessee, was reportedly targeted in a sextortion campaign involving purported AI-generated deepfakes that manipulated her likeness into explicit content. According to reporting, Smith's face was digitally placed onto semi-nude and nude bodies, with the resulting media circulated online by impersonators seeking money. Smith documented the spread of these accounts and has advocated for legislative responses, including a new Tennessee bill addressing deepfake-related harms.
PlusIncident 1476 Rapports
Reported AI-Cloned Voice Used to Deceive Hong Kong Bank Manager in Purported $35 Million Fraud Scheme
2020-01-15
In January 2020, a Hong Kong-based bank manager for a Japanese company reportedly authorized $35 million in transfers after receiving a call from someone whose voice matched the company director's. According to Emirati investigators, scammers used AI-based voice cloning to impersonate the executive. The fraud allegedly involved at least 17 individuals and reportedly led to global fund transfers that triggered a UAE investigation. U.S. authorities were reportedly later asked to help trace part of the funds sent to U.S. banks.
PlusIncident 10452 Rapports
Mother in Louisville, Kentucky Describes Phone Scam Involving Purported AI-Generated Voice of Her Daughter
2025-04-29
Louisville, Kentucky mother Kim Alvey reportedly received a phone call in which an unknown individual purportedly used AI-generated voice cloning to impersonate her 10-year-old daughter, claiming she had been in an accident. The call escalated with a male voice threatening to kidnap the child. Alvey confirmed her daughter was safe at school and identified the call as a scam.
PlusIncident 10532 Rapports
Mumbai Businessman Reportedly Defrauded via Purported AI-Cloned Voice Impersonating Son
2024-03-30
A Mumbai businessman reportedly identified as KT Vinod reportedly lost Rs 80,000 after receiving a call from someone claiming to be a representative of the Indian Embassy in Dubai, who said his son had been arrested. The caller allegedly used AI-generated voice cloning to simulate the voice of Vinod's son, pleading for help. Reportedly convinced of the urgency, Vinod instructed a money transfer via Google Pay. The scam was discovered only after he contacted his son directly.
Plus