Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 4458

Associated Incidents

Incident 8917 Report
AI Voice Scam Targets Westchester Parents with Fake Kidnapping Ransom Calls

Loading...
AI being used to mimic children’s voices in ‘virtual kidnapping extortion’ scams
yahoo.com · 2025

The Peekskill Central School District in New York is warning families of a disturbing new scam in which criminals use generative artificial intelligence to mimic children's voices in an attempt to extort money from unsuspecting parents.

In a message sent to the community this week, Superintendent David Mauricio revealed that two families had recently received calls from strangers claiming to have kidnapped a loved one and demanding a ransom. The realistic nature of these calls made the threats particularly alarming.

While fake kidnapping calls aimed at exploiting worried parents are a familiar tactic for some criminals, advancements in technology are making these scams far more convincing and harder to detect.

In this new, high-tech version of the scam --- referred to as a "Virtual Kidnapping Extortion Call" --- criminals use AI-powered tools to replicate a child's voice, lending a chilling sense of credibility to their demands, Mauricio wrote in the message, according to Patch Peekskill-Cortlandt.

Just last month, the Federal Bureau of Investigation issued a warning that criminals are increasingly leveraging generative AI to enhance "the believability of their schemes."

Federal investigators describe this method, known as "vocal cloning" or "AI-generated audio," as a sophisticated tactic in which realistic-sounding audio clips of a loved one in distress are created and used to coerce victims into paying ransoms.

To avoid becoming a victim of this extortion scheme, Mauricio urged parents to check their privacy settings on social media accounts, review any information published online and refer to tips offered by the National Institutes of Health.

"Also," the superintendent urged, "check what platforms your child is using and what information they are providing."

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd