Incident 537: Mother in Arizona Received Fake Ransom Call Featuring AI Voice of Her Daughter

Description: A mother in Arizona received a ransom call from an anonymous scammer who created her daughter's voice allegedly using AI voice synthesis, which was proven to be fake once her daughter's safety was confirmed.


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: unknown developed an AI system deployed by scammers, which harmed Jennifer DeStefano and DeStefanos family.

Incident Stats

Incident ID
Report Count
Incident Date
Khoa Lam
‘I’ve got your daughter’: Mom warns of terrifying AI voice cloning scam that faked kidnapping · 2023

SCOTTSDALE, Ariz. (KPHO/Gray News) – A mother in Arizona is warning others about a terrifying phone scam involving artificial intelligence that can clone a loved one’s voice.

Jennifer DeStefano said she got a call from an unfamiliar phone n…

'Mom, these bad men have me': She believes scammers cloned her daughter's voice in a fake kidnapping · 2023

(CNN) — Jennifer DeStefano’s phone rang one afternoon as she climbed out of her car outside the dance studio where her younger daughter Aubrey had a rehearsal. The caller showed up as unknown, and she briefly contemplated not picking up.



A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents