Incident 492: Canadian Parents Tricked out of Thousands Using Their Son's AI Voice

Description: Two Canadian residents were scammed by an anonymous caller who used AI voice synthesis to replicate their son's voice asking them for legal fees, disguising as his lawyer.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: unknown developed and deployed an AI system, which harmed Ben Perkin's parents and Perkins family.

Incident Stats

Incident ID
492
Report Count
7
Incident Date
2023-01-11
Editors
Khoa Lam
TikTok: @benno56
tiktok.com · 2023
They thought loved ones were calling for help. It was an AI scam.
washingtonpost.com · 2023

The man calling Ruth Card sounded just like her grandson Brandon. So when he said he was in jail, with no wallet or cellphone, and needed cash for bail, Card scrambled to do whatever she could to help.

"It was definitely this feeling of ...…

Scammers are using voice-cloning A.I. tools to sound like victims’ relatives in desperate need of financial help. It’s working.
fortune.com · 2023

You may very well get a call in the near future from a relative in dire need of help, asking you to send them money quickly. And you might be convinced it’s them because, well, you know their voice. 

Artificial intelligence changes that. Ne…

A couple in Canada were reportedly scammed out of $21,000 after getting a call from an AI-generated voice pretending to be their son
businessinsider.com · 2023

A couple in Canada were reportedly scammed out of $21,000 after they received a call from someone claiming to be a lawyer who said their son was in jail for killing a diplomat in a car accident.

Benjamin Perkin told The Washington Post the …

Scammers are using AI voices to steal millions by impersonating loved ones
androidauthority.com · 2023
  • AI voice-generating software is allowing scammers to mimic the voice of loved ones.
  • These impersonations have led to people being scammed out of $11 million over the phone in 2022.
  • The elderly make up a majority of those who are targeted.
Losing thousands of dollars because AI fakes the voice of a loved one
vnexpress.net · 2023

Parents of Benjamin Perkin (Canada) received a call from their son, which was actually a fake AI, saying he was being held in custody and needed $ 15,000 urgently.

Perkin's family's nightmare, 39, began when a man claiming to be a lawyer ca…

Scammers Use Voice Cloning AI to Trick Grandma Into Thinking Grandkid Is in Jail
futurism.com · 2023

Bail Out

Ruthless scammers are always looking for the next big con, and they might've found it: using AI to imitate your loved ones over the phone.

When a 73-year-old Ruth Card heard what she thought was the voice of her grandson Brandon on…

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.