Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 2847

関連インシデント

インシデント 4927 Report
Canadian Parents Tricked out of Thousands Using Their Son's AI Voice

Loading...
Scammers Use Voice Cloning AI to Trick Grandma Into Thinking Grandkid Is in Jail
futurism.com · 2023

Bail Out

Ruthless scammers are always looking for the next big con, and they might've found it: using AI to imitate your loved ones over the phone.

When a 73-year-old Ruth Card heard what she thought was the voice of her grandson Brandon on the other end of the line saying he needed money for bail, she and her husband rushed to the bank.

"It was definitely this feeling of... fear," Card told The Washington Post. "That we've got to help him right now."

The couple withdrew the maximum of 3,000 Canadian dollars at one bank and went to another for more. Fortunately, a vigilant bank manager flagged them down and warned them that another customer had gotten a similar phone call that sounded like it was from a loved one — but it turned out the voice had been faked.

"We were sucked in," Card said. "We were convinced that we were talking to Brandon."

Legal Trouble

Not all were as lucky. The 39-year-old Benjamin Perkin told WaPo how his elderly parents were swindled out of thousands of dollars with the help of an AI impersonator.

Perkin's parents had received a phone call from a lawyer, who claimed that their son killed a US diplomat in a car crash and needed money for legal fees. The apparent lawyer then let Perkin speak on the phone — and the voice sounded just like him.

This convinced them. When the lawyer later called back asking for CAD $21,000, his parents went to the bank and sent the money through BitCoin.

"The money's gone," Perkin told the paper. "There's no insurance. There's no getting it back. It's gone."

Easy Pickings

Voice cloning scams have been a threat for several years now. But the growing ubiquity of powerful and easy-to-use AI means that the technology's potential to be abused easily outpaces an unwitting public's ability to keep up with the tricks of bad actors — not realizing they could be targeting them already.

"Two years ago, even a year ago, you needed a lot of audio to clone a person's voice," Hany Farid, a professor of forensics at UC Berkeley, told WaPo. "Now... if you have a Facebook page... or if you've recorded a TikTok and your voice is in there for 30 seconds, people can clone your voice."

Take ElevenLabs, whose AI voice synthesis service costs as little as $5 per month, and can produce results so convincing that a journalist used it to break into his own bank account. It's even spawned an entire genre of memes impersonating President Joe Biden. ElevenLabs' voice cloning has only been around since 2022. Imagine the damage it — and competitors looking to ride the coattails of its success — could do in just a few more years.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd