Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 2787

関連インシデント

インシデント 4927 Report
Canadian Parents Tricked out of Thousands Using Their Son's AI Voice

Loading...
Scammers are using AI voices to steal millions by impersonating loved ones
androidauthority.com · 2023
  • AI voice-generating software is allowing scammers to mimic the voice of loved ones.
  • These impersonations have led to people being scammed out of $11 million over the phone in 2022.
  • The elderly make up a majority of those who are targeted.

AI has been a central topic in the tech world for a while now, as Microsoft continues to infuse its products with ChatGPT and Google attempts to keep up by pushing out its own AI products. While AI has the potential to do some genuinely impressive stuff — like generating images based on a single line of text — we’re starting to see more of the downside of the barely regulated technology. The latest example of this is AI voice generators being used to scam people out of their money.

AI voice generation software has been making a lot of headlines as of late, mostly for stealing the voices of voice actors. Initially, all that was required was a few sentences for the software to convincingly reproduce the sound and tone of the speaker. The technology has since evolved to the point where just a few seconds of dialogue is enough to accurately mimic someone.

In a new report from The Washington Post, thousands of victims are claiming that they’ve been duped by imposters pretending to be loved ones. Reportedly, imposter scams have become the second most popular type of fraud in America with over 36,000 cases submitted in 2022. Of those 36,000 cases, over 5,000 victims were conned out of their money through the phone, totaling $11 million in losses according to FTC officials.

One story that stood out involved an elderly couple who sent over $15,000 through a bitcoin terminal to a scammer after believing they had talked to their son. The AI voice had convinced the couple that their son was in legal trouble after killing a U.S. diplomat in a car accident.

Like with the victims in the story, these attacks appear to mostly target the elderly. This comes as no surprise as the elderly are among the most vulnerable when it comes to financial scams. Unfortunately, the courts have not yet made a decision on whether companies can be held liable for harm caused by AI voice generators or other forms of AI technology.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd