Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 4315

Associated Incidents

Incident 8514 Report
Salt Lake City Police Chief Mike Brown's Voice and Image Misused in AI-Generated Scam

Loading...
Don’t fall for AI scams cloning cops’ voices, police warn
arstechnica.com · 2024

AI is giving scammers a more convincing way to impersonate police, reports show.

Just last week, the Salt Lake City Police Department (SLCPD) warned of an email scam using AI to convincingly clone the voice of Police Chief Mike Brown.

A citizen tipped off cops after receiving a suspicious email that included a video showing the police chief claiming that they "owed the federal government nearly $100,000."

To dupe their targets, the scammers cut together real footage from one of Brown's prior TV interviews with AI-generated audio that SLCPD said "is clear and closely impersonates the voice of Chief Brown, which could lead community members to believe the message was legitimate."

The FBI has warned for years of scammers attempting extortion by impersonating cops or government officials. But as AI voice-cloning technology has advanced, these scams could become much harder to detect, to the point where even the most forward-thinking companies like OpenAI have been hesitant to release the latest tech due to obvious concerns about potential abuse.

SLCPD noted that there were clues in the email impersonating their police chief that a tech-savvy citizen could have picked up on. A more careful listen reveals "the message had unnatural speech patterns, odd emphasis on certain words, and an inconsistent tone," as well as "detectable acoustic edits from one sentence to the next." And perhaps most glaringly, the scam email came from "a Google account and had the Salt Lake City Police Department's name in it followed by a numeric number," instead of from the police department's official email domain of "slc.gov."

SLCPD isn't the only police department dealing with AI cop impersonators. Tulsa had a similar problem this summer when scammers started calling residents using a convincing fake voice designed to sound like Tulsa police officer Eric Spradlin, Public Radio Tulsa reported. A software developer who received the call, Myles David, said he understood the AI risks today but that even he was "caught off guard" and had to call police to verify the call wasn't real.

"It sounded like a very, very casual conversation, as if this person was actually expecting to have spoken with me," David said. "He used my first name, my last name. They spoke very casually. That's why it sounded so real and ultimately why it was so scary."

Unlike the obviously fake email address in the Salt Lake City case, a cybersecurity professor targeted by the Tulsa scam, Tyler Moore, told Public Radio Tulsa that it's relatively easy to make the scam phone call appear like it's coming from a Tulsa police line.

Cops advise calling the cops on fake cops

This emerging AI abuse isn't limited to the US. A post on X in March from a user named Kaveri attracted more than 800,000 views, warning of a scammer posing as a cop in India and threatening to take away her young daughter if she didn't do as she was told.

That scammer cloned her daughter's voice, part of another concerning trend flagged last year, where bad actors scammed people out of thousands of dollars by using AI to impersonate loved ones in distress.

For Kaveri, in the end, a red flag was raised by asking specific questions that seemingly made it clear that her daughter was not with the supposed cop. But according to The Indian Express, others targeted in similar scams were fooled and ended up losing money.

Cops globally are hoping to raise awareness of the increasing risk of AI voice scams to deter harms, advising citizens to verify calls from police and ask specific questions that a scammer may not be able to answer.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd