Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 1030: Aspiring Artist Cherelle Kozak Reportedly Targeted by AI-Powered Impersonation of Rapper Fat Joe

Description: An aspiring artist in Austin, Texas, Cherelle Kozak, was targeted by a scammer using AI-generated video and voice to impersonate rapper Fat Joe. The impersonator appeared on a call, encouraged her to upload music for supposed radio play, and then demanded payment. Kozak did not comply. The scam closely resembled one that Fat Joe publicly warned about on January 5, 2025.
Editor Notes: Timeline notes: On January 5th, 2025, Fat Joe posted a warning on social media alerting fans about AI-powered scams impersonating him. The report about the attempt targeting Cherelle Kozak was published on April 18th, 2025.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: Unknown deepfake technology developer and Unknown voice cloning technology developer developed an AI system deployed by Unknown scammers impersonating Fat Joe, which harmed Cherelle Kozak , Fat Joe , Fans of Fat Joe and General public.
Alleged implicated AI systems: Unknown deepfake technology apps , Unknown voice cloning technology and FaceTime

Incident Stats

Incident ID
1030
Report Count
1
Incident Date
2025-01-05
Editors
Daniel Atherton

Incident Reports

Reports Timeline

Incident Occurrence"It's Fat Joe," Austin-based aspiring artist targeted in AI-powered impersonation scam
"It's Fat Joe," Austin-based aspiring artist targeted in AI-powered impersonation scam

"It's Fat Joe," Austin-based aspiring artist targeted in AI-powered impersonation scam

cbsaustin.com

"It's Fat Joe," Austin-based aspiring artist targeted in AI-powered impersonation scam
cbsaustin.com · 2025

AUSTIN, Texas — In a world where connections are just a video call away, it’s easy to trust the face on your screen. Whether dating, networking, or just catching up, FaceTime and social media video calls help bridge the distance-- but what …

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Amazon Alexa Responding to Environmental Inputs

Amazon Alexa Responding to Environmental Inputs

Dec 2015 · 35 reports
Fake LinkedIn Profiles Created Using GAN Photos

Fake LinkedIn Profiles Created Using GAN Photos

Feb 2022 · 4 reports
Alexa Recommended Dangerous TikTok Challenge to Ten-Year-Old Girl

Alexa Recommended Dangerous TikTok Challenge to Ten-Year-Old Girl

Dec 2021 · 3 reports
Previous IncidentNext Incident

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Amazon Alexa Responding to Environmental Inputs

Amazon Alexa Responding to Environmental Inputs

Dec 2015 · 35 reports
Fake LinkedIn Profiles Created Using GAN Photos

Fake LinkedIn Profiles Created Using GAN Photos

Feb 2022 · 4 reports
Alexa Recommended Dangerous TikTok Challenge to Ten-Year-Old Girl

Alexa Recommended Dangerous TikTok Challenge to Ten-Year-Old Girl

Dec 2021 · 3 reports

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 1420c8e