Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 5420

Associated Incidents

Incident 111841 Report
Ongoing Purported AI-Assisted Identity Fraud Enables Unauthorized Access to Western Companies by North Korean IT Workers

Loading...
AI chatbots, deepfake software, and fake ID generators help automate crypto scams: Report
thehindu.com · 2025

AI tools such as chatbots, deepfake software, and fake ID generators have helped criminals automate and expand the process of carrying out crypto scams and other related crimes, according to 'The state of crypto scams 2025' report by blockchain analytics platform Elliptic.

As per existing data, a significant portion of scam-related losses involve crypto. In fact, $9.3 billion of $16.6 billion of U.S. fraud losses last year were crypto-based, according to a statistic from the FBI.

"The last year saw a rise in sextortion, pig butchering, memecoin-based rug-pulls and deepfake incentive scams. Artificial intelligence (AI) tools, including chatbots, deepfake software and fake ID generators, are enabling the automation and scaling up of many of these scam types," noted the report by Elliptic.

Tools such as AI chatbots and deepfake generators make it easier to fool victims and lure them into scams, even in spite of linguistic barriers. Meanwhile, deepfake videos featuring celebrity endorsements can fool victims into handing over their funds or credentials. For example, Elliptic's report noted that North Korean threat actors have used deepfakes to impersonate crypto executives. They also used video calls to distribute malware, per the report.

"Indicators throughout 2024 have suggested that scams are becoming the most lucrative form of illicit activity in the crypto space. Dedicated illicit online marketplaces that sell goods and services to organized fraud rings, exposed by Elliptic, have processed over $30 billion in crypto -- well above the volumes flowing through traditional drugs-focused dark web markets," noted the report.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd