Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 5351

Associated Incidents

Incident 11111 Report
Reported AI-Generated Video Call Impersonation of Cryptocurrency Analyst Leads to Alleged Malware Installation and Account Theft

CZ Warns Video Verification ‘Out the Window’ After Deepfake Scam Hits Analyst
coinedition.com · 2025

Binance founder Changpeng Zhao has cautioned that AI-powered deepfake technology has made video call verification unreliable for security purposes. He also warned users to avoid installing software from unofficial sources, even if the request comes from their friends, as their accounts may have been compromised.

CZ's warning came in response to a sophisticated hacking incident involving cryptocurrency analyst Mai Fujimoto. The analyst lost control of her X account after falling victim to a deepfake attack during a video call.

Zhao emphasized that friends requesting software installation are "most likely hacked" and highlighted how cybercriminals exploit trusted relationships to distribute malware. The former Binance CEO's warning highlights the evolution of social engineering attacks, which now utilize advanced AI technology to create increasingly convincing impersonations.

Deepfake Attack Exploits Trusted Relationships 

Mai Fujimoto explained how her main X account, @missbitcoin_mai, was hacked on June 14 by a carefully planned deepfake attack. The attack began when her friend's Telegram account was compromised, allowing attackers to exploit the account and initiate a video meeting in bad faith. Fujimoto had accepted the Zoom call invitation in good faith, as the communication appeared to be from a known contact.

During the 10-minute video call, Fujimoto could see what appeared to be the face of her acquaintance, but could not hear. The impersonator provided a link that said it would resolve the audio issue and provided step-by-step instructions on how to adjust settings. Fujimoto believes that this was when malware was installed on her computer, which subsequently led to the theft of her social media account.

Fujimoto Incident Shows The Advancement of AI Deepfake

The technology was so advanced that Fujimoto remained for the entire length of the call, thinking she was talking to her real acquaintance. She only understood the level of sophistication of the attack and the way the attackers had managed to make her believe in them when she lost access to her accounts.

Fujimoto acknowledged that continuing to use Zoom despite persistent audio issues should have raised red flags. However, she attributed the platform choice to her friend's engineering background, assuming a technical preference rather than recognizing it as a potential manipulation tactic.

The attack's success extended beyond the initial X compromise, with hackers gaining access to Fujimoto's Telegram and MetaMask accounts. Fujimoto has expressed concern that her own likeness could be used in future deepfake attacks, warning contacts to remain suspicious of any video calls featuring her face.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • eeb4352