Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 4949

Associated Incidents

Incident 9821 Report
Scammers Reportedly Using Deepfake Video Calls to Impersonate Executives in Singapore and Orchestrate Corporate Bank Transfers

Loading...
Singapore authorities warn of rise in deepfake corporate video calls
finextra.com · 2025

The Singapore Police Force (SPF), Monetary Authority of Singapore (MAS) and Cyber Security Agency of Singapore (CSA) says victims would receive unsolicited WhatsApp messages from scammers claiming to be executives from the company that the victims work for, inviting the employee to join a live-streamed Zoom video call.

"It is believed that digital manipulation had been used to alter the appearances of the scammers to impersonate these high-ranking executives," state the authorities. "In some cases, the video calls would also involve scammers impersonating MAS officials and/or potential 'investors'."

During the calls, victims would be instructed to transfer substantial amounts of funds from their company’s corporate bank accounts to designated bank accounts under the pretext of business payments, such as project financing or investments. Some victims were also asked to disclose personal information such as NRIC and passport details."

Victims would subsequently realise that they had been scammed when the scammers become uncontactable, or upon verifying with the actual company’s executives and legal counsel, who would confirm that they neither participated in any video calls nor authorised any fund transfers.

Businesses are being advised to establish protocols for employees to verify the authenticity of any video calls or messages, particularly those purportedly from senior executives or key stakeholders and to check for tell-tale signs that could suggest the manipulation of the audio or video through AI technology.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd