Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 6209

Associated Incidents

Incident 12062 Report
Purported AI-Generated Deepfake of Spiritual Leader Sadhguru Used in Investment Scam Allegedly Defrauding Bengaluru Woman of ₹3.75 Crore (~$425,000)

Loading...
Bengaluru woman defrauded out of Rs 3.75 crore with Sadhguru’s deepfake video: police
indianexpress.com · 2025

A 57-year-old retired woman in Bengaluru has lost Rs 3.75 crore to scammers who used an AI-generated deepfake video of spiritual leader Sadhguru Jaggi Vasudev to promote fake investment opportunities, the police said on Thursday.

The woman, a resident of CV Raman Nagar, was completely unaware of deepfake technology when she encountered what appeared like a genuine Sadhguru video on social media between February 25 and April 23.

The woman said in her complaint filed at the East CEN police station, "I watched a video of Sadhguru stating that he had been trading with the firm, for which a link is provided below and If you click it and input your name, email, phone number for an amount of $250, your finances will improve greatly."

After she followed the instructions, the woman was contacted by a person calling himself Waleed B, who claimed to represent a company called Mirrox. The fraudster operated through multiple UK-based phone numbers and added the woman to a WhatsApp group with approximately 100 members. She was then directed to various websites and instructed to download the Mirrox stock trading app.

Manipulative strategies

Waleed conducted trading tutorials via Zoom, later introducing another accomplice, Michael C, as a substitute instructor. The scammers employed psychological tactics, with group members regularly sharing fabricated profits and screenshots of supposed account credits to build trust and legitimacy, according to the FIR.

Convinced by these manipulative strategies, the woman began transferring money to bank accounts provided by the fraudsters. By April 23, she had transferred the entire Rs 3.75 crore across multiple transactions, with the fake platform displaying impressive returns on her investments.

The woman realised that she had been cheated only when she attempted to withdraw her profits. The scammers demanded additional payments for processing fees and taxes, raising her suspicions. When she refused to pay these extra charges, the fraudsters ceased all communication.

Since the woman filed her complaint on Tuesday, nearly five months after the fraud concluded, a police officer said that the recovery of the lost money would be challenging. The officer added that the authorities were coordinating with banks to freeze the fraudsters' accounts.

In June this year, Sadhguru Vasudev Jaggi and his Isha Foundation approached Delhi High Court against the misuse of his identity through AI-generated deepfakes.

What is deepfake?

Deepfake is a combination of the terms 'deep learning' and 'fake'. It refers to artificial intelligence software that overlays a digital composite onto an existing video or audio file. Deepfakes are generated using machine learning models that employ neural networks to manipulate images and videos.

In January 2024, actor Rashmika Mandanna's deepfake video went viral and the police arrested Eemani Naveen, an engineer from Andhra Pradesh, who created the video to increase his Instagram followers.
Journalist Rajdeep Sardesai and Infosys Foundation chairperson Sudha Murty, wife of Narayana Murthy, are some of the other prominent personalities whose deepfake videos have been used by cybercriminals.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd