Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Entities

Government of Russia-aligned actors

Incidents involved as Deployer

Incident 54422 Report
Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

2023-05-11

During Turkey's 2023 presidential election, reportedly manipulated and allegedly AI-generated videos, audio, and images were used to smear candidates, purportedly link opposition figures to terrorist groups, and circulate a purported sex tape that reportedly contributed to presidential candidate Muharrem İnce’s withdrawal. These incidents reportedly misled voters, disrupted campaigning, and altered the electoral field.

More

Incident 11342 Report
Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

2025-06-30

In late June 2025, Russian Telegram channels reportedly circulated deepfake videos claiming that Deputy Prime Minister Olha Stefanishyna backed mandatory mobilization of up to one million Ukrainian women starting September 1. Officials reportedly debunked the claim, confirming no such plans or laws exist. The disinformation operation reportedly aimed to incite panic and destabilize Ukraine's domestic situation.

More

Incident 11331 Report
Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

2025-06-30

In late June 2025, Russian Telegram channels reportedly circulated a video containing a purportedly AI-generated audio track impersonating Ukrainian commander Andrii Biletskyi. The audio clip reportedly claimed Ukrainian authorities deliberately avoid identifying fallen soldiers to withhold compensation. Verification reportedly showed the voice was synthetic and mismatched with original May 16 footage of Biletskyi. Hive Moderation reportedly confirmed the audio was overwhelmingly AI-generated.

More

Related Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
 

Entity

Supporters of Recep Tayyip Erdoğan

Incidents involved as Deployer
  • Incident 544
    22 Reports

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

More
Entity

Unknown deepfake technology developers

Incidents involved as Developer
  • Incident 544
    22 Reports

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

  • Incident 1134
    2 Reports

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

Incidents implicated systems
  • Incident 544
    22 Reports

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

More
Entity

Unknown voice cloning technology developers

Incidents involved as Developer
  • Incident 544
    22 Reports

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

  • Incident 1134
    2 Reports

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

Incidents implicated systems
  • Incident 544
    22 Reports

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

More
Entity

Unknown generative AI developers

Incidents involved as Developer
  • Incident 544
    22 Reports

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

More
Entity

Muharrem İnce

Incidents Harmed By
  • Incident 544
    22 Reports

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

More
Entity

Kemal Kilicdaroglu

Incidents Harmed By
  • Incident 544
    22 Reports

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

More
Entity

General public

Incidents Harmed By
  • Incident 544
    22 Reports

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

More
Entity

General public of Turkey

Incidents Harmed By
  • Incident 544
    22 Reports

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

More
Entity

Epistemic integrity

Incidents Harmed By
  • Incident 544
    22 Reports

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

  • Incident 1134
    2 Reports

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

More
Entity

Electoral integrity

Incidents Harmed By
  • Incident 544
    22 Reports

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

More
Entity

Democracy

Incidents Harmed By
  • Incident 544
    22 Reports

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

More
Entity

Truth

Incidents Harmed By
  • Incident 544
    22 Reports

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

  • Incident 1134
    2 Reports

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

More
Entity

National security and intelligence stakeholders

Incidents Harmed By
  • Incident 544
    22 Reports

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

  • Incident 1134
    2 Reports

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

More
Entity

Social media platforms

Incidents implicated systems
  • Incident 544
    22 Reports

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

More
Entity

Russian Telegram channels

Incidents involved as Deployer
  • Incident 1134
    2 Reports

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

More
Entity

Russian disinformation channels

Incidents involved as Deployer
  • Incident 1134
    2 Reports

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

More
Entity

Military of Ukraine

Incidents Harmed By
  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

More
Entity

Government of Ukraine

Incidents Harmed By
  • Incident 1134
    2 Reports

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

More
Entity

General public of Ukraine

Incidents Harmed By
  • Incident 1134
    2 Reports

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

More
Entity

Families of military personnel in Ukraine

Incidents Harmed By
  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

More
Entity

Andriy Biletsky

Incidents Harmed By
  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

More
Entity

Unknown voice cloning technology

Incidents implicated systems
  • Incident 1134
    2 Reports

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

More
Entity

Unknown deepfake technology

Incidents implicated systems
  • Incident 1134
    2 Reports

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

More
Entity

Telegram

Incidents implicated systems
  • Incident 1134
    2 Reports

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

More
Entity

Women of Ukraine

Incidents Harmed By
  • Incident 1134
    2 Reports

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

More
Entity

Olha Stefanishnyna

Incidents Harmed By
  • Incident 1134
    2 Reports

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

More

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • f5f2449