Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Découvrir
Envoyer
  • Bienvenue sur AIID
  • Découvrir les incidents
  • Vue spatiale
  • Vue de tableau
  • Vue de liste
  • Entités
  • Taxonomies
  • Soumettre des rapports d'incident
  • Classement des reporters
  • Blog
  • Résumé de l’Actualité sur l’IA
  • Contrôle des risques
  • Incident au hasard
  • S'inscrire
Fermer
Découvrir
Envoyer
  • Bienvenue sur AIID
  • Découvrir les incidents
  • Vue spatiale
  • Vue de tableau
  • Vue de liste
  • Entités
  • Taxonomies
  • Soumettre des rapports d'incident
  • Classement des reporters
  • Blog
  • Résumé de l’Actualité sur l’IA
  • Contrôle des risques
  • Incident au hasard
  • S'inscrire
Fermer
Entités

Government of Russia-aligned actors

Incidents involved as Deployer

Incident 54422 Rapports
Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

2023-05-11

During Turkey's 2023 presidential election, reportedly manipulated and allegedly AI-generated videos, audio, and images were used to smear candidates, purportedly link opposition figures to terrorist groups, and circulate a purported sex tape that reportedly contributed to presidential candidate Muharrem İnce’s withdrawal. These incidents reportedly misled voters, disrupted campaigning, and altered the electoral field.

Plus

Incident 11342 Rapports
Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

2025-06-30

In late June 2025, Russian Telegram channels reportedly circulated deepfake videos claiming that Deputy Prime Minister Olha Stefanishyna backed mandatory mobilization of up to one million Ukrainian women starting September 1. Officials reportedly debunked the claim, confirming no such plans or laws exist. The disinformation operation reportedly aimed to incite panic and destabilize Ukraine's domestic situation.

Plus

Incident 11331 Rapport
Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

2025-06-30

In late June 2025, Russian Telegram channels reportedly circulated a video containing a purportedly AI-generated audio track impersonating Ukrainian commander Andrii Biletskyi. The audio clip reportedly claimed Ukrainian authorities deliberately avoid identifying fallen soldiers to withhold compensation. Verification reportedly showed the voice was synthetic and mismatched with original May 16 footage of Biletskyi. Hive Moderation reportedly confirmed the audio was overwhelmingly AI-generated.

Plus

Entités liées
Autres entités liées au même incident. Par exemple, si le développeur d'un incident est cette entité mais que le responsable de la mise en œuvre est une autre entité, ils sont marqués comme entités liées.
 

Entity

Supporters of Recep Tayyip Erdoğan

Incidents involved as Deployer
  • Incident 544
    22 Report

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

Plus
Entity

Unknown deepfake technology developers

Incidents involved as Developer
  • Incident 544
    22 Report

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

  • Incident 1134
    2 Report

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

Incidents implicated systems
  • Incident 544
    22 Report

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

Plus
Entity

Unknown voice cloning technology developers

Incidents involved as Developer
  • Incident 544
    22 Report

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

  • Incident 1134
    2 Report

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

Incidents implicated systems
  • Incident 544
    22 Report

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

Plus
Entity

Unknown generative AI developers

Incidents involved as Developer
  • Incident 544
    22 Report

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

Plus
Entity

Muharrem İnce

Affecté par des incidents
  • Incident 544
    22 Report

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

Plus
Entity

Kemal Kilicdaroglu

Affecté par des incidents
  • Incident 544
    22 Report

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

Plus
Entity

General public

Affecté par des incidents
  • Incident 544
    22 Report

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

Plus
Entity

General public of Turkey

Affecté par des incidents
  • Incident 544
    22 Report

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

Plus
Entity

Epistemic integrity

Affecté par des incidents
  • Incident 544
    22 Report

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

  • Incident 1134
    2 Report

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

Plus
Entity

Electoral integrity

Affecté par des incidents
  • Incident 544
    22 Report

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

Plus
Entity

Democracy

Affecté par des incidents
  • Incident 544
    22 Report

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

Plus
Entity

Truth

Affecté par des incidents
  • Incident 544
    22 Report

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

  • Incident 1134
    2 Report

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

Plus
Entity

National security and intelligence stakeholders

Affecté par des incidents
  • Incident 544
    22 Report

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

  • Incident 1134
    2 Report

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

Plus
Entity

Social media platforms

Incidents implicated systems
  • Incident 544
    22 Report

    Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

Plus
Entity

Russian Telegram channels

Incidents involved as Deployer
  • Incident 1134
    2 Report

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

Plus
Entity

Russian disinformation channels

Incidents involved as Deployer
  • Incident 1134
    2 Report

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

Plus
Entity

Military of Ukraine

Affecté par des incidents
  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

Plus
Entity

Government of Ukraine

Affecté par des incidents
  • Incident 1134
    2 Report

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

Plus
Entity

General public of Ukraine

Affecté par des incidents
  • Incident 1134
    2 Report

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

Plus
Entity

Families of military personnel in Ukraine

Affecté par des incidents
  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

Plus
Entity

Andriy Biletsky

Affecté par des incidents
  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

Plus
Entity

Unknown voice cloning technology

Incidents implicated systems
  • Incident 1134
    2 Report

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

Plus
Entity

Unknown deepfake technology

Incidents implicated systems
  • Incident 1134
    2 Report

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

Plus
Entity

Telegram

Incidents implicated systems
  • Incident 1134
    2 Report

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

  • Incident 1133
    1 Report

    Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign

Plus
Entity

Women of Ukraine

Affecté par des incidents
  • Incident 1134
    2 Report

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

Plus
Entity

Olha Stefanishnyna

Affecté par des incidents
  • Incident 1134
    2 Report

    Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women

Plus

Recherche

  • Définition d'un « incident d'IA »
  • Définir une « réponse aux incidents d'IA »
  • Feuille de route de la base de données
  • Travaux connexes
  • Télécharger la base de données complète

Projet et communauté

  • À propos de
  • Contacter et suivre
  • Applications et résumés
  • Guide de l'éditeur

Incidents

  • Tous les incidents sous forme de liste
  • Incidents signalés
  • File d'attente de soumission
  • Affichage des classifications
  • Taxonomies

2024 - AI Incident Database

  • Conditions d'utilisation
  • Politique de confidentialité
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • f5f2449