Government of Russia-aligned actors
Incidents involved as Deployer
インシデント 54422 Report
Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election
2023-05-11
During Turkey's 2023 presidential election, reportedly manipulated and allegedly AI-generated videos, audio, and images were used to smear candidates, purportedly link opposition figures to terrorist groups, and circulate a purported sex tape that reportedly contributed to presidential candidate Muharrem İnce’s withdrawal. These incidents reportedly misled voters, disrupted campaigning, and altered the electoral field.
もっとインシデント 11342 Report
Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women
2025-06-30
In late June 2025, Russian Telegram channels reportedly circulated deepfake videos claiming that Deputy Prime Minister Olha Stefanishyna backed mandatory mobilization of up to one million Ukrainian women starting September 1. Officials reportedly debunked the claim, confirming no such plans or laws exist. The disinformation operation reportedly aimed to incite panic and destabilize Ukraine's domestic situation.
もっとインシデント 11331 Report
Reported AI-Generated Audio of Ukrainian Commander Andriy Biletsky Used in Russian Disinformation Campaign
2025-06-30
In late June 2025, Russian Telegram channels reportedly circulated a video containing a purportedly AI-generated audio track impersonating Ukrainian commander Andrii Biletskyi. The audio clip reportedly claimed Ukrainian authorities deliberately avoid identifying fallen soldiers to withhold compensation. Verification reportedly showed the voice was synthetic and mismatched with original May 16 footage of Biletskyi. Hive Moderation reportedly confirmed the audio was overwhelmingly AI-generated.
もっと関連団体
同じインシデントに関連するその他のエンティティ。たとえば、インシデン トの開発者がこのエンティティで、デプロイヤーが別のエンティティである場合、それらは関連エンティティとしてマークされます。
関連団体
Unknown deepfake technology developers
Incidents involved as Developer
- インシデント 54422 レポート
Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election
- インシデント 11342 レポート
Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women
Incidents implicated systems
Unknown voice cloning technology developers
Incidents involved as Developer
- インシデント 54422 レポート
Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election
- インシデント 11342 レポート
Reported Deepfakes of Ukrainian Deputy PM Olha Stefanishyna Allegedly Supporting Fictional Mobilization Plan for Women