Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 1644

関連インシデント

インシデント 1989 Report
Deepfake Video of Ukrainian President Yielding to Russia Posted on Ukrainian Websites and Social Media

Loading...
Bad Deepfake of Zelenskyy Shared on Ukraine News Site in Reported Hack
snopes.com · 2022

A poorly done deepfake video of Ukrainian President Volodymyr Zelenskyy asking civilians to lay down their arms to the Russian military was circulated on social media in March 2022. In addition to getting some views on social media, a summary of this video was also broadcast on a Ukrainian news station after it was reportedly hacked. The running text at the bottom of this broadcast mentions the message from Zelenskyy’s deepfake video.

Ukraine 24 posted a message on Facebook stating that this message was added to the broadcast after the network had been hacked:

The message reads in English (translated via Google):

The running line of the “Ukraine 24” TV channel and the “Today” website were hacked by enemy hackers and broadcast Zelensky’s message about alleged “capitulation.”

!!! THIS IS FAKE! FAKE !

Friends, we have repeatedly warned about this. No one is going to give up. Especially, in the circumstances when the Russian army suffers losses in battles with the Ukrainian army!

Zelenskyy, too, appeared to address this rumor in a video posted to his Facebook page. That video included the caption “Ми вдома і захищаємо Україну” or “We are at home and defending Ukraine” and, according to Ukraine 24, a message to Russian soldiers to lay down their arms.

How To Spot a Deepfake Video

This likely isn’t the last deepfake video we will encounter during Russia’s invasion of Ukraine. Propagandists have been working overtime to change the narrative of the war, and deepfakes, a term used to describe digitally edited videos to make it seem as if a real person is saying or doing something they never said or did, are just another tool in their disinformation toolbox.

In the example above, most viewers can likely tell that the Zelenskyy footage is fake simply by looking at it. His head, for example, doesn’t seem to quite fit on his neck. The best strategy to identify deepfakes, however, is to look for their source. Zelenskyy has recorded several videos using the same background on his social media profiles and on the official social media pages of the Ukrainian government. This deepfake, needless to say, was never posted to these pages.

If you see a video that you think might be fake, try taking a screenshot from the video and then running a reverse-image search on Google Images, TinEye, or another reverse-image search engine. You can also send the video to Snopes and we’ll do our best to authenticate it.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd