Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 3076

関連インシデント

インシデント 54422 Report
Alleged Use of Purportedly AI-Generated and Manipulated Media to Misrepresent Candidates and Disrupt Turkey's 2023 Presidential Election

Loading...
Turkey’s deepfake-influenced election spells trouble
fortune.com · 2023

Yesterday’s election in Turkey was momentous for a variety of reasons, with major implications for regional geopolitics and the domestic economy. But historians will remember these polls for one thing in particular: the role of tech-powered disinformation.

Just days before the Sunday election, one of the best-known opposition candidates, Muharrem İnce, pulled out. An alleged sex tape had been circulating on social media and he claimed his appearance in it was the result of deepfake technology; nonetheless, he withdrew from the contest.

Kemal Kılıçdaroğlu, the main opposition candidate, blamed Russia for this and other disinformation doing the rounds in the run-up to the election. “Get your hands off the Turkish state,” he tweeted in Moscow’s direction while telling Reuters he had evidence of Russia’s involvement. (Russian President Vladimir Putin and incumbent Turkish President Recep Tayyip Erdoğan have a complex but generally friendly relationship, and Erdoğan’s tendency to play a divisive role within NATO suits Putin just fine in the context of his war on Ukraine.)

What difference did İnce’s departure make to the election result? It’s hard to tell. He was polling at less than 2% and, as a longstanding rival of Erdoğan—İnce was the main opposition candidate in the 2018 elections—those votes probably went to Kılıçdaroğlu. 

But given that Erdoğan only barely failed to clear the 50% threshold that would have avoided the run-off that will now take place in a couple weeks—he got 49.5% to Kılıçdaroğlu’s 44.9%—this election was one where every percentage point mattered.

And the alleged İnce deepfake was not the only one out there. The fact-checking site Teyit (think Turkish Snopes) has debunked many such pieces of disinformation and one of them even got brandished by Erdoğan himself—at a rally, the president showed his supporters a Kılıçdaroğlu campaign video that had been modified to depict members of the banned PKK terrorist organization singing the opposition party’s song. Somewhat less damningly, a Twitter user claimed to have trained an A.I. on Kılıçdaroğlu’s voice and used it to generate a version of his campaign speech delivered in flawless English.

Faked photos and even videos have been around for a while, but thanks to A.I. they are so much easier to make now, in a form that convinces many of those who aren’t paying close attention—which is most people. Soon, they will be incredibly simple to generate at a quality level that will fool even more voters. From now on, as Fortune reported recently, this will be a feature of most elections, including next year’s U.S. presidential election, and I think it’s fair to say we aren’t ready.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd