Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 3321

関連インシデント

インシデント 60110 Report
AI-Generated Fake Audio of Verbal Abuse Incident Circulates of British Labour Leader Keir Starmer

Loading...
UK opposition leader targeted by AI-generated fake audio smear
therecord.media · 2023

An audio clip posted to social media on Sunday, purporting to show Britain’s opposition leader Keir Starmer verbally abusing his staff, has been debunked as being AI-generated by private-sector and British government analysis.

The audio of Keir Starmer was posted on X (formerly Twitter) by a pseudonymous account on Sunday morning, the opening day of the Labour Party conference in Liverpool. The account asserted that the clip, which has now been viewed more than 1.4 million times, was genuine, and that its authenticity had been corroborated by a sound engineer.

Ben Colman, the co-founder and CEO of Reality Defender — a deepfake detection business — disputed this assessment when contacted by Recorded Future News: “We found the audio to be 75% likely manipulated based on a copy of a copy that's been going around (a transcoding).

“As we don't have the ground truth, we give a probability score (in this case 75%) and never a definitive score (‘this is fake’ or ‘this is real’), leaning much more towards ‘this is likely manipulated’ than not,” said Colman.

“It is also our opinion that the creator of this file added background noise to attempt evasion of detection, but our system accounts for this as well,” he said.

The audio was criticized on a bipartisan basis, despite the highly contested political environment in the United Kingdom — with polls generally showing the Labour Party 17 points ahead of the incumbent Conservatives.

Simon Clarke, a Conservative Party MP, warned on social media: “There is a deep fake audio circulating this morning of Keir Starmer - ignore it.” The security minister Tom Tugendhat, also a Conservative MP, also warned of the “fake audio recording” and implored Twitter users not to “forward to amplify it.”

“Deepfakes threaten our freedom. That’s why the Defending Democracy Taskforce and the work the PM is doing on AI are critical for protecting us all,” added Tugendhat. The word “deepfake” is used colloquially to refer to any kind of synthetic media generated by AI technologies.

The Defending Democracy Taskforce was established in November 2022 with the mission of reducing “the risk of foreign interference to the U.K.’s democratic processes, institutions, and society, and ensure that these are secure and resilient to threats of foreign interference,” accordion to a parliamentary question previously answered by Tugendhat.

Recorded Future News understands an analysis of the audio file by the British government confirmed it to be fake.

Screenshot of the social media post featuring the audio file.

Authorities in the U.K. are bracing for this kind of interference ahead of the country’s general election next year, in the wake of similar attempts to influence the recent elections in Slovakia.

Two days before the polls opened there on September 30, faked audio clips were published on social media attempting to incriminate an opposition party leader and a journalist with rigging the election by plotting to purchase votes.

Publicly debunking the audio was a challenge because of the country's election laws, which strictly ban both the media and politicians making campaigning announcements in the two days before the polls open.

As reported by Wired, as an audio post the fake also “exploited a loophole in Meta’s manipulated-media policy, which dictates only faked videos — where a person has been edited to say words they never said — go against its rules.”

It is not clear who produced the fake audio in either the Slovakian or British cases.

The account which posted the Keir Starmer smear had previously tweeted: “Let me be clear. I am unequivocally PRO smear tactics against those who engage in smear tactics themselves. People lie about Keir Starmer? Good. And I'm one of them.”

That tweet has now been deleted, although the fake audio remains available.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd