Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 4674

Associated Incidents

Incident 92916 Report
Sustained AI-Driven Russian Disinformation Campaigns Doppelgänger, Storm-1516, and Matryoshka Reportedly Disrupting German Federal Elections

Loading...
AI-driven Russian disinformation campaign targets German elections
euractiv.com · 2025

This article is part of our special report The dark side of AI innovation is supercharging disinformation and damaging democracy.

A Russian influence operation known as "Storm-1516" has set up over 100 websites using artificial intelligence (AI) to spread disinformation and meddle in Germany's upcoming elections.

The network was identified by a joint investigation of Correctiv and NewsGuard, which found that the campaign has fabricated false claims about German politicians, including deepfake videos and misleading reports, to manipulate public perception.

The network mimics real German news outlets and promotes nationalist and Eurosceptic narratives, particularly supporting the Alternative für Deutschland (AfD) party while targeting mainstream politicians.

Examples include allegations of sexual misconduct against Green Party candidate Robert Habeck and Foreign Minister Annalena Baerbock, as well as fabricated reports about German military mobilisation and plans to import 1.9 million Kenyan workers.

Ties to American Kremlin propagandist

The operation is linked to a former US police officer, John Mark Dougan, who allegedly uses AI to generate fake news and is reportedly funded by Russia's military intelligence agency, the GRU.

According to a previous NewsGuard research, Dougan, who fled his home in Florida in 2016 to evade criminal charges and now lives in Russia, is linked to scores of fake news sites pushing the Kremlin's propaganda to US audiences.

Articles on these sites amplify narratives of Ukrainian corruption, a weakening Ukrainian military, and a strong Russian economy while also praising figures like AfD leader Alice Weidel and Elon Musk.

The recent campaign has effectively spread falsehoods through a network of pro-Russian influencers, making it more effective than previous disinformation efforts. Many of the websites appear to have been set up in advance, waiting to be activated for maximum impact on the elections.

The tactics used in Germany mirror those seen in the US presidential elections, where "Storm-1516" aimed to influence the outcome in favour of Donald Trump by producing deepfake videos and fabricated stories about figures like Kamala Harris and Tim Waltz.

The campaign's reach has been amplified through social media influencers, some of whom may be financially incentivised to spread disinformation.

Germany on alert

Germany's domestic intelligence agency has acknowledged the likelihood of foreign influence in the 2025 elections, and investigations suggest that the same network responsible for US-focused disinformation is now operating in Germany.

Despite denials from Dougan, evidence points to Russian state involvement, particularly through the Foundation to Battle Injustice, a group linked to the late Yevgeny Prigozhin's Internet Research Agency.

The campaign's primary goal is to discredit politicians and manipulate public discourse, regardless of whether the false claims are ultimately believed. Even after being debunked, disinformation continues to spread, highlighting the challenge of combating Russian election interference.

[Edited By Brian Maguire | Euractiv's Advocacy Lab ]

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd