Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 6096

Associated Incidents

Incident 11681 Report
Purportedly AI-Generated Image of British Army Colonels Captured in Ukraine Reportedly Circulates in Russian Media

Loading...
False claim of UK colonels captured in Ukraine spreads
ukdefencejournal.org.uk · 2025

A claim that two British Army colonels were captured by Russian special forces in Ukraine has circulated online this week, drawing attention on social media and fringe platforms.

The story, which lacks any independent verification, appears to have originated from Russian-aligned sources and includes fabricated images and fictional identities.

There is no evidence that any such incident took place.

The initial report was published on 4 August by EADaily, a Russian outlet that regularly echoes official Kremlin narratives. It claimed that "Colonels Edward Blake and Richard Carroll" were detained by Russian special forces during a covert mission inside Ukraine. The piece alleged that both officers were classified as "illegal combatants" and suggested the UK had attempted to cover up their presence by first stating they were in London, then claiming they were visiting Second World War battlefields.

The story cited the Norwegian website Steigan.no, which has a long history of publishing anti-Western and conspiratorial content. According to the EADaily article, Russian authorities provided "forensic evidence" proving the identity and presence of the officers, although no such evidence has been made public. The article also included a single photograph that purported to show the two officers in uniform, you can see it below.

That image is demonstrably fake. It is unmistakably AI-generated, and there are multiple clear indicators that confirm its inauthenticity. The most obvious is the text on the passports, which appears convincing from a distance but falls apart under scrutiny. Although the covers mimic British passport styling in colour and layout, the writing is gibberish and lacks any legible or coherent wording. This is a common failure point in AI-generated images. Systems can reproduce the visual impression of documents but consistently struggle to generate real-world text. The clipboard in front of the two men shows a similar issue, with meaningless marks arranged to look like data columns but containing no readable or structured content.

The uniforms worn by the kneeling men also reveal the image as a fake. While the camouflage superficially resembles British Army patterns, the details are wrong. There are no rank insignia or unit identifiers, and the collar shapes and button placements are inconsistent with actual issue kit. The belts and webbing appear more decorative than functional, and the radio wires are oddly placed in ways that would be impractical in a real field setting. Military clothing follows strict patterns and standards, especially in operational environments, and these deviations suggest that the uniforms were generated based on visual approximations rather than real references.

The lighting is overly uniform, creating a flat appearance that lacks the depth and variability found in real photographs. The masked soldiers in the background are too similar in posture and facial structure, suggesting they were generated using the same base model with only slight alterations. Their weapons are facing he wrong way, generic and distorted on close inspection, lacking the specific characteristics of real small arms. While the image might appear convincing at a glance, a closer look at text, clothing detail, facial variation, and logical consistency quickly exposes it as an artificial creation.

There are visual inconsistencies in the hands, faces, and background. These anomalies are consistent with artefacts produced by AI image-generation tools. Also, their names do not appear in any Ministry of Defence public records, honours lists, or military directories.

No reputable Western, Ukrainian, or international media outlets have reported anything resembling this story. There has been no announcement from the UK Ministry of Defence, no indication from NATO, and no alert from international bodies such as the Red Cross. These are the kinds of signals that typically accompany the detention of senior military personnel. None are present here.

The UK government has consistently stated that it does not have combat troops operating in Ukraine. Its support has focused on military aid, logistics, and training, largely conducted outside Ukrainian territory.

The names "Edward Blake" and "Richard Carroll" also raise questions. Neither name appears in available British military service records. There is no trace of them in recent Armed Forces appointments, public records, or military press releases. In short, there is no proof these individuals exist, let alone that they were captured.

Amplification and social media

Despite the lack of evidence, the story spread quickly on Telegram channels, conspiracy forums, and smaller fringe websites. On 4 August, George Galloway --- former MP and now leader of the Workers Party of Britain --- posted on X (formerly Twitter):

"Russia nets two British colonels and MI6 spy in Ukraine. They were just battlefield trainspotters says UK. No Vienna!"

The message closely mirrored the tone and structure of the Russian reports. Galloway did not present any additional sources or claim to have independent knowledge of the events. His comment, shared with over a million followers, further pushed the story into the British online discourse.

It is not the first time a fringe claim has entered public view through carefully worded commentary rather than outright endorsement. This method often allows a narrative to spread without full responsibility for its accuracy. The phrasing leaves room for ambiguity while still reinforcing the central implication of deception by Western governments.

A pattern seen before

This story fits a familiar pattern seen across Russian disinformation campaigns. A sensational claim is seeded in Russian state media, echoed by ideologically aligned or conspiratorial outlets, and then repeated in Western political or social circles. The key elements, unnamed sources, synthetic images, unverifiable identities, and unverifiable "forensics", appear designed not to withstand scrutiny but to inject doubt and provoke reaction.

"We've seen this tactic before. create a false narrative, back it with a synthetic image, and wait for someone with a platform to repeat it. It doesn't matter that it's fake. The goal is to inject doubt and get people asking the wrong questions." -- Analyst at a private OSINT company, speaking on condition of anonymity

The same methods were used during the siege of Mariupol in 2022, when false reports claimed that NATO generals had been captured inside the Azovstal plant. Those reports were never substantiated, and the individuals named were never shown or verified.

What makes the current episode notable is how quickly it gained traction and how easily a fabricated narrative was repeated in Western political discourse without corroboration. Even as public awareness of AI-generated images and fake news grows, the tools used to craft these stories are improving, and their emotional appeal remains potent.

At time of writing, there is no evidence that any British Army colonels were captured in Ukraine. There is no record of Colonels Blake or Carroll existing. There is no credible photograph, no formal complaint, no press release, no Red Cross involvement, no ICRC prisoner of war notification, and no allied confirmation. The UK government has not issued contradictory statements, as the Russian report claimed.

Everything that exists traces back to a single article in a Russian-aligned publication, picked up by a handful of ideologically aligned outlets, and circulated with the aid of a fabricated image. It was then referenced by a former British MP on social media without substantiation.

The facts, checked independently, do not support the story in any respect.

Why it matters

These claims are about trust, narrative control, and the ways in which foreign actors test the resilience of open information environments. When disinformation can move freely from Kremlin sources into Western political commentary, the result is not always belief, but confusion, suspicion, and fatigue. Over time, that erosion of clarity serves strategic goals: to weaken resolve, muddy alliances, and destabilise democratic debate.

This story was built for that purpose. It was never confirmed, because it was never meant to be. Its success lies in how far it travelled before being questioned... and in how many will keep repeating it even after it has been shown to be false.

This story shows how disinformation and engagement bait often overlap online. Disinformation is deliberately false or misleading content, usually spread to advance political, strategic, or ideological aims. Engagement bait refers to material designed to provoke strong emotional reactions such as anger or shock, in order to drive clicks, shares, or comments. The two often reinforce each other. A false story does not need to be credible to spread widely if it is provocative enough to encourage discussion or outrage. When such content is repeated or referenced by public figures, it can gain momentum even without evidence. For this reason, we are not linking to the original article.

Doing so would only increase its visibility and spread, despite the lack of any credible basis for the claims.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • b9764d4