Description: Purported AI-generated deepfakes have circulated widely on Sudanese social media during the ongoing conflict, allegedly spreading false claims, impersonating public figures, and distorting political discourse. Reported incidents include fabricated videos and audio used to mislead or satirize, some of which gained traction among journalists and officials. Observers warn that poor detection tools and limited moderation may be enabling the manipulation of public trust.
Editor Notes: This incident ID covers multiple reported uses of AI-generated disinformation in Sudan's civil conflict. (1) In April 2023, an alleged AI-generated voice recording falsely attributed to U.S. Ambassador John Godfrey circulated online, reportedly suggesting a plot to impose secularism. See Incident 1088 for the specific archive on this incident. (2) In August 2023, a TikTok campaign under "The Voice of Sudan" purportedly used voice-cloning tools to simulate former President Omar al-Bashir, with recordings allegedly repurposed from a Sudanese commentator's livestreams. For the specific archive on this one, see Incident 1087. (3) In September 2023, tech-savvy Sudanese individuals reportedly used deepfake tools to create satirical videos, including one showing RSF leader Hemedti singing; some content was later framed as serious disinformation. (4) In October 2023, BBC investigators documented the Bashir impersonation campaign as a prominent example of synthetic media use, which forms the backbone of Incident 1087. (5) In March 2024, AI-generated audio was allegedly used to simulate the Sudanese Armed Forces commander ordering attacks on civilians; an alleged separate deepfake recording depicted a fabricated RSF-Freedom and Change coalition meeting discussing a coup. (6) Also in March 2024, Sudanese Armed Forces supporters reportedly cast doubt on genuine recordings of Hemedti by claiming they were AI-generated, an example of the liar's dividend. (7) In April 2024, an AI-generated image of a building purportedly bombed by the SAF went viral, reportedly misleading many political figures. Each of these incidents may form their own discrete incident IDs; this incident ID is a hub. The initial report that forms the backbone of this archive was published 10/23/2024; it was added to the database on 05/31/2025.
Entities
View all entitiesAlleged: Unknown voice cloning technology developers and Unknown deepfake technology developers developed an AI system deployed by Various actors aligned with Sudanese factions, which harmed Truth , Political leaders of Sudan , Journalists of Sudan and General public of Sudan.
Alleged implicated AI systems: Unknown voice cloning technology , Unknown deepfake technology and Social media
Incident Stats
Incident ID
1089
Report Count
1
Incident Date
2024-10-23
Editors
Daniel Atherton
Incident Reports
Reports Timeline
n April 2024, an image of a building engulfed in flames went viral on Facebook in Sudan. The image was widely shared with captions that claimed the building was part of Al-Jazeera University in Wad Madani city, and the Sudanese army had bom…
Variants
A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.