Incidente 1089: Contenido generado por IA circula ampliamente en Sudán en medio del conflicto civil y el vacío de información
Descripción: Supuestas falsificaciones profundas generadas por IA han circulado ampliamente en las redes sociales sudanesas durante el conflicto actual, supuestamente difundiendo afirmaciones falsas, suplantando la identidad de figuras públicas y distorsionando el discurso político. Los incidentes reportados incluyen videos y audios inventados utilizados para engañar o satirizar, algunos de los cuales generaron interés entre periodistas y funcionarios. Los observadores advierten que las deficientes herramientas de detección y la escasa moderación podrían estar facilitando la manipulación de la confianza pública.
Editor Notes: This incident ID covers multiple reported uses of AI-generated disinformation in Sudan's civil conflict. (1) In April 2023, an alleged AI-generated voice recording falsely attributed to U.S. Ambassador John Godfrey circulated online, reportedly suggesting a plot to impose secularism. See Incident 1088 for the specific archive on this incident. (2) In August 2023, a TikTok campaign under "The Voice of Sudan" purportedly used voice-cloning tools to simulate former President Omar al-Bashir, with recordings allegedly repurposed from a Sudanese commentator's livestreams. For the specific archive on this one, see Incident 1087. (3) In September 2023, tech-savvy Sudanese individuals reportedly used deepfake tools to create satirical videos, including one showing RSF leader Hemedti singing; some content was later framed as serious disinformation. (4) In October 2023, BBC investigators documented the Bashir impersonation campaign as a prominent example of synthetic media use, which forms the backbone of Incident 1087. (5) In March 2024, AI-generated audio was allegedly used to simulate the Sudanese Armed Forces commander ordering attacks on civilians; an alleged separate deepfake recording depicted a fabricated RSF-Freedom and Change coalition meeting discussing a coup. (6) Also in March 2024, Sudanese Armed Forces supporters reportedly cast doubt on genuine recordings of Hemedti by claiming they were AI-generated, an example of the liar's dividend. (7) In April 2024, an AI-generated image of a building purportedly bombed by the SAF went viral, reportedly misleading many political figures. Each of these incidents may form their own discrete incident IDs; this incident ID is a hub. The initial report that forms the backbone of this archive was published 10/23/2024; it was added to the database on 05/31/2025.
Entidades
Ver todas las entidadesAlleged: Unknown voice cloning technology developers y Unknown deepfake technology developers developed an AI system deployed by Various actors aligned with Sudanese factions, which harmed Truth , Political leaders of Sudan , Journalists of Sudan y General public of Sudan.
Sistemas de IA presuntamente implicados: Unknown voice cloning technology , Unknown deepfake technology y Social media
Estadísticas de incidentes
ID
1089
Cantidad de informes
1
Fecha del Incidente
2024-10-23
Editores
Daniel Atherton
Informes del Incidente
Cronología de Informes
En abril de 2024, la imagen de un edificio envuelto en llamas se viralizó en Facebook en Sudán. La imagen fue ampliamente compartida con subtítulos que afirmaban que el edificio formaba parte de la Universidad Al-Jazeera en la ciudad de Wad…
Variantes
Una "Variante" es un incidente de IA similar a un caso conocido—tiene los mismos causantes, daños y sistema de IA. En lugar de enumerarlo por separado, lo agrupamos bajo el primer incidente informado. A diferencia de otros incidentes, las variantes no necesitan haber sido informadas fuera de la AIID. Obtenga más información del trabajo de investigación.
¿Has visto algo similar?