Description: A reportedly AI-manipulated video circulated on social media that falsely portrayed U.S. President-elect Donald Trump calling for the release of Nigerian separatist leader Nnamdi Kanu and threatening sanctions against Nigeria. The video allegedly used old footage paired with AI-generated audio mimicking Trump's voice. Fact-checkers identified the video as inauthentic, noting mismatched lip-sync, fabricated quotes, and the fictitious date of "November 31st."
Editor Notes: Timeline notes: The reported posting of the video to social media was November 20, 2023. FactCheckAfrica published its report on the video on November 22, 2024. The report was included in the database on April 21, 2025.
Entities
View all entitiesAlleged: Unknown deepfake technology developer and Unknown voice cloning technology developer developed an AI system deployed by Unknown actors, which harmed Donald Trump , Nnamdi Kanu , General public of Nigeria , Media integrity and Electoral integrity.
Alleged implicated AI systems: Unknown deepfake app , Unknown voice cloning technology , TikTok and Social media platforms
Incident Stats
Incident ID
1035
Report Count
1
Incident Date
2024-11-20
Editors
Daniel Atherton
Incident Reports
Reports Timeline

Claim
A viral video on social media shows the President-elect of the United State, Donald Trump urging the Nigerian government to release Nnamdi Kanu, the leader of the Indigenous People of Biafra (IPOB).
Verdict
False. The video is d…
Variants
A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Similar Incidents
Did our AI mess up? Flag the unrelated incidents