Description: A purported AI-manipulated video falsely showing Citizen TV anchor Swaleh Mdoe reporting on the bombing of a Kenyan doctor's home circulated widely on Facebook. The video reportedly used AI-generated audio and visuals to fabricate a conspiracy in which pharmaceutical companies targeted the doctor for promoting a "miracle cure." In reality, the explosion footage was from Ohio, the doctor was fictitious, and the content aimed to manipulate viewers into purchasing an unproven health product.
Editor Notes: Timeline notes: The reported video emerged sometime in January 2025. By the time Africa Check published its report on January 20, 2025, they claim that it had garnered over 497,000 views. The report was included in the database on April 21, 2025.
Entities
View all entitiesAlleged: Unknown deepfake technology developer and Unknown voice cloning technology developer developed an AI system deployed by scammers and Fraudsters, which harmed Swaleh Mdoe , Citizen TV , General public of Kenya and Media integrity.
Alleged implicated AI systems: Unknown deepfake app and Unknown voice cloning technology
Incident Stats
Incident ID
1036
Report Count
1
Incident Date
2025-01-20
Editors
Daniel Atherton
Incident Reports
Reports Timeline

IN SHORT: A viral Facebook video claims that a Kenyan doctor's house has been destroyed in an explosion linked to his criticism of pharmaceutical companies. It also shows him promoting a "miracle cure" for unnamed chronic diseases. But the …
Variants
A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?
Similar Incidents
Did our AI mess up? Flag the unrelated incidents

Deepfake Obama Introduction of Deepfakes
· 29 reports
Defamation via AutoComplete
· 28 reports

Alexa Plays Pornography Instead of Kids Song
· 16 reports
Similar Incidents
Did our AI mess up? Flag the unrelated incidents

Deepfake Obama Introduction of Deepfakes
· 29 reports
Defamation via AutoComplete
· 28 reports

Alexa Plays Pornography Instead of Kids Song
· 16 reports