Description: A purported AI-manipulated video falsely showing Citizen TV anchor Swaleh Mdoe reporting on the bombing of a Kenyan doctor's home circulated widely on Facebook. The video reportedly used AI-generated audio and visuals to fabricate a conspiracy in which pharmaceutical companies targeted the doctor for promoting a "miracle cure." In reality, the explosion footage was from Ohio, the doctor was fictitious, and the content aimed to manipulate viewers into purchasing an unproven health product.
Editor Notes: Timeline notes: The reported video emerged sometime in January 2025. By the time Africa Check published its report on January 20, 2025, they claim that it had garnered over 497,000 views. The report was included in the database on April 21, 2025.
Entities
View all entitiesAlleged: Unknown deepfake technology developer and Unknown voice cloning technology developer developed an AI system deployed by scammers and Fraudsters, which harmed Swaleh Mdoe , Citizen TV , General public of Kenya and Media integrity.
Alleged implicated AI systems: Unknown deepfake app and Unknown voice cloning technology
Incident Stats
Incident ID
1036
Report Count
1
Incident Date
2025-01-20
Editors
Daniel Atherton
Incident Reports
Reports Timeline

IN SHORT: A viral Facebook video claims that a Kenyan doctor's house has been destroyed in an explosion linked to his criticism of pharmaceutical companies. It also shows him promoting a "miracle cure" for unnamed chronic diseases. But the …
Variants
A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.
Similar Incidents
Did our AI mess up? Flag the unrelated incidents

Deepfake Obama Introduction of Deepfakes
· 29 reports
Defamation via AutoComplete
· 28 reports

Alexa Plays Pornography Instead of Kids Song
· 16 reports
Similar Incidents
Did our AI mess up? Flag the unrelated incidents

Deepfake Obama Introduction of Deepfakes
· 29 reports
Defamation via AutoComplete
· 28 reports

Alexa Plays Pornography Instead of Kids Song
· 16 reports