Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 1036: Purported AI-Manipulated News Clip Fabricates Explosion and Doctor's Murder Plot for Scam

Description: A purported AI-manipulated video falsely showing Citizen TV anchor Swaleh Mdoe reporting on the bombing of a Kenyan doctor's home circulated widely on Facebook. The video reportedly used AI-generated audio and visuals to fabricate a conspiracy in which pharmaceutical companies targeted the doctor for promoting a "miracle cure." In reality, the explosion footage was from Ohio, the doctor was fictitious, and the content aimed to manipulate viewers into purchasing an unproven health product.
Editor Notes: Timeline notes: The reported video emerged sometime in January 2025. By the time Africa Check published its report on January 20, 2025, they claim that it had garnered over 497,000 views. The report was included in the database on April 21, 2025.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: Unknown deepfake technology developer and Unknown voice cloning technology developer developed an AI system deployed by scammers and Fraudsters, which harmed Swaleh Mdoe , Citizen TV , General public of Kenya and Media integrity.
Alleged implicated AI systems: Unknown deepfake app and Unknown voice cloning technology

Incident Stats

Incident ID
1036
Report Count
1
Incident Date
2025-01-20
Editors
Daniel Atherton

Incident Reports

Reports Timeline

+1
Ignore deepfake video claiming Kenyan TV station reported on doctor's house being blown up for criticising pharmaceutical firms
Ignore deepfake video claiming Kenyan TV station reported on doctor's house being blown up for criticising pharmaceutical firms

Ignore deepfake video claiming Kenyan TV station reported on doctor's house being blown up for criticising pharmaceutical firms

africacheck.org

Ignore deepfake video claiming Kenyan TV station reported on doctor's house being blown up for criticising pharmaceutical firms
africacheck.org · 2025

IN SHORT: A viral Facebook video claims that a Kenyan doctor's house has been destroyed in an explosion linked to his criticism of pharmaceutical companies. It also shows him promoting a "miracle cure" for unnamed chronic diseases. But the …

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Deepfake Obama Introduction of Deepfakes

Deepfake Obama Introduction of Deepfakes

Jul 2017 · 29 reports
Defamation via AutoComplete

Defamation via AutoComplete

Apr 2011 · 28 reports
Alexa Plays Pornography Instead of Kids Song

Alexa Plays Pornography Instead of Kids Song

Dec 2016 · 16 reports
Previous IncidentNext Incident

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Deepfake Obama Introduction of Deepfakes

Deepfake Obama Introduction of Deepfakes

Jul 2017 · 29 reports
Defamation via AutoComplete

Defamation via AutoComplete

Apr 2011 · 28 reports
Alexa Plays Pornography Instead of Kids Song

Alexa Plays Pornography Instead of Kids Song

Dec 2016 · 16 reports

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 1420c8e