Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 5460

Associated Incidents

Incident 6766 Report
Alleged Deepfake Audio Depicts Philippines President Ferdinand Marcos Jr. Ordering Military Action

Loading...
Deepfake Audio Of Philippine President Urging Military Action Against China Sparks Concerns
ndtv.com · 2024

In a scary incident, a fabricated audio clip featuring Philippine President Ferdinand Marcos Jr. instructing his military to respond to China has raised significant alarm among Manila's government officials. They caution that this could have implications for the nation's foreign policy. 

The manipulated audio features a deep fake voice of Marcos Jr, where he purportedly indicates to his military to intervene if China poses a threat to the Philippines. He adds that he cannot tolerate further harm to Filipinos by Beijing. 

Deepfake technology involves the use of artificial intelligence to replace aspects of a person's appearance or voice with those of another individual in synthetic media.

"We cannot compromise even a single individual just to protect what rightfully belongs to us," says the voice in the faked audio, which was reportedly released via a YouTube channel with thousands of subscribers. The audio was accompanied by a slideshow of photos showing Chinese vessels in the South China Sea, the *South China Morning Post *reported.

On Tuesday night, the Presidential Community  Communications Office (PCO) issued a public warning about the manipulated media and confirmed that it was entirely fake.

"It has come to the attention of the Presidential Communications Office that there is video content posted on a popular video streaming platform circulating online that has manipulated audio designed to sound like President Ferdinand R. Marcos Jnr," the PCO said in a statement.

"The audio deepfake attempts to make it appear as if the President has directed our Armed Forces of the Philippines to act against a particular foreign country. No such directive exists nor has been made," it added.

The PCO said that it is actively working on measures to combat fake news, misinformation, and disinformation through its Media and Information Literacy Campaign.

"We are also closely coordinating and working with government agencies and relevant private sector stakeholders to actively address the proliferation and malicious use of video and audio deepfakes and other generative AI content," it said.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd