Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Découvrir
Envoyer
  • Bienvenue sur AIID
  • Découvrir les incidents
  • Vue spatiale
  • Vue de tableau
  • Vue de liste
  • Entités
  • Taxonomies
  • Soumettre des rapports d'incident
  • Classement des reporters
  • Blog
  • Résumé de l’Actualité sur l’IA
  • Contrôle des risques
  • Incident au hasard
  • S'inscrire
Fermer
Découvrir
Envoyer
  • Bienvenue sur AIID
  • Découvrir les incidents
  • Vue spatiale
  • Vue de tableau
  • Vue de liste
  • Entités
  • Taxonomies
  • Soumettre des rapports d'incident
  • Classement des reporters
  • Blog
  • Résumé de l’Actualité sur l’IA
  • Contrôle des risques
  • Incident au hasard
  • S'inscrire
Fermer

Problème 3753

Incidents associés

Incident 6482 Rapports
Alleged Deepfake Audio of Imran Khan Calls for Election Boycott, Misleading Pakistan Voters

Loading...
Imran Khan's PTI to boycott polls? Deepfake audio attempts to mislead voters in Pakistan
logicallyfacts.com · 2024

On February 7, 2024, just a day before Pakistan's highly anticipated general elections, from which former Prime Minister Imran Khan was barred because of a graft conviction, a voice recording alleged to be of the imprisoned popular leader circulated on social media.

This audio clip, purported to feature Khan calling for an election boycott by the Pakistan Tehreek-e-Insaf (PTI), was disseminated by several social media accounts late in the evening on X (formerly known as Twitter).

Soon after the audio emerged on social media platforms, PTI took to X to clarify from the party's official account that the audio was fake, alleging that "the controlled media is being used to run fake news about PTI boycotting elections, along with running a fake audio!"
In a conversation with Logically Facts, PTI leader Zulfi Bukhari asserted that the audio was entirely fabricated and dismissed any notion of a boycott. Bukhari emphasized, "Imran Khan and his party only have had one demand for the past two years. That is for free and fair elections. Although these elections are nowhere close to free and fair, they would have boycotted had it been any other country or party. We have maintained to contest as we know the people of Pakistan are overwhelmingly with us."

'AI voice, unnatural noise': Experts analyze the audio

Two independent experts consulted by Logically Facts confirmed with a high degree of certainty that the audio was artificially generated and did not originate from Khan.

Tanmay Srivastava, an expert in audio forensics, identified several anomalies within the viral audio clip. He noted the presence of an unnatural white noise throughout the recording, likely added post-production to simulate authenticity.

"It seems like the noise was introduced after the AI voice was created because it is not properly mixed with the actual audio recording," Srivastava added.

Srivastava also observed that the intonation of Khan's voice throughout the clip was monotone, with no natural variation in pitch or tone that would typically be present in a genuine recording.

"All the sibilant consonants that produce high frequency sounds like s's and t's, or "esses" are not sounding natural, the way they should sound in a normal recording... they sound mechanical or highly processed," he added

Srivastava highlighted specific timestamps where the audio appeared heavily edited or manipulated to correct pronunciation or to add authenticity, indicating a sophisticated level of digital alteration.

"At the 0:38-0:39 mark, the voice cracks due to multiple layering to fix the pronunciation of the word (sibilant consonants). Unnatural noise was added at 0:02-0:03, 0:28-0:29, and 1:15-1:17 to make the audio look more genuine. Also, at 1:03-1:05, unnatural-sounding sibilant consonants can be heard," Srivastava said.

Further analysis by experts from IIT Jodhpur, led by Professor Mayank Vatsa, utilized deep-learning models to evaluate the audio's authenticity. Their findings, which they shared with Logically Facts, were based on four distinct models and indicated with high confidence that the audio was a fabrication created using AI technology, with two models assigning confidence scores above 0.9 and one reaching the maximum score of 1.

Surge of deepfakes in Pakistan elections

The proliferation of deepfake technology has become a significant concern globally, with numerous instances of its application in disrupting election campaigns.

Pakistan's election has been no exception, witnessing several deepfake videos and audio clips designed to mislead the public, including false claims of an election boycott by the PTI party. Another example of this, in addition to Khan's audio clip, is of a potentially deepfake audio, purportedly from a leaked group conversation, citing the strategy behind PTI's potential boycott of the general elections.

PTI has actively countered these narratives, with officials clarifying the authenticity of videos and audio clips attributed to them. Recently, PTI leader Muhammad Basharat Raja took to X to state that a video of him stating that he would boycott the elections was an AI-generated deepfake. He said that he would be standing in the current election.

Meanwhile, the Information Secretary of PTI London flagged another digitally altered video of a PTI leader purportedly announcing a similar boycott.

Interestingly, PTI had previously acknowledged using AI technology in December 2023 to create an audio message from Khan, utilizing text he had written in prison, indicating the complex role of digital innovations in modern political discourse. However, the party had said that the audio was generated after the approval of his lawyers. 

A repeat of the Bangladesh elections?

The issue of deepfakes is not confined to Pakistan. The recent Bangladesh elections also encountered similar challenges, with deepfake videos of independent candidates falsely declaring their withdrawal from the race, underscoring the global penetration of AI technology in electoral processes.

As the world braces for elections in over 50 countries in 2024, the escalation of deepfake technology underscores an urgent need for vigilance and verification to preserve the integrity of democratic institutions.

Lire la source

Recherche

  • Définition d'un « incident d'IA »
  • Définir une « réponse aux incidents d'IA »
  • Feuille de route de la base de données
  • Travaux connexes
  • Télécharger la base de données complète

Projet et communauté

  • À propos de
  • Contacter et suivre
  • Applications et résumés
  • Guide de l'éditeur

Incidents

  • Tous les incidents sous forme de liste
  • Incidents signalés
  • File d'attente de soumission
  • Affichage des classifications
  • Taxonomies

2024 - AI Incident Database

  • Conditions d'utilisation
  • Politique de confidentialité
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd