Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 4163

Associated Incidents

Incident 8178 Report
Purportedly AI-Generated Images Reportedly Spread Misinformation During Hurricane Helene Response

Loading...
Heartbreaking images or something else? AI-generated deepfakes of Hurricane Helene victims go viral
wionews.com · 2024

Just days after Hurricane Helene hit the United States, a wave of misinformation has flooded the internet. Among it are two digitally manipulated images that supposedly show a distraught child in a boat amidst floodwaters. 

At first glance, the images appear to depict a child in a life jacket clutching a dog as heavy rain from the devastating storm pours down. However, closer inspection reveals flaws that expose the images as fakes.

As reported by Forbes, the two nearly identical photos show significant alterations. One glaring issue is that the child has an extra misplaced finger in one image. Furthermore, she is wearing different shirts and is seated in distinct types of boats in each picture. The dog in the images also varies slightly, with its coat appearing darker in the more pixelated version.

Not only common people, even Utah Senator Mike Lee fell victim to the doctored images, sharing one on X (formerly Twitter) with the caption, "Caption this photo." He later deleted the post after users pointed out its inauthenticity. Similarly, a Facebook user shared the manipulated image, calling for help for the "babies and their families."

Experts warn that digitally altered images depicting disasters can have serious long-term repercussions. They complicate relief efforts, distort narratives, erode public trust in crisis situations, and can even lead to scams that exploit people's goodwill, though it remains unclear if this particular image has been used for fraudulent fundraising.

With misinformation spiralling around Hurricane Helene, the Federal Emergency Management Agency (FEMA) has now established a Rumor Response page. This page aims to debunk false claims, including allegations that FEMA is confiscating survivors' properties, distributing aid based on demographic criteria, or confiscating donations and supplies.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd