Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 4186

Associated Incidents

Incident 8205 Report
Alleged AI-Generated Photo Alteration Leads to Inappropriate Modifications in Speaker's Conference Picture

Loading...
AI edits ex-Google techie's image to add a 'bra'; netizens shocked
news9live.com · 2024

Washington DC: A woman took to X to share a bizarre and unique experience with netizens. She claimed that an image she had given for a conference was altered by AI with a bra. The incident left netizens confused and weirded out. It also sparked a conversation regarding the challenges of using AI.

Elizabeth Laraki, a former Google employee, shared that she was speaking at a conference on UX and AI design. On receiving the poster ad for the conference, Laraki felt something wasn’t right about her picture. It looked as if her bra was showing. To confirm, Laraki compared the published image with the original one. To her shock, the original image did not have the bra bit. “Someone edited my photo to unbutton my blouse and reveal a made-up hint of a bra or something else underneath,” wrote the woman on X. 

The techie immediately sent out a mail to the host of the event. The man apologized and got back to her after reporting the concern to the woman who ran the event’s social media. It was then that Laraki realized the true culprit behind the image. The woman who managed the event’s social media stated that she used an AI expand image tool to edit the image to make it look vertically taller.

“AI invented the bottom part of the image (in which it believed that women’s shirts should be unbuttoned further, with some tension around the buttons, and revealing a little hint of something underneath),” said Laraki in the post. She also stated that the organisers took down all the content that used the AI-generated image. 

The incident shed light on a concerning aspect of AI. “This is definitely an appalling result from seemingly ‘innocuous’ AI tools,” said one. Another thought this incident would make an excellent case study.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd