Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 4185

Associated Incidents

Incident 8205 Report
Alleged AI-Generated Photo Alteration Leads to Inappropriate Modifications in Speaker's Conference Picture

Loading...
'Unbuttoned Blouse, Added Bra': Ex-Google, Facebook Employee Claims Photo Edited for AI Conference
news18.com · 2024

A product designer, who worked for Google, YouTube and Facebook, has accused the organisers of an AI conference of editing her photo for a poster. The US-based techie alleged that not only the pockets of her shirt were removed, but her blouse was also unbuttoned, and a bra was added. The organisers have now taken down all the posters and apologised to her.

Elizabeth Laraki is set to attend a conference on UX and AI later this year and the organisers put up an ad for the same. When she came across the ad featuring her photo, she was shocked as her photo “didn’t look right”.

“Is my bra showing in my profile pic and I’ve never noticed…? That’s weird. I open my original photo. No bra showing. I put the two photos side by side and I’m like WTF,” Laraki wrote on X (formerly known as Twitter).

She further revealed, “Someone edited my photo to unbutton my blouse and reveal a made-up hint of a bra or something else underneath.”

Upon discovering the alterations, Laraki reached out to the conference host, whom she described as a “respectable guy with five kids at home”.

Laraki said that he was apologetic regarding the edited picture and explained that the “woman running their social media used a cropped square image from their website”. Laraki added, “She needed it to be more vertical, so she used an AI expand image tool to make the photo taller.”

“AI invented the bottom part of the image (in which it believed that women’s shirts should be unbuttoned further, with some tension around the buttons, and revealing a little hint of something underneath),” she concluded.

Take a look at the original and edited pictures below:

I'm talking at a conference later this year (on UX+AI).I just saw an ad for the conference with my photo and was like, wait, that doesn't look right.

Is my bra showing in my profile pic and I've never noticed…? That's weird.

I open my original photo.No bra showing.

I put… pic.twitter.com/CpoIgiXtUI

— Elizabeth Laraki (@elizlaraki) October 15, 2024

After going through the techie’s viral post, a woman expressed, “Insane. Also, it thinks women’s shirts shouldn’t have pockets.”

“It also hid your necklace for some reason,” another pointed out.

A third commented, “It also hid your necklace for some reason.”

“I could see this becoming a case study. Thanks for posting about it, and glad it wasn’t a person’s active choice,” wrote a fourth.

A fifth shared, “Something is off. As a Photoshop expert, I’d say they went ham with generative fill. Even the medallion on your necklace is gone along with the pockets on your shirt.”

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd