Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 6821

Associated Incidents

Incident 13542 Report
Purportedly AI-Altered Fake Nude Images of High School Girls and Women Reportedly Created and Disseminated in Pensacola, Florida

Loading...
Police arrest girl for sending fake, nude photos of females. Why parents are outraged
pnj.com · 2024

The girlfriend of a young Pensacola man who allegedly used Artificial Intelligence to digitally "undress" pictures of more than a dozen girls and women has been arrested for sending the images without consent. The 18-year-old former Washington High School student who parents and victims say made the pictures has not been charged.

Pensacola police issued a press release Thursday afternoon saying investigators have charged 17-year-old Jaylyn Lee with promoting an altered sexual depiction of an identifiable person without consent, a third-degree felony.

Investigators say Lee obtained the cell phone of a "male acquaintance" that contained AI generated nude photos of 14 females. They say she then made a copy of the photos with her cell phone and kept them for a month before disseminating them to 17 high school students, some of whom were not depicted in the AI generated photos.

Investigators obtained a warrant for Lee's arrest, and she turned herself in today, police say.

The News Journal first reported on the AI altered images in October after young women and girls, as well as their parents, reached out about their concerns that the photos were created in the first place and said they wanted the 18-year-old man who made them held accountable.

According to the victims and their families, the young man made dozens of fake nude pictures but they weren't discovered until a girl he had dated found the images on his phone, took a video of them, and allegedly sent them to others including people depicted in the pictures.  

The 18-year-old daughter of Pensacola attorney Autumn Beck Blackledge is one of the victims who says a picture of her and a friend she had posted to social media when she was a minor was downloaded and "undressed" using an online app that allows users to create homemade nude pictures from existing photos.

Julie Harmon's daughter is also among the girls and women depicted in the fake nudes photos. Harmon is outraged that he has not been charged, along with Lee who took the photos off her "ex-boyfriend's" phone, held onto it like collateral and sent out the video of the pictures after he "dumped" her, Harmon said.

"He stole those girls' photos without their permission and edited them, so technically it started with him," Harmon said. "How is it that sending it is a felony but creating it for your own pleasure is not? How are you able to hold onto it and it's not against the law, but the minute you send it, it is. The only thing he's going to get is a possible misdemeanor for cyber stalking and harassment."

Some parents and victims who want the teen who created the fake nudes charged say police told them investigating the case is challenging because state laws haven't kept up with technology.

In 2022, Florida enacted a new "deep fake" law addressing the use of AI in creating altered sexual depictions and political misinformation, but criminal charges related to sexual depictions require that the images be "promoted," shared and transferred.

"It's sexism at its best," Blackledge said about the 17-year-old girl's arrest. "She may have promoted this to people who weren't the victims. I don't know anything about that and if she did, I don't like that. However, how can she be charged with promoting an altered sexual depiction of an identifiable person without consent if the person hadn't created the image of an altered sexual depiction of an unidentifiable person without consent. If he didn't make the pictures, she wouldn't have been able to send them."

Other parents and victims who have spoken to the News Journal echoed similar concerns. Pensacola High School student Lucy Adams Stevenson is also depicted in a fake nude picture found on the teen's phone. Stevenson grew up with him as her "godbrother" and says she was shocked and disgusted to learn he had altered a photo of her. She would also like to see him face consequences for his actions and said she will be disappointed if that doesn't happen.  

"I guess right now, because it wasn't actual (nude) pictures, they can't really charge him with anything and they can only charge him if he sent them to people, which they don't know if he did, so I've heard," Stevenson said. "It's like, OK, he did all this, but nothing can really happen because there's no law against this because AI is so new. It would be nice if there was some kind of law change or addition where this is illegal too, because right now, anyone could do this and get away with it."

The News Journal has not released the 18-year-old man's name because he has not been charged. However, victims and their families say he is from a prominent and politically connected family in Pensacola. They worried that could have an impact on the case, although police told them it did not.

Some parents say Jaylyn Lee's arrest alone is not justice since she is not the one responsible for making the AI generated nude photos.

"If she got arrested after he did, it wouldn't strike me as sexism," said Blackledge. "Right now, it just reeks."

Julie Harmon agrees.

"That's what upsets me is the fact that he's not getting charged with stealing a minor's photo and altering her to where she is identified as sexual nature," Harmon said. "(The police) even said to me that it's not sexual because nudity is not sexual. It just makes zero sense to me whatsoever."

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd