Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 3495

Associated Incidents

Incident 6172 Report
Male student allegedly used AI to generate nude photos of female classmates at a high school in Issaquah, Washington

No charges as AI-generated nude pictures of female students circulate around Issaquah school
kiro7.com · 2023

ISSAQUAH, Wash. — KIRO 7 has learned from a parent of an Issaquah High School student that AI-generated pornographic images have been circulating around the school recently. We also learned that a teenage boy took photos of several of his female classmates, used AI to alter them, created nude photos, and then sent them around the school.

The parent who gave us the tip wishes to remain anonymous but told us that the school didn’t inform her right away that her child was a victim.

We asked the school district why they failed to let parents know about this.

The district responded with, “We notified all families of students who were confirmed to have been involved in the incident. We empathize with all students and families connected to this incident.”

We have confirmed with Issaquah Police that they are investigating.

“I’m appalled by it,” parent and grandparent, Sherri Burgess, said when she learned what happened. “I think that there should be ginormous consequences for that.”

And most parents would agree with Burgess, however, attorney Debbie Silberman told us that, unfortunately, it’s not that simple.

“At this moment no one has yet been prosecuted for creating a deep fake with the intention of harming both an adult, a teenager, or a child and the law needs to catch up to that,” Silberman said.

She explained that there are currently no laws state or federal that address the creation of deep fake images.

“What makes this story so heartbreaking is that this is someone’s likeness, this is someone’s identity, this is someone’s reputation and someone’s future and the law needs to catch up and address this very soon,” Silberman said.

She said it would be a similar situation if victims were to file a civil suit.

“I have not yet seen a civil case which has gone after someone who abuses the images of someone to create a deep fake naked photo,” she said. “This is really another form of violence, of technology violence that’s being used the majority of the time against women.”

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • a9df9cf