Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 3495

関連インシデント

インシデント 6172 Report
Male student allegedly used AI to generate nude photos of female classmates at a high school in Issaquah, Washington

Loading...
No charges as AI-generated nude pictures of female students circulate around Issaquah school
kiro7.com · 2023

ISSAQUAH, Wash. — KIRO 7 has learned from a parent of an Issaquah High School student that AI-generated pornographic images have been circulating around the school recently. We also learned that a teenage boy took photos of several of his female classmates, used AI to alter them, created nude photos, and then sent them around the school.

The parent who gave us the tip wishes to remain anonymous but told us that the school didn’t inform her right away that her child was a victim.

We asked the school district why they failed to let parents know about this.

The district responded with, “We notified all families of students who were confirmed to have been involved in the incident. We empathize with all students and families connected to this incident.”

We have confirmed with Issaquah Police that they are investigating.

“I’m appalled by it,” parent and grandparent, Sherri Burgess, said when she learned what happened. “I think that there should be ginormous consequences for that.”

And most parents would agree with Burgess, however, attorney Debbie Silberman told us that, unfortunately, it’s not that simple.

“At this moment no one has yet been prosecuted for creating a deep fake with the intention of harming both an adult, a teenager, or a child and the law needs to catch up to that,” Silberman said.

She explained that there are currently no laws state or federal that address the creation of deep fake images.

“What makes this story so heartbreaking is that this is someone’s likeness, this is someone’s identity, this is someone’s reputation and someone’s future and the law needs to catch up and address this very soon,” Silberman said.

She said it would be a similar situation if victims were to file a civil suit.

“I have not yet seen a civil case which has gone after someone who abuses the images of someone to create a deep fake naked photo,” she said. “This is really another form of violence, of technology violence that’s being used the majority of the time against women.”

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • d37129b