Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 3567

関連インシデント

インシデント 59757 Report
Female Students at Westfield High School in New Jersey Reportedly Targeted with Deepfake Nudes

Loading...
Sharing Fake Nude Images Could Become a Federal Crime Under Proposed Law
wsj.com · 2024

It could eventually be a federal crime to share digitally-altered nude images of real people.

Rep. Joseph Morelle (D., N.Y.) on Tuesday re-proposed the "Preventing Deepfakes of Intimate Images Act," which would outlaw the nonconsensual sharing of digitally-altered intimate images. He had previously introduced the bill but has since added Rep. Tom Kean, a Republican from New Jersey, as a co-sponsor. Kean had introduced a bill in November called the AI Labeling Act of 2023, which would require AI-generated content to have clear labeling that identifies it as such.

The bipartisan move Tuesday comes in response to an incident at Westfield High School in New Jersey. Boys there were sharing AI-generated nude images of female classmates without their consent. When the girls found out, they reported it to school administrators, who referred to it in an email to parents as a "very serious incident."

Francesca Mani, a 14-year-old student who was told by the school that her photo had been included in some of the generated images, got angry and decided to advocate for other victims. She and her mother, Dorota Mani, have spent the past two months meeting with lawmakers and were present in Washington, D.C., on Tuesday when Morelle and Kean announced the legislation. 

"What happened to me and my classmates was not cool, and there's no way I'm just going to shrug and let it slide," Francesca said in a joint statement issued by Morelle's office. "I'm here, standing up and shouting for change, fighting for laws so no one else has to feel as lost and powerless as I did."

While people have been able to doctor images with Photoshop and similar software for years, new AI image-makers make it easy to produce entirely fabricated photos. There are now dozens of free or cheap face-swapping and "clothes-removing" tools that can be used to doctor real photos---and it is hard for the human eye to tell real from fake, according to AI experts. Any image can easily be shared widely on social and messaging platforms with a few taps.

Still, faked sexual images of real people are so new, federal law is lagging, legal experts say. A handful of states, including Virginia, California, Minnesota and New York, have outlawed the distribution of faked porn or given victims the right to sue its creators in civil court.

In addition to making the sharing of digitally altered intimate images a criminal offense, Morelle and Kean's proposed legislation also would allow victims to sue offenders in civil court. 

In the statement, Morelle said: "Let's not wait for the next mass incident to make the news. This is happening every day to women everywhere, and it's time to give them back their power." 

Morelle's office didn't immediately respond to a request for further comment.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd