Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 1439: Former New Orleans Isidore Newman School Teacher Allegedly Used AI to Create Fake Nude Images from Social Media Photos of Girls, Including Students

Responded
Description: Louisiana authorities alleged that former Isidore Newman School teacher Benoit Cransac used an online AI platform to alter social media photos of girls and generate fake nude images, including collages. Local reports said he was rearrested on 60 unlawful-deepfake charges, and that investigators believed some depicted girls from the New Orleans area, with later coverage referring to teenage girls and students.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: Unknown image generator developers and Unknown deepfake technology developers developed an AI system deployed by Benoit Cransac, which harmed minors , Isidore Newman School community , Girls from New Orleans , Girls and Epistemic integrity.
Alleged implicated AI systems: Unknown image generator technology and Unknown deepfake technology

Incident Stats

Incident ID
1439
Report Count
3
Incident Date
2026-01-08
Editors
Daniel Atherton

Incident Reports

Reports Timeline

+1
Newman School teacher arrested on suspicion of possessing child sexual abuse material - Response
+1
Former New Orleans teacher accused of using AI to make fake nude images from social media photos
Loading...
Newman School teacher arrested on suspicion of possessing child sexual abuse material

Newman School teacher arrested on suspicion of possessing child sexual abuse material

fox8live.com

Loading...
Former New Orleans teacher accused of using AI to make fake nude images from social media photos

Former New Orleans teacher accused of using AI to make fake nude images from social media photos

fox8live.com

Loading...
New Orleans teacher previously arrested in child sex abuse investigation now facing new deepfake charges

New Orleans teacher previously arrested in child sex abuse investigation now facing new deepfake charges

wdsu.com

Loading...
Newman School teacher arrested on suspicion of possessing child sexual abuse material
fox8live.com · 2026
FOX 8 Staff post-incident response

NEW ORLEANS (WVUE) - The Louisiana Attorney General's Office confirmed Thursday that Benoit Cransac, 49, has been arrested on 22 counts of possessing child sexual abuse material.

Cransac was identified as a teacher and coach at Isidore Newm…

Loading...
Former New Orleans teacher accused of using AI to make fake nude images from social media photos
fox8live.com · 2026

NEW ORLEANS (WVUE) - A former New Orleans teacher used artificial intelligence to create fake nude images of females after obtaining their photos from social media, according to investigators.

Benoit Cransac, 49, was arrested on 60 counts o…

Loading...
New Orleans teacher previously arrested in child sex abuse investigation now facing new deepfake charges
wdsu.com · 2026

A former Isidore Newman School teacher who has been arrested multiple times in connection with an investigation linked to child sexual abuse material is facing additional charges.

According to Louisiana Attorney General Liz Murrill's office…

Variants

A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Loading...
Detroit Police Wrongfully Arrested Black Man Due To Faulty FRT

Detroit Police Wrongfully Arrested Black Man Due To Faulty FRT

Jan 2020 · 11 reports
Loading...
Predictive Policing Program by Florida Sheriff’s Office Allegedly Violated Residents’ Rights and Targeted Children of Vulnerable Groups

Predictive Policing Program by Florida Sheriff’s Office Allegedly Violated Residents’ Rights and Targeted Children of Vulnerable Groups

Sep 2015 · 12 reports
Loading...
Defamation via AutoComplete

Defamation via AutoComplete

Apr 2011 · 28 reports
Previous IncidentNext Incident

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Loading...
Detroit Police Wrongfully Arrested Black Man Due To Faulty FRT

Detroit Police Wrongfully Arrested Black Man Due To Faulty FRT

Jan 2020 · 11 reports
Loading...
Predictive Policing Program by Florida Sheriff’s Office Allegedly Violated Residents’ Rights and Targeted Children of Vulnerable Groups

Predictive Policing Program by Florida Sheriff’s Office Allegedly Violated Residents’ Rights and Targeted Children of Vulnerable Groups

Sep 2015 · 12 reports
Loading...
Defamation via AutoComplete

Defamation via AutoComplete

Apr 2011 · 28 reports

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd