Associated Incidents
Newtown Township police and the Bucks County District Attorney’s Office are investigating allegations that a middle school student used AI technology to create pornographic deepfake images of classmates.
The incident reportedly occurred in March involving a student at Newtown Middle School; as many as 12 underage girls had photos used to create phony nude images, according to individuals familiar with the allegations.
It was not immediately known if the images were shared with others or available online.
Council Rock School District spokeswoman Andrea Mangold declined comment on the incident other than to say that Newtown Township police are handling the matter.
Newtown Township police Capt. Jason Harris confirmed that an active investigation is underway involving a juvenile, but said no information is being released to avoid any compromising of the case.
Harris confirmed that Council Rock staff reported the matter to police. It is unknown whether ChildLine, the state’s child abuse hotline, was notified.
A Bucks County district attorney spokesman also confirmed that the office is aware of the allegations. He said they're investigating, but can't release more information, citing the involvement of a juvenile.
If the allegations are confirmed, the student could be among the first prosecuted under a new Pennsylvania law that outlawed the distribution of so-called pornographic deepfakes.
What are deepfakes?
Deepfakes are images, videos and audio generated by artificial intelligence tools to depict both real and non-existent people. These deepfakes can be made to look real using someone face or voice. In some cases, a persons image has been placed on a nude body or pornographic image in no way associated with them and shared online.
The high-tech digital forgeries have been used to commit financial scams, hurt personal reputations and disrupt the political process.
What does PA law say about using AI-generated digital deepfakes?
Last year, Pennsylvania became the first state to crack down on digital deepfakes
The new Pennsylvania law makes it a crime to harass someone by distributing a deepfake image of them without consent while in a state of nudity or engaged in a sexual act. The offense would be more serious if the victim is a minor.
The addition of AI-generated material was added to enhance the state’s existing law that makes it a crime to distribute intimate images without consent.
State Sen. Tracy Pennycuick, R-Montgomery, who chairs the Senate Communications and Technology Committee, said the bill was partly inspired by a Westfield, New Jersey incident in 2023, where AI-generated nude images of high school students were disseminated on social media.
On Tuesday, members of the state’s Senate Judiciary Committee passed a new AI-related bill that would include forged digital likenesses and impersonation under the state’s forgery statute.
Members of Congress are also taking a harder look at deepfake technology.
Last year, members of the Senate passed the Defiance Act, a federal bill that would allow victims of non-consensual sexually explicit deepfakes to sue people who create, share and receive the images. The House version of the bill was recently reintroduced.
Earlier this year members of the House passed a similar law that gained Senate approval called the Take It Down Act, which Trump signed last month. The law criminalizes non-consensual deepfake porn and requires platforms to take down such material within 48 hours of being served notice.
How are schools handling deepfake images among students?
In its most recent survey, the Center for Democracy and Technology found that nearly 40% of high school students polled had heard about people making and sharing sexual images of others without permission, including deepfake images.
“Kids are doing this to each other most often,” said Kristin Woelfel, policy counsel for the center, a nonpartisan nonprofit focused on civil rights and civil liberties in the digital age.
The center’s research has also found that most students caught sharing sexual images, real or deepfake, faced consequences — typically law enforcement referral and long-term school suspensions.
But according to Woelfel, school staffs aren't doing a good job with victim assistance and prevention. Only approximately 10% of students surveyed in 2023-24 said their school provided resources to victims depicted in images.
The survey also found that a majority of the schools had not shared any policies or procedures addressing real or deepfake sexual image sharing with students, staff or parents, Woelfel said.
More than half of the teachers surveyed didn’t know if such a policy existed in their district.
The center also found that schools were more than twice as likely to develop a policy regarding the sharing of sexual images after an incident occurred, Woelfel said.
(This story was updated to accurately reflect the most current information.)