Associated Incidents
Artificial intelligence was not something Elliston Berry was thinking about, beyond a vague understanding that students were not allowed to use it to cheat in exams.
She had just started her first year at high school in Texas and was throwing herself into student life, but on the morning of October 2 Elliston had a terrifying introduction to the technology. Photographs, taken from her Instagram account, had been put through AI software to render her naked, and the fake images were being shared among her classmates on Snapchat.
"I was totally shocked, I had no thoughts going through my head, I was just terrified," she says. She was 14 years old.
A then-unknown person was also sharing fake images of other female classmates using an easy-to-access online tools that allow someone to upload a real photograph and use AI to analyse how that person would look without clothes on. The software creates a new image, using the person's head on an AI-generated naked body.
In the past year, schools across the US have been forced to contend with this emerging form of sexual exploitation without any clear legal framework on how to combat it.
"It was so real," says Elliston's mother, Anna McAdams. "That's what was so shocking. "Here's my daughter, I know her, I know the picture and where it was taken. I just didn't have words, other than anger --- how could somebody do this?"
McAdams then went into what she calls "Mama Bear mode". Liaising with the parents of the other targeted girls, she arranged a meeting that morning with police and staff at the Aledo High School. It was then that she had her second shock of the day.
"They didn't know what to do with this," she says. "There's nothing in the student code of conduct. And by law, there isn't anything either."
While it is difficult to quantify the scale of the problem as it often goes unreported, cases of AI nudes being shared in schools have been documented in Texas, New Jersey, Washington DC, Florida, California and Washington State.
"We have heard from at least a half a dozen schools that have had this very problem, and that's just the tip of the iceberg," says Brian Hughes, executive director of the Polarisation and Extremism Research and Innovation Lab (Peril) at American University in Washington. "No one is getting a handle on this completely unregulated space, and our regulators and our legislators have really been derelict in their responsibilities to protect children," he says.
Since the leap forward in AI capabilities with the release of ChatGPT in 2022, the ability of the technology to generate explicit images has been exploited around the world, with many public figures such as politicians and actors targeted. In the UK, a Channel 4 News investigation found that 255 people were the victims of those creating what are known as deepfake images.
Stefan Turkheimer, vice-president of public policy at the Rape, Abuse and Incest National Network, says he was particularly concerned that doctored, explicit images of children could enter the commercial marketplace. "If someone doesn't do something about this, relatively soon, we could get into a commercial exploitation of these images," he says.
The creation of the images can have a particularly devastating impact on young victims, experts warn. "There's anxiety, there's depression, there's shame, they don't want to go to school, they often quit extracurriculars," says Turkheimer, adding that the consequences can last a lifetime. "Some of them are scared to take jobs outside their house, it can be a real lifetime problem. And then there are instances where people are harassed when those images are associated with their names and with their identity. In the worst cases, it can even lead to suicide."
Too often, he says, schools are reluctant to take action against the perpetrator: "The base reaction in American schools at the moment is to actually protect the person who created the deepfakes at the expense of the person who is subject to the deepfakes."
This was the experience of Elliston Berry and her mother. The day after the nude images were shared, the person behind the Snapchat account that had distributed them began posting disturbing messages about "going out with a bang" and "ruining the girls", prompting parents to keep their girls out of school.
The following day, the perpetrator was identified when he posted using the school's WiFi. He was charged as a juvenile with the distribution of harmful materials of a minor. The parents claim that the school took little action, and it was down to them to remove him.
"[The school's] response was, 'well, we have a moral obligation to educate him and support and protect him', and really took him like he was the victim and not the girls," McAdams says.
Elliston describes her mental state when she returned to school later in the week as "all over the place". She says she "was so paranoid that everyone has seen these images, it gave me so much anxiety". So far, the school has not provided any mental health support. "After that week, they kind of just forgot about it. It was just like a silent battle that me and all the girls had to deal with," she said.
A spokesman from the Aledo Independent School District said that it had "disciplined the student involved in accordance with the Aledo ISD Student Code of Conduct and current state law". The district said it was reviewing its code of conduct in light of the incident.
The patchy response from police and the school prompted Elliston and her mother to approach Ted Cruz, the Republican Texas senator who is himself a father to teenage girls. Within months, he had collaborated on a bipartisan bill called the Take It Down Act, which criminalises the publication of non-consensual AI-created nudes and requires online platforms to remove such content within 48 hours.
Cruz and his Democrat colleague, Amy Klobuchar, introduced the bill in the Senate last month, and a bi-partisan group has also taken it to the House of Representatives. In June, with pressure from Cruz, Snapchat eventually removed the images of Elliston Berry, which she describes as "a really big relief". "These images have been stuck in my mind, and every day I live with the fear that they will resurface," she says.
Hughes, of the American University's Peril lab, says that schools must also put resources into ensuring boys and girls have spaces in which they can discuss consent and sexual exploitation without fear or shame. "It's very important to integrate this into a much broader way of raising our children to respect the rights and the autonomy of other people," he says.
Elliston Berry shows remarkable empathy for the young man who caused her such distress. She has no idea about his motivation. "I really do hope that he's able to get the help that he needs," she says.