Associated Incidents
Elliston Berry woke up on a Monday morning last October to some alarming text messages. Friends were asking if she had seen nude photos of herself that were circulating among students at her Texas high school.
One sent a screenshot. She was shocked. The image showed her face...but it wasn't her body.
A male classmate, she would later learn, had taken at least two photos from her private Instagram account and rendered her naked using artificial-intelligence-powered clothes-removal software. Two of her friends were also victims of the photo manipulation.
Using anonymous Snapchat accounts, the boy shared the photos with other students at Aledo High School, in a small town outside of Fort Worth. As the day went on, he allegedly created and shared fake nudes of six other girls in Berry's friend group.
"I had never even thought about someone doing something like this," says Berry, now 15 years old. "To wake up to that was surreal."
Teen girls across the country are dealing with the aftermath of this disturbing new trend. Berry and her family are trying to prevent others from experiencing the embarrassment and worry such images---sometimes referred to as deepfakes---can create. They shared their story with Sen. Ted Cruz (R., Texas).
On Tuesday, Cruz, along with Sen. Amy Klobuchar (D., Minn.) and a bipartisan group of senators, introduced a bill that would criminalize the publication of nonconsensual nude images---real or fake. It would also require social-media companies and other websites to remove the pictures within 48 hours of receiving notice from the victim.
The bill adds to a growing body of proposed state and federal legislation intended to stop this new form of harassment.
'I couldn't focus'
Berry says the photos looked convincing. One image of Berry was rendered from a real photo taken of her standing on a cruise ship. Some original photos of her friends had been taken at a beach. The only giveaway that the nude versions were doctored was that the subjects wouldn't ever have been without clothes in such public settings.
New AI image-generating software makes it easy to produce believable-looking photos. Google and Apple have been removing nude-generating AI apps from their app stores.
Berry and other victims of fake nudes say they worry about the long-term ramifications of such images surfacing when they apply for college or for jobs. There are immediate concerns, too. Berry says she was worried about telling her mom and stepdad.
"It's so hard to admit to your parents even though the pictures aren't real," she says.
Her parents were supportive and vowed to do everything they could to put an end to her nightmare. Berry stayed home from school for a few days, but had to go back that Friday to attend volleyball practice.
"I was so in my head about these images, I couldn't focus," she says.
When she walked down the school hallways, she wondered who had seen the images and whether they would believe the pictures were fake. "I felt shameful and fearful," she says.
Student code of conduct vs. state law
Anna McAdams, Berry's mother, reported the incident to school administrators, who she says conducted an investigation that revealed the perpetrator to be a fellow freshman. The teen boy, McAdams says, completed the semester in a separate building on campus where students do detention, and left the school after winter break.
Jeff Swain, district attorney for Parker County, Texas, says the boy was sanctioned within the juvenile justice system.
Even though Berry didn't have to see the boy anymore, she found it hard to fully engage at school.
"It threw me off guard and created so much anxiety," she says. "I'm a social person but I took a step back and became more closed off."
A spokeswoman for the Aledo Independent School District says the district assisted the county sheriff's department with its investigation and disciplined the boy in accordance with the student code of conduct and state law.
"The district is reviewing its student code of conduct with the possible use of AI by students in mind," the spokeswoman says. Harsher consequences for student misconduct are established by state law and apply to all Texas public schools, she adds, so it's up to the state to revise those laws.
"I do think there is a lot of confusion in the law-enforcement community about cases involving deepfakes," says Swain, the district attorney. Two state laws potentially cover them, including one that was updated to criminalize child-abuse images manipulated with conventional or AI-powered software. Since the laws are new, enforcement is a work in progress, he adds.
The TAKE IT DOWN Act
McAdams reached out to Dorota Mani, the mother of another teen victim of fake nudes, after I wrote about a similar situation that occurred around the same time in Westfield, N.J.
Mani had spoken with local and federal lawmakers and was instrumental in getting Rep. Joseph Morelle (D., N.Y.) to reintroduce the "Preventing Deepfakes of Intimate Images Act" in January. A spokeswoman for Morelle says the bill now has more than 50 co-sponsors and is awaiting consideration by the House Judiciary Committee.
McAdams began calling state and federal officials, including Cruz. He recently called her back. She and her daughter were with him in Washington, D.C., on Tuesday when he introduced his bill: "Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks," aka TAKE IT DOWN.
After the fake nudes of Berry were shared, she removed all her Instagram posts for a few months and culled her followers. The perpetrator had been an acquaintance who requested to follow her private account on Instagram the year before. She says she now only accepts people she knows well.
Berry says the experience left her less trusting and more cynical.
"I have learned to be ready for anything," she says.