Associated Incidents
This report has more details that pertain to Incident 799, but begins by referencing this incident too.
In February, students in Beverly Vista Middle School were investigated by the Beverly Hills police department for creating fake nude pictures of their classmates. School and district officials expressed their awareness of the "AI-generated nude photos."
Deepfakes are artificial-intelligence-generated images, videos, and audio clips that depict real or non-existent people. These AI-produced images are used to spread misinformation, especially about celebrities.
Elliston Berry, a 15-year-old girl from Aledo, Texas, found artificial nude pictures of her and her friends posted on social media. "I had woken up with many, many text messages from my friends just telling me that there were these images of mine circling," Elliston said.
In recent years, the rise of Deepfake nudes among young adults has become an alarming issue. There have been numerous incidents around the country about teenagers creating fake sexual content for their peers using AI as a tool of bullying.
"It was so realistic. It is child porn," Elliston's mom, Anna McAdams said in an interview with WFAA. "We really could not take care of her. We, in that moment, were helpless ... more and more pictures were coming out throughout that first day into the second day."
The fake sexual content disproportionately harms young girls, who make up 90% of the deep counterfeit victims. Deepfakes can also target historically marginalized groups as a student in New York made an artificial video of their principal shouting racist slurs and threatening to hurt students of color. The violation of reputation and privacy can trigger serious issues in student mental health, making them feel unsafe in others and school environments. Identity-based attacks make students feel humiliated and detached from themselves.
Regulations have been implemented in some states to address the harassment by deep-fake images.
In June, Texas Senator Ted Cruz introduced the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks (Take It Down) Act, which makes publishing realistic, artificial pornographic images illegal. Other states including Mississippi, Louisiana, South Dakota, and New Mexico also passed bills to terminate revenge porn.
In 2019, Congress passed the Deepfake Report Act, which requires U.S. technology companies to report digital forgeries. The House also introduced the Deepfake Accountability Act dedicated to providing helpful resources for victims of Deepfakes. Other federal bills have also been passed to mitigate the influences of artificial images, but there are no official federal legislations that ban or regulate Deepfakes.
Before the problem of Deepfake pornography can be efficiently regulated, teens are subjected to the danger of the new form of cyberbullying, and victims are scarred by their experiences.
"I was a freshman, and I was only fourteen," said Elliston. "Even today, I'm still fearful that these images will resurface."