Associated Incidents
Students at a Kansas high school sometimes worry as they write class presentations or emails to their teachers. They stop and consider their words. They ask each other: "Will this get Gaggled?"
Anything students at Lawrence High write or upload to their school accounts can get "Gaggled" --- flagged by Gaggle Safety Management, a digital safety tool the Lawrence, Kansas, high school purchased in 2023. Gaggle uses artificial intelligence to scan student documents and emails for signs of unsafe behavior, such as substance abuse or threats of violence or self-harm, which it deletes or reports to school staff.
Students say it's doing much more than that. Since Gaggle came online in Lawrence, it has deleted part of an art student's portfolio --- a photo of girls wearing tank tops --- after mistakenly flagging it as child pornography. Another student was questioned by administrators after writing that they were "gonna die" because they ran a fitness test wearing Crocs shoes.
When Suzana Kennedy, 19, emailed a records request to the school last year for a report of student material flagged by Gaggle, Gaggle blocked her attempt to investigate it, she said. The system flagged and intercepted the school's response containing the records. Kennedy never received the reply.
This is what some students say life in high school is like under the watch of an AI-powered safety tool like Gaggle, which boasts partnerships with around 1,500 school districts across the country. The Illinois-based company advertises its round-the-clock monitoring as a bulwark against a litany of threats to today's students, such as gun violence, mental health struggles and sexual assault.
In school board meetings, Lawrence officials have called Gaggle a vital aid in bolstering safety procedures. The program has enabled staff to intervene in several instances where students were at risk of suicide, school board members said.
But Gaggle has also come under scrutiny for the reach of its monitoring and complaints about intrusions into students' privacy. Former and current Lawrence students, including Kennedy, sued the school district in August to stop its use, alleging that Gaggle's surveillance is unconstitutional and prone to misfires.
Instead of aiding their safety, the lawsuit says, Gaggle's monitoring has had a chilling effect among students. They wonder if discussing mental health or using the wrong words could lead to them being reported to teachers and having schoolwork deleted.
"There was always that fear," said Natasha Torkzaban, 19, a former student who is a plaintiff in the lawsuit. "Who else, other than me, is looking at this document?"
Lawrence Public Schools declined to comment but shared a statement from former superintendent Anthony Lewis in response to previous criticism of Gaggle in April 2024, when student journalists requested to be exempt from Gaggle monitoring to protect their sources.
"The information we have gleaned through the use of Gaggle has helped our staff intervene and save lives," Lewis said at the time.
The Lawrence Public Schools website states that the district uses the software to scan for "signs of self-harm, depression, thoughts of suicide, substance abuse, cyberbullying, credible threats of violence against others, or other harmful situations."
Gaggle did not respond to requests for comment from The Washington Post and has not yet responded to the substance of the lawsuit in court. In news releases on its website, the company says it has a "staunch commitment to supporting student safety without compromising privacy."
The company's product is part of a wave of AI-powered school security systems that use machine learning to detect safety risks in the classroom. Some products, like Gaggle, monitor students' activity on school accounts and devices. Others scan security camera feeds to flag guns and fights in hallways.
The Gaggle Safety Management tool can review the contents of a student's Google or Microsoft account, including emails, documents, links to websites and calendar entries. "Trained safety professionals" evaluate any flagged material for false positives before reporting it to schools, according to Gaggle, though the Lawrence lawsuit alleges that reviews are outsourced to third-party contractors.
An investigation of Gaggle by the Seattle Times and the Associated Press this past spring found that the system carried security risks and privacy concerns. Reporters were temporarily able to view screenshots of flagged student material that wasn't password-protected, the investigation found. In other instances, LGBTQ students in North Carolina and British Columbia were potentially outed to family members and school officials when Gaggle flagged messages about their sexual identity or mental health.
Amanda Klinger, the director of operations at the Educator's School Safety Network, said Gaggle and similar AI-powered systems can be a "valuable tool" to spot concerning behavior in students, particularly when school staff is overtaxed. But Klinger added that poor implementation risks making students feel excessively surveilled.
"I don't envy the position that educators are in," Klinger said, adding, "But we just need to be really clear-eyed about the limitations of these tools and the unintended consequences."
The Lawrence school board voted unanimously to ink a three-year deal with Gaggle in August 2023 for around $160,000. The district, which does not permit students to opt out of Gaggle's surveillance on school devices, quickly drew controversy. In 2024, Lawrence High School administrators summoned several art, journalism and photography students and accused them of uploading images that featured indecent exposure or child pornography, according to the lawsuit.
Opal Morris, an 18-year-old Lawrence graduate and one of the former students suing Lawrence Public Schools, was among them. She said she was pulled out of class by security guards to be questioned. She said she told administrators she'd recently uploaded a portfolio of photography, and they let her go.
"It was very formal and very accusatory in the beginning," Morris said. "And then kind of just like, 'Okay, be quiet about this and go back to class.'"
Morris said she later found that a photo from the portfolio had been deleted from her school account. She determined that the offending image was a portrait of two girls wearing tank tops. None of the other students called in that day were disciplined, according to the lawsuit.
That spring, Lawrence High School's student newspaper, the Budget, complained to the school that Gaggle's scanning of students' reporting notes could violate Kansas law by exposing their sources to school officials.
The leaders of the Budget also flagged other instances when they were alarmed by Gaggle's reach. Torkzaban, a former co-editor in chief, said the program scanned a college admission essay on a friend's personal Google account when Torkzaban edited it while logged into her school account. She was Gaggled because the essay contained the words "mental health," she said.
In the fall of 2024, Kennedy, another former co-editor in chief for the Budget, submitted her records request for Gaggle data. After waiting more than a month, Kennedy said, administrators told her that they discovered their responses were being blocked by Gaggle, but they did not know why.
The records, which Kennedy eventually obtained through a teacher and shared with The Post, showed that Gaggle flagged more than 1,200 instances of "questionable" student content across the district between November 2023 and September 2024. Keywords that Gaggle flagged included expletives, terms related to gun violence and self-harm, and words like "sex," "drunk," "get in a fight" or "bomb."
Eighteen incidents were reported to law enforcement. Around 800 were classified as "nonissues," though other incidents were addressed after a teacher questioned a student or reprimanded them for their choice of words. Among the nonissues was a student who was questioned last August for writing a message with the phrase "I wanted to kill."
"She said that she was sending an email to her grandma and was referencing a fly that she wanted her to kill," the report read.