Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Découvrir
Envoyer
  • Bienvenue sur AIID
  • Découvrir les incidents
  • Vue spatiale
  • Vue de tableau
  • Vue de liste
  • Entités
  • Taxonomies
  • Soumettre des rapports d'incident
  • Classement des reporters
  • Blog
  • Résumé de l’Actualité sur l’IA
  • Contrôle des risques
  • Incident au hasard
  • S'inscrire
Fermer
Découvrir
Envoyer
  • Bienvenue sur AIID
  • Découvrir les incidents
  • Vue spatiale
  • Vue de tableau
  • Vue de liste
  • Entités
  • Taxonomies
  • Soumettre des rapports d'incident
  • Classement des reporters
  • Blog
  • Résumé de l’Actualité sur l’IA
  • Contrôle des risques
  • Incident au hasard
  • S'inscrire
Fermer

Problème 3721

Incidents associés

Incident 6461 Rapport
Snapchat's Algorithm Alleged to Link Minor with Sex Offenders

Loading...
Snapchat isn’t liable for connecting 12-year-old to convicted sex offenders
arstechnica.com · 2024

A judge has dismissed a complaint from a parent and guardian of a girl, now 15, who was sexually assaulted when she was 12 years old after Snapchat recommended that she connect with convicted sex offenders.

According to the court filing, the abuse that the girl, C.O., experienced on Snapchat happened soon after she signed up for the app in 2019. Through its "Quick Add" feature, Snapchat "directed her" to connect with "a registered sex offender using the profile name JASONMORGAN5660." After a little more than a week on the app, C.O. was bombarded with inappropriate images and subjected to sextortion and threats before the adult user pressured her to meet up, then raped her. Cops arrested the adult user the next day, resulting in his incarceration, but his Snapchat account remained active for three years despite reports of harassment, the complaint alleged.

Two years later, at 14, C.O. connected with another convicted sex offender on Snapchat, a former police officer who offered to give C.O. a ride to school and then sexually assaulted her. The second offender is also currently incarcerated, the judge's opinion noted.

The lawsuit painted a picture of Snapchat's ongoing neglect of minors it knows are being targeted by sexual predators. Prior to C.O.'s attacks, both adult users sent and requested sexually explicit photos, seemingly without the app detecting any child sexual abuse materials exchanged on the platform. C.O. had previously reported other adult accounts sending her photos of male genitals, but Snapchat allegedly "did nothing to block these individuals from sending her inappropriate photographs."

Among other complaints, C.O.'s lawsuit alleged that Snapchat's algorithm for its "Quick Add" feature was the problem. It allegedly recklessly works to detect when adult accounts are seeking to connect with young girls and by design sends more young girls their way—continually directing sexual predators toward vulnerable targets. Snapchat is allegedly aware of these abuses and, therefore, should be held liable for harms caused to C.O., the lawsuit argued.

Although C.O.'s case raised difficult questions, Judge Barbara Bellis ultimately agreed with Snapchat that Section 230 of the Communications Decency Act barred all claims and shielded Snap because "the allegations of this case fall squarely within the ambit of the immunity afforded to" platforms publishing third-party content.

According to Bellis, C.O.'s family had "clearly alleged" that Snap had failed to design its recommendations systems to block young girls from receiving messages from sexual predators. Specifically, Section 230 immunity shields Snap from liability in this case because Bellis considered the messages exchanged to be third-party content. Snapchat designing its recommendation systems to deliver content is a protected activity, Bellis ruled.

Internet law professor Eric Goldman wrote in his blog that Bellis' "well-drafted and no-nonsense opinion" is "grounded" in precedent. Pointing to an "extremely similar" 2008 case against MySpace—"which reached the same outcome that Section 230 applies to offline sexual abuse following online messaging"—Goldman suggested that "the law has been quite consistent for a long time."

However, as this case was being decided, a seemingly conflicting ruling in a Los Angeles court found that "Section 230 didn't protect Snapchat from liability for allegedly connecting teens with drug dealers," MediaPost noted. Bellis acknowledged this outlier opinion but did not appear to consider it persuasive.

Yet, at the end of her opinion, Bellis seemed to take aim at Section 230 as perhaps being too broad.

She quoted a ruling from the First Circuit Court of Appeals, which noted that some Section 230 cases, presumably like C.O.'s, are "hard" for courts not because "the legal issues defy resolution," but because Section 230 requires that the court "deny relief to plaintiffs whose circumstances evoke outrage." She then went on to quote an appellate court ruling on a similarly "difficult" Section 230 case that warned "without further legislative action," there is "little" that courts can do "but join with other courts and commentators in expressing concern" with Section 230's "broad scope."

Ars could not immediately reach Snapchat or lawyers representing C.O.'s family for comment.

Senator: Snapchat is a “perfect tool for predators”

Bellis' decision comes a month after a Senate Judiciary Committee hearing grilling Big Tech CEOs over child safety concerns.

During the hearing, Ars noted at the time that US Senator Dick Durbin (D-Ill.) confronted Snap CEO Evan Spiegel with Snapchat's alleged reputation among law enforcement since at least 2017 as a "go-to tool" for pedophiles.

"Did you fail to see it was a perfect tool for predators?" Durbin asked Spiegel.

Durbin also directly confronted Spiegel over lawsuits against Snap dismissed under Section 230, like C.O.'s, asking Spiegel if Snapchat would "have implemented better safeguards if not for [a] shield?"

In response, Spiegel appeared to defend Snapchat's current safeguards, which he said "already make it very difficult" for predators to target young users.

At the hearing, Spiegel offered some assurances, telling the committee that Snapchat responds within 15 minutes to reports of harassment or sexual content and coordinated last year with law enforcement investigations that led to more than 100,000 arrests.

But Spiegel also told the senators that Snap "will be honest about our shortcomings," taking a moment to acknowledge families attending the hearing whose children had been harmed on Snapchat.

Spiegel said that while "no legislation is perfect," Snapchat would prefer if lawmakers would provide some "rules of the road" to combat child safety concerns. According to Spiegel, Snap supports the Kids Online Safety Act (KOSA) for that reason.

Kids Online Safety Act advancing

Durbin's point was that platforms like Snap could be held more accountable if Section 230 immunity wasn't a factor in cases over design features that connect kids to harmful content. Passing a law like KOSA, though strongly criticized for its own flaws, could perhaps provide a path to appease parents by combating known harms to kids while leaving Section 230 protections in place.

Last week, the Senate garnered enough votes to pass KOSA, The Verge reported, marking a "major milestone" for the law, which has for years failed to gain traction with enough lawmakers to advance.

If the Senate indeed follows through and passes KOSA, it will be thanks to updates in its text intended to address concerns that KOSA will empower officials to broadly censor content they dislike online just by deeming it harmful for kids. Now the law more narrowly gives the Federal Trade Commission (FTC) power to protect consumers from harms caused by design features like infinite scrolling or constant notifications—supposedly not because they funnel kids to harmful content but because they addict minors by encouraging them to spend more time on apps.

Longtime KOSA critic the Electronic Frontier Foundation (EFF)—a nonprofit that advocates for digital rights—has said that the latest draft of KOSA doesn't remedy flaws that make it an "unconstitutional censorship bill that continues to empower state officials to target services and online content they do not like."

On its blog, the EFF also said that "this still allows a small group of federal officials appointed by the President to decide what content is dangerous for young people. Placing this enforcement power with the FTC is still a First Amendment problem: no government official, state or federal, has the power to dictate by law what people can read online."

The bill would almost certainly restrict speech, the EFF warned, and likely mandate age verification. Where the prior version of the bill "outlined a wide collection of harms to minors that platforms had a duty to prevent and mitigate through 'the design and operation' of their product," the latest version redefines the "duty of care" to say "that a platform shall 'exercise reasonable care in the creation and implementation of any design feature' to prevent and mitigate" harms to kids.

"This language still means increased liability merely for hosting and distributing otherwise legal content that the government—in this case the FTC—claims is harmful," the EFF's blog said.

Specifically, KOSA requires platforms to limit design features for infinite scrolling, auto-play, rewards for time spent on the platform, push notifications, personalized recommendation systems, in-game purchases, and appearance-altering filters. The EFF argued that some of these design features, like infinite scrolling and auto-play, are protected by the First Amendment because they are content designs.

Evan Greer, director of digital rights group Fight for the Future, told The Verge that KOSA's "duty of care" regarding design features "should be further clarified to apply in a 'content neutral manner.' As we have said for months, the fundamental problem with KOSA is that its duty of care covers content specific aspects of content recommendation systems, and the new changes fail to address that."

While debate continues over KOSA, some tech companies have already embraced the law, including Snap, X, and Microsoft. What's more, Snapchat has "already implemented many" of KOSA's "core provisions," Spiegel told the Senate Judiciary Committee last month, according to a TechCrunch report.

KOSA still has a long way to go before becoming law. It needs "significant support" in the House of Representatives to survive this legislative year, The Washington Post reported. So far, the House hasn't even introduced its version of the bill for consideration, apparently clashing with the Senate over which tech bills to prioritize. But The Post noted that a bipartisan push in the Senate could motivate the House to embrace the bill.

Lire la source

Recherche

  • Définition d'un « incident d'IA »
  • Définir une « réponse aux incidents d'IA »
  • Feuille de route de la base de données
  • Travaux connexes
  • Télécharger la base de données complète

Projet et communauté

  • À propos de
  • Contacter et suivre
  • Applications et résumés
  • Guide de l'éditeur

Incidents

  • Tous les incidents sous forme de liste
  • Incidents signalés
  • File d'attente de soumission
  • Affichage des classifications
  • Taxonomies

2024 - AI Incident Database

  • Conditions d'utilisation
  • Politique de confidentialité
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd