Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 4398

Associated Incidents

Incident 8697 Report
TikTok Algorithms Allegedly Linked to Minors' Exposure to Harmful Content

Loading...
French families sue TikTok over alleged promotion of self-harm content
politico.eu · 2024

PARIS --- Seven French families are hoping to set a precedent by holding TikTok liable for insufficient content moderation they say put their children at risk.

The families, part of a collective called Algos Victima, are suing the social media platform whose parent company ByteDance is based in Beijing. They accuse it of promoting content tied to self-mutilation, suicide or eating disorders.

"Our challenge is to see TikTok held accountable for its lack of moderation, which makes the service flawed," said Laure Boutron-Marmion, the lawyer for the collective, confirming the civil lawsuit first reported by Franceinfo.

The collective includes the families of seven teenage girls, two of whom died by suicide. In September 2023, the family of 15-year-old Marie filed criminal charges against TikTok after her death, accusing the platform of "inciting suicide," "failure to assist a person in danger," and "promoting and advertising methods for self-harm."

TikTok's algorithm, Boutron-Marmion said, had trapped Marie in a bubble of toxic content linked to bullying she experienced because of her weight.

The case relies on a British precedent, where a coroner concluded that Molly Russell, a 14-year-old who took her own life after scrolling on Pinterest and Instagram, had been systematically exposed to graphic content depicting self-harm and suicide.

In the United States, current and former Presidents Joe Biden and Donald Trump have spoken out in favor of repealing Section 230 of the 1996 Communications Decency Act, which protects companies from being sued for user-posted content housed on their platforms.

Democratic candidate Kamala Harris has not taken a public stance, but voted in 2018 on a bill that created an exception to Section 230 to hold platforms accountable for sex trafficking.

With this new civil suit, Algos Victima aims to bring TikTok's algorithm and business model under scrutiny in court.

In response to POLITICO, TikTok's French division said it had not been in contact with anyone regarding this case and declined to comment on ongoing legal proceedings. TikTok prohibits "exposing, promoting, or sharing plans for suicide or self-harm," the company added.

For TikTok, which claims to have 21 million users in France, the lawsuit risks setting a precedent regarding the platform's responsibility for user-generated content that can have severe consequences for those exposed to it.

Over the summer, French President Emmanuel Macron opposed a blanket ban on TikTok but called for "open academic research that can really say what's under the hood" of the app's algorithm.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd