Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 4395

Associated Incidents

Incident 8697 Report
TikTok Algorithms Allegedly Linked to Minors' Exposure to Harmful Content

Loading...
Seven French families sue TikTok over harmful content that allegedly led to suicides
euronews.com · 2024

The lawyer representing the plaintiffs told Euronews that TikTok's "addictive" design and limited content moderation is harmful to young people.

Seven French families are taking TikTok to court, accusing the social network of "direct damage" to young people's health that led to two of them taking their own lives at the age of 15, their lawyer said. 

Laure Boutron-Marmion, a lawyer representing the Algos Victima collective of families, filed a lawsuit before a judge.

"I'm leveraging the essential principles of French civil liability," she told Euronews Next adding that for "any fault committed by another person in law, you have to make amends".

The combination of the "addictive" design of the application with the lack of content moderation "makes for a truly problematic product,"  Boutron-Marmion said.

"Virality mixes with harmful content," she added.

TikTok said parents could control and limit their teenagers' use of the application through 'Family Connection' mode in 2020 and highlighted that there are more than 630 French-speaking moderators of the platform.

"When people carry out a search that includes words such as suicide, they are immediately redirected to a page with dedicated resources, as well as a number for a local helpline or prevention hotline," TikTok told Euronews Next in an email.

The state of health of teenagers

Boutron-Marmion also underlined the vulnerability of children and teenagers who can be confronted with content promoting self-harm or suicide for example.

Stéphanie Mistre, the mother of one of the teenagers who took their life, told French media that her "daughter could still be here today" if it weren't for the application. 

Mistre filed a complaint in September 2023, for 'incitement to suicide', 'failure to assist a person in danger', and 'propaganda or advertising of means of committing suicide' that is still being instructed according to the French media.

"The deterioration in the state of health of the teenagers is very, very serious. These are children who have either come close to death or are no longer here," Boutron-Marmion said.

She pointed out that the version of the application deployed in China, Douyin, has safeguards in place for minors' accounts, such as a daily time limit or a digital curfew.

Legal action against social media platforms

"TikTok very directly today is going to be brought before the French court and will have to respond with arguments. After that, of course, it's up to the judges to settle," she said.

Last month, more than a dozen US states sued TikTok for allegedly harming young people's mental health.

Bytedance's application isn't the only one under fire, Meta - the parent company of Facebook, Instagram and Snapchathas been targeted by legal action regarding the platforms' potential impact on their young users.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd