Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 869: TikTok Algorithms Allegedly Linked to Minors' Exposure to Harmful Content

Description: Seven French families are suing TikTok, alleging its algorithm exposed minors to harmful content promoting self-harm, eating disorders, and suicide. Two teenagers reportedly died by suicide after viewing such content, while others allegedly attempted suicide or developed mental health issues. The case seeks to establish TikTok's legal liability for failing to protect minors from harmful algorithmic content.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: TikTok and TikTok recommendation algorithms developed and deployed an AI system, which harmed TikTok users , Seven French families and Minors using TikTok.
Alleged implicated AI system: TikTok recommendation algorithms

Incident Stats

Incident ID
869
Report Count
7
Incident Date
2024-11-04
Editors
Daniel Atherton
Applied Taxonomies
MIT

MIT Taxonomy Classifications

Machine-Classified
Taxonomy Details

Risk Subdomain

A further 23 subdomains create an accessible and understandable classification of hazards and harms associated with AI
 

1.2. Exposure to toxic content

Risk Domain

The Domain Taxonomy of AI Risks classifies risks into seven AI risk domains: (1) Discrimination & toxicity, (2) Privacy & security, (3) Misinformation, (4) Malicious actors & misuse, (5) Human-computer interaction, (6) Socioeconomic & environmental harms, and (7) AI system safety, failures & limitations.
 
  1. Discrimination and Toxicity

Entity

Which, if any, entity is presented as the main cause of the risk
 

AI

Timing

The stage in the AI lifecycle at which the risk is presented as occurring
 

Post-deployment

Intent

Whether the risk is presented as occurring as an expected or unexpected outcome from pursuing a goal
 

Unintentional

Incident Reports

Reports Timeline

+5
French families sue TikTok over alleged failure to remove harmful content
France: TikTok taken to court over content policingSeven French families sue TikTok over harmful content that allegedly led to suicides
French families sue TikTok over alleged failure to remove harmful content

French families sue TikTok over alleged failure to remove harmful content

reuters.com

French families sue TikTok over harmful content

French families sue TikTok over harmful content

bbc.com

French parents whose children took own lives sue TikTok over harmful content

French parents whose children took own lives sue TikTok over harmful content

theguardian.com

French families sue TikTok over alleged promotion of self-harm content

French families sue TikTok over alleged promotion of self-harm content

politico.eu

TikTok sued in France over harmful content that allegedly led to two suicides

TikTok sued in France over harmful content that allegedly led to two suicides

cnn.com

France: TikTok taken to court over content policing

France: TikTok taken to court over content policing

dw.com

Seven French families sue TikTok over harmful content that allegedly led to suicides

Seven French families sue TikTok over harmful content that allegedly led to suicides

euronews.com

French families sue TikTok over alleged failure to remove harmful content
reuters.com · 2024

PARIS, Nov 4 (Reuters) - Seven French families have filed a lawsuit against social media giant TikTok, accusing the platform of exposing their adolescent children to harmful content that led to two of them taking their own lives at 15, thei…

French families sue TikTok over harmful content
bbc.com · 2024

TikTok is being sued by seven families in France, who accuse the social media giant of exposing their children to harmful content - leading two to take their own lives.

The case alleges the video platform's algorithm exposed them to content…

French parents whose children took own lives sue TikTok over harmful content
theguardian.com · 2024

Seven French families have filed a lawsuit against TikTok, accusing the platform of exposing their adolescent children to harmful content that led to two of them taking their own lives at 15, their lawyer said.

The lawsuit alleges TikTok’s …

French families sue TikTok over alleged promotion of self-harm content
politico.eu · 2024

PARIS --- Seven French families are hoping to set a precedent by holding TikTok liable for insufficient content moderation they say put their children at risk.

The families, part of a collective called Algos Victima, are suing the social me…

TikTok sued in France over harmful content that allegedly led to two suicides
cnn.com · 2024

Seven French families have filed a lawsuit against social media giant TikTok, accusing the platform of exposing their adolescent children to harmful content that led to two of them taking their own lives at 15, their lawyer said Monday.

The…

France: TikTok taken to court over content policing
dw.com · 2024

Seven French families have filed a lawsuit against TikTok, accusing the social media giant of exposing their children to harmful content that led to two of them taking their own lives at 15, their lawyer said.

According to the lawsuit, TikT…

Seven French families sue TikTok over harmful content that allegedly led to suicides
euronews.com · 2024

The lawyer representing the plaintiffs told Euronews that TikTok's "addictive" design and limited content moderation is harmful to young people.

Seven French families are taking TikTok to court, accusing the social network of "direct damage…

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Defamation via AutoComplete

Defamation via AutoComplete

Apr 2011 · 28 reports
A Tesla Taxi Cab Involved in an Accident in Paris with Twenty Injuries

A Tesla Taxi Cab Involved in an Accident in Paris with Twenty Injuries

Dec 2021 · 7 reports
Google’s YouTube Kids App Presents Inappropriate Content

Google’s YouTube Kids App Presents Inappropriate Content

May 2015 · 14 reports
Previous IncidentNext Incident

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Defamation via AutoComplete

Defamation via AutoComplete

Apr 2011 · 28 reports
A Tesla Taxi Cab Involved in an Accident in Paris with Twenty Injuries

A Tesla Taxi Cab Involved in an Accident in Paris with Twenty Injuries

Dec 2021 · 7 reports
Google’s YouTube Kids App Presents Inappropriate Content

Google’s YouTube Kids App Presents Inappropriate Content

May 2015 · 14 reports

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 1420c8e