Incident 382: Instagram's Exposure of Harmful Content Contributed to Teenage Girl’s Suicide

Description: Instagram was ruled by a judge to have contributed to the death of a teenage girl in the UK allegedly through its exposure and recommendation of suicide, self-harm, and depressive content.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: Instagram developed and deployed an AI system, which harmed Molly Rose Russell , the Russell family , teenage girls and teenagers.

Incident Stats

Incident ID
382
Report Count
1
Incident Date
2017-11-21
Editors
Khoa Lam

Incident Reports

British Ruling Pins Blame on Social Media for Teenager’s Suicide
nytimes.com · 2022

Sitting in the witness box of a small London courtroom this week, a Meta executive faced an uncomfortable question: Did her company contribute to the suicide of a 14-year-old named Molly Russell?

Videos and images of suicide, self-harm and …

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.