Incident 382: Instagram's Exposure of Harmful Content Contributed to Teenage Girl’s Suicide

Description: Instagram was ruled by a judge to have contributed to the death of a teenage girl in the UK allegedly through its exposure and recommendation of suicide, self-harm, and depressive content.


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: Instagram developed and deployed an AI system, which harmed Molly Rose Russell , the Russell family , teenage girls and teenagers.

Incident Stats

Incident ID
Report Count
Incident Date
Khoa Lam

Incident Reports

British Ruling Pins Blame on Social Media for Teenager’s Suicide · 2022

Sitting in the witness box of a small London courtroom this week, a Meta executive faced an uncomfortable question: Did her company contribute to the suicide of a 14-year-old named Molly Russell?

Videos and images of suicide, self-harm and …


A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.