Incident 268: Permanent Removal of Social Media Content via Automated Tools Allegedly Prevented Investigative Efforts

Description: Automated permanent removal of violating social media content such as terrorism, violent extremism, and hate speech without archival allegedly prevented its potential use to investigate serious crimes and hamper criminal accountability efforts.


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Incident Stats

Incident ID
Report Count
Incident Date
Khoa Lam

Incident Reports

Reports Timeline

Social Media Platforms Remove War Crimes Evidence · 2020

Social media platforms are taking down online content they consider terrorist, violently extremist, or hateful in a way that prevents its potential use to investigate serious crimes, including war crimes, Human Rights Watch said in a report…

“Video Unavailable” · 2020

In recent years, social media platforms have been taking down online content more often and more quickly, often in response to the demands of governments, but in a way that prevents the use of that content to investigate people suspected of…


A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.