Incident 215: Facebook Content Moderators Demand Better Working Conditions Due to Allegedly Inadequate AI Content Moderation

Description: Content moderators and employees at Facebook demand better working conditions, as automated content moderation system allegedly failed to achieve sufficient performance and exposed human reviewers to psychologically hazardous content such as graphic violence and child abuse.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover
Alleged: Facebook developed and deployed an AI system, which harmed Facebook content moderators.

Incident Stats

Incident ID
215
Report Count
1
Incident Date
2020-04-01
Editors
Khoa Lam

Incident Reports

Reports Timeline

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.