Incident 343: Facebook, Instagram, and Twitter Failed to Proactively Remove Targeted Racist Remarks via Automated Systems

Description: Facebook's, Instagram's, and Twitter's automated content moderation failed to proactively remove racist remarks and posts directing at Black football players after finals loss, allegedly largely relying on user reports of harassment.


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover
Alleged: Facebook , Instagram and Twitter developed and deployed an AI system, which harmed Marcus Rashford , Jadon Sancho , Bukayo Saka , Facebook users , Instagram users and Twitter Users.

Incident Stats

Incident ID
Report Count
Incident Date
Khoa Lam


A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents