Incident 300: TikTok's "For You" Algorithm Allegedly Abused by Online Personality to Promote Anti-Women Hate

Description: TikTok’s “For You” algorithm allegedly boosted or was manipulated by an online personality to artificially boost his content which promotes extreme misogynistic views towards teenagers and men, despite breaking its rules.
Alleged: TikTok developed and deployed an AI system, which harmed TikTok , TikTok male teenager users , TikTok male users , TikTok teenage users and TikTok users.

Suggested citation format

Lam, Khoa. (2022-01-15) Incident Number 300. in Lam, K. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
300
Report Count
2
Incident Date
2022-01-15
Editors
Khoa Lam

Reports Timeline

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover

Incident Reports

How TikTok bombards young men with misogynistic videos

An Observer investigation has revealed how TikTok is promoting misogynistic content to young people despite claiming to ban it.

Videos of the online personality Andrew Tate, who has been criticised by domestic abuse campaigners for normalis…

Inside the violent, misogynistic world of TikTok’s new star, Andrew Tate

Andrew Tate says women belong in the home, can’t drive, and are a man’s property.

He also thinks rape victims must “bear responsibility” for their attacks and dates women aged 18–19 because he can “make an imprint” on them, according to vid…

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.