Incident 469: Automated Adult Content Detection Tools Showed Bias against Women Bodies

Description: Automated content moderation tools to detect sexual explicitness or "raciness" reportedly exhibited bias against women bodies, resulting in suppression of reach despite not breaking platform policies.


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: Microsoft , Google and Amazon developed an AI system deployed by Meta , LinkedIn , Instagram and Facebook, which harmed LinkedIn users , Instagram users and Facebook users.

Incident Stats

Incident ID
Report Count
Incident Date
Khoa Lam
A Google Algorithm Seems To Think Brands Like Boohoo And Missguided Are Pretty ‘Racy’ · 2019

An investigation into the 'raciest' clothing from different fashion brands has highlighted the fact that Google uses software to rate imagery as part of a 'safe search' tool and scores clothing based on how 'skimpy or sheer' it is.


‘There is no standard’: investigation finds AI algorithms objectify women’s bodies · 2023

Images posted on social media are analyzed by artificial intelligence (AI) algorithms that decide what to amplify and what to suppress. Many of these algorithms, a Guardian investigation has found, have a gender bias, and may have been cens…

New Investigation Reveals AI Tools Are Sexualizing Women’s Bodies in Photos · 2023

Many social media platforms such as Instagram and LinkedIn use content moderation systems to suppress images that are sexually explicit or deemed inappropriate for viewers. 

But what happens when these systems block images that are not at a…


A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents