Incident 282: Facebook’s Algorithm Mistook an Advertisement of Onions as Sexual Suggestive Content
Description: Facebook’s content moderation algorithm misidentified and removed a Canadian business’s advertisement containing a photo of onions as products of overtly sexual content, which was later reinstated after review.
Entities
View all entitiesAlleged: Facebook developed and deployed an AI system, which harmed The Seed Company by E.W. Gaze and businesses on Facebook.
Incident Stats
Incident ID
282
Report Count
3
Incident Date
2020-10-03
Editors
Khoa Lam
Incident Reports
Reports Timeline

Canada’s most sexually provocative onions were pulled down from Facebook after the social media giant told a produce company that its images went against advertising guidelines, the CBC reported.
Now, Facebook has admitted the ad was picked…
Are onions naked or clothed in their natural form? Facebook's Artificial Intelligence (AI) seems to be having an issue telling the difference between pictures with sexual connotations referring to the human body and vegetables that are just…
Facebook's AI struggles to tell the difference between sexual pictures of the human body and globular vegetables.
A garden center in Newfoundland, Canada on Monday received a notice from Facebook about an ad it had uploaded for Walla Walla …
Variants
A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.