Incident 282: Facebook’s Algorithm Mistook an Advertisement of Onions as Sexual Suggestive Content

Description: Facebook’s content moderation algorithm misidentified and removed a Canadian business’s advertisement containing a photo of onions as products of overtly sexual content, which was later reinstated after review.
Alleged: Facebook developed and deployed an AI system, which harmed The Seed Company by E.W. Gaze and businesses on Facebook.

Suggested citation format

Giallella, Thomas. (2020-10-03) Incident Number 282. in Lam, K. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
282
Report Count
3
Incident Date
2020-10-03
Editors
Khoa Lam

Tools

New ReportNew ReportDiscoverDiscover

Incidents Reports

Canada’s most sexually provocative onions were pulled down from Facebook after the social media giant told a produce company that its images went against advertising guidelines, the CBC reported.

Now, Facebook has admitted the ad was picked up by an errant algorithm, and will be restored.

The offending ad — for Gaze Seed Company’s Walla Walla onion seeds — shows an image of a handful of onions in a wicker basket. According to a Facebook notice sent to the company in recent days, the onions were positioned in a “sexually suggestive manner.”

“I guess something about the two round shapes there could be misconstrued as boobs or something, nude in some way,” Jackson McLean, a manager at Gaze Seed Company, told the CBC.

Gaze is a St. John’s business that sells seeds, soil and other supplies. The business pays Facebook for advertising, and was preparing to put out its spring advertisements for onions when the image was pulled.

“I just thought it was funny,” said McLean. “You’d have to have a pretty active imagination to look at that and get something sexual out of it.”

It's just onions

Ironically, the error made for better publicity than the actual ad could ever have. On its Facebook page, the seed company had fun creating mock-ups of what Facebook thought it was seeing.

McLean felt the decision was likely automated, which was a good guess. He said his company was seeking to get actual human eyes on the photo, so they could realize there was nothing sexual involved. “It’s just onions,” he said.

And in an email to the National Post on Wednesday, Facebook has now explained that an algorithm was in fact at fault. Like McLean, the tech giant saw the funny side.

“We use automated technology to keep nudity off our apps, but sometimes it doesn’t know a walla walla onion from a, well, you know,” said Meg Sinclair, head of communications at Facebook Canada. “We restored the ad and are sorry for the business’ trouble.”

From April to June 2020, Facebook algorithms removed 35.7 million pieces of content that were in violation of adult nudity and sexual activity policies.

Nudity algorithm wrongly blocked company's onion images, Facebook admits, says adverts will be restored

Facebook's AI struggles to tell the difference between sexual pictures of the human body and globular vegetables.

A garden center in Newfoundland, Canada on Monday received a notice from Facebook about an ad it had uploaded for Walla Walla onion seeds that contained a photo of some onions.

Facebook's notice said the ad broke its rules on "products with overtly sexual positioning," clarifying: "listings may not position products or services in a sexually suggestive manner."

Facebook on Wednesday told Canada's CBC News the ad had been reinstated after review. The mistake had been made by its AI moderation tech, which automatically takes down content it thinks contains nudity, it said.

"We use automated technology to keep nudity off our apps. But sometimes it doesn't know a walla walla onion from a, well, you know. We restored the ad and are sorry for the business' trouble," Meg Sinclair, Facebook Canada's head of communications told CBC.

She did not clarify what she meant by a "you know."

This is not the first time Facebook's automated systems have over-zealously removed content later reinstated by human moderators. In 2018 its systems took down a post containing excerpts from the Declaration of Independence after it flagged the post as containing hate speech.

Facebook's nudity-spotting AI mistook a photo of some onions for 'sexually suggestive' content

Are onions naked or clothed in their natural form? Facebook's Artificial Intelligence (AI) seems to be having an issue telling the difference between pictures with sexual connotations referring to the human body and vegetables that are just round in shape. A garden centre in Newfoundland, Canada on Monday received a notice from Facebook about an ad they had uploaded for Walla Walla onion seeds which had photos of some onions. Facebook's notice said the ad broke its rules on "products with overtly sexual positioning," clarifying: "listings may not position products or services in a sexually suggestive manner."

The Seed Company by E.W. Gaze shared pictures of the onions with the caption, "So we just got notified by Facebook that the photo used for our Walla Walla Onion seed is 'Overtly Sexual' and therefore cannot be advertised to be sold on their platform...Can you see it?" CBC News quoted Jackson McLean, a manager at the Newfoundland-based firm as saying, "I guess something about the two round shapes there could be misconstrued as boobs or something, nude in some way. You’d have to have a pretty active imagination to look at that and get something sexual out of it." He later asked for the site to review the ban, and did not get a reply from him.

So we just got notified by Facebook that the photo used for our Walla Walla Onion seed is "Overtly Sexual" and therefore cannot be advertised to be sold on their platform... 😂 Can you see it?

Posted by The Seed Company by E.W. Gaze on Saturday, October 3, 2020

Meg Sinclair, Facebook Canada's head of communications told CBC, "We use automated technology to keep nudity off our apps. But sometimes it doesn't know a walla walla onion from a, well, you know. We restored the ad and are sorry for the business' trouble." In the past too, Facebook's automated systems have removed similar content only to be reinstated by human moderators. In 2018, they had taken down a post containing excerpts from the Declaration of Independence after it flagged a post containing hate speech.

Too Sexy! Facebook’s AI Mistakes Ad for Onions As Nude Content, Blocks for Being ‘Overtly Sexual’

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents