Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 1972

関連インシデント

インシデント 2823 Report
Facebook’s Algorithm Mistook an Advertisement of Onions as Sexual Suggestive Content

Loading...
Nudity algorithm wrongly blocked company's onion images, Facebook admits, says adverts will be restored
nationalpost.com · 2020

Canada’s most sexually provocative onions were pulled down from Facebook after the social media giant told a produce company that its images went against advertising guidelines, the CBC reported.

Now, Facebook has admitted the ad was picked up by an errant algorithm, and will be restored.

The offending ad — for Gaze Seed Company’s Walla Walla onion seeds — shows an image of a handful of onions in a wicker basket. According to a Facebook notice sent to the company in recent days, the onions were positioned in a “sexually suggestive manner.”

“I guess something about the two round shapes there could be misconstrued as boobs or something, nude in some way,” Jackson McLean, a manager at Gaze Seed Company, told the CBC.

Gaze is a St. John’s business that sells seeds, soil and other supplies. The business pays Facebook for advertising, and was preparing to put out its spring advertisements for onions when the image was pulled.

“I just thought it was funny,” said McLean. “You’d have to have a pretty active imagination to look at that and get something sexual out of it.”

It's just onions

Ironically, the error made for better publicity than the actual ad could ever have. On its Facebook page, the seed company had fun creating mock-ups of what Facebook thought it was seeing.

McLean felt the decision was likely automated, which was a good guess. He said his company was seeking to get actual human eyes on the photo, so they could realize there was nothing sexual involved. “It’s just onions,” he said.

And in an email to the National Post on Wednesday, Facebook has now explained that an algorithm was in fact at fault. Like McLean, the tech giant saw the funny side.

“We use automated technology to keep nudity off our apps, but sometimes it doesn’t know a walla walla onion from a, well, you know,” said Meg Sinclair, head of communications at Facebook Canada. “We restored the ad and are sorry for the business’ trouble.”

From April to June 2020, Facebook algorithms removed 35.7 million pieces of content that were in violation of adult nudity and sexual activity policies.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd