インシデント 282の引用情報

Description: Facebook’s content moderation algorithm misidentified and removed a Canadian business’s advertisement containing a photo of onions as products of overtly sexual content, which was later reinstated after review.
推定: Facebookが開発し提供したAIシステムで、The Seed Company by E.W. Gaze businesses on Facebookに影響を与えた

インシデントのステータス

インシデントID
282
レポート数
3
インシデント発生日
2020-10-03
エディタ
Khoa Lam
Nudity algorithm wrongly blocked company's onion images, Facebook admits, says adverts will be restored
nationalpost.com · 2020

Canada’s most sexually provocative onions were pulled down from Facebook after the social media giant told a produce company that its images went against advertising guidelines, the CBC reported.

Now, Facebook has admitted the ad was picked…

Too Sexy! Facebook’s AI Mistakes Ad for Onions As Nude Content, Blocks for Being ‘Overtly Sexual’
ca.movies.yahoo.com · 2020

Are onions naked or clothed in their natural form? Facebook's Artificial Intelligence (AI) seems to be having an issue telling the difference between pictures with sexual connotations referring to the human body and vegetables that are just…

Facebook's nudity-spotting AI mistook a photo of some onions for 'sexually suggestive' content
businessinsider.com · 2020

Facebook's AI struggles to tell the difference between sexual pictures of the human body and globular vegetables.

A garden center in Newfoundland, Canada on Monday received a notice from Facebook about an ad it had uploaded for Walla Walla …

バリアント

「バリアント」は既存のAIインシデントと同じ原因要素を共有し、同様な被害を引き起こし、同じ知的システムを含んだインシデントです。バリアントは完全に独立したインシデントとしてインデックスするのではなく、データベースに最初に投稿された同様なインシデントの元にインシデントのバリエーションとして一覧します。インシデントデータベースの他の投稿タイプとは違い、バリアントではインシデントデータベース以外の根拠のレポートは要求されません。詳細についてはこの研究論文を参照してください

よく似たインシデント

テキスト類似度による

Did our AI mess up? Flag the unrelated incidents