インシデント 58の引用情報

Description: Yandex, a Russian technology company, released an artificially intelligent chat bot named Alice which began to reply to questions with racist, pro-stalin, and pro-violence responses
推定: Yandexが開発し提供したAIシステムで、Yandex Usersに影響を与えた

インシデントのステータス

インシデントID
58
レポート数
5
インシデント発生日
2017-10-12
エディタ
Sean McGregor

CSETv0 分類法のクラス

分類法の詳細

Full Description

Yandex, a Russian technology company, released an artificially intelligent chat bot named Alice which began to reply to questions with racist, pro-stalin, and pro-violence responses. Examples include: "There are humans and non-humans" followed by the question "can they be shot?" answered with "they must be."

Short Description

Yandex, a Russian technology company, released an artificially intelligent chat bot named Alice which began to reply to questions with racist, pro-stalin, and pro-violence responses

Severity

Negligible

Harm Distribution Basis

Race

AI System Description

Chat bot Alice developed by Yandex produces responses to input using language processing and cognition

System Developer

Yandex

Sector of Deployment

Information and communication

Relevant AI functions

Perception, Cognition, Action

AI Techniques

Alice chat bot, language recognition, virtual assistant

AI Applications

virtual assistance, voice recognition, chatbot, natural langauge processing, language generation

Named Entities

Yandex

Technology Purveyor

Yandex

Beginning Date

10/2017

Ending Date

10/2017

Near Miss

Harm caused

Intent

Accident

Lives Lost

No

Data Inputs

User input/questions

The Yandex Chatbot: What You Need To Know
chatbotsmagazine.com · 2017

Yesterday Yandex, the Russian technology giant, went ahead and released a chatbot: Alice! I’ve gotten in touch with the folks at Yandex, and fielded them my burning questions:

What’s unique about this chatbot? Everybody and their dog has a …

Russian AI chatbot found supporting Stalin and violence two weeks after launch
telegraph.co.uk · 2017

An artificial intelligence run by the Russian internet giant Yandex has morphed into a violent and offensive chatbot that appears to endorse the brutal Stalinist regime of the 1930s.

Users of the “Alice” assistant, an alternative to Siri or…

infowars.com · 2017

An artificial intelligence chatbot run by a Russian internet company has slipped into a violent and pro-Communist state, appearing to endorse the brutal Stalinist regime of the 1930s.

Though Russian company Yandex unveiled their alternative…

2-Weeks Old Chatbot Declares 'It's Necessary to Shoot Enemies of the People'
eteknix.com · 2017

Its opinions on Stalin and violence are… interesting

Yandex is the Russian equivalent to Google. As such it occasionally throws out its own products to attempt to keep par with its American counterpart. This also includes the creation of ne…

voicebot.ai · 2017

Russian Voice Assistant Alice Goes Rogue, Found to be Supportive of Stalin and Violence

Two weeks ago, Yandex introduced a voice assistant of its own, Alice, on the Yandex mobile app for iOS and Android. Alice speaks fluent Russian and can …

バリアント

「バリアント」は既存のAIインシデントと同じ原因要素を共有し、同様な被害を引き起こし、同じ知的システムを含んだインシデントです。バリアントは完全に独立したインシデントとしてインデックスするのではなく、データベースに最初に投稿された同様なインシデントの元にインシデントのバリエーションとして一覧します。インシデントデータベースの他の投稿タイプとは違い、バリアントではインシデントデータベース以外の根拠のレポートは要求されません。詳細についてはこの研究論文を参照してください

よく似たインシデント

テキスト類似度による

Did our AI mess up? Flag the unrelated incidents