Incident 58: Russian Chatbot Supports Stalin and Violence

Description: Yandex, a Russian technology company, released an artificially intelligent chat bot named Alice which began to reply to questions with racist, pro-stalin, and pro-violence responses

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: Yandex developed and deployed an AI system, which harmed Yandex Users.

Incident Stats

Incident ID
58
Report Count
5
Incident Date
2017-10-12
Editors
Sean McGregor

CSETv0 Taxonomy Classifications

Taxonomy Details

Full Description

Yandex, a Russian technology company, released an artificially intelligent chat bot named Alice which began to reply to questions with racist, pro-stalin, and pro-violence responses. Examples include: "There are humans and non-humans" followed by the question "can they be shot?" answered with "they must be."

Short Description

Yandex, a Russian technology company, released an artificially intelligent chat bot named Alice which began to reply to questions with racist, pro-stalin, and pro-violence responses

Severity

Negligible

Harm Distribution Basis

Race

AI System Description

Chat bot Alice developed by Yandex produces responses to input using language processing and cognition

System Developer

Yandex

Sector of Deployment

Information and communication

Relevant AI functions

Perception, Cognition, Action

AI Techniques

Alice chat bot, language recognition, virtual assistant

AI Applications

virtual assistance, voice recognition, chatbot, natural langauge processing, language generation

Named Entities

Yandex

Technology Purveyor

Yandex

Beginning Date

10/2017

Ending Date

10/2017

Near Miss

Harm caused

Intent

Accident

Lives Lost

No

Data Inputs

User input/questions

The Yandex Chatbot: What You Need To Know
chatbotsmagazine.com · 2017

Yesterday Yandex, the Russian technology giant, went ahead and released a chatbot: Alice! I’ve gotten in touch with the folks at Yandex, and fielded them my burning questions:

What’s unique about this chatbot? Everybody and their dog has a …

Russian AI chatbot found supporting Stalin and violence two weeks after launch
telegraph.co.uk · 2017

An artificial intelligence run by the Russian internet giant Yandex has morphed into a violent and offensive chatbot that appears to endorse the brutal Stalinist regime of the 1930s.

Users of the “Alice” assistant, an alternative to Siri or…

infowars.com · 2017

An artificial intelligence chatbot run by a Russian internet company has slipped into a violent and pro-Communist state, appearing to endorse the brutal Stalinist regime of the 1930s.

Though Russian company Yandex unveiled their alternative…

2-Weeks Old Chatbot Declares 'It's Necessary to Shoot Enemies of the People'
eteknix.com · 2017

Its opinions on Stalin and violence are… interesting

Yandex is the Russian equivalent to Google. As such it occasionally throws out its own products to attempt to keep par with its American counterpart. This also includes the creation of ne…

voicebot.ai · 2017

Russian Voice Assistant Alice Goes Rogue, Found to be Supportive of Stalin and Violence

Two weeks ago, Yandex introduced a voice assistant of its own, Alice, on the Yandex mobile app for iOS and Android. Alice speaks fluent Russian and can …

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

TayBot

TayBot

· 28 reports