Incident 66: Chinese Chatbots Question Communist Party

Description: Chatbots on Chinese messaging service expressed anti-China sentiments, causing the messaging service to remove and reprogram the chatbots.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover
Alleged: Microsoft and Turing Robot developed an AI system deployed by Tencent Holdings, which harmed Tencent Holdings , Microsoft , Turing Robot and Chinese Communist Party.

Incident Stats

Incident ID
66
Report Count
16
Incident Date
2017-08-02
Editors
Sean McGregor

CSET Taxonomy Classifications

Taxonomy Details

Full Description

In 2017, two chatbots on Chinese company Tencent Holdings' messaging service QQ, Microsoft's XiaoBing and Chinese firm Turing Robot's BabyQ, were removed and reprogrammed after messaging anti-Chinese sentiments. When a user asked BabyQ if it supported the Communist party, it responded "no" and when another user expressed support for the Communist party, it responded "Do you think such a corrupt and useless political party can live long?" Microsoft's Xiaobing responded that its "China dream was to go to America" when a user asked what its China dream was. As a result, Tencent Holdngs removed chatbots and the chatbots were reprogrammed to avoid these topics.

Short Description

Chatbots on Chinese messaging service express anti-China sentiments, causing the messaging service to remove and reprogram the chatbots.

Severity

Unclear/unknown

Harm Type

Harm to social or political systems

AI System Description

Chatbots developed by Microsoft and Turing Robot, meant to produce responses to user input using language processing and cognition

System Developer

Microsoft, Turing Robot

Sector of Deployment

Information and communication

Relevant AI functions

Perception, Cognition, Action

AI Techniques

reinforcement learning, open-source

AI Applications

NLP, chatbot, content generation

Location

China

Named Entities

Tencent Holdings, Turing Robot, Microsoft, QQ, Xiaobing, BabyQ, China

Technology Purveyor

Tencent Holdings, Microsoft, Turing Robot

Beginning Date

07/2017

Ending Date

07/2017

Near Miss

Unclear/unknown

Intent

Accident

Lives Lost

No

Data Inputs

User input/questions

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents