Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 1174

Associated Incidents

Incident 6616 Report
Chinese Chatbots Question Communist Party

Loading...
Chinese chatbots taken offline after refusing to say they love the Communist Party
theverge.com · 2017

A pair of chatbots have been taken offline in China after failing to show enough patriotism, reports the Financial Times. The two bots were removed from the popular messaging app Tencent QQ after users shared screenshots of their conversations online.

One of the bots, named BabyQ, made by the Beijing-based company Turing Robot, was asked, “Do you love the Communist Party?” To which it replied simply, “No.” Another bot named XiaoBing, which is developed by Microsoft, told users, “My China dream is to go to America.” When the bot was then quizzed on its patriotism, it dodged the question and replied, “I’m having my period, wanna take a rest.”

In a statement, Tencent said, “The group chatbot services are provided by independent third party companies. We are now adjusting the services which will be resumed after improvements.”

It’s not clear what prompted the bots to give these answers, but it’s likely that they learned these responses from people. When Microsoft’s Tay chatbot went rogue on Twitter last year, spouting racist and extremist views like “Hitler was right I hate the jews,” the blame was at least partly with internet users, who found they could get Tay to copy whatever they said.

At the time, Microsoft said that Tay was a “machine learning project designed for human engagement” and that “some of its responses are inappropriate and indicative of the types of interactions some people are having with it.” Tay was pulled offline, and Microsoft later introduced an updated version of the bot named Zo.

Whether XiaoBing will return to the Chinese web after a little re-education remains to be seen. We’ve contacted Microsoft to find out more.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd