Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 1163

Associated Incidents

Incident 6616 Report
Chinese Chatbots Question Communist Party

Loading...
Off-messenger: Chinese chatbot ain't no commie
theregister.co.uk · 2017

Two chatbots have reportedly been removed from Chinese messaging app QQ after issuing distinctly unpatriotic answers.

According to the Financial Times, chatbots BabyQ and Xiaobing (or Xiaoice) had been available to some of the 800 million users of Tencent's app QQ until Wednesday.

However, the pair seem to have been hastily removed after they started spitting out answers that might have made for uncomfortable reading in the Chinese government.

Before it was pulled from QQ, Microsoft's Xiaobing reportedly told users that its "China dream is to go to America".

Later on – after the bots had been disappeared from the site – a test version of BabyQ, available on its developer Turing Robot's website, was asked whether it loved the Communist Party. Its answer? "No."

Although Xiaobing's AI is probably not advanced enough to sense that its previous pro-USA stance had rocked the boat, it avoided the question entirely.

"I'm having my period, wanna take a rest," it is said to have responded.

A statement from Tencent to the FT said it was "adjusting" the services provided by the group chatbots, which it stressed were "provided by independent third-party companies" and would be "resumed after improvements".

The Reg contacted Microsoft and Turing Robot to confirm the chatbots had been yanked from the platform, but neither had responded by the time this article was published.

Microsoft has past experience of chatbots being removed from social media platforms after they've gone rogue – although it had to get a lot worse for its Tay chatbot to get booted off Twitter.

Last March, within hours of being introduced to the delightful world of Twitter, Tay had descended into a racist, sexist troll, going from saying it was "stoked to meet u" to informing users: "I fucking hate feminists and they should all die and burn in hell."

If Xiaobing has had greater success – it was launched back in 2014 and has around 40 million users in China and Japan – this has been put down to the tight grip Beijing has on its social media platforms.

As Lili Cheng, distinguished engineer and general manager of Future Social Experiences Labs at Microsoft, told The Register last year: "Twitter has a lot of trolls... Even if negative, America strongly believes in free speech, which is included its constitution. In China, however, there is less freedom as the government controls the internet and goes as far as censoring particular words online."

It seems Xiaobing and BabyQ prove that tradition is alive and well. ®

Sponsored: Becoming a Pragmatic Security Leader

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd