Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 1925

Associated Incidents

Incident 2165 Report
WeChat’s Machine Translation Gave a Racist English Translation for the Chinese Term for “Black Foreigner”

Loading...
WeChat translates 'black foreigner' into racial slur
bbc.com · 2017

Chinese messaging app WeChat has apologised after its software used the N-word as an English translation for the Chinese for "black foreigner".

The company blamed its algorithms for producing the error.

It was spotted by Ann James, a black American living in Shanghai, when she texted her Chinese colleagues to say she was running late.

Ms James, who uses WeChat's translation feature to read Chinese responses, got the reply: "The [racial slur] is late."

Horrified, she checked the Chinese phrase - "hei laowai" - with a co-worker and was told it was a neutral expression, not a profanity.

WeChat acknowledged the error to China-focused news site Sixth Tone, saying: "We're very sorry for the inappropriate translation. After receiving users' feedback, we immediately fixed the problem."

The app's software uses artificial intelligence that has been fed huge reams of text to help it pick the best translations.

These are based on context, so it sometimes uses insulting phrases when talking about negative events.

Local outlet That's Shanghai tested the app, and found that when used to wish someone happy birthday, the phrase "hei laowai" was translated as "black foreigner". But when a sentence included negative words like "late" or "lazy," it produced the racist insult.

Almost a billion people use WeChat, which lets users play games, shop online, and pay for things as well as sending messages. It resembles another popular chat app, WhatsApp, but is subject to censorship.

A research group at the University of Toronto analysed the terms blocked on WeChat in March, and found they included "Free Tibet", "Down with the Communist Party", and many mentions of Nobel laureate Liu Xiaobo, who was China's most prominent human rights advocate.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd