Incident 216: WeChat’s Machine Translation Gave a Racist English Translation for the Chinese Term for “Black Foreigner”

Description: The Chinese platform WeChat provided an inappropriate and racist English translation for the Chinese term for “black foreigner” in its messaging app.
Alleged: WeChat developed and deployed an AI system, which harmed Black WeChat users.

Suggested citation format

Dickinson, Ingrid. (2017-10-10) Incident Number 216. in Lam, K. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
216
Report Count
5
Incident Date
2017-10-10
Editors
Khoa Lam

Tools

New ReportNew ReportDiscoverDiscover

Incidents Reports

Chinese messaging app WeChat has apologized for an error in its algorithm that provided the N-word as a translation for a neutral Chinese term for black foreigners.

“We’re very sorry for the inappropriate translation,” a WeChat spokesperson told Sixth Tone. “After receiving users’ feedback, we immediately fixed the problem.”

The issue was discovered by Shanghai-based theater director Ann James, who texted her Chinese colleagues in their messaging group this morning to say she was running late. As usual, the black American typed in English, using WeChat’s in-app translation feature to read the Chinese responses. She gasped when she saw that the next message contained a racist obscenity: “The n----- is late.”

“I was just horrified,” James told Sixth Tone. Yet she doubted that her colleague would use such a slur, and another friend confirmed that the original Chinese message used a neutral term: hei laowai, or “black foreigner.”

A local English-language media outlet, That’s Shanghai, reported the story and found that the translator gave neutral translations in some instances but used the slur when the phrase in question included a negative term, such as “late” or “lazy.” Sixth Tone’s own testing on Wednesday evening found similar results.

While testing the translation for ‘hei laowai’ on an Apple device, in some cases the phrase is translated as ‘foreigner’ or ‘black foreigner.’ But in other cases — when combined with a negative adjective, for example — the phrase is rendered as the N-word.

While testing the translation for ‘hei laowai’ on an Apple device, in some cases the phrase is translated as ‘foreigner’ or ‘black foreigner.’ But in other cases — when combined with a negative adjective, for example — the phrase is rendered as the N-word.

WeChat, an app with an estimated 1 billion active users, is ubiquitous in China, not only as a messaging tool but also as a cashless payment provider and a social media and online publishing platform. The company behind it, Tencent, is now the world’s 10th most valuable public firm, worth $275 billion, according to The Economist. But as Chinese technology firms expand globally, their cross-cultural aptitude will be put to the test.

The Chinese colleague who had sent the original message was shocked, but the theater director reassured her. “I said, ‘No problem, I know it’s not you — it’s something in the programming,’” James recalled, though she questions how the algorithm came to present such a profanity in the first place: “Why is that word even in the translator?”

The spokesperson from WeChat explained that the app used neural machine translation, though the engine was constantly being refined to provide “more accurate, faithful, expressive, and elegant” results.

Many modern translation apps take advantage of big data sets and machine learning techniques, basing their translations on existing usage without a human filter. But such processes can introduce a risk of artificial intelligence picking up offensive associations and language. In 2016, Twitter users taught an AI account to enthusiastically support Hitler, and a report that examined software used by U.S. courts claims it picked up the racial bias of institutions whose data it used.

James, who recently played a role in the Chinese blockbuster “Wolf Warrior 2,” found the translation issue disheartening but unsurprising. “If you’re a black person in China, you’ve come up against some craziness,” she says, explaining that she was often touched and photographed in public without her consent even before her film appearance.

In 2016, a Chinese laundry detergent advertisement was widely criticized for showing a black man “washed” into a light-skinned Chinese man. And on Saturday, international skin care and cosmetics brand Dove apologized for a similar gaffe.

“I know there’s a lot of curiosity and a lot of ignorance about black people [in China],” said James, who was quick to emphasize that she loves the country she’s called home for five years. “I just think that we need to have more open discussion between Chinese people and black people.”

WeChat Apologizes for Translating ‘Black Foreigner’ as N-Word

The translation service in China’s biggest messaging app, WeChat, is being retooled after offering a racist slur as a translation for the phrase “black foreigner.”

Ann James, a black theater director based in Shanghai, messaged her colleagues in English on Wednesday to say she was running late. When a coworker replied in Chinese, WeChat translated their message into English as “The nigger is late.” As Sixth Tone explains, “hei laowai,” the term the coworker actually used, is a neutral phrase meaning “black foreigner.” But until the issue was raised following James’ rude awakening, WeChat sometimes translated it as the n-word.

WeChat sent Sixth Tone the following apology, but gave no further explanation: “We’re very sorry for the inappropriate translation. After receiving users’ feedback, we immediately fixed the problem.” The platform boasts a staggering 700 million users worldwide and, in China, is used for everything from booking plane tickets to paying utility bills to office communications.

WeChat confirmed their software uses neural machine translation, AI that’s been trained on vast quantities of text to gain new vocabulary and, crucially, discern the specific contexts in which to use these new words. That second part may be what triggered the slur. From Sixth Tone:

A local English-language media outlet, That’s Shanghai, reported the story and found that the translator gave neutral translations in some instances but used the slur when the phrase in question included a negative term, such as “late” or “lazy.” Sixth Tone’s own testing on Wednesday evening found similar results.

Recognizing patterns is the core of language AI. Neural language processing AI picks up on patterns between associated words, then spits them back out. In 2016, for example, researchers used algorithms trained on Google News copy to uncover associations the news crawler was picking up. As the algorithm determined, “Emily” is to “Ebony” as “pancakes” are to “fried chicken.” In another case, it found “man” is to “woman” as “doctor” is to “nurse.”

This is essentially how you “teach” AI to be racist. AI literalizes negative connotations. If the phrases “black foreigner” or “black person” are used as slurs in conjunction with words like “lazy” or “slow” in the source text, the AI picks up on those patterns and makes them explicit. All the AI does is repeat the associations buried in its source.

Interestingly, the derogatory translations seem to have been provided by an unspecified service, while the inoffensive translations were explicitly performed by Microsoft Translator. That’s Shanghai couldn’t replicate the offensive translation in either Bing Translator or Microsoft’s Neural Machine Translation system. “Hei laowai,” even when coupled with words like “lazy” or “rude,” still produced “black foreigner.” One could infer that Microsoft’s platform either removed the slur already or that association had never been made.

Neural language processing was invented so AI could speak and think more like humans. Sadly, they’re learning the worst of what we offer.

China's Most Popular App Apologizes After Translating 'Black Foreigner' as the N-Word

China’s most popular chat app has apologised after its software used the N-word to translate a Chinese phrase that commonly means “black foreigner”.

WeChat, which has almost 900 million users, blamed the use of the racial slur on an error in the artificial intelligence software that translates between Chinese and English.

The issue was first noticed by an American living in Shanghai, Ann James, when her friend discussed being late in Chinese in a group chat. James used WeChat’s built-in translation feature, which produced the message: “The nigger is late.”

“If you’re a black person in China, you’ve come up against some craziness,” James told the news website Sixth Tone, adding that she is frequently touched and photographed in public. “I know there’s a lot of curiosity and a lot of ignorance about black people.”

In subsequent tests, users found the app used the racial slur almost exclusively in negative contexts, including with the words late, lazy, and thief. But in many neutral sentences the word – hei laowai in Chinese – was translated into English as its literal meaning of “black foreigner”.

“We’re very sorry for the inappropriate translation,” a WeChat spokesman told local media. “After receiving users’ feedback, we immediately fixed the problem.” Tests by the Guardian showed the translation software had been retooled and no longer produced racial slurs.

The company uses AI and machine learning, feeding computers huge amounts of data to train it to pick the best translations based on context. But the system also removes human oversight, leading to incorrect and even offensive words being used.

This is not the first time Chinese companies and media have been accused of being tone deaf when it comes to race. Han people constitute approximately 92% of the population of China, and most of the country’s ethnic minorities live in the far west, away from the populated cities along the eastern coast.

Last year a television advert for laundry detergent showed a black man covered in paint going into a washing machine and coming out as an sparkling Asian man. The video went viral around the world and caused outrage for its insensitive messaging.

Over the summer China’s state news agency published a video during a border standoff with India featuring an offensive parody of a Sikh man, complete with a turban and fake beard.

China's WeChat app translates 'black foreigner' to N-word

Chinese messaging app WeChat has apologised after its software used the N-word as an English translation for the Chinese for "black foreigner".

The company blamed its algorithms for producing the error.

It was spotted by Ann James, a black American living in Shanghai, when she texted her Chinese colleagues to say she was running late.

Ms James, who uses WeChat's translation feature to read Chinese responses, got the reply: "The [racial slur] is late."

Horrified, she checked the Chinese phrase - "hei laowai" - with a co-worker and was told it was a neutral expression, not a profanity.

WeChat acknowledged the error to China-focused news site Sixth Tone, saying: "We're very sorry for the inappropriate translation. After receiving users' feedback, we immediately fixed the problem."

The app's software uses artificial intelligence that has been fed huge reams of text to help it pick the best translations.

These are based on context, so it sometimes uses insulting phrases when talking about negative events.

Local outlet That's Shanghai tested the app, and found that when used to wish someone happy birthday, the phrase "hei laowai" was translated as "black foreigner". But when a sentence included negative words like "late" or "lazy," it produced the racist insult.

Almost a billion people use WeChat, which lets users play games, shop online, and pay for things as well as sending messages. It resembles another popular chat app, WhatsApp, but is subject to censorship.

A research group at the University of Toronto analysed the terms blocked on WeChat in March, and found they included "Free Tibet", "Down with the Communist Party", and many mentions of Nobel laureate Liu Xiaobo, who was China's most prominent human rights advocate.

WeChat translates 'black foreigner' into racial slur

WeChat, the Chinese messaging app, has apologised for translating “black foreigner” into the N-word.

It was noticed by Ann James, a black American director and actor who featured in China’s highest-grossing film ever, this summer’s Wolf Warrior 2.

Ms Jones recently texted Chinese colleagues to tell them she was running late. When Ms Jones translated their Chinese response into English using WeChat’s translation feature, it read: “The [N-word] is late.”

“I was just horrified”, Ms James, who has lived in China for five years, told news site Sixth Tone.

But a Chinese colleague assured Ms James that the original Chinese used – “hei laowai” – was a neutral phrase.

Local news outlet That’s Shanghai tested the app. It found that in some sentences the phrase “black foreigner” was translated neutrally, but when the phrase was used in a negative context, the app translated it into the N-word.

WeChat admitted the error.

"We're very sorry for the inappropriate translation”, a spokesperson told Sixth Tone.

“After receiving users' feedback, we immediately fixed the problem."

The app's translation software uses artificial intelligence. It learns how to use language in context by analysing huge volumes of material, which is why it may choose insulting language to translate negative sentences.

Ms James questioned why WeChat included material containing the N-word in its machine learning process.

“Why is that word even in the translator?”

Nearly a billion people used WeChat for chatting, shopping and gaming. It is censored by the Chinese government. Whatsapp, another messaging app which WeChat resembles, which has been blocked in China.

WeChat’s parent company, Tencent, is worth $275 million, making it the world’s tenth most valuable public company, according to The Economist.

WeChat translates 'black foreigner' into the N-word