Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 1311

Associated Incidents

Incident 7226 Report
Facebook translates 'good morning' into 'attack them', leading to arrest

Loading...
Israel arrested a Palestinian after Facebook translated “Good morning” as “Attack them”
qz.com · 2017

Facebook says it automatically translates about 4.5 billion posts each day. With new research and advancements, the technology keeps improving, but it’s not perfect—and its flaws can have serious consequences.

Last week, a Palestinian man working at a construction site in an Israeli settlement in the West Bank, posted on Facebook a photo of himself in front of a bulldozer, with the words “Good Morning” in Arabic. The platform’s automatic translation mistook the words for “attack them” in Hebrew, and “hurt them” in English.

Here’s an explanation from the Israeli daily newspaper Haaretz, which first reported the incident, about the nature of the mistake:

Arabic speakers explained that English transliteration used by Facebook is not an actual word in Arabic but could look like the verb “to hurt” – even though any Arabic speaker could clearly see the transliteration did not match the translation.

Google Translate also mistranslates the phrase, turning it into “become them,” likely because the literal translation of “morning” in Arabic is “day becoming.”

The Jerusalem Post reports that Israeli police got multiple complaints from civilians about the post, and that they acted upon it because of previous terrorist attacks using a bulldozer. Reportedly, no officer who spoke Arabic saw the post before the arrest was made. The police released the man from custody after several hours, realizing the mistake.

Facebook introduced its translation tool in 2011. Then, it used Microsoft’s Bing to translate. In late 2015 the platform completed a shift to its own AI translation technology. In 2016, users started being able to have their statuses automatically translated, visible to everyone in the language they post in. In August 2017, the platform fully transitioned from phrase-based translation to the more accurate neural network system. The company is developing a more sophisticated neural technology, which it says is nine times faster than its competitors.

Recently, another case of mistranslation by Facebook resulted in “Mexicanitos,” meant to be a term of endearment, being posted as the slur “wetbacks.” The mother from El Paso who posted the photo of her kids, calling them “Mexicanitos,” complained publicly.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd