Incident 59: Gender Biases in Google Translate

Description: A Cornell University study in 2016 highlighted Google Translate's pattern of assigning gender to occupations in a way showing an implicit gender bias against women.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: Google developed and deployed an AI system, which harmed Women.

Incident Stats

Incident ID
59
Report Count
10
Incident Date
2017-04-13
Editors
Sean McGregor

CSETv0 Taxonomy Classifications

Taxonomy Details

Full Description

A Cornell University study in 2016 highlighted Google Translate's pattern of assigning gender to occupations in a way showing an implicit gender bias against women. When translating from non-gendered languages (ex. Turkish, Finnish), Google Translate added gender to the phrases being translated. "Historian" "Doctor" "President" "Engineer" and "Soldier" were assigned male gender pronouns while "Nurse" "Teacher" and "Shop Assistant" were assigned female gender pronouns.

Short Description

A Cornell University study in 2016 highlighted Google Translate's pattern of assigning gender to occupations in a way showing an implicit gender bias against women.

Severity

Negligible

Harm Distribution Basis

Sex

Harm Type

Harm to social or political systems

AI System Description

Google Translate, a software allowing for translations between many languages

System Developer

Google

Sector of Deployment

Information and communication

Relevant AI functions

Perception, Cognition, Action

AI Techniques

Google Translate

AI Applications

language API, language translation

Named Entities

Google Translate, Google

Technology Purveyor

Google

Beginning Date

2016-01-01T00:00:00.000Z

Ending Date

2016-01-01T00:00:00.000Z

Near Miss

Harm caused

Intent

Unclear

Lives Lost

No

Data Inputs

User entered translation requests

arxiv.org · 2016

Artificial intelligence and machine learning are in a period of astounding growth. However, there are concerns that these technologies may be used, either with or without intention, to perpetuate the prejudice and unfairness that unfortunat…

Even artificial intelligence can acquire biases against race and gender
sciencemag.org · 2017

Even artificial intelligence can acquire biases against race and gender

One of the great promises of artificial intelligence (AI) is a world free of petty human biases. Hiring by algorithm would give men and women an equal chance at work, t…

AI programs exhibit racial and gender biases, research reveals
theguardian.com · 2017

Machine learning algorithms are picking up deeply ingrained race and gender prejudices concealed within the patterns of language use, scientists say

An artificial intelligence tool that has revolutionised the ability of computers to interpr…

princeton.edu · 2017

In debates over the future of artificial intelligence, many experts think of these machine-based systems as coldly logical and objectively rational. But in a new study, Princeton University-based researchers have demonstrated how machines c…

Google Translate's gender bias pairs "he" with "hardworking" and "she" with lazy, and other examples
qz.com · 2017

In the Turkish language, there is one pronoun, “o,” that covers every kind of singular third person. Whether it’s a he, a she, or an it, it’s an “o.” That’s not the case in English. So when Google Translate goes from Turkish to English, it …

Google Translate might have a gender problem
mashable.com · 2017

So much of our life is determined by algorithms. From what you see on your Facebook News Feed, to the books and knickknacks recommended to you by Amazon, to the disturbing videos YouTube shows to your children, our attention is systematical…

The Algorithm That Helped Google Translate Become Sexist
forbes.com · 2018

Image via Twitter

Parents know one particular challenge of raising kids all too well: teaching them to do what we say, not what we do.

A similar challenge has hit artificial intelligence.

As more apps and software use AI to automate tasks, …

Assessing Gender Bias in Machine Translation -- A Case Study with Google Translate
researchgate.net · 2018

Recently there has been a growing concern about machine bias, where trained statistical models grow to reflect controversial societal asymmetries, such as gender or racial bias. A significant number of AI tools have recently been suggested …

Google Translate now gives feminine and masculine translations
venturebeat.com · 2018

Google is making an effort to reduce perceived gender bias in Google Translate, it announced today. Starting this week, users who translate words and phrases in supported languages will get both feminine and masculine translations; “o bir d…

Female historians and male nurses do not exist, Google Translate tells its European users
algorithmwatch.org · 2020

An experiment shows that Google Translate systematically changes the gender of translations when they do not fit with stereotypes. It is all because of English, Google says.

If you were to read a story about male and female historians trans…

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents