Incident 47: LinkedIn Search Prefers Male Names

Description: An investigation by The Seattle Times in 2016 found a gender bias in LinkedIn's search engine.
Alleged: LinkedIn developed and deployed an AI system, which harmed Women.

Suggested citation format

Yampolskiy, Roman. (2016-09-06) Incident Number 47. in McGregor, S. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
47
Report Count
9
Incident Date
2016-09-06
Editors
Sean McGregor

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover

CSET Taxonomy Classifications

Taxonomy Details

Full Description

In 2016, an investigation by the Seattle Times found that the LinkedIn search engine feature potentially possessed gender bias, as the LinkedIn search function would present any male users before female users when users would search for names that possess both male and female profiles. In addition, when a user would search a female name, a prompt would ask if the user was searching for the male equivalent of the name. The same did not occur when searching the 100 most common male names.

Short Description

An investigation by The Seattle Times in 2016 found a gender bias in LinkedIn's search engine.

Severity

Negligible

Harm Distribution Basis

Sex

AI System Description

Linkedin uses search engines trained on and guided by relative frequencies of words appearing in past queries and member profiles

System Developer

LinkedIn

Sector of Deployment

Information and communication

Relevant AI functions

Perception, Cognition, Action

AI Techniques

Machine learning, natural language processing model

AI Applications

recommendation engine, decision support

Location

Global

Named Entities

LinkedIn, The Seattle Times, Microsoft

Technology Purveyor

LinkedIn

Beginning Date

2016-08-31

Ending Date

2016-08-31

Near Miss

Unclear/unknown

Intent

Accident

Lives Lost

No

Data Inputs

words appearing in user past queries and member profiles

Incident Reports

LinkedIn says its suggested results are generated automatically by an analysis of the tendencies of past searchers.

Search for a female contact on LinkedIn, and you may get a curious result. The professional networking website asks if you meant to search for a similar-looking man's name.

A search for "Stephanie Williams," for example, brings up a prompt asking if the searcher meant to type "Stephen Williams" instead.

It's not that there aren't any people by that name — about 2,500 profiles included Stephanie Williams.

But similar searches of popular female first names, paired with placeholder last names, bring up LinkedIn's suggestion to change "Andrea Jones" to "Andrew Jones", Danielle to Daniel, Michaela to Michael and Alexa to Alex.

The pattern repeats for at least a dozen of the most common female names.

READ MORE:

  • Why does Microsoft need LinkedIn so badly?

  • LinkedIn agrees to US$13 million settlement for embarrassing invite spam

  • LinkedIn opens 'Influencer' blogs

Searches for the 100 most common male names, on the other hand, bring up no prompts asking if users meant predominantly female names.

LinkedIn says its suggested results are generated automatically by an analysis of the tendencies of past searchers. "It's all based on how people are using the platform," spokeswoman Suzi Owens said.

The company, which Microsoft is buying in a US$26.2 billion deal, doesn't ask users their gender at registration, and doesn't try to tag users by assumed gender or group results that way, Owens said. LinkedIn is reviewing ways to improve its predictive technology, she said.

Owens didn't say whether LinkedIn's members, which total about 450 million, skewed more male than female.

LinkedIn's female-to-male name prompts come as some researchers and technologists warn that software algorithms, used to inform everything from which businesses show up in search results to policing strategies, aren't immune from human biases.

"Histories of discrimination can live on in digital platforms," Kate Crawford, a Microsoft researcher, wrote earlier this year. "And if they go unquestioned, they become part of the logic of everyday algorithmic systems."

There's plenty of evidence of that recently.

A Google photo application made headlines last year in mistakenly identifying black people as gorillas.

More recently, Tay, a chatbot Microsoft designed to engage in mindless banter on Twitter, was taken offline after other internet users persuaded the software to repeat racist and sexist slurs.

The impact of machine-learning algorithms isn't limited to the digital world.

A Bloomberg analysis found that Amazon's same-day delivery service, relying on data specifying the concentration of Amazon Prime members, had excluded predominantly nonwhite neighbourhoods in six cities.

Meanwhile, ProPublica found that software used to predict the tendencies of repeat criminal offenders was likely to falsely flag black defendants as future criminals.

LinkedIn's search engine may reflect a gender bias

– Search for a female ­contact on LinkedIn, and you might get a curious result. The professional networking website asks if you meant to search for a similar-looking man's name.

A search for "Stephanie Williams," for example, brings up a prompt asking if the searcher meant to type ­"Stephen Williams" instead.

It's not that there aren't any people by that name — about 2,500 profiles included Stephanie Williams.

But similar searches of popular female first names, paired with placeholder last names, bring up LinkedIn's suggestion to change "Andrea Jones" to "Andrew Jones," Danielle to ­Daniel, Michaela to Michael and Alexa to Alex.

The pattern repeats for at least a dozen of the most common female names in the United States.

Searches for the 100 most common male names in the U.S., on the other hand, bring up no prompts asking if users meant predominantly female names.

LinkedIn said its suggested results are generated automatically by an analysis of the tendencies of past searchers. "It's all based on how ­people are using the platform," spokeswoman Suzi Owens said.

The Mountain View, Calif., company, which Microsoft is buying in a $26.2 billion deal, doesn't ask users their gender at registration, and doesn't try to tag users by assumed gender or group results that way, Owens said. LinkedIn is reviewing ways to improve its predictive ­technology, she said.

Owens didn't say whether LinkedIn's members, which total about 450 million, skewed more male than female. A Pew Research survey last year didn't find a large gap in the gender of LinkedIn users in the U.S. About 26 percent of male internet users used LinkedIn, compared with 25 percent of all female internet users, Pew said.

LinkedIn's female-to-male name prompts come as some researchers and technologists warn that software algorithms, used to inform everything from which businesses show up in search results to policing strategies, aren't immune from human biases.

"Histories of discrimination can live on in digital platforms," Kate Crawford, a Microsoft researcher, wrote in the New York Times earlier this year. "And if they go unquestioned, they become part of the logic of everyday algorithmic systems."

There's plenty of evidence of that recently. A Google photo application made headlines last year in mistakenly identifying black people as gorillas.

More recently, Tay, a chatbot Microsoft designed to engage in mindless ­banter on Twitter, was taken offline after other internet users persuaded the software to repeat racist and sexist slurs.

The effect of machine-learning algorithms isn't limited to the digital world.

A Bloomberg analysis found that Amazon.com's same-day delivery service, relying on data specifying the concentration of Amazon Prime members, had excluded predominantly nonwhite neighborhoods in six U.S. cities. Meanwhile, ProPublica found that software used to predict the tendencies of repeat criminal offenders was likely to falsely flag black defendants as future criminals.

People who work in machine intelligence said one of the challenges in constructing bias-free algorithms is a workforce in the field that skews ­heavily white and male.

"It really comes down to, are you putting the correct training data into the system," said Kieran Snyder, chief executive of Textio, a Seattle start-up that builds a tool designed to detect patterns, including evidence of bias, in job listings. "A broader set of people [working on the software] would have figured out how to get a broader set of data in the first place."

A few months ago, Snyder said a Textio analysis found that job postings for machine-learning roles contained language more likely to appeal to male applicants than the average ­technology industry job post.

"These two trains of conversation, one around inclusion in technology, the other around [artificial intelligence], have only been growing in momentum for the last couple of years," she said.

Tech companies work toward website searches free of bias

Until Sep. 7, LinkedIn users searching for female contacts on the site may have noticed some strange results. Searches for common female names were yielding suggestions for male names as well.

Take a LinkedIn search for “Stephanie Williams.” Earlier this week, that query returned the result, “did you mean Stephen Williams?” (in addition to the 2,500-plus users actually named Stephanie Williams). A search for “Stephen Williams,” however, simply displayed the 7,200 results for people with that name.

The same was true of searches for at least a dozen other popular female first names in the US, a Seattle Times investigation revealed. LinkedIn wondered whether users searching for Andrea meant Andrew, Danielle meant Daniel, and Alexa meant Alex. Searches for the US’ 100 most common males names didn’t return suggestions for female names.

LinkedIn’s ”did you mean” results are fueled by an algorithm designed to suggest names with similar spellings. The algorithm makes recommendations based on how frequently names have shown up in past queries of the company’s more than 450 million member profiles, says spokesperson Suzi Owens. “It is not anything to do with gender,” she said.

All the same, on Sept. 7 the Silicon Valley-based company rolled out a change to the algorithm that enables it to explicitly recognize popular names as such, so that the algorithm doesn’t try to correct them.

It appears to be working: Searches for first names like Dana, Joan, Danielle, Alexa, and Stephanie no longer return any “did you mean” results.

The issue underscores the biases present in artificially intelligent systems that learn from other users’ behavior. Earlier this year, Microsoft was forced to take its millennial chatbot offline after it learned to make racist and sexist remarks from users on Twitter. (Microsoft is also acquiring LinkedIn.)

“As with all machine learned systems, there are always edge cases and we are constantly working hard to improve and create the best possible experience for our members,” says Owens.

LinkedIn’s search algorithm apparently favored men until this week

MOTTO Samantha Cooney is the content strategy editor at TIME.

A week after the Seattle Times reported that LinkedIn’s search engine may reflect a gender bias, the networking platform announced a tweak to its search algorithm.

The Times reported on Aug. 31 that users who searched for at least a dozen common female names were asked if they meant to search for a male name instead — but that no such prompt appeared if users searched for any of 100 of the most common male names. For instance, the Times reported, users who searched for one of the 2,500 “Stephanie Williams” LinkedIn profiles were asked if they meant to look for “Stephen Williams” instead. LinkedIn told the Times in August that the algorithm was dependent upon a users’ past searches and that gender wasn’t a factor.

Follow Motto on Facebook.

On Thursday, LinkedIn announced a change to its algorithm that addresses the issue, the Times reported. Now, the search engine won’t prompt any user who searches for a full name if they meant to look up a different name.

The Brief Newsletter Sign up to receive the top stories you need to know right now. View Sample Sign Up Now

But a LinkedIn spokesperson reiterated to BBC on Thursday that the algorithm wasn’t ever colored by gender bias.

“Suggestions of similar spelt names that are frequently searched for on LinkedIn will follow the search query,” the spokesperson told BBC. “The search algorithm is guided by relative frequencies of words appearing in past queries and member profiles, it is not anything to do [with] gender.”

[The Seattle Times]

Write to Samantha Cooney at samantha.cooney@time.com.

LinkedIn Tweaks Search Engine After Gender Bias Allegations

The question of whether a computer can be biased or not may seem frivolous, but it could make all the difference when it comes to being found online.

Now, an investigation by a US newspaper has suggested that this bias may be present on the biggest professional networking site in the world.

It found that a search of common female names on LinkedIn returned suggestions for related male names.

An investigation by a US newspaper has suggested that gender bias may be present on the biggest professional networking site in the world, by suggesting male names when searching for female professionals. LinkedIn has denied its algorithms use gender

LINKEDIN 'SEARCH BIAS' An investigation has claimed LinkedIn's search function carries a gender bias. When searching for a common female names, such as 'Andrea Jones', it also returned suggestions for male equivalents, such as 'Andrew Jones'. The bias was found not to work the other way, suggesting female equivalents in searches for male professionals. LinkedIn has said its algorithms are not based on gender but on previous searches of its 450 million users.

According to the report in The Seattle Times, the same pattern works for at least a dozen common female names in the US.

What’s more, the apparent bias seems to be a one way street.

When searching for common male names, there are no suggestions of women with a similar name.

It claims that a search for the common name ‘Stephanie Williams’ suggests ‘Stephen Williams’.

Other examples include a search for ‘Andrea Jones’ which brings back many details for ‘Andrew Jones’.

MailOnline was able to replicate some of the results.

LinkedIn is the largest professional networking platform in the world, claiming more than 450 million users. It is set to be bought by software giant Microsoft in a deal worth an estimated £17.7 billion ($26.2 bn).

Responding to the claims, LinkedIn said that searches are based on common search queries by its 450 million users, based on similarly spelled names, having nothing to do with a user’s sex.

The investigation claims that a search for common female names results in suggestions for male professionals, such as a search for ‘Andrea Jones’ brings back many suggestions for ‘Andrew Jones’ (pictured)

LinkedIn is the largest professional networking platform in teh world, claiming more than 450 million users. It is set to be bought by software giant Microsoft in a deal worth an estimated £17.7 billion ($26.2 bn)

CAN COMPUTERS BE BIASED? When teaching machines how to process language, programmers can use word embedding algorithms. These programs enable computers to use machine learning to process language based on learned examples. An example is when a computer has to find related words using the 'she is to he' comparison. This can be used to find accurate pairs of words like she:he, such as sister:brother or queen:king. But using real world sources, such as news articles and websites, can lead to gender biases creeping in. For example, occupations associated with 'he' can be philosopher, fighter pilot, or boss. But occupations associated with 'she' included homemaker, socialite, receptionist and hairdresser. Researchers are trying to combat this bias by teaching machines to ignore certain relationships between words.

A spokesperson for the networking platform told MailOnline: ‘The search algorithm is guided by relative frequencies of words appearing in past queries and member profiles; it's not to do with gender.

‘To fix unintended spelling suggestions that are similar sounding we rolled out a change that explicitly recognises people's names so that the algorithm doesn't try to correct them into another name – of the same or different gender.

‘As with all machine-learned systems, there are always edge cases and we are constantly working hard to improve and create the best possible experience for our members.’

As machine learning algorithms are increasingly used to deal with complex queries, instances of inherent bias are coming to light.

One example is algorithms used to predict rates of recidivism in former criminals in the US, based on social and societal factors.

Some reports have claimed these predictive algorithms may skew results cording to race, with African Americans facing more negative outcomes.

While other examples include a technique called word embedding, which teaches machines how to process language by finding relationships between words.

But when the computer searches real world sources, the embedding approach can pick up on inherent gender stereotypes.

LinkedIn investigation claims searches for female professionals end up suggesting MEN

Image copyright Getty Images Image caption LinkedIn launched in 2002

LinkedIn has denied that its search algorithm has been biased towards suggesting male versions of female names in searches on its website.

A Seattle Times investigation found searching for "Stephanie Williams" on the professional networking service would trigger a prompt for "Stephen Williams" instead, for example.

At least a dozen of the most common female names in the US were affected.

LinkedIn has updated its algorithm to avoid proposing alternative names.

Prior to the update, searches for 100 of the most common male names in the US did not result in prompts suggesting female versions of those names, the Seattle Times said.

'Not gender related'

"Suggestions of similar spelt names that are frequently searched for on LinkedIn will follow the search query," said a LinkedIn spokeswoman.

"The search algorithm is guided by relative frequencies of words appearing in past queries and member profiles, it is not anything to do [with] gender."

A fix had been rolled out to "explicitly recognise people's names" so that alternative names - of the same or a different gender - would not be proposed, she added.

Microsoft announced that it would purchase LinkedIn for $26.2bn (£19.6bn) in June.

Social network algorithms have faced much scrutiny over alleged hints of bias recently.

Last month, Facebook overhauled its Trending feature - which recommends online content to users - after some complained that it was biased towards left-wing stories.

LinkedIn denies gender bias claim over site search

LinkedIn.

Last week, a Seattle Times investigation revealed that LinkedIn’s search function seems to have a pretty pronounced gender bias. It turns out, when you search for a woman’s name on LinkedIn, the site has a pesky habit of asking whether you’re actually looking for a similarly spelled man’s name instead.

For example, on a search for “Stephanie Williams,” LinkedIn helpfully asks if you meant to look up “Stephan Williams” instead — even though, according to the Seattle Times, there are approximately 2,500 profile results for the name “Stephanie Williams.” A similar pattern holds for at least a dozen common American women’s names. Type in “Andrea” — are you sure you didn’t mean Andrew? And “Michaela” — what about Michael? “Danielle” — Daniel, right?

Curiously, searches for the 100 most-common U.S. male names did not result in any suggestions for their female counterparts.

LinkedIn maintains that its algorithm doesn’t have a gender bias, explaining that the suggested male names are simply based on the site’s most-common searches. But today the BBC reports that LinkedIn has updated its search function to avoid proposing any alternative names, which sounds like a good plan!

LinkedIn Denies Gender-Bias Problem

Have you ever searched for a contact on LinkedIn only to have the networking site automatically prompt you to look up a man with a similar name? You're not alone. A recent investigation by the Seattle Times revealed a pervasive gender bias within the site's search algorithm that pointed users toward male profiles when they searched for some of the most common female names.

The Times cited the relatively-common female name "Stephanie Williams." Rather than yield results for the 2,500 women of the same name, a search for Stephanie would instead bring up a prompt asking if the user meant to search for "Stephen Williams" instead. Similar situations occurred when searching for an "Andrea Jones" (users are asked if they're looking for "Andrew Jones"), as well as for women named Danielle (Daniel), Michaela (Michael), and Alexa (Alex). But when users searched for men with one of the 100 most common male names, the site did not ask if they were looking for a female user with a similar name.

LinkedIn, however, denies any gender bias in its search tool. "The search algorithm is guided by relative frequencies of words appearing in past queries and member profiles, it is not anything to do [with] gender," a spokeswoman told the BBC.

But, to make absolute certain that no such bias exists, the professional networking site has updated its search algorithm to remove any prompts suggesting alternative names. We can only assume this will mean big things for all the Stephanie Williamses of the world who missed out on countless new and exciting professional opportunities because recruiters couldn't actually search for them.

Bye Bye, Gender Bias! LinkedIn Will No Longer Ask Users If They Meant to Search for a Man

By By Tim Sandle Sep 19, 2016 in Technology The ‘professional’ social networking site LinkedIn has been accused of having a gender bias. This is through providing more male professionals in its search results than females. "A Fresh Conversation on Gender is Needed" https://t.co/1VPR802Civ by @ngodbout on @LinkedIn — Susan Almon (@susanalmon) September 16, 2016 The charged against LinkedIn (soon to be a Microsoft owned company) is that when names are entered into its search field, which can be used by both men and women (such as Hilary), then it is most often male professionals that appear at the top of the search results rather than women. The charge also contends that even when a female name is entered, often a male name appears. This is based on an experiment In all the newspaper ran up to twelve female names and found that, each time, male names appeared at the top of the search. Conversely, running through 100 male names each time no female name alternatives were offered. The LinkedIn company has denied that its search algorithm is biased and that the search outcomes are not gender related. However, LinkedIn has stated that it has also updated its algorithm to avoid proposing alternative names. The fix functions to more explicitly recognize people's names. The The spokesperson further explained that the LinkedIn search algorithm is shaped by the relative frequencies of words appearing in past queries and member profiles. Microsoft Of course a computer program itself isn’t biased, any ‘bias’ comes from the algorithms that drive the way the software functions. It is with the way that data is extracted in relation to key search terms that LinkedIn has come under criticism.The charged against LinkedIn (soon to be a Microsoft owned company) is that when names are entered into its search field, which can be used by both men and women (such as Hilary), then it is most often male professionals that appear at the top of the search results rather than women.The charge also contends that even when a female name is entered, often a male name appears. This is based on an experiment conducted by the Seattle Times . In a test, journalists entered the name "Stephanie Williams" on the professional networking service. It was found this triggered a prompt for "Stephen Williams" instead. This same effect was repeated with other female names.In all the newspaper ran up to twelve female names and found that, each time, male names appeared at the top of the search. Conversely, running through 100 male names each time no female name alternatives were offered.The LinkedIn company has denied that its search algorithm is biased and that the search outcomes are not gender related. However, LinkedIn has stated that it has also updated its algorithm to avoid proposing alternative names. The fix functions to more explicitly recognize people's names.The BBC quotes from a LinkedIn spokesperson who says: "Suggestions of similar spelt names that are frequently searched for on LinkedIn will follow the search query.”The spokesperson further explained that the LinkedIn search algorithm is shaped by the relative frequencies of words appearing in past queries and member profiles.Microsoft purchased LinkedIn for $26.2 billion this summer. The deal will be completed by the end of 2016. More about Linkedin, Gender bias, Gender, Sexism Linkedin Gender bias Gender Sexism

Does LinkedIn have a gender bias?

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents