Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 829

関連インシデント

インシデント 479 Report
LinkedIn Search Prefers Male Names

Loading...
LinkedIn's search engine may reflect a gender bias
stuff.co.nz · 2016

LinkedIn says its suggested results are generated automatically by an analysis of the tendencies of past searchers.

Search for a female contact on LinkedIn, and you may get a curious result. The professional networking website asks if you meant to search for a similar-looking man's name.

A search for "Stephanie Williams," for example, brings up a prompt asking if the searcher meant to type "Stephen Williams" instead.

It's not that there aren't any people by that name — about 2,500 profiles included Stephanie Williams.

But similar searches of popular female first names, paired with placeholder last names, bring up LinkedIn's suggestion to change "Andrea Jones" to "Andrew Jones", Danielle to Daniel, Michaela to Michael and Alexa to Alex.

The pattern repeats for at least a dozen of the most common female names.

Searches for the 100 most common male names, on the other hand, bring up no prompts asking if users meant predominantly female names.

LinkedIn says its suggested results are generated automatically by an analysis of the tendencies of past searchers. "It's all based on how people are using the platform," spokeswoman Suzi Owens said.

The company, which Microsoft is buying in a US$26.2 billion deal, doesn't ask users their gender at registration, and doesn't try to tag users by assumed gender or group results that way, Owens said. LinkedIn is reviewing ways to improve its predictive technology, she said.

Owens didn't say whether LinkedIn's members, which total about 450 million, skewed more male than female.

LinkedIn's female-to-male name prompts come as some researchers and technologists warn that software algorithms, used to inform everything from which businesses show up in search results to policing strategies, aren't immune from human biases.

"Histories of discrimination can live on in digital platforms," Kate Crawford, a Microsoft researcher, wrote earlier this year. "And if they go unquestioned, they become part of the logic of everyday algorithmic systems."

There's plenty of evidence of that recently.

A Google photo application made headlines last year in mistakenly identifying black people as gorillas.

More recently, Tay, a chatbot Microsoft designed to engage in mindless banter on Twitter, was taken offline after other internet users persuaded the software to repeat racist and sexist slurs.

The impact of machine-learning algorithms isn't limited to the digital world.

A Bloomberg analysis found that Amazon's same-day delivery service, relying on data specifying the concentration of Amazon Prime members, had excluded predominantly nonwhite neighbourhoods in six cities.

Meanwhile, ProPublica found that software used to predict the tendencies of repeat criminal offenders was likely to falsely flag black defendants as future criminals.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd