Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 1442

関連インシデント

インシデント 1174 Report
TikTok's "Suggested Accounts" Algorithm Allegedly Reinforced Racial Bias through Feedback Loops

Loading...
Is TikTok’s algorithm actually pretty racist?
dailydot.com · 2020

According to an experiment performed by artificial intelligence researcher Marc Faddoul, the algorithm TikTok uses to suggest new users to follow might have a racial bias.

Faddoul, an AI researcher from the University of California, Berkeley, who specializes in algorithmic fairness, first pointed out his findings on Twitter this week.

“A TikTok novelty: FACE-BASED FITLER BUBBLES,” Faddoul wrote. “The AI-bias techlash seems to have had no impact on newer platforms. Follow a random profile, and TikTok will only recommend people who look almost the same.”

Faddoul explained to BuzzFeed News that when a user on TikTok follows an account they are then suggested a series of other accounts they could follow. Faddoul said he noticed similarities in these accounts, from users being of the same race, hair color, and having similar appearances.

Faddoul said he repeated the experiment again with a new account with similar results.

“Clearly, recommendations are very physiognomic,” Faddoul said. “But it’s not just gender and ethnicity, you can get much more niche facial profiling. TikTok adapts ‘recommendability’ on hair style, body profile, age, how (un)dressed the person is, and even whether they have visible disabilities.”

A representative from TikTok told BuzzFeed that the algorithm isn’t based on race, or the account’s picture, but based on the content of the account. According to the representative, this is called collaborative filtering, a similar process used by YouTube and Netflix.

“Our recommendation of accounts to follow is based on user behavior: users who follow account A also follow account B, so if you follow A you are likely to also want to follow B,” a representative told BuzzFeed.

But according to Faddoul, if this is the case, it could still render a racial bias.

“A risk is to reinforce a ‘coverage bias’ with a feedback loop,” Faddoul said. “If most popular influencers are say, blond, it’s will be easier for a blond to get followers than for a member of an underrepresented minority. And the loop goes on…”

This is not the first time the company has found itself in hot water, back in December, TikTok admitted it was burying content made by queer, fat, and disabled users.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd