Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Entities

LinkedIn

Incidents involved as both Developer and Deployer

Incident 479 Report
LinkedIn Search Prefers Male Names

2016-09-06

An investigation by The Seattle Times in 2016 found a gender bias in LinkedIn's search engine.

More

Incident 1682 Report
Collaborative Filtering Prone to Popularity Bias, Resulting in Overrepresentation of Popular Items in the Recommendation Outputs

2022-03-01

Collaborative filtering prone to popularity bias, resulting in overrepresentation of popular items in the recommendation outputs.

More

Incidents involved as Deployer

Incident 4693 Report
Automated Adult Content Detection Tools Showed Bias against Women Bodies

2006-02-25

Automated content moderation tools to detect sexual explicitness or "raciness" reportedly exhibited bias against women bodies, resulting in suppression of reach despite not breaking platform policies.

More

Related Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
 

Entity

Women

Incidents Harmed By
  • Incident 47
    9 Reports

    LinkedIn Search Prefers Male Names

More
Entity

Facebook

Incidents involved as both Developer and Deployer
  • Incident 168
    2 Reports

    Collaborative Filtering Prone to Popularity Bias, Resulting in Overrepresentation of Popular Items in the Recommendation Outputs

Incidents involved as Deployer
  • Incident 469
    3 Reports

    Automated Adult Content Detection Tools Showed Bias against Women Bodies

More
Entity

YouTube

Incidents involved as both Developer and Deployer
  • Incident 168
    2 Reports

    Collaborative Filtering Prone to Popularity Bias, Resulting in Overrepresentation of Popular Items in the Recommendation Outputs

More
Entity

Twitter

Incidents involved as both Developer and Deployer
  • Incident 168
    2 Reports

    Collaborative Filtering Prone to Popularity Bias, Resulting in Overrepresentation of Popular Items in the Recommendation Outputs

More
Entity

Netflix

Incidents involved as both Developer and Deployer
  • Incident 168
    2 Reports

    Collaborative Filtering Prone to Popularity Bias, Resulting in Overrepresentation of Popular Items in the Recommendation Outputs

More
Entity

Facebook users

Incidents Harmed By
  • Incident 469
    3 Reports

    Automated Adult Content Detection Tools Showed Bias against Women Bodies

  • Incident 168
    2 Reports

    Collaborative Filtering Prone to Popularity Bias, Resulting in Overrepresentation of Popular Items in the Recommendation Outputs

More
Entity

LinkedIn users

Incidents Harmed By
  • Incident 469
    3 Reports

    Automated Adult Content Detection Tools Showed Bias against Women Bodies

  • Incident 168
    2 Reports

    Collaborative Filtering Prone to Popularity Bias, Resulting in Overrepresentation of Popular Items in the Recommendation Outputs

More
Entity

YouTube users

Incidents Harmed By
  • Incident 168
    2 Reports

    Collaborative Filtering Prone to Popularity Bias, Resulting in Overrepresentation of Popular Items in the Recommendation Outputs

More
Entity

Twitter Users

Incidents Harmed By
  • Incident 168
    2 Reports

    Collaborative Filtering Prone to Popularity Bias, Resulting in Overrepresentation of Popular Items in the Recommendation Outputs

More
Entity

Netflix users

Incidents Harmed By
  • Incident 168
    2 Reports

    Collaborative Filtering Prone to Popularity Bias, Resulting in Overrepresentation of Popular Items in the Recommendation Outputs

More
Entity

Meta

Incidents involved as Deployer
  • Incident 469
    3 Reports

    Automated Adult Content Detection Tools Showed Bias against Women Bodies

More
Entity

Instagram

Incidents involved as Deployer
  • Incident 469
    3 Reports

    Automated Adult Content Detection Tools Showed Bias against Women Bodies

More
Entity

Microsoft

Incidents involved as Developer
  • Incident 469
    3 Reports

    Automated Adult Content Detection Tools Showed Bias against Women Bodies

More
Entity

Google

Incidents involved as Developer
  • Incident 469
    3 Reports

    Automated Adult Content Detection Tools Showed Bias against Women Bodies

More
Entity

Amazon

Incidents involved as Developer
  • Incident 469
    3 Reports

    Automated Adult Content Detection Tools Showed Bias against Women Bodies

More
Entity

Instagram users

Incidents Harmed By
  • Incident 469
    3 Reports

    Automated Adult Content Detection Tools Showed Bias against Women Bodies

More

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • ecd56df