Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 1872

Associated Incidents

Incident 2802 Report
Coffee Meets Bagel’s Algorithm Reported by Users Disproportionately Showing Them Matches of Their Own Ethnicities Despite Selecting “No Preference”

Loading...
The Dating App That Knows You Secretly Aren’t Into Guys From Other Races
buzzfeednews.com · 2016

Yet, it seems like a relatively common experience, even if you aren’t from a minority group.

Amanda Chicago Lewis (who now works at BuzzFeed) wrote about her similar experience on Coffee Meets Bagel for LA Weekly : “I've been on the site for almost three months, and fewer than a third of my matches and I have had friends in common. So how does the algorithm find the rest of these dudes? And why was I only getting Asian guys?”

Anecdotally, other friends and colleagues who have used the app all had a similiar experience: white and Asian women who had no preference were shown mostly Asian men; latino men were shown only latina women. All agreed that this racial siloing was not what they were hoping for in potential matches. Some even said they quit the app because of it.

Yet Coffee Meets Bagel argues that they actually are hoping for racial matches — even if they don’t know it. This is where things start to feel, well, a little racist. Or at the very least, that it is exposing a subtle racism.

“Through millions of match data, what we found is that when it comes to dating, what people say they want is often very different from what they actually want,” Dawoon Kang, one of the three sisters who founded the app explained in an email to BuzzFeed News. “For example, many users who say they have ‘no preference’ in ethnicity actually have a very clear preference in ethnicity when we look at Bagels they like – and the preference is often their own ethnicity.

I asked Kang if this seemed sort of like the app is telling you we secretly know you’re more racist than you think.

“I think you are misunderstanding the algorithm,” she replied. “The algorithm is NOT saying that ‘we secretly know you're more racist than you actually are…’ What it's saying is ‘I don't have enough information about you so I'm going to use empirical data to maximize your connection rate until I have enough information about you and can use that to maximize connection rate for you.’

In this case, the empirical data is that the algorithm knows that people are more likely to match with their own ethnicity.

Perhaps the fundamental problem here is a disconnect between what daters think selecting "no preference" will mean ("I am open to dating all different types of people") and what the app's algorithm understands it to mean ("I care so little about ethnicity that I won't think it's weird if I'm shown only one group). The disconnect between what the ethnicity preference actually means and what the users expect it to mean ends up being a frustrating disappointment for daters.

Coffee Meets Bagel selling point is its algorithm based on data from its site. And they have indeed analyzed the bizarre and somewhat disheartening information on what kinds of ethnicity preferences people have. In a blog post examining if the myth that Jewish men have a “thing” for Asian women, the company looked what the preferences for each race was (at the time, the app was 29% Asian and 55% white).

It found that most white men (both Jewish and non-Jewish) selected white as a preferred ethnicity. However, you can select multiple ethnicities, so to see if white Jewish men really were more likely to select only Asian women, they looked at the data for people who only selected one race, which would indicate they had a “thing” for Asian women.

What they found instead was that white Jewish men were most likely (41%) to select just one race preference. And for those that did, it was overwhelmingly for other white women, not Asian women.

A similar analysis of women’s preferences showed that of white women who only preferred one race, 100% were for white men.

The app’s goal is to use what they’ve learned about people’s behavior to make the best possible match suggestions. What’s unclear is if it becomes a self-fulfilling prophecy: if you’re only being shown potential matches of your same race, it’s most likely that you’ll end up picking a same-race date. And the idea that the majority of couples are the same race isn’t exactly a total shock: it’s true. Ok Cupid has also analyzed race in dating data that is kind of a one way train to bummersville.

But still, something feels extremely… wrong. And perhaps what’s most shocking is simply that Coffee Meets Bagel is being completely upfront about a kind of racism that genteel people don’t want to cop to.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd