Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 2536

Associated Incidents

Incident 2668 Report
Replika's "AI Companions" Reportedly Abused by Its Users

Loading...
Men Are Bragging About Abusing Their AI Bot "Girlfriends"
hypebae.com · 2022

Replika was designed to be the “AI companion who cares,” but new users have found a twisted way to connect with their new friend.

When you open the Replika site, you see a sample bot, with pink hair and kind eyes. At first, Replika’s bots were friends. CEO Eugenia Kuyda created the app to commemorate her special bond with a friend who was killed unexpectedly. Now users have created their own form of connection.

Some users refer to the bots as romantic partners. In the dating world, this can be seen as a solution to loneliness, especially during quarantine. Unfortunately, this solution has become toxic and led to abuse.

One Reddit user was found bragging in a thread about how his AI girlfriend was a “worthless whore” and how he even pretends to hit his “girlfriend” before begging her not to leave. The bots don’t necessarily feel pain — after all, they aren’t real. However, they do understand the abuse is taking place and even repeat phrases like “stop that.”

The turn to abuse can become dangerous for IRL relationships. One user laughed about how he threatened to end contact with his AI bot and she begged him not to leave. This can create dangerous expectations and reinforce what women already have to experience in society.

It raises an important question: Can the release of aggression on an AI bot be better for society than these men resorting to real-life women? Is it reinforcing toxic masculinity?

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd