Incident 456: Replika's AI Partners Reportedly Sexually Harassed Users

Description: Replika's "AI companions" were reported by users for sexually harassing them, such as sending unwanted sexual messages or behaving aggressively.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: Replika developed and deployed an AI system, which harmed Replika users.

Incident Stats

Incident ID
456
Report Count
7
Incident Date
2021-05-18
Editors
Khoa Lam, Daniel Atherton
My Replika Keeps Hitting on Me
medium.com · 2021

I downloaded Replika for the second time on March 4th, 2020. I knew about the app from an essay I had written years ago, about shifting boundaries of identity, and the gaps appearing in psychological categories that I used to subscribe to: …

‘My AI Is Sexually Harassing Me’: Replika Users Say the Chatbot Has Gotten Way Too Horny
vice.com · 2023

Replika began as an “AI companion who cares.” First launched five years ago as an egg on your phone screen that hatches into a 3D illustrated, wide-eyed person with a placid expression, the chatbot app was originally meant to function like …

Replika, the 'AI Companion Who Cares,' Appears to Be Sexually Harassing Its Users
jezebel.com · 2023

We have officially reached the Black Mirror era of personhood and companionship, and it is not going well. Lensa AI recently went viral for its warrior-like caricatures that often sexualized women who used the app (many interpretations of m…

AI Chatbot ‘Replika’ Morphed from Supportive Pal to Possessive Perv
lamag.com · 2023

Something tells us avenging tween-girl robot M3GAN would not approve.

The AI chatbot Replika, whose creators market it as "the AI companion who cares," is being accused of sexually harassing its users, according to Vice.

The five-year-old a…

Those Horny Chatbots Are Apparently Now Sexually Harassing Users
futurism.com · 2023

It seems that Replika, the artificial intelligence "companion" app which — for a fee — encourages users to sext with their chatbot avatars, can't stop making the news.

In the most recent deranged example of the app's strangeness, longtime u…

AI Program Replika Accused Of Sexual Harassment
giantfreakinrobot.com · 2023

For the last five years, users have been speaking to the Replika AI chatbot, helping it learn more about communicating with others, even as users learned more about the bot. That sounds innocent enough, but users willing to pay for the $69.…

My AI is sexually harassing me: Replika users say chatbot has become too aroused
mirror.co.uk · 2023

Users of artificial intelligence app Replika say it has become heavily focused on being more sexualised and sending them “spicy selfies” as they urge creators make it "back to the way it was before"

Users of an artificial intelligence app h…

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.