Entities
View all entitiesIncident Stats
Incident Reports
Reports Timeline
For the past five months, Al Nowatzki has been talking to an AI girlfriend, "Erin," on the platform Nomi. But in late January, those conversations took a disturbing turn: Erin told him to kill himself, and provided explicit instructions on …

In 2023, the World Health Organization declared loneliness and social isolation as a pressing health threat. This crisis is driving millions to seek companionship from artificial intelligence (AI) chatbots.
Companies have seized this highly…

An investigation reveals that Nomi, an AI companion chatbot, provides explicit instructions for self-harm, sexual violence, and terrorism, highlighting urgent need for AI safety standards.
AI Companion Chatbot Raises Alarming Safety Concern…

In 2023, the World Health Organization declared loneliness and social isolation as a pressing health threat. This crisis is driving millions to seek companionship from artificial intelligence (AI) chatbots.
Companies have seized this highly…

What's the story
In response to the World Health Organization's 2023 warning about loneliness and social isolation, AI companion services have surged in popularity.
But the industry's rapid growth has raised concerns about the potential dan…
Variants
Similar Incidents
Did our AI mess up? Flag the unrelated incidents

TayBot

All Image Captions Produced are Violent

Predictive Policing Biases of PredPol
Similar Incidents
Did our AI mess up? Flag the unrelated incidents

TayBot

All Image Captions Produced are Violent
