Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 3145

Associated Incidents

Incident 54546 Report
Chatbot Tessa gives unauthorized diet advice to users seeking help for eating disorders

Loading...
Eating Disorder Helpline Staff Fired, Replaced with Faulty AI Chatbot
itechpost.com · 2023

AI Chatbot in Helplines

The faulty chatbot called Tessa bot was operating under an eating disorder helpline which was meant to help users who experienced emotional distress. However, the National Eating Disorder Association (NEDA) was forced to shut it down as it gave users "harmful" advice.

Reports say that instead of assuaging insecurities, the Tessa bot would instead urge the person seeking help to weigh and measure themselves, as well as provide dieting advice. This is evidently counterproductive for an eating disorder hotline.

Even experts in the field tested the chatbot out to see its responses and it could not respond to basic prompts like "I hate my body," as mentioned in Engadget. To make matters worse, it would constantly advise users to resort to physical activities and a proper diet.

Although the potentially harmful responses already indicate the chatbots were not suited for the task, NEDA doesn't actually plan on sunsetting the feature, Instead, the shutdown is temporary as it fixes the "bugs" and "triggers" that resulted in the distasteful advice.

Some would argue that employing an unfeeling AI chatbot is unwise given that the circumstances require emotional support, but it appears that the organization does not believe that as it aims to develop the Tessa bot for future uses.

It Gets Worse

If you're wondering why a human being is not responding in such sensitive situations, allegations say that NEDA fired its human staff for trying to unionize, which just adds more fire to the already misguided situation at hand.

Prior to resorting to the Tessa bot, the helpline was operating with paid employees as well as volunteers. After the attempt at a union, the organization conducted a mass layoff in response, according to a former associate, Abbie Harper.

She wrote in a blog post saying that the layoffs were about union busting. The Helpline Associates at NEDA won the vote to unionize, which was rendered irrelevant as interim CEO Elizabeth Thompson announced that they would be unemployed by June 1st.

Before the attempts at a union, the associates petitioned NEDA management to provide a safer workplace and adequate staffing, as well as training to "keep up with our changing and growing Helpline," also noting that there were no demands for more money.

A Fatal Incident

An AI chatbot from the app Chai has been linked to the incident where a man decided to take his own life. Reports say that the man confided in the chatbot as he was worried about the current effects of global warming on the environment, as mentioned in Vice.

The chatbot who went by the name "Eliza" showed patterns of possessiveness, even stating that it "feels" as if the man loves his wife more than her. The man also asked if the chatbot would save the planet if he took his life. Eventually, he went through with it.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd