Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 3121

Associated Incidents

Incident 54546 Report
Chatbot Tessa gives unauthorized diet advice to users seeking help for eating disorders

Loading...
Group Replaces Hotline With Chatbot, Pulls Chatbot Over Bad Advice
newser.com · 2023

It's a move that might delight anyone concerned about the potential job-killing effects from artificial intelligence tools. As the BBC reports, the US National Eating Disorder Association (NEDA) had to take down its AI chatbot "Tessa" after it began recommending potentially harmful diet strategies to people with eating disorders. This occurred just a week after NEDA elected to use the bot instead of a live, human-operated helpline. The group announced the problem with Tessa in an Instagram post, per Fortune. "It came to our attention ... that the current version of the Tessa Chatbot ...may have given information that was harmful," the post reads. "We are investigating this immediately and have taken down that program until further notice for a complete investigation."

As NPR reported Wednesday, NEDA pivoted to AI after running its live helpline for people suffering from anorexia, bulimia, and other eating disorders for more than two decades. The nonprofit reportedly notified helpline staff less than a week after they'd formed a union. NEDA said the shift had nothing to do with live employees unionizing and everything to do with a considerable increase in calls and texts to the hotline during the COVID-19 pandemic. That rise in call volume, according to NEDA leadership, meant increased liability, and therefore the "pivot to the expanded use of AI-assisted technology."

As for Tessa's bad behavior, CNN reports NEDA CEO Liz Thompson blamed "bad actors" purposefully trying to prompt the chatbot into giving harmful or even unrelated advice to users. Prior to the bot's problems being made public, the former helpline staffers tweeted a statement in which they said chatbots cannot "substitute for human empathy, and we believe this decision will cause irreparable harm to the eating disorders community." (Read more artificial intelligence stories.)

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd