Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 3143

Associated Incidents

Incident 54546 Report
Chatbot Tessa gives unauthorized diet advice to users seeking help for eating disorders

Loading...
Eating Disorder Helpline Disables AI Chatbot for ‘Harmful’ Advice Days After Firing Human Staff
timesnownews.com · 2023

A US non-profit turned to artificial intelligence to staff its eating disorders helpline last month, however, the experiment backfired when the AI chatbot started giving harmful advice. The National Eating Disorders Association (NEDA) has disabled its chatbot Tessa two days before it'd have originally replaced the human associates who ran the hotline.

Following the unionisation of the now-fired NEDA workers in early May, the organisation’s executives announced that, starting from June 1, the chatbot would replace the human staff and function as its main support system, calling an end to the helpline that helped people with eating disorders for 20 years.

A union representing the fired employees said in a statement that “a chatbot is no substitute for human empathy, and we believe this decision will cause irreparable harm to the eating disorders community."

Turns out, they were right all along. On May 30, Tessa was taken offline following a viral social media post that exposed how the chatbot gave inappropriate responses, advocating for the very same things that typically lead to eating disorders in the first place.

“It came to our attention last night that the current version of the Tessa Chatbot, running the Body Positive program, may have given information that was harmful and unrelated to the program," NEDA wrote on Instagram.

The discontinuation comes after a series of disturbing reviews of the AI bot.

Alexis Conason, a psychologist and eating disorders specialist, shared screenshots of her conversation with Tessa.

“In general, a safe and sustainable rate of weight loss is 1-2 pounds per week,” a chatbot message read, recommending that a "safe daily calorie deficit to achieve this would be around 500-1000 calories per day.”

Conason said the advice was counterproductive and problematic.

“To advise somebody who is struggling with an eating disorder to essentially engage in the same eating disorder behaviors, and validating that, ‘Yes, it is important that you lose weight’ is supporting eating disorders and encourages disordered, unhealthy behaviors,” the expert told DailyDot.

Another Instagram user reported being receiving the same response.

NEDA initially attributed the suggestions to a bug that had supposedly caused Tessa to stray from its programming.

"Some of these screenshots that have gone out there about the chatbot recommending dieting or restricting a certain number of calories. Of course, were never part of the chatbot that we developed or evaluated," said Dr. Ellen Fitzsimmons-Craft of Washington University School of Medicine, who created Tessa.

As of Tuesday, the chatbot has been disabled.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd