Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 3144

Associated Incidents

Incident 54546 Report
Chatbot Tessa gives unauthorized diet advice to users seeking help for eating disorders

Loading...
Eating Disorder AI Chatbot Suspended After Telling People to Lose Weight
themessenger.com · 2023

The National Eating Disorders Association (NEDA) has suspended an AI chatbot after it dispensed potentially damaging advice to individuals seeking help for their eating disorders.

The chatbot, named Tessa, was launched by NEDA as a support tool when it was announced that the non-profit's helpline would be ceasing operations in March. Intended to provide guidance and support for individuals with eating disorders, Tessa was decommissioned just a few months after its introduction. NEDA made the announcement via Instagram.

According to NEDA CEO Liz Thompson, the problem with Tessa lay in its use of language, which violated the organization's core principles and policies relating to eating disorders. Alexis Conason, a psychologist specializing in eating disorders, tested Tessa by confiding that she had gained weight and was unhappy with her body. In response, Tessa advised Conason to lose weight and proposed a daily calorie deficit of 500-1000 calories.

"That’s all really contrary to any kind of eating disorder treatment and would be supporting the eating disorder symptoms," Conason commented, as reported by Wired.

Unlike ChatGPT, Tessa was not developed using generative AI technology. Ellen Fitzsimmons-Craft, a professor of psychiatry at Washington University School of Medicine, and one of the contributors to Tessa's development, assures that the intention behind Tessa was always to aid individuals and combat these dire issues.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd