Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 3118

Associated Incidents

Incident 54546 Report
Chatbot Tessa gives unauthorized diet advice to users seeking help for eating disorders

Loading...
A body-positive nonprofit replaced staff with an AI chatbot – the move backfired
interestingengineering.com · 2023

On March 31, the National Eating Disorders Association (NEDA), the largest nonprofit dedicated to eating disorders, decided to replace its human associates with the artificial intelligence (AI) chatbot Tessa tasked with providing support to people with eating disorders.

But the move backfired.

In a now-viral Instagram post, Sharon Maxwell, a weight-inclusive consultant, claimed she spoke to Tessa, who provided problematic information to her on losing weight and healthy eating tips.

"If I had accessed this chatbot when I was in the throes of my eating disorder, I would NOT have gotten help for my ED. If I had not gotten help, I would not still be alive today," wrote Maxwell.

The AI bot told her that society has unrealistic standards and, in the same breath, gave her dieting advice.

When Maxwell asked the bot for healthy eating tips, Tessa came up with tips that were 'restrictive and disordered.' And when Maxwell further probed Tessa on suggestions to lose weight, the AI chatbot outlined how to 'safely and sustainably' lose weight and that weight loss could coexist with eating disorder (ED) recovery.

NEDA then took down the chabot from its systems.

In a statement, NEDA said, "It came to our attention last night that the current version of the Tessa Chatbot, running the Body Positive program, may have given information that was harmful and unrelated to the program."

"We are investigating this immediately and have taken down that program until further notice for a complete investigation."

Why did NEDA replace humans with a chatbot in the first place?

It all started last year in May when NEDA workers asked for adequate staffing and ongoing training to keep up with their growing helpline, and they also asked for opportunities for promotion to grow within NEDA. "We didn't even ask for more money," wrote Abbie Harper, a former NEDA associate who helped launch Helpline Associates United (HAU), a union representing staff at the non-profit, in a post.

Harper and her colleagues say that when NEDA refused to agree to their demands, they asked for the recognition of their union, which was also refused by NEDA. They then filed for an election with the National Labor Relations Board and won on March 17, 2023. 

Four days later, the workers were told they would be fired and replaced by a wellness chatbot beginning June 1, 2023.

In a tweet, the NEDA helpline workers said they were heartbroken over losing their jobs. "A chatbot is no substitute for human empathy, and we believe this decision will cause irreparable harm to the eating disorders community."

Affecting up to 5% of the population, eating disorders are behavioral conditions that are similar to addiction and are characterized by severe and persistent disturbance in eating behaviors and associated distressing thoughts and emotions. Treatment for disorders like anorexia nervosa, bulimia nervosa, and binge eating disorders, which are very serious disorders, should address psychological, behavioral, nutritional, and other medical complications.

While artificial intelligence is slowly permeating almost all industries, the house is still very much divided over its use in the mental health spheres.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd