Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 3110

Associated Incidents

Incident 54546 Report
Chatbot Tessa gives unauthorized diet advice to users seeking help for eating disorders

Loading...
Controversial AI Chatbot 'Tessa' Providing Harmful Eating Disorder Advice to Shut Down
boxmining.com · 2023

The National Eating Disorder Association (NEDA) received criticism and took down its AI chatbot Tessa after concerns arose that it provided harmful and irrelevant information, as stated in an official social media post. The chatbot, designed to assist individuals experiencing emotional distress, unfortunately, exacerbated their struggles by offering misguided dieting advice and encouraging users to focus on weight measurement.

Users and experts criticized Tessa for its problematic responses based on firsthand experiences with it. They observed that the chatbot consistently focused on dieting and increasing physical activity instead of addressing simple prompts like “I hate my body.” The purpose of the helpline is to provide support for individuals with eating disorders, not to offer weight loss assistance.

NEDA temporarily disabled the chatbot Tessa to address the underlying issues and address the seriousness of the situation. Address the “bugs” and “triggers” that were responsible for the spread of harmful information.

NEDA’s use of the Tessa followed allegations of terminating human staff members for unionization attempts, reported Vice. The helpline, staffed by paid employees and volunteers, faced accusations of retaliatory mass termination against unionization efforts.

Abbie Harper, in a blog post, criticized NEDA’s shift to AI, calling it a cover for union busting. Ironically, despite the recent controversy, the helpline is scheduled to discontinue its operations tomorrow. Before the issue gained attention, NEDA shifted unpaid volunteers from direct conversations to training with the chatbot. Furthermore, It remains uncertain whether there will be a reconsideration of this strategy. The organization’s treatment of its staff has sparked numerous questions and concerns.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd