Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 3141

Associated Incidents

Incident 54546 Report
Chatbot Tessa gives unauthorized diet advice to users seeking help for eating disorders

Loading...
US-based NGO takes down AI chatbot on preventing eating disorders. Here's why
hindustantimes.com · 2023

A New York-based NGO dedicated to preventing eating disorders has taken down an AI chatbot following reports of it providing harmful device.

The National Eating Disorder Association (NEDA) is under fire for its decision to fire four employees who worked for its helpline allowing people to reach out to volunteers offering support to those concerned about eating disorder. The NGO's chatbot ‘Tessa’ has run into problems after an activist posted on Instagram claiming that it offered her ‘healthy eating tips’ and advised on losing weight, Guardian reported.

The activist Sharon Maxwell was advised a calorie deficit of 500-1,000 calories a day and weekly weighing to keep track of the weight. Maxwell claim if she had accessed the chatbot during ‘throes' of her disorder, she would not have got help and would not be alive.

The NGO said those diet moderately are five times more likely to develop an eating disorder. NEDA said that the current version of the chatbot might have given information considered harmful and unrelated to the Body Positivity programme. The organisation said it is carrying out a probe and has taken down the programme until further notice.

National Eating Disorder Association's CEO clarified that the chatbot is not run by OpenAI's ChatGPT and is not a highly functional AI system(Representational image)

Last month, a former NEDA helpline employee had claimed that the helpline witnessed a 107 per cent hike in calls and messages since the start of Covid-19 pandemic, mostly pertaining to reports of suicidal thoughts, self-harm and child abuse. The union had asked for adequate staffing and training to keep up with the demands of the helpline.

According to the report, NEDA worked with psycology researchers and AI company Cass AI which develops chatbots focusing on mental health. Ellen Fitzsimmons, a psychologist in a post on NEDA website has said that the chatbot ‘Tessa’ was thought as a solution for making eating disorder prevention widely available.

NEDA CEO Liz Thompson told the website that ‘Tessa’ was not meant to replace the helpline but was created as a separate programme. The CEO clarified that the chatbot is not run by OpenAI's ChatGPT and is not a highly functional AI system.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd