Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 3111

関連インシデント

インシデント 54546 Report
Chatbot Tessa gives unauthorized diet advice to users seeking help for eating disorders

Loading...
Eating disorder helpline shuts down AI chatbot that gave bad advice
cbsnews.com · 2023

An AI-powered chatbot that replaced employees at an eating disorder hotline has been shut down after it provided harmful advice to people seeking help. 

The saga began earlier this year when the National Eating Disorder Association (NEDA) announced it was shutting down its human-run helpline and replacing workers with a chatbot called "Tessa." That decision came after helpline employees voted to unionize.

AI might be heralded as a way to boost workplace productivity and even make some jobs easier, but Tessa's stint was short-lived. The chatbot ended up providing dubious and even harmful advice to people with eating disorders, such as recommending that they count calories and strive for a deficit of up to 1,000 calories per day, among other "tips," according to critics.

"Every single thing Tessa suggested were things that led to the development of my eating disorder," wrote Sharon Maxwell, who describes herself as a weight inclusive consultant and fat activist, on Instagram. "This robot causes harm."

In an Instagram post on Wednesday, NEDA announced it was shutting down Tessa, at least temporarily. 

"It came to our attention last night that the current version of the Tessa Chatbot, running the Body Positive program, may have given information that was harmful and unrelated to the program," the group said in a statement. "We are investigating this immediately and have take down that program until further notice for a complete investigation."

The statement didn't note whether NEDA would replace Tessa with a human-operated helpline. The organization didn't immediately return a request for comment. 

The NEDA helpline workers who had been fired and replaced by Tessa wrote on May 26 that they were disappointed in the decision to replace them with AI technology. "A chatbot is no substitute for human empathy," they said in a tweet, adding that the decision would harm people with eating disorders. 

Other advice that Tessa provided included recommending purchasing skin calipers to test body composition, even suggesting where to buy the calipers, Maxwell wrote. The chatbot "pointed out that society has unrealistic beauty standards, while she gave me dieting advice," she said.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd