Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 3113

関連インシデント

インシデント 54546 Report
Chatbot Tessa gives unauthorized diet advice to users seeking help for eating disorders

Loading...
National Eating Disorder Association shuts down A.I. chatbot it planned to use to replaces humans saying it ‘may have given’ harmful information
fortune.com · 2023

Less than a week after it announced plans to replace its human helpline staff with an A.I. chatbot named Tessa, the National Eating Disorder Association (NEDA) has taken the technology offline.

“It came to our attention [Monday] night that the current version of the Tessa Chatbot…may have given information that was harmful,” NEDA said in an Instagram post. “We are investigating this immediately and have taken down that program until further notice for a complete investigation.”

The chatbot was set to completely replace human associates on the organization’s hotline on June 1. It’s unclear how the organization plans to staff that helpline at this point.

The problems with Tessa were made public by an activist named Sharon Maxwell, who said: “Every single thing Tessa suggested were things that led to the development of my eating disorder.” NEDA officials initially called those claims a lie in a social media post, but deleted it after Maxwell sent screenshots of the interaction, she said.

Alexis Conason, a psychologist who specializes in treating eating disorders, was able to re-create the issues, posting screenshots of a conversation with the chatbot on Instagram.

“Imagine vulnerable people with eating disorders reaching out to a robot for support because that’s all they have available and receiving responses that further promote the eating disorder,” she wrote.

NEDA introduced Tessa after the hotline staff decision to unionize following a slew of pandemic-era calls led to mass staff burnout. The six paid employees oversaw a volunteer staff of roughly 200 people, who handled calls (sometimes multiple ones) from nearly 70,000 people last year.

NEDA officials told NPR the decision had nothing to do with the unionization. Instead, said Vice President Lauren Smolar, the increasing number of calls and largely volunteer staff was creating more legal liability for the organization and wait times for people who needed help were increasing. Former workers, however, called the move blatantly anti-union.

The creator of Tessa says the chatbot, which was specifically designed for NEDA, isn’t as advanced as ChatGPT. Instead, it’s programmed with a limited number of responses meant to help people learn strategies to avoid eating disorders.

“It’s not an open-ended tool for you to talk to and feel like you’re just going to have access to kind of a listening ear, maybe like the helpline was,” Dr. Ellen Fitzsimmons-Craft, a professor of psychiatry at Washington University’s medical school who helped design Tessa, told NPR.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd