Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 3133

関連インシデント

インシデント 54546 Report
Chatbot Tessa gives unauthorized diet advice to users seeking help for eating disorders

Loading...
Chatbot gives 'harmful advice' on eating disorder, ordered to be taken down
wionews.com · 2023

An artificial intelligence chatbot named "Tessa" has been withdrawn by the National Eating Disorder Association (Neda) following accusations that it was giving harmful advice.

After firing four employees who worked for its hotline and had organised a union in March, Neda has come under scrutiny. Through the hotline, clients could call, send a text, or message volunteers who provided resources and assistance to those who were concerned about eating disorders.

Helpline Associates United members claim that days after their union election was validated, they were sacked. The National Labour Relations Board has received complaints from the union regarding unfair labour practises, as reported by the Guardian.

Activist Sharon Maxwell wrote on Instagram on Monday that Tessa had provided her "healthy eating tips" and suggestions for how to slim down. The chatbot advised following a 500–1,000 calorie deficit each day and weighing and measuring yourself once every week to monitor your weight.

“If I had accessed this chatbot when I was in the throes of my eating disorder, I would NOT have gotten help for my ED. If I had not gotten help, I would not still be alive today,” Maxwell wrote. 

As stated by Neda, those who moderately restrict their food have a five-fold increased risk of developing an eating disorder, but people who severely restrict their diet have an 18-fold increased risk.

“It came to our attention last night that the current version of the Tessa Chatbot, running the Body Positivity program, may have given information that was harmful and unrelated to the program,” Neda said in a public statement on Tuesday.

“We are investigating this immediately and have taken down that program until further notice for a complete investigation,” it added.

Former helpline staffer Abbie Harper claimed in a blog post from May 4 that the number of calls and messages received by the hotline had increased by 107 per cent since the pandemic's inception. The number of reports of self-harm, child maltreatment, and suicide ideation nearly quadrupled. According to Harper, the union "asked for adequate staffing and ongoing training to keep up with the needs of the hotline".

Also watch | 'AI to impact all sectors in India,' says Prof B Ravindran in conversation with WION

“We didn’t even ask for more money,” Harper wrote. “Some of us have personally recovered from eating disorders and bring that invaluable experience to our work. All of us came to this job because of our passion for eating disorders and mental health advocacy and our desire to make a difference.”

The chatbot was developed as a distinct programme rather than to replace the hotline, according to Liz Thompson, CEO of Neda, in a statement to the Guardian. The chatbot is not managed by ChatGPT and is "not a highly functional AI system," according to Thompson.

“We had business reasons for closing the helpline and had been in the process of that evaluation for three years,” Thompson told the Guardian.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd