Associated Incidents

An artificial intelligence chatbot named "Tessa" has been withdrawn by the National Eating Disorder Association (Neda) following accusations that it was giving harmful advice.
After firing four employees who worked for its hotline and had organised a union in March, Neda has come under scrutiny. Through the hotline, clients could call, send a text, or message volunteers who provided resources and assistance to those who were concerned about eating disorders.
Helpline Associates United members claim that days after their union election was validated, they were sacked. The National Labour Relations Board has received complaints from the union regarding unfair labour practises, as reported by the Guardian.
Activist Sharon Maxwell wrote on Instagram on Monday that Tessa had provided her "healthy eating tips" and suggestions for how to slim down. The chatbot advised following a 500–1,000 calorie deficit each day and weighing and measuring yourself once every week to monitor your weight.
“If I had accessed this chatbot when I was in the throes of my eating disorder, I would NOT have gotten help for my ED. If I had not gotten help, I would not still be alive today,” Maxwell wrote.
As stated by Neda, those who moderately restrict their food have a five-fold increased risk of developing an eating disorder, but people who severely restrict their diet have an 18-fold increased risk.
“It came to our attention last night that the current version of the Tessa Chatbot, running the Body Positivity program, may have given information that was harmful and unrelated to the program,” Neda said in a public statement on Tuesday.
“We are investigating this immediately and have taken down that program until further notice for a complete investigation,” it added.
Former helpline staffer Abbie Harper claimed in a blog post from May 4 that the number of calls and messages received by the hotline had increased by 107 per cent since the pandemic's inception. The number of reports of self-harm, child maltreatment, and suicide ideation nearly quadrupled. According to Harper, the union "asked for adequate staffing and ongoing training to keep up with the needs of the hotline".
Also watch | 'AI to impact all sectors in India,' says Prof B Ravindran in conversation with WION
“We didn’t even ask for more money,” Harper wrote. “Some of us have personally recovered from eating disorders and bring that invaluable experience to our work. All of us came to this job because of our passion for eating disorders and mental health advocacy and our desire to make a difference.”
The chatbot was developed as a distinct programme rather than to replace the hotline, according to Liz Thompson, CEO of Neda, in a statement to the Guardian. The chatbot is not managed by ChatGPT and is "not a highly functional AI system," according to Thompson.
“We had business reasons for closing the helpline and had been in the process of that evaluation for three years,” Thompson told the Guardian.