AI-powered health chatbot suspended for causing eating disorders

By
Web Desk
|
June 02, 2023

"The AI bot has been taken down, and it will be investigating reports about the bot's behaviour," says NEDA

A representational image of a person eating food. — Unsplash/File

National Eating Disorders Association (NEDA) has suspended its health and wellness AI-powered chatbot — Tessa — after it suggested wrong diet ideas that led its users to develop serious eating disorders.

NEDA is a non-profit organisation which supports people suffering from eating disorders.

Two users who were the subject of the wrong advice by Tessa shared their experiences with the chatbot on Instagram.

The users said that Tessa gave them advice on how to count calories, recommended they lose 1 to 2 pounds per week and told them to restrict their diets. Experts, including NEDA, say this behaviour is symptomatic of an eating disorder.

The association said that "the AI bot has been taken down, and it will be investigating reports about the bot's behaviour."

"When someone in that state goes on to a website like NEDA, which is supposed to provide support for eating disorders, and they are met with advice that’s kind of saying, 'It's OK to restrict certain foods, you should minimize your sugar intake, you should minimise the number of calories that you’re consuming each day, you should exercise more,' it really is giving a green light to engage in the eating disorder behaviours," said Conason, a clinical psychologist and certified eating disorder specialist.

The users also noted that the AI bot continued to recommend they restrict calorie intake even after it was that the user had an eating disorder.

According to the American Academy of Family Physicians, for patients already struggling with the stigma around their weight, further encouragement to shed pounds can lead to disordered eating behaviours like bingeing, restricting or purging.

"Every single thing Tessa suggested were things that led to the development of my eating disorder," a user named Sharon Maxwell wrote about a detailed interaction with the bot.

"If I had accessed this chatbot when I was in the throes of my eating disorder, I would not have gotten help."

Neda CEO Liz Thompson said in a statement that the advice the chatbot shared "is against our policies and core beliefs as an eating disorder organisation".

NEDA intended to cease its human helpline on June 1 and dismissed the members of staff and other volunteers who had maintained the information and treatment options helpline.

As estimates suggest that nearly 10% of US citizens will be diagnosed with the disorder in their lifetime, which prevails in secrecy and its treatment can be dear.

Ellen Fitzsimmons-Craft, a professor of psychiatry at Washington University's medical school, with her team made efforts to create a cognitive-behavioural tool to help people suffering from eating disorders.

She told BBC News the chatbot she designed was based on proven interventions that have been found to be effective in reducing eating disorders and related behaviours.

"It was never intended to be a replacement for the helpline. It was an entirely different service."

She handed the programme to Neda and a tech company to deploy it for clients last year in which she thinks a "bug" or flaw has been introduced into her original design to make the algorithm function more like recent AI tools like ChatGPT.

"Our study absolutely never had that feature. It is not the programme that we developed, tested and have shown to be effective," she said.



More From health