The National Eating Disorders Association (NEDA) has decided to remove hundreds of helpline staffers and volunteers after they voted to unionize, and now many of these positions will be replaced by a chatbot named Tessa. Cheddar’s Ashley Mastronardi spoke to one eating disorder expert who thinks this may have harmful consequences.

Update: The chatbot was taken down after reports that it was offering harmful advice. In response, NEDA sent Cheddar News this statement:

The Tessa chatbot was taken down over the weekend after it came to our attention that it provided "off-script" language. This was not how the chatbot was programmed, and X2AI/Cass' arrangement was to run the Body Positive program with zero opportunity for generative programming. We now know that over the weekend the chatbot was hacked and somehow was able to go off the pre-approved programmed responses. We will not be putting Tessa back on our website until we are confident this is not a possibility again.

Tessa has been available on our site since February 2022 and has had incredibly positive outcomes both in testing it before we launched on our website, as well as during the last year it has been available to NEDA users. Right now, the current program runs the Body Positive program for individuals at risk for an eating disorder - it is not a replacement for treatment and was never intended to be. It is designed to fill a gap for individuals with shape and weight concerns interested in tools before their thoughts and behaviors may progress to an eating disorder and need traditional professional interventions.

Share:
More In General News
How to Avoid Car Rental Scams This Summer
As families across the country hit the road this summer, everyone should be on the lookout for "imposter websites'' when it comes to renting a car. Criminals are posing as rental car companies and luring in unsuspecting victims with low prices online. Here's a deep dive into the illegal practice. 
Good2Know: Study Finds Racial Bias in Software Diagnosing Lung Cancer
A new study published yesterday in Jama found that a test used to diagnose patients with lung cancer had a built-in racial bias. The algorithm in the software used to diagnose patients assumed that Black people had naturally weaker lung capacity, raising the threshold for recommending care.
Load More