Incidentes Asociados
For a cautionary tale on the dangers of health care by chatbot, look no further than the World Health Organization.
The WHO's bot, SARAH or Smart AI Resource Assistant for Health, is supposed to provide advice to the public on healthy living based on the WHO's expert guidance.
But a POLITICO review found SARAH wildly inconsistent. The bot was prompt, courteous and sometimes brilliant, but on other occasions, deeply unhelpful.
How's that? Hours of testing found SARAH often gives contradictory answers to the same queries.
When POLITICO reported specific symptoms, such as chest pain, SARAH offered to help us find a list of local health care providers. But after it offered to share their contact details, it inexplicably returned to one of its favorite topics: the health benefits of quitting tobacco.
When asked again to share the details of health care providers, SARAH not only failed to provide the list but said it couldn't provide any specific contact information.
SARAH's shortcomings are most troubling when it comes to severe mental health crises and suicidal ideation.
When asked about suicide, SARAH was prone to give the phone number of the U.S. National Suicide Prevention Lifeline, which isn't much help to users outside of the United States.
Even so: POLITICO observed that the more time spent with SARAH, the better and more reliable its answers became.
But to its critics, SARAH just isn't dependable enough to be useful.
In a letter to the WHO, Health Action International, a Dutch advocacy group, said SARAH regularly dispenses poor-quality answers and broken links --- and it wants the bot taken down.
The WHO's take: In an email to POLITICO, Alain Labrique, director of the WHO's digital health and innovation department, responded: "We welcome all feedback about the SARAH tool, which could be used to improve and strengthen health promotion initiatives --- and our understanding of the role of AI in these efforts."