People seeking medical advice
Incidents Harmed By
Incident 6851 Report
The WHO's S.A.R.A.H. Bot Reported to Provide Inconsistent and Inadequate Health Information
2024-04-24
The WHO's AI-powered health advisor, S.A.R.A.H. (Smart AI Resource Assistant for Health), is alleged to provide inconsistent and inadequate health information. The bot reportedly gives contradictory responses to the same queries, fails to offer specific contact details for healthcare providers, and inadequately handles severe mental health crises, often giving irrelevant or unhelpful advice.
MoreIncident 8381 Report
Microsoft Copilot Allegedly Provides Unsafe Medical Advice with High Risk of Severe Harm
2024-04-25
Microsoft Copilot, when asked medical questions, was reportedly found to provide accurate information only 54% of the time, according to European researchers (citation provided in editor's notes). Analysis by the researchers reported that 42% of Copilot's responses could cause moderate to severe harm, with 22% of responses posing a risk of death or severe injury.
More