People seeking medical advice
Afectado por Incidentes
Incidente 6851 Reporte
The WHO's S.A.R.A.H. Bot Reported to Provide Inconsistent and Inadequate Health Information
2024-04-24
The WHO's AI-powered health advisor, S.A.R.A.H. (Smart AI Resource Assistant for Health), is alleged to provide inconsistent and inadequate health information. The bot reportedly gives contradictory responses to the same queries, fails to offer specific contact details for healthcare providers, and inadequately handles severe mental health crises, often giving irrelevant or unhelpful advice.
MásIncidente 8381 Reporte
Microsoft Copilot Allegedly Provides Unsafe Medical Advice with High Risk of Severe Harm
2024-04-25
Microsoft Copilot, when asked medical questions, was reportedly found to provide accurate information only 54% of the time, according to European researchers (citation provided in editor's notes). Analysis by the researchers reported that 42% of Copilot's responses could cause moderate to severe harm, with 22% of responses posing a risk of death or severe injury.
MásIncidente 13171 Reporte
Purported Deepfake Impersonation of Elon Musk Used to Promote Fraudulent '17-Hour' Diabetes Treatment Claims
2025-12-27
A purported deepfake video reportedly circulated online falsely depicting Elon Musk endorsing a nonexistent "17-hour" diabetes cure. The reported video promoted unverified health claims and appears to have been part of a scam ecosystem exploiting Musk's public credibility. Rapper Boosie Badazz reportedly encountered and amplified the video before its falsity was identified.
Más