People seeking medical advice
影響を受けたインシデント
インシデント 4816 Report
Purported Deepfake TikTok Video Reportedly Featured Joe Rogan Endorsing Supplement Brand
2023-02-12
A purported deepfake video featuring podcast host Joe Rogan reportedly advertising to his listeners about a "libido-boosting" supplement was circulating on TikTok and other platforms before being removed by TikTok along with the account which posted it.
もっとインシデント 14082 Report
Purported Oprah Deepfake Reportedly Induced Utah Woman to Buy Misrepresented Weight Loss Supplements
2025-08-12
A Utah woman, Lisa Swearingen, reported paying more than $400 for weight loss supplements after seeing online ads featuring a purported Oprah Winfrey deepfake endorsement. When the product arrived, she said its primary ingredient was turmeric rather than the advertised formula.
もっとインシデント 6851 Report
The WHO's S.A.R.A.H. Bot Reported to Provide Inconsistent and Inadequate Health Information
2024-04-24
The WHO's AI-powered health advisor, S.A.R.A.H. (Smart AI Resource Assistant for Health), is alleged to provide inconsistent and inadequate health information. The bot reportedly gives contradictory responses to the same queries, fails to offer specific contact details for healthcare providers, and inadequately handles severe mental health crises, often giving irrelevant or unhelpful advice.
もっとインシデント 8381 Report
Microsoft Copilot Allegedly Provides Unsafe Medical Advice with High Risk of Severe Harm
2024-04-25
Microsoft Copilot, when asked medical questions, was reportedly found to provide accurate information only 54% of the time, according to European researchers (citation provided in editor's notes). Analysis by the researchers reported that 42% of Copilot's responses could cause moderate to severe harm, with 22% of responses posing a risk of death or severe injury.
もっと