概要: OpenAI's AI-powered transcription tool Whisper, used to translate and transcribe audio content such as patient consultations with doctors, is advertised as having near “human level robustness and accuracy.” However, software engineers, developers and academic researchers have alleged that it is prone to making up chunks of text or even entire sentences and that some of the hallucinations can include racial commentary, violent rhetoric, and even imagined medical treatments.
推定: OpenAIが開発し提供したAIシステムで、patients , Patients reliant on Whisper と Medical practitioners reliant on Whisperに影響を与えた