概要: Google's AI search engine has reportedly been providing users with confidently incorrect and often harmful information. Reports highlight numerous inaccuracies, including misleading health advice and dangerous cooking suggestions. For example, it has falsely claimed Barack Obama as the first Muslim U.S. President, reflecting fringe conspiracy theories, or recommending that glue can be an ingredient in pizza.
Editor Notes: Reports about Incident 693 occasionally reference reports associated with Incident 609.
インシデントのステータス
Risk Subdomain
A further 23 subdomains create an accessible and understandable classification of hazards and harms associated with AI
3.1. False or misleading information
Risk Domain
The Domain Taxonomy of AI Risks classifies risks into seven AI risk domains: (1) Discrimination & toxicity, (2) Privacy & security, (3) Misinformation, (4) Malicious actors & misuse, (5) Human-computer interaction, (6) Socioeconomic & environmental harms, and (7) AI system safety, failures & limitations.
- Misinformation
Entity
Which, if any, entity is presented as the main cause of the risk
AI
Timing
The stage in the AI lifecycle at which the risk is presented as occurring
Post-deployment
Intent
Whether the risk is presented as occurring as an expected or unexpected outcome from pursuing a goal
Unintentional
インシデントレポート
レポートタイムライン
Loading...
/cdn.vox-cdn.com/uploads/chorus_asset/file/25448029/Screenshot_2024_05_14_at_1.08.35_PM.png)
Google は本日の I/O カンファレンスで Gemini AI が検索を制覇 と大々的に宣伝しましたが、最も派手なデモの 1 つでは、これまでのすべての大規模言語モデルに常に存在する致命的な欠陥、つまり自信を持って間違った答えをでっち上げることが再び目立ちました。
「Gemini 時代の検索」のシズル リールで、Google はビデオ検索のデモを行いました。これは、ビデオ クリップに話しかけて検索できるものです。例として、フィルム カメラのフィルム送りレバーが動かなくな…
Loading...
/cdn.vox-cdn.com/uploads/chorus_asset/file/25362061/STK_414_AI_CHATBOT_R2_CVirginia_D.jpg)
Imagine this: you've carved out an evening to unwind and decide to make a homemade pizza. You assemble your pie, throw it in the oven, and are excited to start eating. But once you get ready to take a bite of your oily creation, you run int …
Loading...