Incident 693: Google AI Reportedly Delivering Confidently Incorrect and Harmful Information

Responded
Description: Google's AI search engine has reportedly been providing users with confidently incorrect and often harmful information. Reports highlight numerous inaccuracies, including misleading health advice and dangerous cooking suggestions. For example, it has falsely claimed Barack Obama as the first Muslim U.S. President, reflecting fringe conspiracy theories, or recommending that glue can be an ingredient in pizza.
Editor Notes: Reports about Incident 693 occasionally reference reports associated with Incident 609.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: Google developed and deployed an AI system, which harmed Google users and General public.

Incident Stats

Incident ID
693
Report Count
7
Incident Date
2024-05-14
Editors
Daniel Atherton
Google’s Gemini video search makes factual error in demo
theverge.com · 2024

Google made a lot of noise about its Gemini AI taking over search at its I/O conference today, but one of its flashiest demos was once again marked by the ever-present fatal flaw of every large language model to date: confidently making up …

Google promised a better search experience — now it’s telling us to put glue on our pizza
theverge.com · 2024

Imagine this: you've carved out an evening to unwind and decide to make a homemade pizza. You assemble your pie, throw it in the oven, and are excited to start eating. But once you get ready to take a bite of your oily creation, you run int…

Google's AI search feature suggested using glue to keep cheese sticking to a pizza
businessinsider.com · 2024

Google's new search feature, AI Overviews, seems to be going awry.

The tool, which gives AI-generated summaries of search results, appeared to instruct a user to put glue on pizza when they searched "cheese not sticking to pizza."

A screens…

Google’s AI Is Churning Out a Deluge of Completely Inaccurate, Totally Confident Garbage
futurism.com · 2024

Google's AI search, which swallows up web results and delivers them to users in a regurgitated package, delivers each of its AI-paraphrased answers to user queries in a concise, coolly confident tone. Just one tiny problem: it's wrong. A lo…

Why Google’s AI might recommend you mix glue into your pizza
washingtonpost.com · 2024

You probably have a sense that new forms of artificial intelligence can be dumb as rocks.

Hilariously wrong information from Google's new AI is showing you just how dumb.

In search results, Google's AI recently suggested mixing glue into pi…

Google’s A.I. Search Errors Cause a Furor Online
nytimes.com · 2024

Last week, Google unveiled its biggest change to search in years, showcasing new artificial intelligence capabilities that answer people's questions in the company's attempt to catch up to rivals Microsoft and OpenAI.

The new technology has…

Google Rolls Back A.I. Search Feature After Flubs and Flaws
nytimes.com · 2024
Nico Grant post-incident response

When Sundar Pichai, Google's chief executive, introduced a generative artificial intelligence feature for the company's search engine last month, he and his colleagues demonstrated the new capability with six text-based queries that the pub…

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.