Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 6219

Associated Incidents

Incident 12111 Report
Google AI Overviews Reportedly Misrepresented Pizza Specials at Stefanina's in Wentzville, Missouri

Loading...
‘Please do not use Google AI to find out our specials, ’ Wentzville restaurant asks patrons
firstalert4.com · 2025

WENTZVILLE, Mo. (First Alert 4) -A popular local restaurant says an Artificial Intelligence tool designed to enhance Google search results is causing confusion for customers.

Stefanina's Wentzville warned patrons about search results including misleading or downright false information.

The post read:

"Please do not use Google AI to find out our specials. Please go on our Facebook page or our website. Google AI is not accurate and is telling people specials that do not exist which is causing angry customers yelling at our employees. We cannot control what Google posts or says and we will not honor the Google AI specials."

Eva Gannon, part of the family that owns the restaurant, said the issue was that customers had been learning about deals or offers at Stefanina's that weren't available.

One example she shared was an AI Overview that showed that Stefanina's would offer a large pizza for the same price as a small. She said certain menu items generated by the tool were also incorrect.

"It's coming back on us," Gannon said. "As a small business, we can't honor a Google AI special."

Google did not respond to FirstAlert4's messages about the situation. But the company's online guide to using AI warns that results may sometimes be inaccurate due to the tool making mistakes or misunderstanding the data it's searching through.

Jonathan Hanahan, a Washington University Professor who uses and teaches about Artificial Intelligence, described it as a tool that, while useful, still requires skepticism by users and a willingness to fact check.

"There's a big difference between using AI and working with AI," he said.

Hanahan used a golf scoring app he's been working on as an example. Using AI he can quickly incorporate course information that might normally take him hours to gather. But for the app to be effective he still needs to cross check the information the software is gathering.

He said that an issue with using AI to search for information is that certain wording in the prompt may skew information, risking confirmation bias.

"It will sometimes take liberties to get you what you're looking for," he explained.

He said the takeaway for users should be to think carefully about the kinds of questions and prompts they're using to gather information, and to avoid what he described as a binary relationship.

"Think about how are you asking prompts, how are you asking questions," he said. "Thinking about the greater conversation and not just the things you're expecting to find out."

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd