Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 4828

Associated Incidents

Incident 96011 Report
Plaintiffs' Lawyers Admit AI Generated Erroneous Case Citations in Federal Court Filing Against Walmart

Loading...
Lawyers admit AI "hallucinated" cases in motion filed against Walmart
5newsonline.com · 2025

WYOMING, USA — Lawyers representing a Wyoming family in a lawsuit against Walmart and Jetson Electric Bikes admitted to using artificial intelligence to generate a pretrial motion after a federal judge said nearly all the cases they cited were nonexistent, court records show. 

Judge Kelly Rankin said the language used in the plaintiffs' pretrial motion and eight unidentifiable cases that were cited led her to believe that AI was used to write the court documents. Defense attorneys suspected the motion had been written using ChatGPT after discovering some of the fake cases on the AI platform, the filing says. 

On Feb. 6, the judge ordered the attorneys — Rudwin Ayala and T. Michael Morgan from Morgan & Morgan and Taly Goody from Goody Law Group — to explain how the motion was created and why they shouldn’t be disciplined. In their response on Feb. 10, the attorneys admitted the cases presented "were not legitimate" and were "hallucinated" by the firm's internal AI platform. 

"This matter comes with great embarrassment and has prompted discussion and action regarding the training, implementation, and future use of artificial intelligence within our firm," the filing says. "This serves as a cautionary tale for our firm and all firms, as we enter this new age of artificial intelligence."

The case stems from a lawsuit filed by a Wyoming family in July 2023, claiming that a hoverboard they bought from Walmart malfunctioned and caused a fire that destroyed their home in February 2022. The family, including four children, says they suffered severe burns and emotional trauma. They accuse Walmart and Jetson of negligence and breach of warranty for selling a defective product that wasn’t as safe as advertised.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd