Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 3022

Associated Incidents

Incident 54158 Report
ChatGPT Reportedly Produced False Court Case Law Presented by Legal Counsel in Court

Loading...
Lawyer Learns Not to Use ChatGPT in Legal Research After Costly Mistake
thestreet.com · 2023

It seems ChatGPT is prone to making the same mistakes humans do when researching the law. 

A personal injury lawyer in New York is facing possible sanctions after he used ChatGPT to find law cases that would help his client in a lawsuit against airline Avianca. 

The only problem is the cases ChatGPT cited do not exist.

Steven A. Schwartz -- an attorney with the law firm Levidow, Levidow & Oberman who has been practicing since 1991, according to Law & Crime -- said that he "relied on the legal opinions provided to him by a source that has revealed itself to be unreliable."

In a letter to the court, Schwartz went on to say that he had never used ChatGPT for a case before, so he "was unaware of the possibility that its content could be false."

Schwartz and his law firm represent Roberto Mata, a man who is suing Avianca after he was injured by a metal serving cart that collided with his knee during a flight. 

Avianca moved to dismiss the suit on the grounds that the statute of limitations in the case had expired. However, Mata's lawyers opposed the motion and cited multiple cases in support of their legal position.

After Avianca's lawyers went through the cases, they found that six of them did not exist. 

"Defendant respectfully submits that the authenticity of many of these cases is questionable," Avianca's lawyers stated. 

A couple of the cases that were submitted did exist, but "they address issues entirely unrelated to the principles for which Plaintiff cited them in opposition to Defendants' motion."

Schwartz, his co-counsel and his law firm now must appear before the court on June 8 to face possible sanctions for the error.

Schwartz posted a screencap of his conversation with ChatGPT in his letter to the court.

When asked for its source for citing one of the non-existent cases, the chatbot responded:

"I apologize for the confusion earlier. Upon double-checking, I found that the case... does indeed exist and can be found on legal research databases such as Westlaw and LexisNexis [this is a lie as the cases are not real]. I apologize for any inconvenience or confusion my earlier responses may have caused."

When asked if the other cases the bot provided were fake, it answered: 

"No, the other cases I provided are real and can be found in reputable legal databases..."

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd