Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 3051

Associated Incidents

Incident 54158 Report
ChatGPT Reportedly Produced False Court Case Law Presented by Legal Counsel in Court

Loading...
New York Lawyer Caught Using ChatGPT After Citing Non-Existent Cases
greekreporter.com · 2023

A lawyer from New York is now facing a court hearing because his law firm utilized the AI tool ChatGPT for conducting legal research.

The judge overseeing the case expressed that the court was confronted with an 'unprecedented circumstance' when it was discovered that the filing referenced legal cases were non-existent.

During the court proceedings, the lawyer who employed the AI tool stated that he was unaware of the possibility that the information it generated could be untrue.

ChatGPT has the capability to generate original text upon request, but it comes with a cautionary note that it may sometimes provide inaccurate information.

In the original case, a man filed a lawsuit against an airline, claiming personal injury. His legal team submitted a brief that referenced several past court cases in an effort to establish a precedent and justify why the case should proceed.

However, the opposing lawyers representing the airline later wrote to the judge, expressing their inability to locate some of the cases mentioned in the brief.

Judge Castel wrote in an order, "Six of the submitted cases appear to be bogus judicial decisions with bogus quotes and bogus internal citations." Moreover, the judge also demanded the man's legal team explain itself.

Non-existent cases

As acknowledged in an affidavit, the court determined that the following cases were nonexistent:

1. Varghese v. China Southern Airlines Co Ltd, 925 F.3d 1339 (11th Cir. 2019)

2. Shaboon v. Egyptair 2013 IL App (1st) 111279-U (Il App. Ct. 2013)

3. Petersen v. Iran Air 905 F. Supp 2d 121 (D.D.C. 2012)

4. Martinez v. Delta Airlines, Inc, 2019 WL 4639462 (Tex. App. Sept. 25, 2019)

5. Estate of Durden v. KLM Royal Dutch Airlines, 2017 WL 2418825 (Ga. Ct. App. June 5, 2017)

6. Miller v. United Airlines, Inc, 174 F.3d 366 (2d Cir. 1999)

Court demands an explanation

Throughout multiple submissions, it was revealed that the legal research in question had not been conducted by the plaintiff's lawyer, Peter LoDuca, but rather by one of his colleagues at the same law firm.

Steven A Schwartz, an attorney with over 30 years of experience, utilized ChatGPT to search for similar prior cases, according to BBC.

In his written statement, Mr. Schwartz clarified that Mr. LoDuca was not involved in the research process and had no knowledge of how it had been conducted.

Mr. Schwartz expressed deep remorse for relying on the chatbot, admitting that he had never used it for legal research before and was unaware of the potential for inaccurate content.

He has pledged to abstain from using AI to "supplement" his legal research in the future without thorough verification of its authenticity.

Both lawyers have been ordered to provide an explanation as to why disciplinary actions should not be taken against them at a hearing scheduled for 8th June.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd