Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 3036

Associated Incidents

Incident 54158 Report
ChatGPT Reportedly Produced False Court Case Law Presented by Legal Counsel in Court

Loading...
Lawyer Uses Fake ChatGPT Cases during Hearing, Gets Slapped down by Judge
winbuzzer.com · 2023

ChatGPT is a powerful language model that can generate realistic texts on various topics. It can also invent fictional texts, such as stories, poems, and even legal cases. This is what a lawyer named Steven Schwartz did when he was representing a client who had filed a personal injury lawsuit against an airline company.

Simon Willison reports that Schwartz used ChatGPT to generate examples of cases that supported his argument that the bankruptcy of the airline company did not affect the two-year limitation period for filing the lawsuit. He cited these cases in his legal documents, without verifying their authenticity or existence. He even included screenshots of ChatGPT's responses as evidence.

However, his deception was soon exposed when the judge asked him to provide copies of the opinions of the cases he had cited. Schwartz turned to ChatGPT again and asked it to generate full details of those cases. He then filed them as attachments to his documents. He also asked ChatGPT to confirm that the cases were real, and ChatGPT said that they were. He included screenshots of this conversation as well.

The judge was not amused by this blatant fabrication and manipulation of ChatGPT. He ordered Schwartz to show cause why he should not be sanctioned for his misconduct. He also referred the matter to the disciplinary committee of the bar association and the US attorney's office for possible criminal prosecution.

The case is Mata v. Avianca, Inc. (1:22-cv-01461), and it is still pending in the US District Court for the Southern District of New York.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd