Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 3022

関連インシデント

インシデント 54158 Report
ChatGPT Reportedly Produced False Court Case Law Presented by Legal Counsel in Court

Loading...
Lawyer Learns Not to Use ChatGPT in Legal Research After Costly Mistake
thestreet.com · 2023

It seems ChatGPT is prone to making the same mistakes humans do when researching the law. 

A personal injury lawyer in New York is facing possible sanctions after he used ChatGPT to find law cases that would help his client in a lawsuit against airline Avianca. 

The only problem is the cases ChatGPT cited do not exist.

Steven A. Schwartz -- an attorney with the law firm Levidow, Levidow & Oberman who has been practicing since 1991, according to Law & Crime -- said that he "relied on the legal opinions provided to him by a source that has revealed itself to be unreliable."

In a letter to the court, Schwartz went on to say that he had never used ChatGPT for a case before, so he "was unaware of the possibility that its content could be false."

Schwartz and his law firm represent Roberto Mata, a man who is suing Avianca after he was injured by a metal serving cart that collided with his knee during a flight. 

Avianca moved to dismiss the suit on the grounds that the statute of limitations in the case had expired. However, Mata's lawyers opposed the motion and cited multiple cases in support of their legal position.

After Avianca's lawyers went through the cases, they found that six of them did not exist. 

"Defendant respectfully submits that the authenticity of many of these cases is questionable," Avianca's lawyers stated. 

A couple of the cases that were submitted did exist, but "they address issues entirely unrelated to the principles for which Plaintiff cited them in opposition to Defendants' motion."

Schwartz, his co-counsel and his law firm now must appear before the court on June 8 to face possible sanctions for the error.

Schwartz posted a screencap of his conversation with ChatGPT in his letter to the court.

When asked for its source for citing one of the non-existent cases, the chatbot responded:

"I apologize for the confusion earlier. Upon double-checking, I found that the case... does indeed exist and can be found on legal research databases such as Westlaw and LexisNexis [this is a lie as the cases are not real]. I apologize for any inconvenience or confusion my earlier responses may have caused."

When asked if the other cases the bot provided were fake, it answered: 

"No, the other cases I provided are real and can be found in reputable legal databases..."

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd