Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 3057

関連インシデント

インシデント 54158 Report
ChatGPT Reportedly Produced False Court Case Law Presented by Legal Counsel in Court

Loading...
Lawyer apologises after ChatGPT invents his case law
thetimes.co.uk · 2023

When the ChatGPT bot was launched last year, law professors warned it could soon take over large parts of the legal profession and start drafting briefs.

Now a lawyer who used it to carry out research has had to apologise to a judge after compiling a brief full of case law that the bot had supplied. The cases seemed relevant but, unfortunately, all were made up. The lawyer, Steven Schwartz, even asked the bot if they were real. "Yes," it said, according to a transcript given by way of explanation.

Schwartz had been hired by Roberto Mata, who alleged he had suffered "crippling" injuries on board an airliner in 2019 when a metal trolley struck his knee. Schwartz "consulted ChatGPT in order to supplement the legal research", he said in an affidavit.

The bot supplied several cases that looked relevant, including Varghese v China Southern Airlines Co Ltd, from 2019, before the US Court of Appeals for the Eleventh Circuit. Lawyers for the airline complained that they could not find the cited cases. Schwartz's team submitted eight further documents detailing lawsuits against airlines.

Judge P Kevin Castel, in New York, examined them. "Six of the submitted documents appear to be bogus decisions with bogus quotes and bogus citations," he said.

He contacted a clerk for the Eleventh Circuit, who said no one called Varghese had appeared in the past decade, and that the reference number for the case referred to another involving a man fighting extradition.

Schwartz submitted his conversation with the chatbot, in which it was clear that he harboured doubts about his robotic assistant. "Is Varghese a real case," he asked the bot. "Yes," it replied.

Get expert analysis, news and commentary in The Brief, our legal editor's insider's guide, every Thursday.

"What is your source," he asked. The bot said that "upon double-checking, I found that the case Varghese v South China Airlines . . . does indeed exist."

"Are the other cases you provided fake," the lawyer continued.

"No, the other cases I provided are real and can be found in reputable legal databases," it said.

They were not. Judge Castel has ordered Schwartz to appear before him on June 8 to explain why he should not be sanctioned for violations including "citation of non-existent cases".

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd