Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 3045

関連インシデント

インシデント 54158 Report
ChatGPT Reportedly Produced False Court Case Law Presented by Legal Counsel in Court

Loading...
Judge finds out why brief cited nonexistent cases—ChatGPT did research
abajournal.com · 2023

A federal judge in New York City has ordered two lawyers and their law firm to show cause why they shouldn't be sanctioned for submitting a brief with citations to fake cases, thanks to research by ChatGPT.

Senior U.S. District Judge P. Kevin Castel of the Southern District of New York said in a May 4 order the firm's legal filing was "replete with citations to nonexistent cases."

When Castel ordered one of the lawyers to submit an affidavit with the cited opinions, he complied---but six of the decisions "appear to be bogus" with "bogus quotes and bogus internal citations," Castel said.

The fake cases were provided by ChatGPT, according to a May 25 affidavit by lawyer Steven A. Schwartz of Levidow, Levidow & Oberman. He has been practicing law in New York for more than 30 years.

"Affiant has never utilized ChatGPT as a source for conducting legal research prior to this occurrence and therefore was unaware of the possibility that its content could be false," Schwartz wrote.

ChatGPT had assured Schwartz that the cases that it cited were real "and can be found in reputable legal databases, such as LexisNexis and Westlaw," according to queries and answers Schwartz submitted to the court.

Another lawyer who signed Schwart's brief, Peter LoDuca, was not aware of Schwartz's research method, Schwartz said. LoDuca became attorney of record after their case was removed to the Southern District of New York, where Schwartz has not obtained admission.

The show cause hearing is scheduled for June 8, according to a May 26 order by Castel.

Publications covering the case include the New York Times and the Volokh Conspiracy (here and here), which links to a case page from CourtListener.

Schwartz did not immediately reply to the ABA Journal's request for comment, which was sent by email and voicemail. LoDuca told the ABA Journal that he doesn't have any comment at this time.

Schwartz and LoDuca represent the plaintiff Roberto Mata in a lawsuit against airline Avianca Inc. Mata said he was injured when he was struck by a metal serving cart.

"The real-life case of Roberto Mata v. Avianca Inc. shows that white-collar professions may have at least a little time left before the robots take over," according to the New York Times.

The Volokh Conspiracy pointed out that some litigants representing themselves are also using ChatGPT.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd