Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 5482

Associated Incidents

Incident 11384 Report
South African Legal Team Reportedly Relied on Unverified ChatGPT Case Law in Johannesburg Body Corporate Defamation Matter

Loading...
South African court calls out lawyers for using ChatGPT references
techpoint.africa · 2023

AIID editor's note: Please visit the original source for the full article. What has been excerpted here is the relevant subsection.

Lawyers called out for using ChatGPT

A woman's lawyers have been chastised by a South African court --- the Johannesburg Regional Court --- for using phoney references generated by ChatGPT.

Story time: The woman sued a body corporate located in Parkwood, Johannesburg for defamation.

While the defendant's lawyers argued that the organisation could not be sued for defamation, her lawyer, Michelle Parker, disagreed, saying that earlier rulings had addressed the matter; they had just been unable to access them due to time constraints.

Consequently, Magistrate Arvin Chaitram postponed the case until late May 2023 to give both parties enough time to gather the evidence to prove their cases.

Over the next weeks, her lawyers attempted to find the documents they had been referring to.

While ChatGPT gave them real cases and citations, the citations were for different cases entirely. Also, they were inapplicable to defamation suits between body corporates and individuals.

In his ruling, Chaitram said the names, citations, facts and decisions in the cases presented by the plaintiff's lawyers were made up. He then ordered the woman to pay costs to the other side.

In Chaitram's opinion, the efficiency of modern technology must be supplemented with good old-fashioned independent reading when it comes to legal research.

In somewhat related news, in June 2023, two New York-based lawyers were fined $5,000 for using fake ChatGPT cases.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd