Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 5295

Associated Incidents

Incident 10743 Report
Citation Errors in Concord Music v. Anthropic Attributed to Claude AI Use by Defense Counsel

Loading...
Anthropic's lawyers take blame for AI 'hallucination' in music publishers' lawsuit
reuters.com · 2025

An attorney defending artificial-intelligence company Anthropic in a copyright lawsuit over music lyrics told a California federal judge on Thursday that her law firm Latham & Watkins was responsible for an incorrect footnote in an expert report caused by an AI "hallucination."

Ivana Dukanovic said in a court filing, opens new tab that the expert had relied on a legitimate academic journal article, but Dukanovic created a citation for it using Anthropic's chatbot Claude, which made up a fake title and authors in what the attorney called "an embarrassing and unintentional mistake."

"Unfortunately, although providing the correct publication title, publication year, and link to the provided source, the returned citation included an inaccurate title and incorrect authors," Dukanovic said.

The lawsuit from music publishers Universal Music Group (UMG.AS), opens new tab, Concord and ABKCO over Anthropic's alleged misuse of their lyrics to train Claude is one of several high-stakes disputes between copyright owners and tech companies over the use of their work to train AI systems.

The publishers' attorney Matt Oppenheim of Oppenheim + Zebrak told the court during a hearing on Tuesday that Anthropic data scientist Olivia Chen may have used an AI-fabricated source to bolster the company's argument in a dispute over evidence.

U.S. Magistrate Judge Susan van Keulen said at the hearing that the allegation raised "a very serious and grave issue," and that there was "a world of difference between a missed citation and a hallucination generated by AI."

Dukanovic responded on Thursday that Chen had cited a real article from the journal American Statistician that supported her argument, but the attorneys had missed that Claude introduced an incorrect title and authors.

A spokesperson for the plaintiffs declined to comment on the new filing. Dukanovic and a spokesperson for Anthropic did not immediately respond to requests for comment.

Several attorneys have been criticized or sanctioned by courts in recent months for mistakenly citing nonexistent cases and other incorrect information hallucinated by AI in their filings.

Dukanovic said in Thursday's court filing that Latham had implemented "multiple levels of additional review to work to ensure that this does not occur again."

The case is Concord Music Group Inc v. Anthropic PBC, U.S. District Court for the Northern District of California, No. 5:24-cv-03811.

For the music publishers: Matt Oppenheim of Oppenheim + Zebrak

For Anthropic: Sy Damle of Latham & Watkins

Read more:

Music publishers sue AI company Anthropic over song lyrics

Anthropic reaches deal on AI 'guardrails' in lawsuit over music lyrics

Anthropic expert accused of using AI-fabricated source in copyright case

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd