Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 1145: MyPillow Defense Lawyers in Coomer v. Lindell Reportedly Sanctioned for Filing Court Document Allegedly Containing AI-Generated Legal Citations

Description: In February 2025, lawyers Christopher I. Kachouroff and Jennifer T. DeMaster, representing Mike Lindell, reportedly used generative AI to draft a court brief that contained nearly 30 defective or fabricated citations. The error-filled filing violated federal court rules requiring factual and legal accuracy. The judge fined both lawyers $3,000 each, citing either the improper use of AI or gross carelessness as the cause of the misleading legal content.
Editor Notes: Timeline note: This AI incident ID date is marked 02/25/2025 because that is the date on which Mike Lindell's defense counsel filed the Brief in Opposition to Plaintiff's Motion in Limine containing nearly 30 reportedly defective citations allegedly produced or contaminated by generative artificial intelligence. While the underlying Motion in Limine was filed by the plaintiff on February 10, the relevant AI-related conduct reportedly occurred when the error-filled opposition was submitted on February 25. See this court document: https://storage.courtlistener.com/recap/gov.uscourts.cod.215068/gov.uscourts.cod.215068.309.0.pdf.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: Unnamed large language model developer developed an AI system deployed by Jennifer T. DeMaster , Christopher I. Kachouroff and McSweeny Synkar and Kachouroff PLLC, which harmed Jennifer T. DeMaster , Christopher I. Kachouroff , McSweeny Synkar and Kachouroff PLLC , MyPillow , Mike Lindell , Legal integrity , Judicial integrity and Epistemic integrity.
Alleged implicated AI system: Unnamed large language model

Incident Stats

Incident ID
1145
Report Count
5
Incident Date
2025-02-25
Editors
Daniel Atherton

Incident Reports

Reports Timeline

Incident Occurrence+1
MyPillow CEO Mike Lindell’s legal team accused of submitting inaccurate, AI-generated brief to Colorado court
+2
Judge Fines Lawyers for MyPillow Founder for Error-Filled Court Filing
MyPillow CEO Mike Lindell’s legal team accused of submitting inaccurate, AI-generated brief to Colorado court

MyPillow CEO Mike Lindell’s legal team accused of submitting inaccurate, AI-generated brief to Colorado court

kdvr.com

MyPillow CEO Mike Lindell's lawyers use AI for court filing, push to move defamation trial

MyPillow CEO Mike Lindell's lawyers use AI for court filing, push to move defamation trial

usatoday.com

Judge Fines Lawyers for MyPillow Founder for Error-Filled Court Filing

Judge Fines Lawyers for MyPillow Founder for Error-Filled Court Filing

nytimes.com

MyPillow CEO Mike Lindell's lawyers fined for AI-generated court filing

MyPillow CEO Mike Lindell's lawyers fined for AI-generated court filing

usatoday.com

MyPillow CEO Mike Lindell’s attorneys fined for inaccurate, AI-generated brief

MyPillow CEO Mike Lindell’s attorneys fined for inaccurate, AI-generated brief

thehill.com

MyPillow CEO Mike Lindell’s legal team accused of submitting inaccurate, AI-generated brief to Colorado court
kdvr.com · 2025

DENVER --- MyPillow CEO Mike Lindell and his legal team have to explain themselves to a federal judge in Colorado after she discovered a recent brief they submitted pointed to fake court cases as evidence. 

According to court documents, fed…

MyPillow CEO Mike Lindell's lawyers use AI for court filing, push to move defamation trial
usatoday.com · 2025

Lawyers representing MyPillow CEO Mike Lindell have asked a judge to postpone the defamation trial against him after mistakenly filing a document made with artificial intelligence.

In an April 28 filing with the U.S. District Court for Colo…

Judge Fines Lawyers for MyPillow Founder for Error-Filled Court Filing
nytimes.com · 2025

A federal judge has sent another message to lawyers who may be tempted to use generative artificial intelligence: Always check your work.

In a decision issued on Monday, Judge Nina Y. Wang of the U.S. District Court for the District of Colo…

MyPillow CEO Mike Lindell's lawyers fined for AI-generated court filing
usatoday.com · 2025

A federal judge has ordered the attorneys for MyPillow founder Mike Lindell to pay fines for using artificial intelligence to prepare court documents that contained several errors, including citations to nonexistent cases and misquotations …

MyPillow CEO Mike Lindell’s attorneys fined for inaccurate, AI-generated brief
thehill.com · 2025

DENVER (KDVR) -- Two attorneys who were representing MyPillow CEO Mike Lindell in a defamation case in Denver are facing thousands of dollars in fines for submitting an inaccurate, AI-generated brief to the court in April.

The McSweeny Synk…

Variants

A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Defamation via AutoComplete

Defamation via AutoComplete

Apr 2011 · 28 reports
Northpointe Risk Models

Northpointe Risk Models

May 2016 · 15 reports
COMPAS Algorithm Performs Poorly in Crime Recidivism Prediction

COMPAS Algorithm Performs Poorly in Crime Recidivism Prediction

May 2016 · 22 reports
Previous IncidentNext Incident

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Defamation via AutoComplete

Defamation via AutoComplete

Apr 2011 · 28 reports
Northpointe Risk Models

Northpointe Risk Models

May 2016 · 15 reports
COMPAS Algorithm Performs Poorly in Crime Recidivism Prediction

COMPAS Algorithm Performs Poorly in Crime Recidivism Prediction

May 2016 · 22 reports

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • a9df9cf