Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 1071: Student Reportedly Files Complaint Over Professor’s Undisclosed Use of Generative AI at Northeastern University

Description: A student at Northeastern University reportedly filed a complaint after discovering that a professor had used generative AI tools, including ChatGPT, to produce course materials despite university policies discouraging undisclosed AI use. The student alleged hypocrisy, citing a syllabus that barred unauthorized AI use by students. The professor acknowledged having used AI-generated materials and expressed regret over not reviewing them more carefully. No tuition refund was reportedly granted.
Editor Notes: This incident record includes the name of the professor because he spoke on the record to The New York Times and offered a public reflection on the event. His comments were framed as part of an ongoing learning process around AI use in higher education. Naming him here acknowledges his own contribution to that process and situates the episode within a society-wide moment of institutional and pedagogical transition. Timeline note: The key events in this incident span from February 2025, which is when the student reportedly discovered her professor's use of generative AI in course materials, to May 2025, when she received the university's formal response denying her tuition reimbursement request. The New York Times report was published on May 14, 2025, and reflects developments through the end of the academic semester, including retrospective comments from the professor and updates to institutional AI policy. This incident ID takes as its date the publication of that report.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: OpenAI , Perplexity and Gamma.app developed an AI system deployed by Rick Arrowood, which harmed Rick Arrowood , Ella Stapleton , Northeastern University students and Northeastern University.
Alleged implicated AI systems: ChatGPT , Perplexity AI and Gamma

Incident Stats

Incident ID
1071
Report Count
1
Incident Date
2025-05-14
Editors
Daniel Atherton

Incident Reports

Reports Timeline

+1
The Professors Are Using ChatGPT, and Some Students Aren’t Happy About It
The Professors Are Using ChatGPT, and Some Students Aren’t Happy About It

The Professors Are Using ChatGPT, and Some Students Aren’t Happy About It

nytimes.com

The Professors Are Using ChatGPT, and Some Students Aren’t Happy About It
nytimes.com · 2025

In February, Ella Stapleton, then a senior at Northeastern University, was reviewing lecture notes from her organizational behavior class when she noticed something odd. Was that a query to ChatGPT from her professor?

Halfway through the do…

Variants

A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?
Previous IncidentNext Incident

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • d414e0f