Incident 574: AI-Generated Articles at G/O Media Allegedly Diminishes Reputation of Human Staff

Description: G/O Media began publishing AI-generated articles, against staff advice, that contained errors and quality issues. The first such article, a list of Star Wars movies, failed to maintain chronological order, causing internal concerns over journalistic credibility and ethics. Staff expressed that the AI was "actively hurting our reputations and credibility" and accused management of "wasting everyone's time."


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: OpenAI and Google developed an AI system deployed by G/O Media, which harmed Gizmodo journalists.

Incident Stats

Incident ID
Report Count
Incident Date
Daniel Atherton

Incident Reports

Gizmodo’s staff isn’t happy about G/O Media’s AI-generated content · 2023

G/O Media, who owns popular tech site Gizmodo along with a slew of other outlets, began publishing AI-generated articles last week, despite strong objections from many of the members of its staff, according to The Washington Post. The artic…


A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents



· 28 reports