Incident 561: OpenAI Alleged by Lawsuit Violated Users' Privacy Rights by Training AI on Private Info without Informed Consent

Description: OpenAI's products such as ChatGPT and DALL-E were alleged in a lawsuit using stolen private information from internet users without their informed consent or knowledge.


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: OpenAI developed and deployed an AI system, which harmed internet users , Children and social media users.

Incident Stats

Incident ID
Report Count
Incident Date
Khoa Lam

Incident Reports

ChatGPT maker OpenAI faces a lawsuit over how it used people’s data · 2023

SAN FRANCISCO – A California-based law firm is launching a class-action lawsuit against OpenAI, alleging the artificial-intelligence company that created popular chatbot ChatGPT massively violated the copyrights and privacy of countless peo…

Class Action Complaint · 2023


On October 19, 2016, University of Cambridge Professor of Theoretical Physics Stephen Hawking predicted, “Success in creating AI could be the biggest event in the history of our civilization. But it could also be the last, unle…

OpenAI is being sued for training ChatGPT with 'stolen' personal data · 2023

A California law firm has filed a class-action lawsuit against OpenAI for "stealing" personal data to train ChatGPT.

Clarkson Law Firm, in a complaint filed in the Northern District of California court on Wednesday, alleges ChatGPT and Dall…


A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.