Incident 525: Tesla Vehicle Running on Self-Driving Mode Crashes on City Streets

Description: A Tesla vehicle running in self-driving mode outside the operating conditions supported by the software crashed and injured the driver. Subsequently, the driver filed a lawsuit against Tesla and a jury found no damages were warranted.


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: Tesla developed an AI system deployed by , which harmed Justine Hsu.

Incident Stats

Incident ID
Report Count
Incident Date
Sean McGregor

Incident Reports

Reports Timeline

Tesla's Autopilot mode is on trial in California · 2023

The first trial of robotic technology’s threat to human life is underway in a California court. The case relates to Tesla’s Autopilot software, which caused an accident on a city road in 2019.

According to Reuters, the plaintiff in the case…

Tesla wins bellwether trial over Autopilot car crash · 2023

LOS ANGELES, April 21 (Reuters) - A California state court jury on Friday handed Tesla Inc (TSLA.O) a sweeping win, finding the electric vehicle maker's Autopilot feature did not fail in what appeared to be the first trial related to a cras…

Exclusive: Tesla's Autopilot never claimed to be self-pilot, juror says · 2023

LOS ANGELES, April 21 (Reuters) - Jurors in what appears to be the first trial related to a crash involving Tesla's Autopilot feature told Reuters after the verdict on Friday that the electric-vehicle maker clearly warned that the partially…


A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.