Incident 304: Tesla on FSD Reportedly Drove into the Wrong Lane in California

Description: A Tesla Model Y in Full Self-Driving (FSD) mode drove into the wrong lane after making a left turn despite its driver allegedly attempting to overtake its driving, resulting in a non-fatal collision with another vehicle in the wrong lane in Brea, California.
Alleged: Tesla developed and deployed an AI system, which harmed unnamed Tesla driver and Tesla drivers.

Suggested citation format

Pednekar, Sonali. (2021-11-03) Incident Number 304. in Lam, K. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
304
Report Count
1
Incident Date
2021-11-03
Editors
Khoa Lam

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover

Incident Reports

Tesla vehicle in ‘Full Self-Driving’ beta mode ‘severely damaged’ after crash in California

A Tesla Model Y in “Full Self-Driving” (FSD) beta mode allegedly crashed on November 3rd in Brea, a city southeast of Los Angeles, marking what is likely to be the first incident involving the company’s controversial driver assist feature. …

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.