Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 2929

Associated Incidents

Incident 5253 Report
Tesla Vehicle Running on Self-Driving Mode Crashes on City Streets

Loading...
Tesla's Autopilot mode is on trial in California
qz.com · 2023

The first trial of robotic technology’s threat to human life is underway in a California court. The case relates to Tesla’s Autopilot software, which caused an accident on a city road in 2019.

According to Reuters, the plaintiff in the case is Los Angeles resident Justine Hsu, who first sued Tesla in 2020, when, while driving on the semi-autonomous mode, her Model S swerved into a restraint. She says in court filings that her airbag was activated with so much force that it “knocked out teeth, and caused nerve damage to her face” and broke her jaw. Hsu claims the airbag system—and the entire design of the Autopilot system, which Tesla launched in 2015—had flaws. She’s seeking more than $3 million in damages.

The electric car maker has denied any wrongdoing. It defends itself in part by pointing out that Hsu activated Autopilot on a city street, despite the car’s user manual warning against that. Tesla maintains that its cars are not fully autonomous, and that drivers should be ready “to take over at any moment.”

Tesla attorney Michael Carey claims that Hsu had time to brake the vehicle, yet still drove straight into the barrier. “The evidence proving distraction is pretty straightforward,” he said.

Autopilot’s safety record

Tesla vehicles running on the Autopilot software were involved in 273 crashes in 2021, according to data from the National Highway Traffic Safety Administration. That means Tesla vehicles made up nearly 70 percent of the 392 crashes involving advanced driver-assistance systems during that year.

Tesla CEO Elon Musk has always promoted Tesla’s “Full Self-Driving” (FSD) software, selling it as a $15,000 add-on to the company’s vehicles. Automation is a major part of the company’s future plans for revenue growth. As such, investors and shareholders are likely to monitor the outcome of the trial closely. The company’s shares dropped by 8% when the incident was reported in 2019.

While previous Tesla Autopilot flaws have led to deaths around the world, none has ever been prosecuted, making the outcome of the San Francisco case a critical point in how robotic car software will be designed in the future—and a major precedent in similar trials.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd