Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 3669

Associated Incidents

Incident 63817 Report
Fatal Crash Involving Tesla Full Self-Driving Claims Employee's Life

Loading...
Tesla employee with full self-driving enabled killed in car crash
newsbytesapp.com · 2024

What's the story

Back in 2022, Tesla employee and Elon Musk fan Hans von Ohain, tragically died when his Model 3 crashed and caught fire.Now, Erik Rossiter, a survivor of the accident, has claimed that the Full Self-Driving (FSD) feature was active during the crash.If true, this could be the first death involving FSD, a feature that has already caught regulators' attention.

Investigation

FSD feature under scrutiny

The Washington Post has confirmed that von Ohain's car had FSD, which he got for free as an employee perk. His widow, Nora Bass, mentioned he used it often.However, Tesla's vehicles are not fully autonomous yet, and drivers must be prepared to take control.The National Highway Traffic Safety Administration is already investigating Tesla's Autopilot after several accidents involving emergency response vehicles.

Fault

Autopilot fatalities and misleading marketing

Last year, *Washington Post *found that fatal crashes involving Tesla's Autopilot mode have increased since 2019, with at least 17 out of more than 700 crashes being deadly.Von Ohain's autopsy showed a blood alcohol level of 0.26, more than three times than legally permissible.Experts argue that Tesla's misleading marketing might give drivers a false sense of security, even without alcohol involved.

Problems

Ethical questions and Tesla's responsibility

Von Ohain's death raises questions about responsibility. Is Tesla's misleading marketing at fault, or was it the driver's reckless behavior?Bass told the Washington Post, "Regardless of how drunk Hans was, Musk has claimed that this car can drive itself and is essentially better than a human. We were sold a false sense of security."Tesla has not publicly acknowledged von Ohain's death, and the FSD facility is still in development, far from achieving full autonomy.

Future

FSD is Tesla's trump card

Tesla's future is dependent on FSD's success.In 2022, Musk claimed that FSD is "the difference between Tesla being worth a lot of money and being worth basically zero."He claims that the firm will achieve Level 5 autonomy in less than a year, at which point, the car will not require a steering wheel or brake pedal.However, the facility is yet to surpass Level 2 autonomy and needs the driver to take over the wheels at any time.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd