Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 3664

Associated Incidents

Incident 63817 Report
Fatal Crash Involving Tesla Full Self-Driving Claims Employee's Life

Loading...
Musk Says 2022 Tesla Crash Driver Didn’t Have Full-Self Driving Tech
carscoops.com · 2024

Supporters of self-driving car technologies claim that autonomous features like Tesla's Full Self-Driving system make cars safer. However, regardless of how safe they are -- and opinions are split on the Tesla package -- someone was always going to be the first to die in an FSD-related accident.

Earlier this week, the Washington Post reported that the person was Tesla employee Hans von Ohain, who was killed in a fiery collision when his Model 3 left the road and burst into flames after smashing into a tree. However, Tesla CEO Elon has taken to social media to dispute The Post's story, claiming that Ohain's car wasn't equipped with FSD capability.

The Post said that the purchase order for von Ohain's EV showed it was equipped with features only available to buyers who purchased the FSD system. Additionally, friends and family of the driver said he used the car's autonomous capabilities wherever he went, proudly showing them off to passengers. But Musk insists von Ohain's car wasn't equipped with Tesla's top-line driver-assist package.

"He was not on FSD," Musk wrote on X. "The software had unfortunately never been downloaded. I say 'unfortunately', because the accident probably would not have happened if FSD had been engaged."

In a separate tweet, Tesla's policy boss, Rohan Patel, seconded Musk's comments about von Ohain's car, again insisting that the FSD beta software package wasn't downloaded to the Model 3 before it crashed in Evergreen, Colorado.

Von Ohain was found to have been over the drink-drive limit at the time of the crash, as was his passenger, who survived the accident. The pair had been drinking during the day at a golf course and the passenger told emergency responders that the driver had been using the "auto-drive feature on the Tesla" and that the Model 3 "just ran straight off the road," The Post reported.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd