Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Descubrir
Enviar
  • Bienvenido a la AIID
  • Descubrir Incidentes
  • Vista espacial
  • Vista Tabular
  • Vista de lista
  • Entidades
  • Taxonomías
  • Enviar Informes de Incidentes
  • Ranking de Reportadores
  • Blog
  • Resumen de noticias de IA
  • Control de Riesgos
  • Incidente aleatorio
  • Registrarse
Colapsar
Descubrir
Enviar
  • Bienvenido a la AIID
  • Descubrir Incidentes
  • Vista espacial
  • Vista Tabular
  • Vista de lista
  • Entidades
  • Taxonomías
  • Enviar Informes de Incidentes
  • Ranking de Reportadores
  • Blog
  • Resumen de noticias de IA
  • Control de Riesgos
  • Incidente aleatorio
  • Registrarse
Colapsar

Problema 3664

Incidentes Asociados

Incidente 63817 Reportes
Fatal Crash Involving Tesla Full Self-Driving Claims Employee's Life

Loading...
Musk Says 2022 Tesla Crash Driver Didn’t Have Full-Self Driving Tech
carscoops.com · 2024

Supporters of self-driving car technologies claim that autonomous features like Tesla's Full Self-Driving system make cars safer. However, regardless of how safe they are -- and opinions are split on the Tesla package -- someone was always going to be the first to die in an FSD-related accident.

Earlier this week, the Washington Post reported that the person was Tesla employee Hans von Ohain, who was killed in a fiery collision when his Model 3 left the road and burst into flames after smashing into a tree. However, Tesla CEO Elon has taken to social media to dispute The Post's story, claiming that Ohain's car wasn't equipped with FSD capability.

The Post said that the purchase order for von Ohain's EV showed it was equipped with features only available to buyers who purchased the FSD system. Additionally, friends and family of the driver said he used the car's autonomous capabilities wherever he went, proudly showing them off to passengers. But Musk insists von Ohain's car wasn't equipped with Tesla's top-line driver-assist package.

"He was not on FSD," Musk wrote on X. "The software had unfortunately never been downloaded. I say 'unfortunately', because the accident probably would not have happened if FSD had been engaged."

In a separate tweet, Tesla's policy boss, Rohan Patel, seconded Musk's comments about von Ohain's car, again insisting that the FSD beta software package wasn't downloaded to the Model 3 before it crashed in Evergreen, Colorado.

Von Ohain was found to have been over the drink-drive limit at the time of the crash, as was his passenger, who survived the accident. The pair had been drinking during the day at a golf course and the passenger told emergency responders that the driver had been using the "auto-drive feature on the Tesla" and that the Model 3 "just ran straight off the road," The Post reported.

Leer la Fuente

Investigación

  • Definición de un “Incidente de IA”
  • Definición de una “Respuesta a incidentes de IA”
  • Hoja de ruta de la base de datos
  • Trabajo relacionado
  • Descargar Base de Datos Completa

Proyecto y Comunidad

  • Acerca de
  • Contactar y Seguir
  • Aplicaciones y resúmenes
  • Guía del editor

Incidencias

  • Todos los incidentes en forma de lista
  • Incidentes marcados
  • Cola de envío
  • Vista de clasificaciones
  • Taxonomías

2024 - AI Incident Database

  • Condiciones de uso
  • Política de privacidad
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd