Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 212

Associated Incidents

Incident 32114 Report
Tesla Model X on Autopilot Crashed into California Highway Barrier, Killing Driver

Loading...
Tesla in fatal California crash was on Autopilot
bbc.com · 2018

Electric carmaker Tesla says a vehicle involved in a fatal crash in California was in Autopilot mode, raising further questions about the safety of self-driving technology.

One of the company's Model X cars crashed into a roadside barrier and caught fire on 23 March.

Tesla says Autopilot was engaged at the time of the accident involving the driver, 38, who died soon afterwards.

But they did not say whether the system had detected the concrete barrier.

"The driver had received several visual and one audible hands-on warning earlier in the drive," a statement on the company's website said.

"The driver's hands were not detected on the wheel for six seconds prior to the collision."

"The driver had about five seconds and 150m (490ft) of unobstructed view of the concrete divider... but the vehicle logs show that no action was taken," the statement added.

Tesla's Autopilot system does some of the things a fully autonomous machine can do. It can brake, accelerate and steer by itself under certain conditions, but it is classified as a driver assistance system, is not intended to operate independently and as such the driver is meant to have their hands on the wheel at all times.

In 2016, a Tesla driver was killed in Florida when his car failed to spot a lorry crossing its path.

It led the company to introduce new safety measures, including turning off Autopilot and bringing the car to a halt if the driver lets go of the wheel for too long.

The accident in California comes at a difficult time for self-driving technology.

Earlier this month, Uber was forbidden from resuming self-driving tests in the US state of Arizona.

It followed a fatal crash in the state in which an autonomous vehicle hit a woman who was walking her bike across the road.

It was thought to be the first time an autonomous car had been involved in a fatal collision with a pedestrian.

The company suspended all self-driving tests in North America after the accident.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd