Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 1209

Associated Incidents

Incident 6723 Report
Sleeping Driver on Tesla AutoPilot

Loading...
Drunk Tesla Driver Tells Cops Autopilot Was in Charge
fortune.com · 2018

The California Highway Patrol (CHP) says a driver was found passed out in his Tesla with a very high blood alcohol content on San Francisco’s Bay Bridge on Friday. The driver, according to CHP, claimed the car had been “set on autopilot” in an apparent attempt to defend himself.

The highway patrol, seemingly unimpressed, arrested the unnamed driver, charged him with suspicion of driving under the influence, and towed his car, noting on Twitter that “no it didn’t drive itself to the tow yard.”

When u pass out behind the wheel on the Bay Bridge with more than 2x legal alcohol BAC limit and are found by a CHP Motor. Driver explained Tesla had been set on autopilot. He was arrested and charged with suspicion of DUI. Car towed (no it didn’t drive itself to the tow yard). pic.twitter.com/4NSRlOBRBL — CHP San Francisco (@CHPSanFrancisco) January 19, 2018

Tesla did not confirm that the driver had actually engaged the autopilot system, though it has in the past used driver data in accident investigations. The autopilot system is designed to get a driver’s attention if it detects a challenging situation and brings the car to a stop if a driver does not respond.

Get Data Sheet, Fortune’s technology newsletter.

At the most abstract level, the incident invites us to ask questions about driver responsibility in the age of autonomous vehicles. In the future, will it be okay for us to get in our cars while inebriated, and let them take us home?

Maybe—but for now, that hardly matters. Tesla’s “autopilot” is not fully autonomous driving, though it can look like it for short stretches and under specific conditions. Tesla is clear that drivers using autopilot should remain alert and retain responsibility for their vehicle.

But Tesla drivers don’t always seem to get that message. Over-reliance on autopilot might have contributed to a 2016 fatality involving a distracted driver. The investigation following the crash concluded, in part, that Tesla didn’t have sufficient safeguards to ensure driver attention while using autopilot.

There’s still no confirmation that autopilot was in fact involved in the Friday incident, and with no apparent accident or injuries resulting, it’s unlikely it will lead to further official investigations into driver responsibility. However, it should be concerning in light of the ongoing rollout of Tesla’s Model 3, which features optional autopilot features. It appears Tesla may still have some work to do in educating its customers about the limitations of autopilot, or implementing further controls to prevent drivers from misusing it.

This article has been updated to reflect communication with Tesla.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd