Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 1202

Associated Incidents

Incident 6723 Report
Sleeping Driver on Tesla AutoPilot

Loading...
Who's driving? Autonomous cars may be entering the most dangerous phase
theguardian.com · 2018

Autopilot controls are not yet fully capable of functioning without human intervention – but they’re good enough to lull us into a false sense of security

When California police officers approached a Tesla stopped in the centre of a five-lane highway outside San Francisco last week, they found a man asleep at the wheel. The driver, who was arrested on suspicion of drunk driving, told them his car was in “autopilot”, Tesla’s semi-autonomous driver assist system.

In a separate incident this week, firefighters in Culver City reported that a Tesla rear-ended their parked fire truck as it attended an accident on the freeway. Again, the driver said the vehicle was in autopilot.

CHP San Francisco (@CHPSanFrancisco) When u pass out behind the wheel on the Bay Bridge with more than 2x legal alcohol BAC limit and are found by a CHP Motor. Driver explained Tesla had been set on autopilot. He was arrested and charged with suspicion of DUI. Car towed (no it didn’t drive itself to the tow yard). pic.twitter.com/4NSRlOBRBL

The oft-repeated promise of driverless technology is that it will make the roads safer by reducing human error, the primary cause of accidents. However, automakers have a long way to go before they can eliminate the driver altogether.

What’s left is a messy interim period when cars are being augmented incrementally with automated technologies such as obstacle detection and lane centering. In theory, these can reduce the risk of crashes, but they are not failsafe. As a Tesla spokeswoman put it: “Autopilot is intended for use only with a fully attentive driver.”

Autonomy gives people a sense something is in control, and we have a tendency to overestimate technology’s capabilities Nidhi Kalra, information scientist

However, research has shown that drivers get lulled into a false sense of security to the point where their minds and gazes start to wander away from the road. People become distracted or preoccupied with their smartphones. So when the car encounters a situation where the human needs to intervene, the driver can be slow to react.

At a time when there is already a surge in collisions caused by drivers distracted by their smartphones, we could be entering a particularly dangerous period of growing pains with autonomous driving systems.

“People are already inclined to be distracted. We’re on our phones, eating burgers, driving with our knees,” said Nidhi Kalra, senior information scientist at the Rand Corporation. “Additional autonomy gives people a sense that something else is in control, and we have a tendency to overestimate the technology’s capabilities.”

Steven Shladover, of the University of California, Berkeley’s Path programme, was more sharply critical of car manufacturers: “These companies are overselling the capabilities of the systems they have and the public is being misled.”

Waymo, Google’s self-driving car spin-off, discovered the handoff problem when it was testing a “level 3” automated driving system – one that can drive itself under certain conditions, but in which the human still needs to takeover if the situation becomes tricky. The next level, four, is what most people consider “fully autonomous”.

Most of the advanced driver assist features introduced by Tesla, Mercedes, BMW and Cadillac are categorised as level 2 automation.

During testing, Waymo recorded what its CEO, John Krafcik, described as “sort of scary” video footage of drivers texting, applying makeup and even sleeping behind the wheel while their cars hurtled down the freeway. This led Waymo to decide to leapfrog level 3 automation altogether, and focus on full autonomy instead.

“We found that human drivers over-trusted the technology and were not monitoring the roadway carefully enough to be able to safely take control when needed,” said the company in its 2017 safety report.

Ian Reagan from the Insurance Institute for Highway Safety (IIHS) shares Waymo’s caution, although he acknowledges that the safety potential for automated systems is “huge”.

“There are lots of potential unintended consequences, particularly with level 2 and 3 systems,” he said, explaining how the IIHS had bought and tested several cars with level 2 automation including vehicles from Tesla, Mercedes and BMW. “Even the best ones do things you don’t expect,” he said.

Elon Musk lines up $55bn payday – the world's biggest bonus Read more

During tests the IIHS recorded a Mercedes having problems when the lane on the highway forked in two. “The radar system locked onto the right-hand exit lane when the driver was trying to go straight,” he said.

Tesla’s autopilot suffered from a different, repeatable glitch that caused it to veer into the guardrail when approaching the crest of a hill. “If the driver had been distracted, that definitely would have caused a crash,” he said.

Concern over this new type of distracted driving is forcing automakers to introduce additional safety features to compensate. For example, GM has introduced eye-tracking technology

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd