Incident 67: Sleeping Driver on Tesla AutoPilot

Description: A Tesla Model S remained on autopilot while being operated by a drunk, sleeping operator whose hands were not on the wheel. The police had to slow the car down by slowing in front of the vehicle to activate its 'driver assist' feature .
Alleged: Tesla developed an AI system deployed by Tesla and Motorist, which harmed Motorists.

Suggested citation format

Olsson, Catherine. (2018-12-01) Incident Number 67. in McGregor, S. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
67
Report Count
24
Incident Date
2018-12-01
Editors
Sean McGregor

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover

CSET Taxonomy Classifications

Taxonomy Details

Full Description

A Tesla Model S continued autopilot at 70 mph on a California highway in November 2018 despite the driver's hands not being placed on the wheel, a requirement of enabling the Autopilot system. The California Highway Patrol was unable to wake the driver and had to drive in front of the Tesla for approximately 7 minutes to activate its 'driver assist' feature and slow the vehicle to a stop. The driver was allegedly sleeping and with his blood alcohol content being twice the legal limit.

Short Description

A Tesla Model S remained on autopilot while being operated by a drunk, sleeping operator whose hands were not on the wheel. The police had to slow the car down by slowing in front of the vehicle to activate its 'driver assist' feature .

Severity

Negligible

Harm Type

Harm to physical health/safety

AI System Description

The Tesla Autopilot is an advanced driver assistance system that enhances safety and convenience behind the wheel, possessing 8 external cameras, a radar, 12 ultrasonic sensors and a powerful onboard computer to sense the vehicle's external environment, internal environment, and control the vehicle's functions. Autopilot possesses two functions, Traffic-Aware Cruise Control, which matches the speed of your car to that of the surrounding traffic and Autosteer, which assists in steering within a clearly marked lane, and uses traffic-aware cruise control

System Developer

Tesla

Sector of Deployment

Transportation and storage

Relevant AI functions

Perception, Cognition, Action

AI Techniques

Tesla Autopilot

AI Applications

autonomous driving, environmental sensing

Location

Palo Alto, CA

Named Entities

Teslal, Model S, California, California Highway Patrol

Technology Purveyor

Tesla

Beginning Date

2018-11-30T08:00:00.000Z

Ending Date

2018-11-30T08:00:00.000Z

Near Miss

Near miss

Intent

Accident

Lives Lost

No

Data Inputs

360 Ultrasonic Sonar, Image Recognition Camera, Long Range Radar, traffic patterns

Incident Reports

The California Highway Patrol says a man found passed out Friday behind the wheel of his Tesla on the Bay Bridge told them his car was on autopilot. California Highway Patrol

When California Highway Patrol officers found a man passed out Friday behind the wheel of his vehicle on the Bay Bridge, he had an explanation.

His Tesla was on autopilot, the man said, according to a CHP post on Twitter. Nonetheless, officers arrested him on suspicion of drunken driving after finding his blood alcohol content was twice the legal level of .08.

The Tesla “didn’t drive itself to the tow yard,” joked the CHP post. Officers posted a night-time photo of the arrest to Twitter at 10:23 a.m. Friday, but didn’t say what time officers encountered the man.

When u pass out behind the wheel on the Bay Bridge with more than 2x legal alcohol BAC limit and are found by a CHP Motor. Driver explained Tesla had been set on autopilot. He was arrested and charged with suspicion of DUI. Car towed (no it didn’t drive itself to the tow yard). pic.twitter.com/4NSRlOBRBL — CHP San Francisco (@CHPSanFrancisco) January 19, 2018

Unlimited Digital Access: Only $0.99 For Your First Month Get full access to The Sacramento Bee content across all your devices. SAVE NOW #ReadLocal

Tesla models do offer limited self-driving capabilities on highways and parking lots – including cameras, radar and ultrasonic sensors to guide the car past other vehicles or obstacles, reports Business Insider. The cars can control their own speed, change lanes and take off-ramps, plus park themselves. The autopilot requires the driver to keep his or her hands on the wheel or else pulls over and stops.

Fully autonomous self-driving systems await regulatory changes and additional software development, the site reports.

A writer who test-drove a Tesla with autopilot enabled found the experience both exciting and “terrifying,” reported Business Insider. “Autopilot is slightly terrifying when you first activate it, just from the fact that it goes against every instinct and muscle memory you’ve been taught about driving,” wrote Tony Yoo.

Drunken driving suspect tells cops his Tesla was on autopilot

Late last week, a Tesla driver fell asleep in his car around 5:30 p.m. on San Francisco’s Bay Bridge. He was spotted by motorists, and when police arrived on the scene (and, probably, woke him up) he tried to get out of trouble by claiming that his Tesla was on Autopilot, the vehicle’s proprietary, but still in-development, autonomous driving technology. Unsurprisingly, the excuse didn’t work.

He was found to have twice the legal blood alcohol level and was arrested for a suspected DUI. In the subsequent tweet, the San Francisco division of the California Highway Patrol was not amused, though it did throw him some serious shade: “Car towed (no it didn’t drive itself to the tow yard).”

When u pass out behind the wheel on the Bay Bridge with more than 2x legal alcohol BAC limit and are found by a CHP Motor. Driver explained Tesla had been set on autopilot. He was arrested and charged with suspicion of DUI. Car towed (no it didn’t drive itself to the tow yard). pic.twitter.com/4NSRlOBRBL — CHP San Francisco (@CHPSanFrancisco) January 19, 2018

Australia’s National Transport Commission brought up the question of whether or not automated vehicles should change drunk driving laws last year, as Inverse previously reported. In its initial report, it suggested that laws against driving under the influence should no longer apply. However, the Australian government has yet to change any laws as a result.

Meanwhile, in the United States, while lawyers have made some attempt to sift through the laws on the books to figure out whether someone could get a DUI in a self-driving car, there’s no question about where the law currently stands: You can 100 percent be charged with drinking under the influence while in an autonomous vehicle.

Besides, one point worth keeping in mind here is that if an autonomous car is operating correctly, there really shouldn’t be any reason for the police to pull it over on suspicion of driving under the influence. A self-driving car isn’t going to swerve between lanes just because its occupant is drunk. It’s certainly possible the car’s software could malfunction, but it’s far more likely that the human is going to be blamed for any reckless driving the police might see, especially if they are found to be intoxicated.

“Hopefully a self-driving car would be operating safely enough not to draw the attention of law enforcement,” Christopher Coble, an attorney, wrote in a blog post, “but if you’re pulled over while drunk in an autonomous vehicle, it’s probably going to be on you.”

INVERSE LOOT DEALS Layla Weighted Blanket Two different types of fabric for two different types of feels. Less noisy than other weighted blankets and get it now for as low as $11 a month. Buy Now

And if your car is stopped in the middle of a busy bridge at rush hour and you’re sitting in it passed out? It’s definitely going to be on you.

Hi there. You’ve made it to the bottom of this story! Speaking of which… we’re giving away an epic $5,000 ski trip to Banff, Alberta. Click here to enter! ⛷

Drunk Driver in Tesla Tells Police Officer the Car Was on Autopilot

The California Highway Patrol (CHP) says a driver was found passed out in his Tesla with a very high blood alcohol content on San Francisco’s Bay Bridge on Friday. The driver, according to CHP, claimed the car had been “set on autopilot” in an apparent attempt to defend himself.

The highway patrol, seemingly unimpressed, arrested the unnamed driver, charged him with suspicion of driving under the influence, and towed his car, noting on Twitter that “no it didn’t drive itself to the tow yard.”

When u pass out behind the wheel on the Bay Bridge with more than 2x legal alcohol BAC limit and are found by a CHP Motor. Driver explained Tesla had been set on autopilot. He was arrested and charged with suspicion of DUI. Car towed (no it didn’t drive itself to the tow yard). pic.twitter.com/4NSRlOBRBL — CHP San Francisco (@CHPSanFrancisco) January 19, 2018

Tesla did not confirm that the driver had actually engaged the autopilot system, though it has in the past used driver data in accident investigations. The autopilot system is designed to get a driver’s attention if it detects a challenging situation and brings the car to a stop if a driver does not respond.

Get Data Sheet, Fortune’s technology newsletter.

At the most abstract level, the incident invites us to ask questions about driver responsibility in the age of autonomous vehicles. In the future, will it be okay for us to get in our cars while inebriated, and let them take us home?

Maybe—but for now, that hardly matters. Tesla’s “autopilot” is not fully autonomous driving, though it can look like it for short stretches and under specific conditions. Tesla is clear that drivers using autopilot should remain alert and retain responsibility for their vehicle.

But Tesla drivers don’t always seem to get that message. Over-reliance on autopilot might have contributed to a 2016 fatality involving a distracted driver. The investigation following the crash concluded, in part, that Tesla didn’t have sufficient safeguards to ensure driver attention while using autopilot.

There’s still no confirmation that autopilot was in fact involved in the Friday incident, and with no apparent accident or injuries resulting, it’s unlikely it will lead to further official investigations into driver responsibility. However, it should be concerning in light of the ongoing rollout of Tesla’s Model 3, which features optional autopilot features. It appears Tesla may still have some work to do in educating its customers about the limitations of autopilot, or implementing further controls to prevent drivers from misusing it.

This article has been updated to reflect communication with Tesla.

Drunk Tesla Driver Tells Cops Autopilot Was in Charge

Autopilot controls are not yet fully capable of functioning without human intervention – but they’re good enough to lull us into a false sense of security

When California police officers approached a Tesla stopped in the centre of a five-lane highway outside San Francisco last week, they found a man asleep at the wheel. The driver, who was arrested on suspicion of drunk driving, told them his car was in “autopilot”, Tesla’s semi-autonomous driver assist system.

In a separate incident this week, firefighters in Culver City reported that a Tesla rear-ended their parked fire truck as it attended an accident on the freeway. Again, the driver said the vehicle was in autopilot.

CHP San Francisco (@CHPSanFrancisco) When u pass out behind the wheel on the Bay Bridge with more than 2x legal alcohol BAC limit and are found by a CHP Motor. Driver explained Tesla had been set on autopilot. He was arrested and charged with suspicion of DUI. Car towed (no it didn’t drive itself to the tow yard). pic.twitter.com/4NSRlOBRBL

The oft-repeated promise of driverless technology is that it will make the roads safer by reducing human error, the primary cause of accidents. However, automakers have a long way to go before they can eliminate the driver altogether.

What’s left is a messy interim period when cars are being augmented incrementally with automated technologies such as obstacle detection and lane centering. In theory, these can reduce the risk of crashes, but they are not failsafe. As a Tesla spokeswoman put it: “Autopilot is intended for use only with a fully attentive driver.”

Autonomy gives people a sense something is in control, and we have a tendency to overestimate technology’s capabilities Nidhi Kalra, information scientist

However, research has shown that drivers get lulled into a false sense of security to the point where their minds and gazes start to wander away from the road. People become distracted or preoccupied with their smartphones. So when the car encounters a situation where the human needs to intervene, the driver can be slow to react.

At a time when there is already a surge in collisions caused by drivers distracted by their smartphones, we could be entering a particularly dangerous period of growing pains with autonomous driving systems.

“People are already inclined to be distracted. We’re on our phones, eating burgers, driving with our knees,” said Nidhi Kalra, senior information scientist at the Rand Corporation. “Additional autonomy gives people a sense that something else is in control, and we have a tendency to overestimate the technology’s capabilities.”

Steven Shladover, of the University of California, Berkeley’s Path programme, was more sharply critical of car manufacturers: “These companies are overselling the capabilities of the systems they have and the public is being misled.”

Waymo, Google’s self-driving car spin-off, discovered the handoff problem when it was testing a “level 3” automated driving system – one that can drive itself under certain conditions, but in which the human still needs to takeover if the situation becomes tricky. The next level, four, is what most people consider “fully autonomous”.

Most of the advanced driver assist features introduced by Tesla, Mercedes, BMW and Cadillac are categorised as level 2 automation.

During testing, Waymo recorded what its CEO, John Krafcik, described as “sort of scary” video footage of drivers texting, applying makeup and even sleeping behind the wheel while their cars hurtled down the freeway. This led Waymo to decide to leapfrog level 3 automation altogether, and focus on full autonomy instead.

“We found that human drivers over-trusted the technology and were not monitoring the roadway carefully enough to be able to safely take control when needed,” said the company in its 2017 safety report.

Ian Reagan from the Insurance Institute for Highway Safety (IIHS) shares Waymo’s caution, although he acknowledges that the safety potential for automated systems is “huge”.

“There are lots of potential unintended consequences, particularly with level 2 and 3 systems,” he said, explaining how the IIHS had bought and tested several cars with level 2 automation including vehicles from Tesla, Mercedes and BMW. “Even the best ones do things you don’t expect,” he said.

Elon Musk lines up $55bn payday – the world's biggest bonus Read more

During tests the IIHS recorded a Mercedes having problems when the lane on the highway forked in two. “The radar system locked onto the right-hand exit lane when the driver was trying to go straight,” he said.

Tesla’s autopilot suffered from a different, repeatable glitch that caused it to veer into the guardrail when approaching the crest of a hill. “If the driver had been distracted, that definitely would have caused a crash,” he said.

Concern over this new type of distracted driving is forcing automakers to introduce additional safety features to compensate. For example, GM has introduced eye-tracking technology

Who's driving? Autonomous cars may be entering the most dangerous phase

THE OWNER of a Tesla Model S who was caught on camera using his car’s self-driving Autopilot function while sitting in the passenger seat, while travelling at speed on the M1 motorway, says he is “the unlucky one who got caught”.

The admission, from Bhavesh Patel, who was banned from driving for 18 months, suggests he believes other owners of Tesla cars have pulled the dangerous stunt.

Tesla’s Autopilot system does not give cars fully autonomous, self-driving capabilities. When activated, it can accelerate, brake and steer the vehicle, under certain conditions, but in the UK the driver is required by law to remain in control of the vehicle at all times and must keep their hands on the steering wheel.

Browse NEW or USED cars for sale

Patel, from Nottingham pleaded guilty to dangerous driving at St Albans Crown Court, and admitted he had been ‘silly’.

A passenger of a passing vehicle had spotted the Tesla driving in traffic with nobody at the steering wheel. Patel was said to have been sitting in the passenger seat, with his hands behind his head.

They filmed the incident, which was posted to social media and soon went viral. Hertfordshire Constabulary spotted the offence and issued a Notice of Intended Prosecution.

The dangerous stunt also was saw Patel hit with 100 hours of unpaid work, 10 days rehabilitation and £1,800 in costs to the Crown Prosecution Service.

PC Kirk Caldicutt, the investigating officer, said Patel’s behaviour could have “ended in tragedy”.

He said: “He not only endangered his own life but the lives of other innocent people using the motorway on that day.

“This case should serve as an example to all drivers who have access to Autopilot controls and have thought about attempting something similar.

“I want to stress that they are in no way a substitute for a competent motorist in the driving seat.”

Other Tesla owners go hands-free

Driving.co.uk found other examples of such stunts as several drivers have filmed themselves using Tesla’s Autopilot system while they have been in the passenger seat, or not in control of the vehicle.

In one video, an owner films the car in self-driving mode as they sit in the back seat. In another, a driver turns on Autopilot, then moves to the passenger seat, and the car parks itself despite the fact nobody is ready to take control in the event of danger. And in a third film, a couple can be seen playing cards and pretending to sleep, paying little attention to their surroundings.

In America, there have been several fatal accidents involving Tesla cars that were being operated in Autopilot mode.

In late March, 38-year old Walter Huang was killed when his Model X, operating in Autopilot mode, left the Californian freeway lane it was travelling in and struck a crash barrier and concrete dividing wall.

And in 2016, in Florida, 40-year old Joshua Brown was killed after his Model S’ Autopilot system failed to recognise a truck which pulled across his path.

After Brown’s crash, the US National Transportation Safety Board said Tesla’s Autopilot was partly to blame, and added that Tesla “lacked understanding” of the system’s limitations.

As long ago as 2015, The Sunday Times reported on how during its tests, a Tesla Model S tried to pull into the path of a faster car approaching from its rear, when performing a lane-change.

“Be prepared to take corrective action at all times”

When asked for comment, a Tesla spokesperson pointed to a statement from a Tesla engineer, obtained by officers investigating the Patel case, that described Autopilot as a “suite of driver assistance features”.

The engineer stated that the system involves “hands-on” features intended to provide assistance to a “fully-attentive driver”.

They stated that Traffic-Aware Cruise Control (TACC) assists with acceleration and deceleration of the vehicle while Autosteer provides assistance with steering of the vehicle.

Further literature provided by Tesla explains that drivers should “never depend on TACC to adequately slow down model S, always watch the road in front of you and be prepared to take corrective action at all times. Failure to do so can result in serious injury or death”.

Tweet to @squarejames Follow @squarejames

Tesla owner filmed in passenger seat using Autopilot says he was "unlucky one who got caught"

Here’s what you need to know about electric vehicles The United States has one of the largest electric vehicle fleets in the world. This video breaks down all the numbers on EVs. Up Next × SHARE COPY LINK The United States has one of the largest electric vehicle fleets in the world. This video breaks down all the numbers on EVs.

A Northern California man was arrested Friday after he was caught sleeping and drunk behind the wheel of his Tesla, which was zooming down the highway, police said.

California Highway Patrol officers said they noticed 45-year-old Alexander Samek of Los Altos napping in the driver’s seat around 3:30 a.m. when officers drove next to him on Highway 101 heading south, KCBS reports. His grey Tesla Model S was going 70 miles per hour in a 65 zone.

Officers said nothing they did would rouse the man — not flashing police lights or blaring sirens, according to KCBS. Finally an officer had to make bumper-to-bumper contact to stop him.

“One of the officers basically ended up going in front of the vehicle and basically tried to slow it down,” California Highway Patrol spokesman Art Montiel told the radio station.

Unlimited Digital Access: Only $0.99 For Your First Month Get full access to The Sacramento Bee content across all your devices. SAVE NOW #ReadLocal

The Tesla pulled over on the roadway and Samek was taken to a Palo Alto gas station, ABC 7 reports. Samek was arrested on charges of driving under the influence after he failed a field sobriety test, the San Francisco Chronicle reports.

The incident ended roughly seven miles from the spot where officers initially realized the Tesla driver was fast asleep, according to NBC Bay Area.

Officers said they suspect the autopilot feature of the car was being used, the Chronicle reports. But Tesla warns that the technology isn’t the same as a truly self-driving car, telling drivers to hold on to the steering wheel even when the autopilot function is engaged.

“It’s great that we have this technology,” Montiel said, according to the Mountain View Voice. “However, we need to remind people that ... even though this technology is available, they need to make sure they know they are responsible for maintaining control of the vehicle.”

Samek serves on the Los Altos Planning Commission, NBC reported.

Tesla’s autopilot feature has gotten other Californians in trouble, too.

Earlier this year, a man was arrested on the Bay Bridge, which connects San Francisco and Oakland, after highway patrol officers said the driver was drunk and asleep behind the wheel.

The driver “explained Tesla had been set on autopilot,” according to police.

The man was charged with driving under the influence.

Tesla on autopilot drove drunk sleeping man, CA cops say

The California Highway Patrol on Friday pulled over a Tesla Model S that was traveling down the road—but whose driver appeared to be asleep at the wheel. The vehicle was traveling southbound on Highway 101 in Palo Alto.

Officers said that they were unable to get the man's attention.

"One of the officers basically ended up going in front of the vehicle and basically tried to slow it down," a California Highway Patrol spokesman told KCBS radio. The process took about seven minutes, and the car traveled for about seven miles before coming to a stop.

The driver was Alexander Samek, who serves on the Los Altos Planning Commission. He was arrested for driving under the influence.

So how was the vehicle able to travel for more than seven minutes with an apparently sleeping driver? The obvious theory is that the Model S had its Autopilot system turned on, but officials said on Friday that they hadn't confirmed that yet. It's quite possible that Autopilot saved Samek's life.

The situation is a bit of a puzzle because Autopilot is supposed to detect if a driver's hands are on the wheel and disengage if they're not. Tesla has steadily tightened up these rules, with recent revisions of the software warning drivers in as little as 30 seconds. So if the driver did fall asleep at the wheel the car should have started slowing down on its own within a few minutes.

In a similar case back in January, police encountered a man asleep behind the wheel of a Tesla car on the San Francisco–Oakland Bay Bridge. When police woke him up, he insisted that everything was fine because his vehicle was "on autopilot." Unfortunately for him, there's no autopilot exception to drunk-driving laws.

It took seven miles to pull over a Tesla with a seemingly asleep driver

"When a pair of California Highway Patrol officers pulled alongside a car cruising down Highway 101 in Redwood City before dawn Friday, they reported a shocking sight: a man fast asleep behind the wheel," reports the San Francisco Chronicle:Tesla declined to comment on the incident, but John Simpson, privacy/technology project director for Consumer Watchdog, calls this proof that Tesla has wrongly convinced drivers their cars' "autopilot" function really could perform fully autonomous driving..."They've really unconscionably led people to believe, I think, that the car is far more capable of self-driving than actually is the case. That's a huge problem."

A Sleeping Driver's Tesla Led Police On A 7-Minute Chase

A Tesla Model S driver was pulled over and arrested by the California Highway Patrol yesterday after the police officers saw him seemingly sleeping at the wheel.

It took about 7 minutes and 7 miles for the police to be able to pull the car over, which was allegedly on Autopilot. The driver was arrested for drunk driving.

The driver was Alexander Samek, the chair of the Los Altos Planning Commission, according to Palo Alto Online.

They reported:

“At approximately 3:37 a.m., a California Highway Patrol officer was driving south on Highway 101 near Whipple Avenue in Redwood City and noticed a gray Tesla driving at 70 miles per hour, above the speed limit, according to Montiel. The officer pulled up next to the car and noticed that Samek “appeared to be asleep at the wheel,” he said. The officer pulled behind the Tesla and attempted to pull Samek over, using the patrol car’s lights and sirens, but Samek was “unresponsive,” Montiel said.”

Police officers followed the car for about 7 minutes while it drove 7 more miles before it came to a stop by itself on the side of the road.

The officers said that it took a while for them to wake up Samek. Once they did, they drove him to a nearby gas station where he failed a sobriety test.

California Highway Patrol Public Information Officer Art Montiel added:

“It’s great that we have this technology; however, we need to remind people that … even though this technology is available, they need to make sure they know they are responsible for maintaining control of the vehicle,”

When activating the Autopilot features, Tesla reminds drivers that they are responsible for the vehicle and they need to be ready to take control at all time.

The system sends out alerts to hold the steering wheel every few seconds if it doesn’t detect torque being applied to the wheel.

If the driver fails to respond to an alert, the car will eventually slow down and come to a stop on the side of the road if the environment allows it.

It’s unclear how Samek avoided that for so long if he was asleep at the wheel. We contacted Tesla about the issue, but a spokesperson said that they don’t know how.

The CHP is investigating the use of Autopilot in relation to the incident.

Electrek’s Take

First of all, great job by the police here. They managed to spot him sleeping and it looks like they even put a patrol car in front of the Tesla in order to slow it down.

Also, Autopilot likely made the situation safer here since if the driver would have driven his own car anyway, he would have most likely fallen asleep too and it could have resulted in an accident.

That said, it’s obviously not the purpose of Autopilot under its current version and you should never take the wheel after drinking.

What I find most interesting in this case is how the Model S on Autopilot kept driving for miles seemingly without input from the driver to cancel the “Autopilot nag.”

Was he putting pressure on the wheel while he was asleep? Did he use one of those Tesla Autopilot ‘buddy’ hacks to avoid ‘nag’?

What do you think? Let us know in the comment section below.

Subscribe to Electrek on YouTube for exclusive videos and subscribe the podcast.

Tesla on Autopilot drove 7 miles with driver drunk asleep, police says

Bryan Logan/Business Insider The Tesla Model S involved in the incident is not pictured here.

Police in the Northern California town of Redwood City arrested a man they found sleeping behind the wheel of his Tesla Model S as it drove down a highway early Friday morning.

The electric luxury sedan had been travelling south on Highway 101, going about 70 mph, California Highway Patrol Officer Art Montiel told Business Insider.

Montiel said officers believed the Tesla was operating on Autopilot because the driver, Alexander Samek, did not respond to their lights and sirens when they tried to pull the vehicle over.

Police stopped traffic behind the Tesla while another officer travelling in front of the car gradually slowed down, forcing the semi-autonomous sedan, which can respond to varying traffic speeds and accelerate or slow down accordingly, to a complete stop.

Police in the Northern California town of Redwood City arrested a man who was travelling on Highway 101 early Friday morning while sleeping behind the wheel of his Tesla Model S.

Officers first spotted the electric luxury sedan driving south at about 70 mph around 3:40 a.m., California Highway Patrol Officer Art Montiel told Business Insider on Friday night.

Montiel said the officers took action when it became clear that the driver, 45-year-old Alexander Samek, was sleeping.

“The driver wasn’t responding to lights and sirens,” Montiel said.

The officers believed the Tesla may have been operating on Autopilot, a semi-autonomous-driving feature that allows Teslas to drive and change lanes in traffic with minimal human input.

In order to get the sleeping driver’s Tesla to stop, Montiel said officers blocked traffic behind the vehicle while another officer travelling in front of the car gradually slowed down, forcing the Tesla, which can respond to varying traffic speeds and accelerate or slow down accordingly, to a complete stop.

“Once the vehicle came to a stop, the officers got out of their patrol cars, approached the Tesla, and knocked on the windows to wake up the driver,” Montiel said.

Officers placed Samek in a patrol car, while another one drove the intoxicated man’s Tesla off the freeway and parked it at a nearby gas station.

Samek was arrested on suspicion of driving under the influence. Montiel applauded the CHP’s “quick thinking” to get the Tesla and its driver out of harm’s way.

Several Teslas have crashed while operating on Autopilot in recent months. A man was killed when his Model X SUV slammed into a highway barrier in Mountain View, California, in March.

Teslas equipped with Autopilot cannot drive themselves. The system deploys an escalating series of warnings if it detects that the driver does not have their hands on the steering wheel. If the driver does not respond, the system deactivates itself.

Tesla declined to comment on the incident.

Business Insider Emails & Alerts Site highlights each day to your inbox. Email Address Join

Follow Business Insider Australia on Facebook, Twitter, LinkedIn, and Instagram.

Police in the San Francisco Bay Area took an unusual approach to stop a Tesla operating on Autopilot as a drunk driver slept behind the wheel

It took California Highway Patrol seven minutes to pull over a Tesla driver. The driver appeared to be asleep at his wheel.

Alexander Samek had allegedly dozed off while operating his Tesla Model S when an officer tried to pull him over. In the end, the police had to surround the car to slow it down and then arrested Samek on suspicion of driving under the influence.

The Model S is thought to have been in Autopilot, Tesla's semi-autonomous driving mode. This is what would have enabled a sleeping Samek to keep driving for the seven minutes it took Highway Patrol to stop him. It's not currently clear if Autopilot was active, however.

Tesla steering wheels are fitted with sensors that can detect when a driver's hands let go of the wheel. A summer update to the Autopilot software made it so built-in alarms go off as frequently as every 15 or 20 seconds if the sensors can no longer detect pressure on the steering wheel. The cars are also programmed to come to a gradual stop if too many Autopilot warnings go unheeded.

It's not clear how Samek could have been asleep for the whole seven minutes it took to pull him over, given Tesla's built-in safeguards.

CORRECTION: Dec. 3, 2018, 9:10 a.m. PST An earlier version of this article incorrectly referred to Tesla Autopilot as a self-driving feature. We regret the error.

Cops struggle to pull over allegedly drunk, sleeping Tesla driver

When a pair of California Highway Patrol officers pulled alongside a car cruising down Highway 101 in Redwood City before dawn Friday, they reported a shocking sight: a man fast asleep behind the wheel.

The car was a Tesla, the man was a Los Altos planning commissioner, and the ensuing freeway stop turned into a complex, seven-minute operation in which the officers had to outsmart the vehicle’s autopilot system because the driver was unresponsive, according to the CHP.

The arrest of 45-year-old Alexander Samek on suspicion of drunken driving reignited questions about the uses, and potential abuses, of self-driving technology.

Reached by phone Friday afternoon, Samek, a real estate developer who runs the Kor Group, said, “I can’t talk right now,” before hanging up.

Officers observed Samek’s gray Tesla Model S around 3:30 a.m. as it sped south at 70 mph on Highway 101 near Whipple Avenue, said Art Montiel, a CHP spokesman. When officers pulled up next to the car, they allegedly saw Samek asleep, but the car was moving straight, leading them to believe it was in autopilot mode.

The officers slowed the car down after running a traffic break, with an officer behind Samek turning on emergency lights before driving across all lanes of the highway, in an S-shaped path, to slow traffic down behind the Tesla, Montiel said.

He said another officer drove a patrol car directly in front of Samek before gradually slowing down, prompting the Tesla to slow down as well and eventually come to a stop in the middle of the highway, north of the Embarcadero exit in Palo Alto — about 7 miles from where the stop was initiated.

Authorities said the entire operation took about seven minutes.

Officers then walked up to the Tesla and “attempted to wake up Samek by knocking on the window and giving verbal commands,” Montiel said. “After Samek woke up and got out of the Tesla, he was placed in the back of the patrol car and taken off the freeway.”

At a nearby gas station, Montiel said, Samek was given a field sobriety test before being arrested.

Tesla, whose autopilot technology assists in steering, changing lanes and parking, has seen its vehicles involved in several notable accidents in the last few years.

In 2016, a man was killed in Florida after the Model S he was driving collided with a tractor trailer while the Tesla was in autopilot mode. In January, a man was arrested on suspicion of driving under the influence on the Bay Bridge. He told officers he had been using his Tesla’s autopilot mode. Also this year, a man driving a Model X on Highway 101 in Mountain View was killed after his vehicle, which was in autopilot mode, struck a concrete divider.

The company declined to comment on Friday’s incident.

On its website, Tesla notes that its autopilot technology is not synonymous with self-driving cars or autonomous vehicles, and that drivers must keep their hands on the wheel at all times.

William Riggs, a professor of transportation, technology and engineering at the University of San Francisco, said most cars have some variation of autonomous features, and as the technology advances, people will feel more comfortable being distracted when driving.

“The software and technology is so good that humans are going to violate it,” Riggs said. “The future of autonomy really invites a lot of opportunities to disengage from the driving activity as we see more and more AI in control of vehicles.”

John Simpson, privacy and technology project director with Consumer Watchdog, saw Friday’s incident as further evidence of Tesla drivers being inappropriately led to believe the autopilot function is akin to fully autonomous driving.

In May, Simpson’s organization and the Center for Auto Safety asked the Federal Trade Commission to investigate “deceptive and unfair” practices in the advertising and marketing of the autopilot feature.

“They’ve really unconscionably led people to believe, I think, that the car is far more capable of self-driving than actually is the case. That’s a huge problem,” Simpson said. “In this case, it sounds as if it played out with some incredible work by the state police — the Highway Patrol — to get the car to stop. But it’s really amazing they were able to do that.”

Jim McPherson, a Bay Area attorney and industry analyst, said the autopilot feature is supposed to cut out if the driver doesn’t keep a hand on the wheel and exert “a minimal amount of torque.”

“But the length of the warning is variable. It could be a few seconds, it could be a minute or two,” McPherson said. “It’s really quite a miracle that this guy continued for 7 miles without autopilot cutting out.”

McPherson said he would expect the autopilot system to have alerted the driver to keep his hands on the wheel.

“Also, if he’s asleep, any more than a slight nudge of the wheel or touch of the pedal is going to cancel autopilot,” McPherson said. “So for it to get as far as it did, it’s a surprise to me.”

Riggs said the best time to use autopilot technology is d

CHP: Drunk driver slept while Tesla appeared to drive Hwy 101 on autopilot

There may come a day when cars can truly drive themselves and you can take a nap during your commute, but we’re not there yet. In the meantime, the ever-advancing state of self-driving technology has made some people a bit too comfortable turning over control to a system that can’t respond as effectively as a human driver. The California Highway Patrol recently pulled over a Tesla with some difficulty after police spotted the driver passed out behind the wheel.

The incident took place on Highway 101 when a patrol car spotted a Tesla Model S with a driver that appeared to be asleep. While the Tesla Autopilot system can handle some basic driving tasks, it’s not good enough that you can take a nap. The driver, 45-year-old Alexander Samek, was drunk and had passed out on his way home. You could argue that having the Autopilot system driving was probably safer than giving the heavily impaired human control of the car, but neither situation is legal.

Autonomous driving systems generally fall into one of five levels. Level one includes basic automation like lane assistance. At level two, a car can support for one or more tasks without the driver’s constant interaction — for example, many modern cars can steer and brake for several seconds on the highway without your hands on the wheel. Level three is where most research is currently focused. These vehicles use advanced sensors to scan the environment and drive for extended periods while responding to changing conditions. At level four and five, cars can drive well enough that you don’t have to pay attention at all. Level four can handle most types of driving, and level five is full automation that you never need to think about.

Tesla’s Autopilot system is somewhere between level two and three, so it’s certainly not good enough to take a nap behind the wheel. Patrol cars following Samek’s Tesla were unable to rouse him from his alcohol-fueled slumber, which made stopping the Tesla rather tricky. They eventually worked out a method to use the car’s own sensors to stop it. One patrol car stayed behind the Tesla, driving in a sweeping S-curve to keep other cars from getting in the way. Meanwhile, another cruiser maneuvered in front of the Model S and gradually slowed down. The Tesla’s radar saw a slower car in front, so it too slowed down. Eventually, it stopped in the middle of the highway.

It took about seven miles for police to bring the Model S to a stop. Samek was promptly arrested for drunk driving, but a cunning lawyer might attempt to point out he wasn’t technically driving. Whatever the outcome, this is a brave new world.

Now read: Uber Driver in Fatal Self-Driving Car Crash Was Streaming Hulu, Tesla Blames Driver in Model X Autopilot Crash, and Self-Driving Cars Could Use Lasers to See Around Corners

Police Pull Over Self-Driving Tesla with Sleeping Man Behind the Wheel

Redwood City California Highway Patrol stopped a Tesla Model 3 they suspected was running Autopilot with a drunk driver asleep at the wheel. The incident occurred last Friday, November 30th at 3:37AM PT, when officers observed a car going 70 mph on Highway 101 with a driver that appeared to be asleep.

After flashing their lights and sirens in an attempt to pull the car over, the officers deployed a strategy based around their assumption that the Tesla Model 3 was running on Autopilot. According to the CHP incident report, two unit cars pulled up in front and behind the Tesla to get the car to gradually come to a stop, after a seven-mile chase. A statement from the CHP reads, “We cannot confirm at this time if the “driver assist” feature was activated but considering the vehicle’s ability to slow to a stop when [the driver] was asleep, it appears the “driver assist” feature may have been active at the time.”

It’s difficult to determine whether Autopilot was actually on at the time, as the feature requires drivers to keep a firm grip on the steering wheel for it to stay engaged. It’s possible that the driver may have had another Tesla Model 3 feature on, like Traffic Aware Cruise Control, which manages speed against the car in front of the Tesla.

it’s not confirmed whether it was actually on autopilot

It’s not clear which exact feature was engaged, as Teslas have several different autonomous driving features and it can be confusing to keep track of all of them. Most people, even cops and some Tesla drivers, aren’t totally sure what Teslas can do. Tesla warns that Autopilot is only meant to be used on highways, and still requires the driver to remain fully alert while driving, but cases like these show that drivers will continue to abuse Autopilot features and misinterpret them as “self-driving.”

CHP public information officer Art Montiel told the LA Times that “there was no training for the situation the officers encountered and attributed the outcome to their ‘quick thinking.’” So while there isn’t yet a standard plan for pulling over a car with an unresponsive driver using some of this technology, it seems likely that police officers will devise one.

Police may have used Tesla’s Autopilot feature to stop driver asleep at the wheel

Send us a letter

Have an opinion about this story? Click here to submit a Letter to the Editor, and we may publish it in print.

Tesla on Autopilot drove 7 miles with sleeping drunken driver, police say

Share

It’s not uncommon for police officers to find a driver asleep in their car, but finding a driver asleep in one that’s motoring along at 70 mph … well, that’s something else.

The alleged incident took place on Highway 101 in Palo Alto, California at just after 3 a.m. on Friday, November 30.

At first, the episode sounds just plain weird, but the car was a Tesla Model S, which can be driven on Autopilot, a mode that offers a certain degree of self-driving functionality.

Understandably unhappy about the occupant apparently sleeping at the wheel of a moving car, officers with the California Highway Patrol had to work out how to bring the vehicle to a halt as it motored along the road.

They opted to call in additional patrol cars, several of which drove behind the Tesla to slow down traffic that was coming up the rear. At the same time, one of the cars took up a position just ahead of the Tesla before gradually slowing down, causing the Model S to follow suit.

In all, it took seven minutes and seven miles to bring the Tesla to a safe stop.

Local media reported that cops then had to bang on the window to rouse the driver, who was later identified as Los Altos planning commissioner Alexander Samek. He was arrested at the scene on suspicion of driving under the influence.

It has yet to be confirmed that the Tesla was on Autopilot, though the description of the incident seems to suggest that this was the case.

The curious part of the story, however, is that in Autopilot mode the driver still has to keep their hands on the wheel in order for the car to proceed, with Tesla’s system issuing multiple alerts if it detects otherwise. If the driver fails to respond to the alerts, the car should automatically slow to a halt until the driver demonstrates that they have overall control of the vehicle by taking the wheel.

The officers’ efforts to bring the Model S to a safe stop are clearly commendable, while some will praise Tesla’s technology in the way that it appeared to prevent an accident from occurring when the driver wasn’t in full control of the vehicle. The company will of course be keen to learn the full facts of the case, and to understand how the car was able to operate if the driver was truly asleep.

Commenting on the unusual incident, California Highway Patrol public information officer Art Montiel said: “It’s great that we have this technology; however, we need to remind people that … even though this technology is available, they need to make sure they know they are responsible for maintaining control of the vehicle.”

Cops Chased a Tesla for 7 Miles While the Driver Apparently Slept

That’s sophisticated cruiser control!

Late last week at 3:30 in the morning, the California Highway Patrol (CHP) noticed the driver of a gray Tesla Model S seemingly asleep in his car. Unfortunately, it was going 70 miles an hour down Highway 101 in Redwood City at the time. Fortunately, it was on Autopilot, which kept it in a single lane and responsive to traffic ahead of it.

The officer’s quick thinking led to a very elegant solution to getting the unresponsive driver, along with other motorists, out of a dangerous situation. Over seven minutes, they created a running traffic break, slowing down all the lanes behind the Tesla while another cruiser maneuvered in front of the electric sedan. That police vehicle then began to slow to a complete stop, which caused the Model S to also slow and stop.

Officers then knocked on the window and gave verbal commands, arousing the driver, one Alexander Samek, from his apparent slumber. He was placed in a cruiser and taken to a gas station (of all places) where a breathalyzer test was administered (see video above for footage). He was then arrested. Samek, who runs a multi-billion-dollar real estate outfit, The Kor Group, was contacted by SFGate on Friday, but refused to talk to reporters.

The incident raises some interesting questions, the most puzzling of which is how Samek managed to keep the car operating while seemingly asleep. If Autopilot does not detect a driver’s hands on the wheel it will ask for some input using visual and audio prompts. This usually occurs every 30 seconds or so. If it does not receive any feedback, it will slow the car to a stop and turn on the hazard lights. This didn’t seem to occur in this case.

Tesla CEO Elon Musk addressed the situation on Twitter (tweet embedded below), reiterating how the car should behave under Autopilot and saying that he is looking into this particular situation. In another tweet (embedded below) he also added “…adding police car, fire truck & ambulance to the Tesla neural net in coming months,” meaning, we believe, that Tesla vehicles will then be able to distinguish first responder vehicles from others.

Some have taken advantage of the situation to criticize Tesla Autopilot — and, by extension, the Advanced Driving Assistance Systems (ADAS) available in many other vehicles — claiming that it encourages people to drive under the influence. We think that while this may or may not be true (and we hope to see the integration of a driver-focused camera as is used in Cadillac’s Super Cruise which can tell if a vehicle operator is alert and watching the road), it is still, overall, a safety benefit.

Although automakers should certainly consider how to counter criminal misuse of their vehicles, ADAS can greatly diminish the risk of a crash if a driver suffers a medical emergency and loses consciousness. While there have been a couple of high-profile fatal accidents in Tesla vehicles functioning on Autopilot, the company takes the position it is a positive contributor to safety. We have also seen a number of videos which appear to show the system avoiding crashes. It is also worth noting that Autopilot is constantly being improved and the entire fleet that has the system enabled is updated with any new changes on a regular basis.

Exactly. Default Autopilot behavior, if there’s no driver input, is to slow gradually to a stop & turn on hazard lights. Tesla service then contacts the owner. Looking into what happened here. — Elon Musk (@elonmusk) December 3, 2018

We’re adding police car, fire truck & ambulance to the Tesla neural net in coming months — Elon Musk (@elonmusk) December 3, 2018

Source: SFGate, YouTube

Watch How Cops Use Tesla Autopilot To Stop Alleged Drunk Driver

As technology advances, so must policing. Last week, when a couple of California Highway Patrol officers spotted a man apparently sleeping in the driver’s seat of a Tesla Model S going 70 mph down Highway 101 in Palo Alto around 3:30 am, they moved behind the car and turned on their siren and lights. When the driver didn’t respond, the cops went beyond their standard playbook. Figuring the Tesla might be using Autopilot, they called for backup to slow traffic behind them, then pulled in front of the car and gradually started braking. And so the Tesla slowed down, too, until it was stopped in its lane.

“Our officers’ quick thinking got the vehicle to stop,” says CHP public information officer Art Montiel. The officers arrested the driver, identified in a police report as 45-year-old Alexander Joseph Samek of Los Altos, for driving under the influence of alcohol.

Neither the cops nor Tesla has confirmed whether the Model S had Autopilot engaged at the time. It seems likely it was, though, since the vehicle was staying in its lane and responding to vehicles around it, even though its driver didn’t wake up until the cops knocked on his window.

LEARN MORE The WIRED Guide to Self-Driving Cars

Tesla clearly tells its customers who pay the extra $5,000 for Autopilot that they are always responsible for the car’s driving, and that they must remain vigilant at all times. Driving drunk is illegal. And the vehicle’s sorta-self-driving tech may have prevented a crash. But if Autopilot did allow a slumbering and allegedly drunk driver to speed down the highway, it brings up another question: Is Elon Musk’s car company doing enough to prevent human abuse of its technology?

It’s long-standing but still-relevant criticism. Last year, a National Transportation Safety Board investigation into the 2016 death of an Ohio man whose Tesla hit a semi-truck while Autopilot was engaged concluded that Tesla bore some of the blame. When the oncoming truck turned across the path of the Tesla, the sedan didn’t slow down until impact. “The combined effects of human error and the lack of sufficient system controls resulted in a fatal collision that should not have happened,” NTSB Chairman Robert Sumwalt said at the time.

Since then, Tesla has restricted how long a driver can go without touching the steering wheel before the receiving a warning beep. If they don’t respond, the system will eventually direct the car to stop and hit its hazard lights. That makes this incident a bit confusing, as Musk noted in a tweet:

The sensors in the steering wheel that register the human touch, though, are easy to cheat, as YouTube videos demonstrate. A well-wedged orange or water bottle can do the trick. Posters in online forums say they have strapped weights onto their wheels and experimented with Ziplock bags and “mini weights.” For a while, drivers even could buy an Autopilot Buddy “nag reduction device,” until the feds sent the company a cease-and-desist letter this summer.

All of which makes the design of similar systems offered by Cadillac and Audi look rather better suited to the task of keeping human eyes on the road, even as the car works the steering wheel, throttle, and brakes. Cadillac’s Super Cruise includes a gumdrop-sized infrared camera on the steering column that monitors the driver’s head position: Look away or down for too long, and the system issues a sharp beep. Audi’s Traffic Jam Pilot does the same with an interior gaze-monitoring camera.

Humans being human, they will presumably find ways to cheat those systems (perhaps borrowing inspiration from Homer Simpson) but it’s clear a system that monitors where a driver is looking is more robust for this purpose than one that can be fooled by citrus.

It’s possible Tesla will give it a shot. The Model 3 comes with an interior camera mounted near the rearview mirror, and though the automaker hasn’t confirmed what it’s for, don’t be surprised if an over-the-air software update suddenly gives those cars the ability to creep on their human overlords.

And if that doesn’t work, well, there’s always the Ludovico Treatment. Or a car that does all the driving, no human needed.

More Great WIRED Stories

A Sleeping Tesla Driver Highlights Autopilot's Biggest Flaw

It took the police officers in two vehicles seven minutes to outsmart Tesla's Autopilot system.

December 3, 2018 2 min read

This story originally appeared on PCMag

One day in the not too distant future, sleeping at the wheel will become commonplace because we'll all be traveling around in autonomous vehicles. However, in 2018 that's not the case, even if Tesla Autopilot is capable of driving a drunk man home.

As HotHardware reports, California Highway Patrol officers recently spotted a Tesla Model S driving south on Highway 101 with what looked to be a person asleep behind the wheel. Sure enough, when officers looked more closely they discovered a man who was both asleep and unresponsive. That man was 45-year-old Alexander Samek, a Los Altos planning commissioner, and he was drunk.

Related: This Is What It's Like to Drive the Ferrari Hatchback, the World's Most Practical Supercar

Tesla's Autopilot is quite an advanced autonomous driving aid, but it's not fully-autonomous and requires an alert driver behind the wheel at all times. Samek had decided in his drunken state to entrust his drive home to Tesla's system and clearly relaxed a little too much during the journey.

The problem officers had when they couldn't wake Samek was how to go about stopping the car. In the end it took two patrol cars around seven minutes to bring it to a halt. First the officers slowed traffic down behind the vehicle to create a gap, then one patrol vehicle drove in front of the car while the other drove behind and slowly lowered their speed. In the end, the Model S was brought to a standstill in the middle of the highway thinking it was stuck in a traffic jam.

Related: 'That Was Not Appropriate Behavior': NASA Roasted Elon Musk for Smoking Weed Live on the Internet

Samek was then woken up with some loud knocks on the driver's side door. He was asked to carry out a field sobriety test and then promptly arrested. If this is Samek's first DUI then he faces up to six months in jail, fines and penalties of up to $1,000, and a potential six month license suspension. However, he may also face additional charges because he was asleep at the wheel which may count as reckless driving.

Police Catch Tesla Autopilot Driving Home Sleeping Drunk

In an exciting first, the autopilot feature in a Tesla car managed to save rather than kill its occupant.

At 0300 PT on Friday, the highway patrol pulled alongside at grey Tesla Model S travelling at 70 miles an hour on a freeway down toward Silicon Valley and noticed that driver – 45-year-old Alexander Samek – appeared to be fast asleep.

So the police car pulled behind the Tesla and flashed its lights and siren in an effort to wake him up. He didn't budge – most likely because he was dead drunk – the police said in a statement he "was placed under arrest for driving under the influence of alcohol."

The officers figured that since the car was following the freeway lanes – rather than the veering and swerving you'd expect if no one was driving – that the Tesla's "driver assist" feature was on and so did a very smart thing: They maneuvered in front of the car and then gradually slowed down, finally coming to a stop in the lane.

The officers then tried to rouse Samek by banging on the window and shouting at him – he eventually woke up at which point they quickly determined he was intoxicated and arrested him. An officer then drove the car off the freeway.

Highlighting just how incredibly lucky Samek is to be alive, back in March, a Tesla autopilot was responsible for killing its sober and conscious driver when it got confused and accelerated into a barrier when the same highway split.

That deadly crash was only four miles from where Samek was pulled over – and he would have reached the exact same spot three minutes later if the cops hadn't intervened.

Of course, this being Tesla the fact that the car continued to zoom along the freeway with an unconscious occupant was painted by CEO Elon Musk as a great thing.

So safe, so safe

The company itself says that the system is supposed to check for constant signals that the driver is capable of intervening – something that seems very unlikely given that a police car's lights and siren weren't enough to rouse its occupant.

But the Tesla fanbois were out in full force. "This is actually pretty amazing Marketing for Tesla. No one died and the autopilot works. Cool," tweeted one, prompting another to respond: "Idea @elonmusk for Autopilot: if the AP can detect whether or not if a driver is conscious then pull over and use hazard lights, send a signal to police or EMTs. Also make AP able to detect if police are trying to pull you over."

Oi, Elon: You Musk sort out your Autopilot! Tesla loyalists tell of code crashes, near-misses READ MORE

Which in turn prompted the publicity shy CEO of Tesla to tweet: "Exactly. Default Autopilot behavior, if there’s no driver input, is to slow gradually to a stop & turn on hazard lights. Tesla service then contacts the owner. Looking into what happened here."

Which is, of course, not what happened. The cops say they had to force the car to slow down by getting in front of it and making it slow down by putting themselves and their vehicle in harm's way.

In Musky World, however, the system yet again worked perfectly. Like when a Tesla slammed into a stationary firetruck at around 65mph on Interstate 405 in Culver City, California. Or when one crashed into a parked police car in Laguna Beach, also in California. Or when Joshua Brown was killed in Florida after his Tesla crashed into a truck. Or when one ended up on its roof in a swamp in Minnesota. Each time Autopilot was involved.

We've asked Tesla for comment on this matter and will update as more details come in. ®

Sponsored: Becoming a Pragmatic Security Leader

Tesla autopilot saves driver after he fell asleep at wheel on the freeway

A man from northern California was arrested after police caught him sleeping behind the wheel.

If, like me, you’re not a car lover, you may be wondering how that’s possible. Well, he was driving a Tesla Model S – a luxury electric sedan that offers semi-autonomous driving features.

Meanwhile, my car barely drives.

California Highway Patrol Officer Art Montiel told Business Insider the Tesla was driving about 70 mph (115 kph) at 3:40am on Friday morning. When officers attempted to pull the car over using lights and sirens, it didn’t respond, alerting police to the possibility the car was in self-drive mode.

In a strategic move, one officer blocked traffic behind the Tesla while another drove in front, gradually slowing down until all three cars came to a complete stop. The strategy was safe as the Model S is capable of responding to varying traffic speeds (when did we enter the future?).

“Once the vehicle came to a stop, the officers got out of their patrol cars, approached the Tesla, and knocked on the windows to wake up the driver,” said Montiel.

45-year-old Alexander Samek was the man found allegedly sleep-driving. Unsurprisingly, the police claim he was also intoxicated.



Montiel praised the “quick thinking” displayed by the officers to get Samek off the road before he or any other driver was hurt.

However, Montiel issued a warning to drivers during an interview with CBS.

“It’s great we have this technology, however we need to remind people that they need to be responsible,” he said.

A crucial point which may put the brakes on the Jetsons future you’re excited for is that Tesla’s autopilot feature doesn’t mean the car can literally drive itself. In most models, the system will warn the driver if they don’t have their hands on the wheel and deactivate itself if the driver doesn’t respond. Enhanced features allow the vehicle to automatically stay within a lane, change lanes when necessary and even exit the freeway.

In Samek’s case, it’s impossible to know how long the autopilot had been engaged and if it was likely to deactivate, or whether he had somehow fallen asleep with his hands resting on the wheel.

This is not the first time Tesla vehicles have been involved in driving incidents due to the nature of the autopilot feature – in one case, a man was killed in a collision with a highway barrier after ignoring his car’s automated warnings.

Neither of these features sound very foolproof – or safe – to me. I daresay we haven’t seen the last of Tesla’s road incidents.

The brilliant way cops pulled over a drunk driver passed out in a Tesla on autopilot

In the early morning hours, California Highway Patrol chased a grey Tesla S for an unfathomable seven miles down Highway 101 as the driver slept, police said.

Redwood City Area CHP officers said they observed Alexander Joseph Samek, a local Los Altos politician, driving at around 3:30 a.m. PST on Nov. 30. Police followed Samek with lights and siren on, but he remained “unresponsive,” and “appeared to be asleep at the wheel,” according to the arrest report.

Assuming that the car was on Autopilot, police drove in front of Samek and "began slowing directly in front of the Tesla in hopes that the ‘driver assist’ feature had been activated and the Tesla would slow to a stop as the patrol vehicle came to a stop," the arrest report said. Samek was charged on suspicion of driving under the influence.

But what is befuddling transportation analysts and Tesla watchers is that the chase could even go on for that long. Tesla's "Autopilot" feature requires a driver to touch the steering wheel every minute, or the system alerts the driver and gradually brings the car to a stop. It seems that in this case, Autopilot may not have worked, or the driver somehow subverted the process, experts say.

(Artur Widak/NurPhoto via Getty Images) The wheel of a Tesla Model S P100D, May 8, 2018.

Tesla declined to comment on the accident or confirm the car was in Autopilot mode. But on Sunday night, Musk tweeted: "Default Autopilot behavior, if there’s no driver input, is to slow gradually to a stop & turn on hazard lights. Tesla service then contacts the owner. Looking into what happened here."

Exactly. Default Autopilot behavior, if there’s no driver input, is to slow gradually to a stop & turn on hazard lights. Tesla service then contacts the owner. Looking into what happened here. — Elon Musk (@elonmusk) December 3, 2018

In a follow-up tweet, Musk said that Autopilot could not distinguish between different types of emergency vehicles, but that it would be able to in the near future. "We’re adding police car, fire truck & ambulance to the Tesla neural net in coming months," he wrote.

We’re adding police car, fire truck & ambulance to the Tesla neural net in coming months — Elon Musk (@elonmusk) December 3, 2018

Redwood City CHP is familiar with the Tesla Autopilot feature in part because of a fatal crash the agency investigated in March. A 38-year old engineer at Apple died after he did not place his hands on the wheel in time when the car was in Autopilot mode, Tesla said.

The March crash is being investigated by the National Transportation and Safety Board.

Dan Edmunds, director of vehicle testing at Edmunds, an automotive research firm, has been reviewing partially automated vehicles, and called Tesla’s Autopilot a misleading term for an "overhyped automated cruise control system." He said it was difficult to come up with an explanation for such a long car chase, and it underscored shortcomings with Tesla's safety features.

"Certainly somebody could defeat the one minute timeout that allows you to put your hands on the wheel and the car could go longer,” Edmunds told ABC News. “Cadillac's Super Cruise system would not have allowed you to behave this way because Super Cruise does something that Tesla doesn't do and should do. It has sensors look at your head to see which way it's pointed to make sure your chin's up and not down against your shirt, and also looks at your eyeballs to see where they're looking. So even if you're head's up, and you look off to the side, it will warn you and eventually disengage."

"The fact that it doesn't monitor the driver's head position and line of sight is really a major shortcoming," Edmunds said. "Just because somebody has their hands on the wheel, maybe the guy's leaning on it, passed out, with just enough force to make it think that he's got his hands on the wheel. The car isn't really sure what the driver's looking at. It doesn't matter if you have your hands on the wheel or not, it matters if you're looking out the windshield at the cars ahead."

Police chased the 'unresponsive' driver of a Tesla S that was on Autopilot for 7 miles in California. How can that happen?

We have been hearing stories about how the autonomous system has failed in cars and even killed people in some cases. How about we talk about a life-saving incident for a change? A 45-year old man in his Tesla Model S was spotted by the cops cruising on an expressway fast asleep. They chased the car for minutes before successfully stopping it. Imagine the dire consequences had it not been for the Autopilot system.

According to the San Francisco Chronicle, cops in California arrested a 45-year-old man who was sleeping in his Tesla Model S while it was cruising down Highway 101. The driver was in an inebriated state and Autopilot took over once the driver stopped giving inputs to the car. Although it is still unclear if the car was actually running on Autopilot, but it took them over several miles and seven minutes to stop the Model S.

The driver, Alexander Samek, was detained after the police spotted his gray color Model S driving at 70 miles per hour Southbound on Highway 101 at 3:30 a.m. and gave it a chase to stop it.

HOW AUTOPILOT IS SUPPOSED TO WORK

Tesla did not comment on whether the Autopilot was engaged or not, but if it actually was, then the technology has saved someone’s life. The automaker only said that it was “looking into what happened here.” Since all companies who install autonomous technologies in their cars insist that drivers should always be in control of the car, these kinds of incidents raise concerns about how that technology is utilized by drivers.

The way the system is designed, it is supposed to detect if the driver is holding the steering wheel, and, if they’re not, provide a series of warnings.

If the driver provides no response at all, the car is supposed to start slowing down on its own within a few minutes, before coming to a complete stop and turning on its hazard lights.

WHAT THEY HAD TO SAY

Tesla Model S Drove On Autopilot and Saved The Drunk Driver's Life

  • image 808555

This is what Chronicle’s report said, “When officers pulled up next to the car, they allegedly saw Samek asleep, but the car was moving straight, leading them to believe it was in autopilot mode. The officers slowed the car down after running a traffic break, with an officer behind Samek turning on emergency lights before driving across all lanes of the highway, in an S-shaped path, to slow traffic down behind the Tesla, Montiel said. He said another officer drove a patrol car directly in front of Samek before gradually slowing down, prompting the Tesla to slow down as well and eventually come to a stop in the middle of the highway, north of the Embarcadero exit in Palo Alto — about 7 miles from where the stop was initiated. Authorities said the entire operation took about seven minutes.”

The California Highway Patrol Public Information Officer, Art Montiel told the Mountain View Voice that “It’s great that we have this technology; however, we need to remind people that ... even though this technology is available, they need to make sure they know they are responsible for maintaining control of the vehicle.”

OUR TAKE

Tesla Model S Drove On Autopilot and Saved The Drunk Driver's Life

  • image 808554

Such instances are definitely heart-warming to read, especially when technology stopped an almost certain fatal incident that was about to happen. On the other hand, people are giving in completely to a technology that is there to assist you with driving, not drive the car for you on its own. No matter how sophisticated and complex, it is “artificial” intelligence at the end of the day. So, don’t be stupid and let technology take over you. What is your take on this whole episode? Do you think this one-off incident was enough to shut down Autopilot critics? Let us know your thoughts in the comments section below.

FURTHER REA

Someone give the Autopilot a cape already

A TESLA driver was arrested in California after police found the motorist asleep behind the wheel at 70mph.

A California Highway Patrol officer spotted the Tesla Model S driver out for the count along a highway south of San Francisco, and was unable to stir the driver from his slumber with flashing lights or sirens.

Assuming the Tesla’s semi-autonomous “Autopilot” driver aids were turned on, the police positioned squad cars either side and ahead of the Model S, and slowed down to simulate a traffic jam, bringing the dormant driver’s electric saloon to a complete stop safely.

After waking the driver up, police conducted a breathalyser test on the man, where they found he was above the drink drive limit. He was then charged with driving under the influence of alcohol and arrested.

This isn’t the first time this year the California Highway Patrol have encountered a drowsy Tesla driver. In January 2018, officers arrested a different man who fell asleep in his Model S on the San Francisco-Oakland Bay Bridge, and was later found to be over twice the drink drive limit.

When u pass out behind the wheel on the Bay Bridge with more than 2x legal alcohol BAC limit and are found by a CHP Motor. Driver explained Tesla had been set on autopilot. He was arrested and charged with suspicion of DUI. Car towed (no it didn’t drive itself to the tow yard). pic.twitter.com/4NSRlOBRBL — CHP San Francisco (@CHPSanFrancisco) 19 January 2018

In July 2018, the UK car safety body Thatcham Research called out the “deeply unhelpful” names that some car makers give their driver aids, and highlighted systems such as Tesla’s Autopilot for potentially “lulling drivers into a false sense of security”.

Tweet to @J_S_Allen Follow @J_S_Allen

Drunk Tesla driver caught asleep behind the wheel at 70mph

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents