Citation record for Incident 20

Suggested citation format

Anonymous. (2016-06-30) Incident Number 20. in McGregor, S. (ed.) Artificial Intelligence Incident Database. Partnership on AI. Retrieved on July 30, 2021 from incidentdatabase.ai/cite/20.

Incident Stats

Incident ID
Report Count
Incident Date
20
31
2016-06-30

Reports

Tools

All IncidentsDiscover

CSET Taxonomy Classifications

Taxonomy Details

Full Description

Multiple unrelated car accidents result in varying levels of harm have been occurred while a Tesla's autonomous driving mode was in use. The autonomous vehicle's driving capabilities range from fully human-controlled to fully autonomous, allowing the system to control speed, direction, acceleration, deceleration, and lane changes. In most cases, the driver was given warning prior to impact, alerting the human driver to the need of intervention.

Short Description

Multiple unrelated car accidents result in varying levels of harm have been occurred while a Tesla's autopilot was in use.

Severity

Severe

Harm Type

Harm to physical health/safety, Financial harm

AI System Description

Tesla autopilot is an autonomous driving system that allow an autonomous vehicle to determine speed, acceleration, deceleration, direction, and lane changes

System Developer

Tesla

Sector of Deployment

Transportation and storage

Relevant AI functions

Perception, Cognition, Action

AI Techniques

Autonomous vehicle; Tesla autopilot

AI Applications

autonomous driving

Named Entities

Tesla, Joshua Brown, Wei Huang, Nicolas Ciarlone, Elaine Herzberg

Technology Purveyor

Tesla

Beginning Date

2013-01-01T00:00:00.000Z

Ending Date

2018-01-01T00:00:00.000Z

Near Miss

Harm caused

Intent

Accident

Lives Lost

Yes

Infrastructure Sectors

Transportation

Data Inputs

360 Ultrasonic Sonar; Image Recognition Camera; Long Range Radar; traffic patterns

Incidents Reports

Tesla driver killed in crash with Autopilot active, NHTSA investigating

theverge.com · 2016

A Tesla Model S with the Autopilot system activated was involved in a fatal crash, the first known fatality in a Tesla where Autopilot was active. The company revealed the crash in a blog post posted today and says it informed the National Highway Transportation Safety Administration (NHTSA) of the incident, which is now investigating.

The accident occurred on a divided highway in central Florida when a tractor trailer drove across the highway perpendicular to the Model S. Neither the driver — who Tesla notes is ultimately responsible for the vehicle’s actions, even with Autopilot on — nor the car noticed the big rig or the trailer "against a brightly lit sky" and brakes were not applied. In a tweet, Tesla CEO Elon Musk said that the vehicle's radar didn't help in this case because it "tunes out what looks like an overhead road sign to avoid false braking events."

Because of the high ride-height of the trailer, as well as its positioning across the road, the Model S passed under the trailer and the first impact was between the windshield and the trailer. Tesla writes that if the car had impacted the front or rear of the trailer, even at high speed, the car’s safety systems "would likely have prevented serious injury as it has in numerous other similar incidents."

"Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert."

The accident occurred May 7th in Williston, Florida with 40-year-old Ohio resident Joshua Brown driving. The truck driver was not injured.

Tesla says Autopilot has been used for more than 130 million miles, noting that, on average, a fatality occurs every 94 million miles in the US and every 60 million miles worldwide. The NHTSA investigation, Tesla says, is a "preliminary evaluation" to determine if the Autopilot system was working properly, which can be a precursor to a safety action like a recall.

Our condolences for the tragic loss https://t.co/zI2100zEGL — Elon Musk (@elonmusk) June 30, 2016

In the blog post, Tesla reiterates that customers are required to agree that the system is in a "public beta phase" before they can use it, and that the system was designed with the expectation that drivers keep their hands on the wheel and that the driver is required to "maintain control and responsibility for your vehicle." Safety-critical vehicle features rolled out in public betas are new territory for regulators, and rules haven't been set.

The first fatality in an Tesla in Autopilot mode

Some autonomous driving experts have criticized Tesla for introducing the Autopilot feature so early, with a Volvo engineer saying the system "gives you the impression that it's doing more than it is." In other words, the car handles most situations so smoothly that drivers are led to believe that the car can handle any situation it might encounter. That is not the case, and the driver must remain responsible for the actions of the vehicle, even with Autopilot active. Several automakers working on systems similar to Autopilot — GM with Super Cruise, for instance — have only tested the feature privately and have said they won't deploy until they're ready.

Volvo has said that it will take full legal liability for all its cars when they are operating in fully autonomous mode, and plans to launch a limited trial of its autonomous Drive Me technology next year.

NHTSA issued the following statement to The Verge:...

Tesla driver killed in crash with Autopilot active, NHTSA investigating
Two Years On, A Father Is Still Fighting Tesla Over Autopilot And His Son's Fatal Crash

jalopnik.com · 2018

In early 2016, 48-year-old Gao Jubin had high hopes for the future, with plans to ease up on running a logistics company and eventually turn over control of the business to his son, Gao Yaning. But his plans were abruptly altered when Yaning died at the wheel of a Tesla Model S that crashed on a highway in China while the car—Jubin believes—was traveling in Autopilot, the name for Tesla’s suite of driving aids. Now, Jubin says, “Living is worse than death.”

The last time Jubin saw Yaning, it was Jan. 20, 2016, days before the Chinese New Year. Jubin’s family had just left a wedding reception, and his son was in high spirits. The ceremony was for his girlfriend’s brother, so he’d spent the day keeping busy and helping out when needed.

Given that the wedding involved his potential future-in laws, Chinese tradition meant Yaning should enthusiastically help out with planning the ceremony, Jubin told Jalopnik this month, and he did. “He did a lot of logistics of the wedding, preparing transport and accommodating the guests,” Jubin said.

Following the ceremony, Yaning arranged transportation for his parents to get home. He took Jubin’s Tesla Model S and went to meet some friends. Before departing, Jubin said, his son offered up a word of caution: “Be careful driving.”

In retrospect, it was an omen. On Yaning’s way home, the Model S crashed into the back of a road-sweeping truck while traveling on a highway in the northeastern province of Hebei. At the time, his family believes, the car was driving in Autopilot, which allows those cars to automatically change lanes after the driver signals, manage speed on roads and brake to avoid collisions.

Advertisement

Yaning, 23, died shortly after the crash. Local police reported there was no evidence the car’s brakes were applied.

The crash in China came at a precarious time for Tesla. In the summer of 2016, the automaker and its semi-autonomous system were facing intense scrutiny after it revealed a Florida driver had died earlier that year in a crash involving a Model S cruising in Autopilot. An investigation by U.S. auto regulators was already underway.

But no one knew about Yaning’s death—which happened months before the Florida crash—until that September, when Jubin first went public about a lawsuit he’d filed against Tesla over the crash. Police had concluded Yaning was to blame for the crash, but Jubin’s suit accused Tesla and its sales staff of exaggerating Autopilot’s capabilities.

He asked a local court in Beijing to authorize an independent investigation to officially conclude that Autopilot was engaged. The suit sought an apology from Tesla for how it promoted the feature.

Multiple U.S. news outlets covered Jubin’s case after Reuters wrote about the suit in 2016, but interest almost immediately faded. The case, however, is still ongoing, according to Jubin, who spoke with Jalopnik this month by Skype in his first interview with a U.S. media outlet.

Advertisement

He hopes the suit will bring more attention to the system’s limited capabilities and force Tesla to change the way it deploys the technology before it’s refined. (A federal lawsuit filed last year in the U.S. echoed that concern, alleging the automaker uses drivers as “beta testers of half-baked software that renders Tesla vehicles dangerous.” Tesla called the suit “inaccurate” and said it was a “disingenuous attempt to secure attorney’s fees posing as a legitimate legal action.”)

In September 2016, the company said the extensive damage made it impossible to determine whether Autopilot was engaged. Following the incident, Tesla updated the system so that if a driver ignores repeated warnings to resume control of the vehicle, they would be locked out from using Autopilot during the rest of the trip.

Even if the system was engaged during the collision, Tesla told the Wall Street Journal, Autopilot warns drivers to keep their hand on the steering wheel, which is reinforced by repeated warnings to “take over at any time.” Yaning, Tesla said, took no action even though the road sweeper “was visible for nearly 20 seconds.”

A Tesla spokesperson said in a statement to Jalopnik that “We were deeply saddened to learn that the driver of the Model S lost his life in the incident.” A police investigation, the spokesperson said, found the “main cause of the traffic accident” was Yaning’s failure “to drive safely in accordance with operation rules,” while the secondary cause was the street-sweepers had “incomplete safety facilities.”

“Since then, Tesla has been cooperating with an ongoing civil case into the incident, through which the court has required that a third-party appraiser review the data from the vehicle,” the statement said. “While the third-party appraisal is not yet complete, we have no reason to believe that Autopilot on this vehicle ever functioned other than as designed.”

Advertisement

Within the first year of introducing Autopilot, in October 2015, Tesla faced intense criticism for labeling the featur...

Two Years On, A Father Is Still Fighting Tesla Over Autopilot And His Son's Fatal Crash
Watch Tesla Autopilot Tackle This ‘Curve of Death’ in Impressive Video

inverse.com · 2018

Tesla’s Autopilot is getting smarter. The semi-autonomous driving system, set to form the basis of CEO Elon Musk’s self-driving car ambitions, has been receiving regular software updates to improve the underlying artificial intelligence. The company’s vehicles running on the Hardware 2 system can now tackle some of the toughest corners on the road.

In a video shared by Electrek on Thursday, Tesla Model X owner Mike switches on Autopilot along a curve of Virginia highway near Chester. The Tesla Model S P100D loaner car that he’s using effortlessly guides its way through a tricky corner, where previously his car struggled to make the turn.

It’s a promising step for Tesla as it prepares to roll out autonomous driving. The Hardware 2 suite of cameras and sensors, which have shipped with every car since October 2016, are expected to power the new mode with just computer updates. Musk has suggested that coast-to-coast autonomy is achievable in the next six months.

Before that point, Tesla needs to get the existing Autopilot features moving past the Mobileye systems used in previous vehicles. In the Enhanced Autopilot feature set, Tesla promises advanced features like moving to the correct lane, a feature this video suggests is coming soon.

“Once on the freeway, your Tesla will determine which lane you need to be in and when,” the company says in its description of Enhanced Autopilot. “In addition to ensuring you reach your intended exit, Autopilot will watch for opportunities to move to a faster lane when you’re caught behind slower traffic. When you reach your exit, your Tesla will depart the freeway, slow down and transition control back to you.”

Watch the feat in action below:

In a second video, Mike tried the curve again with his Model X at night. It sticks closer to the left side of the curve than right, but ultimately offers a far smoother ride than before:

“This is a huge improvement for the vehicle,” Mike says in the second video. “One of the best runs ever by this Model X.”

Perhaps autonomy this year isn’t so unfeasible after all....

Watch Tesla Autopilot Tackle This ‘Curve of Death’ in Impressive Video
Who's to blame when robot cars kill?

trustedreviews.com · 2018

It’s the year 2025. Your driverless car has just crashed into a tree at 55mph because its built-in computer valued a pedestrian’s life above your own. Your injuries are the result of a few lines of code that were hacked out by a 26-year-old software programmer in the San Francisco Bay Area back in the heady days of 2018. As you wait for a paramedic drone, bleeding out by the roadside, you ask yourself – where did it all go wrong?

The above scenario might sound fanciful, but death by driverless car isn’t just inevitable – it’s already happening. Most recently, an Uber self-driving car hit and killed a pedestrian in Arizona, while in May 2017, semi-autonomous software failed in a similarly tragic way when Joshua Brown’s Tesla Model S drove under the trailer of an 18-wheel truck on a highway while in Autopilot mode.

Tesla admits that its system sensors failed to distinguish the white trailer against a bright sky, resulting in the untimely death of the 40-year-old Floridian. But Tesla also says that drivers need to keep their hands on the wheel to stop accidents like this from happening, even when Autopilot is activated. Despite the name, it’s a semi-autonomous system.

Uber, on the other hand, may not be at fault, according to a preliminary police report, which lays the blame on the victim.

It’s a sad fact that these tragedies are just a taste what’s to come. In writing this article, I’ve realised how woefully unprepared we are for the driverless future – expected as soon as 2020. What’s more worrying is that this future is already spilling out into our present, thanks to semi-autonomous systems like Tesla’s Autopilot and Uber’s (now halted) self-driving car tests.

Tomorrow’s technology is here today, and with issues like ethics and liability now impossible to avoid, car makers can’t afford not to be ready.

What happens when a driverless car causes an accident and, worse still, kills someone?

Understanding death by computer

To tackle liability, we need to ask how and why a driverless car could kill someone. Unlike humans, cars don’t suffer fatigue, they don’t experience road rage, and they can’t knock back six pints of beer before hitting the highway – but they can still make mistakes.

Tesla’s Model S features semi-autonomous Autopilot technology

Arguably, the most likely cause of “death-by-driverless-car” would be if a car’s sensors were to incorrectly interpret data, causing the computer to make a bad driving decision. While every incident, fatal or otherwise, will result in fixes and improvements, tracing the responsibility would be a long and arduous legal journey. I’ll get to that later.

The second possible cause of death-by-driverless-car is much more difficult to resolve, because it’s all about ethics.

Picture this scenario: You’re riding in a driverless car with your spouse, travelling along a single-lane, tree-lined B-road. There are dozens upon dozens of B-roads like this in the UK. The car is travelling at 55mph, which is below the 60mph national speed limit on this road.

A blind bend is coming up, so your car slows down to a more sensible 40mph. As you travel around the bend, you see that a child has run out onto the road from a public footpath hidden in the trees. His mother panicked and followed him, and now they’re both in the middle of the road. It’s a windy day and your car is electric, so they didn’t hear you coming. The sensors on your car didn’t see either of them until they were just metres away.

There’s no chance of braking in time, so the mother and child are going to die if your car doesn’t swerve immediately. If the car swerves to the left, you go off-road and hit a tree; if the car swerves right, you hit an autonomous truck coming in the opposite direction. It’s empty, so you and your spouse would be the only casualties.

In this situation, the car is forced to make a decision – does it hit the pedestrians, almost certainly killing them, or does it risk the passengers in the probability that they may survive the accident. The answers will have been decided months (or even years) before, when the algorithms were originally programmed into your car’s computer system – and they could very well end your life. While the car won’t know it, this is in effect an ethical decision.

To demonstrate the overwhelming difficulty of coding ethics, have a go at the Massachusetts Institute of Technology’s Moral Machine. It’s a quiz that aims to track how humans react to moral decisions made by self-driving cars. You’re presented with a series of scenarios where a driverless car has to choose between two evils (i.e. killing two passengers or five pedestrians) and you have to choose which one you think is most acceptable. As you’ll quickly realise, it’s really hard.

Which scenario would you choose? MIT’s Moral Machine is a self-driving nightmare

If all of this scares you, you’re not alone. In March, a survey by US motoring organisation AAA revealed that three out of four US drivers are “afraid” of riding i...

Who's to blame when robot cars kill?
Can You Sue a Robocar?

theatlantic.com · 2018

Advocates of autonomy tend to cite overall improvements to road safety in a future of self-driving cars. Ninety-four percent of car crashes are caused by driver error, and both fully and partially autonomous cars could improve that number substantially—particularly by reducing injury and death from speeding and drunk driving. Even so, crashes, injuries, and fatalities will hardly disappear when and if self-driving cars are ubiquitous. Robocars will crash into one another occasionally and, as the incident in Tempe illustrates, they will collide with pedestrians and bicyclists, too. Overall, eventually, those figures will likely number far fewer than the 37,461 people who were killed in car crashes in America in 2016.

The problem is, that result won’t be accomplished all at once, but in spurts as autonomous technology rolls out. During that period, which could last decades, the social and legal status of robocar safety will rub up against existing standards, practices, and sentiments. A fatality like the one in Tempe this week seems different because it is different. Instead of a vehicle operator failing to see and respond to a pedestrian in the road, a machine operating the vehicle failed to interpret the signals its sensors received and process them in a way that averted the collision. It’s useful to understand, and even to question the mechanical operation of these vehicles, but the Tempe fatality might show that their legal consequences are more significant than their technical ones.

Arizona Governor Doug Ducey has turned the state into a proving ground for autonomous cars. Ducey, a businessman who was the CEO of the franchised ice-cream shop Cold Stone Creamery before entering politics, signed an executive order in 2015 instructing state agencies to undertake “any necessary steps to support the testing and operation of self-driving cars on public roads within Arizona.” While safety gets a mention, the order cites economic development as its primary rationale. Since then, Uber, Waymo, Lyft, Intel, GM, and others have set up shop there, testing self-driving cars in real-world conditions—a necessity for eventually integrating them into cities.

The 2015 order outlines a pilot program, in which operators are required to “direct the vehicle’s movement if necessary.” On March 1, 2018, Ducey issued an updated order, which allowed fully autonomous operation on public roads without an operator, provided those vehicles meet a “minimal risk condition.” For the purposes of the order, that means that the vehicle must achieve a “reasonably safe state ... upon experiencing a failure” in the vehicle’s autonomous systems. The new order also requires fully autonomous vehicles to comply with registration and insurance requirements, and to meet any applicable federal laws. Furthermore, it requires the state Departments of Transportation and Public Safety, along with all other pertinent state agencies, to take steps to support fully autonomous vehicles. In this case, “fully autonomous” means a level four or five system by SAE standard, or one that a human need not operate at all, but which can be taken over by a human driver if needed....

Can You Sue a Robocar?
How An Uber Self-Driving Car Killing A Pedestrian Could Impact Tesla's Stock

forbes.com · 2018

A self-driving Uber car killed a pedestrian in Arizona on Sunday evening. It was in self-driving mode, occurred around 10 p.m. and there was a vehicle operator in the front seat. In an emailed statement, an Uber spokesperson wrote, “Our hearts go out to the victim’s family. We are fully cooperating with authorities in their investigation of this incident.” This is a tragedy for the person and her family. It could also bring to the forefront an issue that everyone in the self-driving industry knew it would eventually have to address.

Tesla is one of the most “exposed” companies to the self-driving initiative, and it has been vocal about providing Level 5 autonomous driving or completely hands free under all conditions (see chart below). At the SXSW conference just over a week ago Elon Musk, Tesla’s CEO, said, “I think in the next year, self-driving will encompass essentially all modes of driving.” He said they would make roads safer adding, “At least, a hundred to two hundred percent safer than a person [a human driver] by the end of next year. We’re talking maybe 18 months from now.” Tesla’s current autonomous driving system, or Autopilot 2.0, will be “at least two or three times better” than a human driver, Musk concluded. While it wasn’t a Tesla vehicle involved with the accident, it could impact it and other automobile companies.

Governors Highway Safety Association

Musk’s comments significantly helped the shares

Musk made these comments on Sunday, March 11. Tesla’s stock increased by almost $20 or over 5% going from the closing price of $327.17 on Friday, March 9, to close on Monday, March 12, at $345.51. This compares to the NASDAQ that rose 0.4% last Monday.

Tesla has lost all of last Monday’s gain and then some. Over the past week its shares have fallen 9.2% to $313.56, while the Nasdaq has dropped 3.2%. However, it was only down a bit more than the Nasdaq today, 2.4% vs. 1.8%, the first day of trading after the accident.

Tesla’s stock supported by the promised Model 3 production ramp and autonomous driving

There are a handful of major reasons for the very high valuation of Tesla’s shares. Probably the more important one is the Model 3 production ramp that the company is trying to achieve. It should provide preliminary results for its March quarter in about two weeks. If it hits the 2,500 per week production goal by the end of this month, or comes very close, and keeps the 5,000 per week by the end of June guidance in place, the stock should at least hold its ground, if not increase.

One of the next most important reasons Tesla’s shares have done so well is being a perceived leader with fully autonomous vehicles. Pretty much every auto manufacturer and supplier is working on this critical feature. The leaders should be rewarded and the laggards punished by lower sales and stock valuations.

The NTSB, or National Transportation Safety Board, has launched an investigation into yesterday’s fatality. Depending on how long it takes and if it comes out with any recommendations or regulations, this could delay the testing of autonomous vehicles (Uber has paused its self-driving operations in Phoenix, Pittsburgh, San Francisco and Toronto) and approval of them. If the delay is years and not months, or if the additional costs to achieve Level 5 become around $5,000 or more, this could put a damper on Model 3 demand.

This brings us to demand for the Model 3

I believe that eventually self-driving cars will gain traction and become a force in the automobile market, and they should be safer than ones driven by individuals. Depending on how visible this fatality becomes and what delays it causes for all auto manufacturers, this could give some Model 3 reservation holders pause.

Reservation holders may decide that it is better to wait to “Version 2” of the Model 3 . It is fairly typical of the initial production year or model of a car to have more issues than later ones. It is also typical of the first version of a tech product to have more “bugs”. If this feeling starts to take hold with reservations holders this would be bad for Tesla’s stock .

Tesla is charging Model 3 buyers $5,000 for the “Enhanced Autopilot” feature when they buy one, even though the software is not available. And if the buyer wants to add it later the cost is $6,000.

While it wasn’t a Tesla vehicle involved with the fatality, it wouldn’t be surprising to see some Model 3 buyers decide to wait until the NTSB provides some information about the accident. It would not be a comfortable position to have bought a Model 3, found out that not all the hardware is installed that will be required to get to Level 5 and see the value of their car decrease, potentially significantly....

How An Uber Self-Driving Car Killing A Pedestrian Could Impact Tesla's Stock
Tesla Model X driver dies in Mountain View crash

engadget.com · 2018

It's not certain whether or not Autopilot was involved, or the degree to which the battery was involved in destroying the vehicle. Fires are common in crashes regardless of the power plant. We've asked Tesla for comment, but the California Highway Patrol believed the battery might have played a role.

Regardless of the causes, the crash highlights the uncertainties surrounding the safety of electric cars, especially as the introduce autonomous features. First responders still aren't certain how to deal with EV crashes, and there's now the potential for autonomous driving systems to play as much of a role as humans.

A Good Samaritan at the scene of the Tesla Model X car crash described the car to be "actively emitting full flames from the battery bank." https://t.co/n78v5ekcgV pic.twitter.com/EVGqKJnhcR — NBC Bay Area (@nbcbayarea) March 24, 2018...

Tesla Model X driver dies in Mountain View crash
Tesla defends Autopilot record amid investigation into crash that killed driver

independent.co.uk · 2018

Tesla has defended its Autopilot driver-assist technology after a probe was launched into a fatal crash involving a Tesla Model X SUV.

The accident took place on the afternoon of Friday 23 March on a motorway in Mountain View, California. The 38-year-old driver died at a nearby hospital shortly after the crash.

Tesla vehicles have a system called Autopilot which runs some driving tasks but it is not clear if Tesla’s automated control system was driving the car.

We’ll tell you what’s true. You can form your own view. From 15p €0.18 $0.18 USD 0.27 a day, more exclusives, analysis and extras.

According to the National Transport Safety Board (NTSB) and police, the accident involved two other cars.

The NTSB announced it was investigating the accident on Tuesday. It said it would be looking into how the Model X caught fire in the wake of the collision and would take steps to make the vehicle safe before removing it from the scene of the incident.

Tesla, which specialises in electric vehicles, energy storage and solar panel manufacturing and which was founded by Elon Musk, has sought to defend the technology in spite of the NTSB saying it was unclear whether the Model X Autopilot system was engaged before the accident.

In a blog post, the company said: “Our data shows that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out in 2015 and roughly 20,000 times since just the beginning of the year, and there has never been an accident that we know of.”

Tesla also argued part of the reason the fatal crash was so severe was to do with the fact a collision barrier on the highway was either taken away or restricted.

Shape Created with Sketch. World news in pictures Show all 50 left Created with Sketch. right Created with Sketch. Shape Created with Sketch. World news in pictures 1/50 5 April 2019 A refugee father and son lie on railway tracks to prevent a train from leaving a station during a protest in Athens, Greece. Dozens of migrants staged a protest in Athens central train station disrupting all railway services in the hope they will be transported to the Greek border and join other refugees attempting to follow a 2016 migration route towards northern Europe Getty 2/50 4 April 2019 Security agents and police officers hold back migrants during the evacuation of a makeshift camp at Porte de la Chapelle, in the north of Paris. More than 300 migrants and refugees were evacuated on early April 4 from a makeshift camp to accomodation structures AFP/Getty 3/50 3 April 2019 An Alexandra township resident gestures and they part is clashes with the Johannesburg Metro Police, South Africa during a total shutdown of the township due to protest against the lack of service delivery or basic necessities such as access to water and electricity, housing difficulties and lack of public road maintenance. AFP/Getty 4/50 2 April 2019 Children eat next to the debris of damaged homes at Purainiya village in Nepal's southern Bara district near Birgunj, following a rare spring storm. The freak storm tore down houses and overturned cars and trucks as it swept across southern Nepal killing at least 27 people and leaving more than 600 injured AFP/Getty 5/50 1 April 2019 A forensic expert works next to the remains of a small plane that crashed near Erzhausen, Germany. Natalia Fileva, chairwoman and co-owner of Russia's second largest airline S7, died when a private jet she was in crashed near Frankfurt on Sunday, the company said Reuters 6/50 31 March 2019 Ukrainian comic actor and presidential candidate Volodymyr Zelenskiy delivers a speech following the announcement of the first exit poll in a presidential election at his campaign headquarters in Kiev, Ukraine Reuters 7/50 30 March 2019 Catalan pro-independence protesters throw rocks during a counter-demonstration against a protest called by Spanish far-right party Vox against the Catalan independence push in Barcelona. Polls suggest Vox, which campaigns against illegal immigration and "radical" feminism, will become the first far-right party to win seats in the Spanish parliament since the late 70s and could emerge as a kingmaker in Spain's increasingly fragmented political landscape AFP/Getty 8/50 29 March 2019 Protests against President Abdelaziz Bouteflika continue in Algeria despite the announcement on 11 March that he will not run for a fifth Presidential term and postponement of presidential elections previously scheduled for 18 April 2019 until further notice EPA 9/50 28 March 2019 Firefighters on ladders work to extinguish a blaze in an office building in Dhaka after a huge fire tore through it, killing at least five people with many others feared trapped in the latest major fire to hit the Bangladesh capital AFP/Getty 10/50 27 March 2019 A Palestinian protester moves a burning tire during clashes with Israeli troops near the Jewish settlement of Beit El, in the Israeli-occupied West Bank Reuters 11/50 26...

Tesla defends Autopilot record amid investigation into crash that killed driver
Tesla's Self-Driving Autopilot Was Turned On In Deadly California Crash

wired.com · 2018

Tesla now has another fatality to hang on its semi-autonomous driving system. The company just revealed that its Autopilot feature was turned on when a Model X SUV slammed into a concrete highway lane divider and burst into flames on the morning of Friday, March 23. The driver, Wei Huang, died shortly afterwards at the hospital.

This is the second confirmed fatal crash on US roads in which Tesla’s Autopilot system was controlling the car. It raises now familiar questions about this novel and imperfect system, which could make driving easier and safer, but relies on constant human supervision.

In a blog post published this evening, Tesla says the logs in the car’s computer show Autopilot was on, with the adaptive cruise control distance set to the minimum. The car stays in its lane and a fixed distance from the vehicle ahead, but the driver is supposed to keep his hands on the wheel and monitor the road, too. Take your hands off the wheel for too long, and you get a visual warning, on the dashboard. Ignore that, and the system will get your attention with a beep. If you’re stubborn or incapacitated, the car will turn on its flashers and slow to a stop.

Based on data pulled from the wrecked car, Tesla says Huang should have had about five seconds, and 150 meters of unobstructed view of the concrete barrier, before the crash. Huang’s hands were not detected on the wheel for six seconds prior to the impact. Earlier in the drive, he had been given multiple visual warnings and one audible warning to put his hands back on the wheel.

The car’s manual reminds Tesla drivers that Autopilot is a driver assistance tool, not a replacement, and that they retain responsibility for driving safely. (The big center screen conveys the same message when you engage Autopilot for the first time.) But critics say the ease with which Tesla’s system handles regular freeway driving can lull a driver into thinking it’s more capable than it is, and allow them to become distracted or take their eyes off the road.

Drivers need to be ready to grab the wheel if the lane markings disappear, or lanes split, which may have been a contributing factor in this crash. Systems like Autopilot have known weaknesses. The manual also warns that it may not see stationary objects, a shortcoming highlighted when a Tesla slammed into a stopped firetruck near Los Angeles in January. The systems are designed to discard radar data about things that aren’t moving, to prevent false alarms for every overhead gantry or street-side trash can.

Autopilot was first enabled on Tesla’s cars, via an over-the-air software updates, in October 2015. The system combines radar-controlled cruise control with automatic steering to stay within painted lane lines. The first person known to die using Autopilot was Joshua Brown, whose Model S crashed into a truck that turned across his path in Florida, in May 2016. Neither he nor the car’s computers saw the white truck against the bright sky.

LEARN MORE The WIRED Guide to Self-Driving Cars

Federal investigators pored over the crash site and the vehicle logs, as they are doing with this second fatality. The National Highway Traffic Safety Administration concluded that the system was operating as intended, wasn’t defective, and that Tesla didn’t need to recall any cars. The crash, in other words, was Brown’s fault. It went further, and said that crashes dropped 40 percent in Tesla cars equipped with the autosteer feature.

The National Transportation Safety Board was more damning, saying Tesla should bear some of the blame for selling a system that is too easy to misuse.

After Brown’s death, Tesla modified Autopilot to rely more on data from its radar, and less on the camera, to spot obstacles in the car’s path. It also sent out a software update that sharply curtailed the length of time a driver can let go of the wheel, and introduced brighter, flashing warnings. That length of time varies according to speed and road conditions, but can still be a few minutes.

Autopilot was groundbreaking when Tesla introduced it, and Elon Musk promises his cars are capable of even more, from changing lanes all by themselves, to full self-driving. Other luxury car makers have introduced similar systems with varying restrictions—and far less grand promises. Cadillac’s Super Cruise uses an infrared camera to monitor the driver’s head position (so it knows when he’s looking at the road), instead of relying on torque sensors in the steering wheel.

The federal investigations into Huang’s crash are ongoing, and may not produce reports for several months (the NTSB typically takes 12 to 18 months to finalize and publish its findings). In the meantime, Tesla used its blog post to point out some extreme circumstances in this accident. The barrier that Huang hit was supposed to have a crash attenuator, which crumples to absorb some of the impact. But it had been crushed in a previous accident, and not replaced, the company says. “We have never seen this level...

Tesla's Self-Driving Autopilot Was Turned On In Deadly California Crash
Tesla says crashed vehicle had been on autopilot prior to accident

cnbc.com · 2018

Tesla said on Friday that a Tesla Model X involved a fatal crash in California last week had activated its Autopilot system, raising new questions about the semi-autonomous system that handles some driving tasks.

Tesla also said vehicle logs from the accident showed no action had been taken by the driver soon before the crash and that he had received earlier warnings to put his hands on the wheel.

"The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken," Tesla said.

The statement did not say why the Autopilot system apparently did not detect the concrete divider.

The fatal crash and vehicle fire of the Tesla near Mountain View, California, involved two other cars and delayed traffic for hours. The 38-year-old Tesla driver died at a nearby hospital shortly after the crash.

The National Highway Traffic Safety Administration, which launched an investigation into the crash earlier this week, did not immediately comment late Friday. The National Transportation Safety Board is also investigating the fatal crash.

Autopilot allows drivers to take their hands off the wheel for extended periods under certain conditions. Tesla requires users to agree to keep their hands on the wheel "at all times" before they can use autopilot, but users routinely tout the fact they can use the system to drive hands-free.

The NTSB faulted Tesla in a prior fatal autopilot crash.

In September, NTSB Chairman Robert Sumwalt said operational limitations in the Tesla Model S played a major role in a May 2016 crash that killed a driver using autopilot.

That death — the first fatality in a Tesla vehicle operating in Autopilot mode — raised questions about the safety of systems that can perform driving tasks for long stretches with little or no human intervention, but which cannot completely replace human drivers.

The NTSB said Tesla could have taken further steps to prevent the system's misuse, and faulted the driver for not paying attention and for "overreliance on vehicle automation."

In January, NHTSA and NTSB launched investigations into a Tesla vehicle, apparently traveling in semi-autonomous mode, that struck a fire truck in California. Neither agency nor Tesla has offered any update.

The government probes raise the risk for Tesla and automakers at a time when the industry is seeking federal legislation that would ease deployment of self driving cars.

The crash comes soon after an Uber vehicle in Arizona in self-driving mode struck and killed a pedestrian in the first death linked to an autonomous vehicle.

Tesla said late Friday that "Autopilot does not prevent all accidents — such a standard would be impossible — but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists."

Tesla said that in the United States "there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware."

Tesla in September 2016 unveiled improvements to Autopilot, adding new limits on hands-off driving.

On Thursday, Tesla said it was recalling 123,000 Model S sedans built before April 2016 in order to replace bolts in the power steering component that can begin to corrode after contact in cold temperatures with road salt. No accidents or injuries were reported....

Tesla says crashed vehicle had been on autopilot prior to accident
Tesla Driver Allegedly Reported Autopilot Issues to Dealer Several Times Prior to Fatal Crash

thedrive.com · 2018

Tesla confirmed that its self-driving feature, Autopilot, was engaged in a fatal accident that occurred in California. Walter Huang, the driver of the Model X, struck a compressed traffic attenuator and was killed despite having Autopilot engaged. Prior to the data being revealed, the man's family made complaints to local outlet Mercury News that he had reportedly brought his Model X to Tesla several times over complaints about the Autopilot system.

Tesla's Autopilot is far from being crowned a perfect driver. Complaints about the company's latest hardware revision, called AP2, have been made by drivers for many months, often claiming that they felt the system was not as accurate as the company's first iteration, AP1. The updated technology has often shown past difficulty of being able to avoid traveling into other lanes; however, Tesla's over-the-air firmware updates have seemingly improved upon them since the introduction of AP2.

Huang's family reportedly told local news that Huang made several complaints to his local Tesla dealer regarding the vehicle veering off the road with Autopilot engaged. What seemingly makes matters worse is that the dealer was allegedly told that it wasn't just any stretch of road Huang experienced the problem with, but the same stretch of road where the accident occurred.

A Tesla spokesperson told local news that they could find no record suggesting that Huang ever reported the Autopilot performance complaints to Tesla.

The manufacturer's investigation over the crash revealed that Huang had his hands off of the steering wheel when the accident occurred, and despite receiving warnings for over six seconds prior to the crash, no action was said to be taken by the driver. Tesla goes on to defend Autopilot by stating that it is 3.7 times less likely to be involved in a fatal accident than a human driver.

Judging by the new information found by Tesla and the opposing details from Huang's family over past Autopilot complaints, it's quite likely that we will continue to read about this accident for some time. Autonomy is under a microscope after a self-driving Uber car struck and killed a pedestrian several weeks ago. Lawmakers will likely use the two cases to help decide the future legislation governing autonomous driving....

Tesla Driver Allegedly Reported Autopilot Issues to Dealer Several Times Prior to Fatal Crash
Tesla says crashed vehicle had been on autopilot prior to accident

reuters.com · 2018

LOS GATOS, California (Reuters) - Tesla Inc (TSLA.O) said on Friday that a Tesla Model X involved a fatal crash in California last week had activated its Autopilot system, raising new questions about the semi-autonomous system that handles some driving tasks.

FILE PHOTO: Rescue workers attend the scene where a Tesla electric SUV crashed into a barrier on U.S. Highway 101 in Mountain View, California, March 25, 2018. KTVU FOX 2/via REUTERS

Tesla also said vehicle logs from the accident showed no action had been taken by the driver soon before the crash and that he had received earlier warnings to put his hands on the wheel.

“The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken,” Tesla said.

The statement did not say why the Autopilot system apparently did not detect the concrete divider.

The fatal crash and vehicle fire of the Tesla near Mountain View, California, involved two other cars and delayed traffic for hours. The 38-year-old Tesla driver died at a nearby hospital shortly after the crash.

The National Highway Traffic Safety Administration, which launched an investigation into the crash earlier this week, did not immediately comment late Friday. The National Transportation Safety Board is also investigating the fatal crash.

Autopilot allows drivers to take their hands off the wheel for extended periods under certain conditions. Tesla requires users to agree to keep their hands on the wheel “at all times” before they can use autopilot, but users routinely tout the fact they can use the system to drive hands-free.

The NTSB faulted Tesla in a prior fatal autopilot crash.

In September, NTSB Chairman Robert Sumwalt said operational limitations in the Tesla Model S played a major role in a May 2016 crash that killed a driver using autopilot.

That death — the first fatality in a Tesla vehicle operating in Autopilot mode — raised questions about the safety of systems that can perform driving tasks for long stretches with little or no human intervention, but which cannot completely replace human drivers.

The NTSB said Tesla could have taken further steps to prevent the system’s misuse, and faulted the driver for not paying attention and for “overreliance on vehicle automation.”

In January, NHTSA and NTSB launched investigations into a Tesla vehicle, apparently traveling in semi-autonomous mode, that struck a fire truck in California. Neither agency nor Tesla has offered any update.

The government probes raise the risk for Tesla and automakers at a time when the industry is seeking federal legislation that would ease deployment of self driving cars.

The crash comes soon after an Uber vehicle in Arizona in self-driving mode struck and killed a pedestrian in the first death linked to an autonomous vehicle.

Tesla said late Friday that “Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.”

Tesla said that in the United States “there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware.”

Tesla in September 2016 unveiled improvements to Autopilot, adding new limits on hands-off driving.

On Thursday, Tesla said it was recalling 123,000 Model S sedans built before April 2016 in order to replace bolts in the power steering component that can begin to corrode after contact in cold temperatures with road salt. No accidents or injuries were reported....

Tesla says crashed vehicle had been on autopilot prior to accident
Tesla in fatal California crash was on Autopilot

bbc.com · 2018

Image copyright Reuters Image caption The driver of the Tesla Model X died shortly after the crash

Electric carmaker Tesla says a vehicle involved in a fatal crash in California was in Autopilot mode, raising further questions about the safety of self-driving technology.

One of the company's Model X cars crashed into a roadside barrier and caught fire on 23 March.

Tesla says Autopilot was engaged at the time of the accident involving the driver, 38, who died soon afterwards.

But they did not say whether the system had detected the concrete barrier.

"The driver had received several visual and one audible hands-on warning earlier in the drive," a statement on the company's website said.

"The driver's hands were not detected on the wheel for six seconds prior to the collision."

"The driver had about five seconds and 150m (490ft) of unobstructed view of the concrete divider... but the vehicle logs show that no action was taken," the statement added.

Tesla's Autopilot system does some of the things a fully autonomous machine can do. It can brake, accelerate and steer by itself under certain conditions, but it is classified as a driver assistance system, is not intended to operate independently and as such the driver is meant to have their hands on the wheel at all times.

In 2016, a Tesla driver was killed in Florida when his car failed to spot a lorry crossing its path.

It led the company to introduce new safety measures, including turning off Autopilot and bringing the car to a halt if the driver lets go of the wheel for too long.

Federal investigators said last year that Tesla "lacked understanding" of the semi-autonomous Autopilot's limitations.

Media playback is unsupported on your device Media caption Uber dashcam footage shows moment before fatal impact

The accident in California comes at a difficult time for self-driving technology.

Earlier this month, Uber was forbidden from resuming self-driving tests in the US state of Arizona.

It followed a fatal crash in the state in which an autonomous vehicle hit a woman who was walking her bike across the road.

It was thought to be the first time an autonomous car had been involved in a fatal collision with a pedestrian.

The company suspended all self-driving tests in North America after the accident....

Tesla in fatal California crash was on Autopilot
Fatal Tesla Crash Raises New Questions About Autopilot System

nytimes.com · 2018

The company said the driver, Wei Huang, 38, a software engineer for Apple, had received several visual and audible warnings to put his hands back on the steering wheel but had failed to do so, even though his Model X S.U.V. had the modified version of the software. His hands were not detected on the wheel for six seconds before his Model X slammed into a concrete divider near the junction of Highway 101 and 85 in Mountain View, and neither Mr. Huang nor the Autopilot activated the brakes before the crash.

The accident renews questions about Autopilot, a signature feature of Tesla vehicles, and whether the company has gone far enough to ensure that it keeps drivers and passengers safe.

“At the very least, I think there will have to be fundamental changes to Autopilot,” said Mike Ramsey, a Gartner analyst who focuses on self-driving technology. “The system as it is now tricks you into thinking it has more capability than it does. It’s not an autonomous system. It’s not a hands-free system. But that’s how people are using it, and it works fine, until it suddenly doesn’t.”

On Saturday, Tesla declined to comment on the California crash or to make Mr. Musk or another executive available for an interview. In its blog post on Friday about the crash, the company acknowledged that Autopilot “does not prevent all accidents,” but said the system “makes them much less likely to occur” and “unequivocally makes the world safer.”

For the company, the significance of the crash goes beyond Autopilot. Tesla is already reeling from a barrage of negative news. The value of its stock and bonds has plunged amid increasing concerns about how much cash it is using up and the repeated delays in the production of the Model 3, a battery-powered compact car that Mr. Musk is counting on to generate much-needed revenue....

Fatal Tesla Crash Raises New Questions About Autopilot System
Tesla car that crashed and killed driver was running on Autopilot, firm says

theguardian.com · 2018

Tesla has said a car that crashed in California last week, killing its driver, was operating on Autopilot.

Exclusive: Arizona governor and Uber kept self-driving program secret, emails reveal Read more

The 23 March crash on highway 101 in Mountain View is the latest accident to involve self-driving technology. Earlier this month, a self-driving Volvo SUV that was being tested by the ride-hailing service Uber struck and killed a pedestrian in Arizona.

Federal investigators are looking into the California crash, as well a crash in January of a Tesla Model S that may have been operating under the Autopilot system.

In a blogpost, Tesla said the driver of the sport-utility Model X that crashed in Mountain View, 38-year-old Apple software engineer Wei Huang, “had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision.

“The driver had about five seconds and 150 meters of unobstructed view of the concrete divider … but the vehicle logs show that no action was taken.”

Tesla also said the concrete highway divider had previously been damaged, increasing its impact on the car. The vehicle also caught fire, though Tesla said no one was in the vehicle when that happened.

The company said its Autopilot feature can keep speed, change lanes and self-park but requires drivers to keep their eyes on the road and hands on the wheel, in order to be able to take control and avoid accidents.

Autopilot does not prevent all accidents, Tesla said, but it does make them less likely.

“No one knows about the accidents that didn’t happen,” Tesla said, “only the ones that did. The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe.

“There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year.”

The company added that it “care[s] deeply for and feel[s] indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety.

“None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.”...

Tesla car that crashed and killed driver was running on Autopilot, firm says
Tesla Model X in Autopilot Killed a Driver. Officials Aren’t Pleased With How Tesla Handled It.

futurism.com · 2018

Tesla is taking PR very seriously after one of its vehicles in autonomous mode killed a passenger recently.

The crash occurred at 9:27 AM on Highway 101 near Mountain View, California. Walter Huang was in the driver’s seat of the Model X, which was in autonomous mode. The car hit a concrete highway divider, marked with black and yellow chevrons, at full force. Huang didn’t take any action. The SUV crumpled like a tin can, and Huang didn’t make it.

The investigation into the fatal #Tesla crash continued today with the #CHP & #NTSB digging through the scorched EV ? https://t.co/rfdgY88bn7 pic.twitter.com/vd2YzFmAZ0 — Dean C. Smith (@DeanCSmith) March 29, 2018

Other information has been hard to come by, due to the severity of the damage. So far we don’t know if his death was a result of negligence, a fatal nap, or simply being distracted by the fireworks of warning lights, and sounds. But one thing is clear: the crash proves that audio and visual cues on the dashboard could after all be insufficient to prevent a crash.

Huang wasn’t the first to die in a Tesla with Autopilot active. In 2016, Joshua Brown crashed his Model S into a truck, marking the first fatal collision while Autopilot was engaged.

The timing for this particular crash isn’t exactly ideal (from Tesla’s perspective). Uber is already doing damage control after its self-driving car killed a pedestrian in Arizona on March 19, four days before Huang’s fatal collision.

Interestingly, officials aren’t too pleased about Tesla’s PR offensive. On Sunday, a spokesperson for the U.S. National Transportation Safety Board (NTSB) told the Washington Post:

At this time the NTSB needs the assistance of Tesla to decode the data the vehicle recorded. In each of our investigations involving a Tesla vehicle, Tesla has been extremely cooperative on assisting with the vehicle data. However, the NTSB is unhappy with the release of investigative information by Tesla.

Presumably, investigators aren’t happy because they’d like to get as much information as they can, then release a report.

But Tesla might have jumped the gun. Not complying with the NTSB’s investigation processes and deadlines might end up having their technological advancements (and security improvements) screech to a halt.

After the Uber car’s crash, the company was banned from further testing in Arizona (though other companies were allowed to continue). Many people feared that the crash would fray the public’s trust in autonomous vehicles, and that largely has not come to pass, at least not yet.

But if the crashes continue, that could change. The market for autonomous cars could dry up before the technology becomes reliable enough to make them widespread.

Tesla’s Autopilot is Level 2 autonomy, while Uber’s self-driving car is a Level 4. So the technology isn’t even really the same. Still, a turn in the tide of public opinion could sweep both up with it.

Autonomous vehicles aren’t the best at sharing the unpredictable road with imprecise humans. Yes, once fully autonomous vehicles roll out all over the country and make up 100 percent of the vehicles on the road, American roads will inevitably become safer.

But we’re not there yet. If crashes like these keep happening, and the public loses trust, we might never be.

Update: Tesla CEO Elon Musk took to Twitter to respond to comments from NTSB and reiterate Tesla’s priorities:...

Tesla Model X in Autopilot Killed a Driver. Officials Aren’t Pleased With How Tesla Handled It.
Tesla, Uber Deaths Raise Questions About the Perils of Partly Autonomous Driving

wsj.com · 2018

Earlier this week, the Tempe, Ariz., police released a video of the fatal accident involving a pedestrian and an Uber self-driving vehicle. We asked experts to analyze the footage and explain what factors may have caused systems to fail. Photo: National Transportation Safety Board

Two recent fatal crashes of cars with varying levels of autonomous-driving technology are focusing attention on vehicles that vest control in both humans and machines.

U.S. investigators are still completing their probes of an Uber Technologies Inc. self-driving vehicle with a safety operator behind the wheel that hit and killed a pedestrian March 18 in Tempe, Ariz., and of a Tesla Inc. Model X sport-utility with its semiautonomous system engaged that collided with a highway barrier on March 23 near Mountain View, Calif.,......

Tesla, Uber Deaths Raise Questions About the Perils of Partly Autonomous Driving
Tesla: Autopilot was on during crash

aljazeera.com · 2018

Electric carmaker Tesla has confirmed its "Autopilot" feature was engaged during a fatal crash last week, a development set to exacerbate concerns over the safety of futuristic vehicles.

Autopilot is still far from a completely autonomous driving system, which would not require any involvement by a human.

Autopilot is considered part of the second of five levels of autonomous driving, with the fifth being fully autonomous - something once featured in futuristic cartoons but which has moved closer to reality.

Car crash

A Tesla Model X - the latest model - collided with a highway barrier near the town of Mountain View in California on March 23, catching fire before two other cars struck it.

The driver was identified by The Mercury News as a 38-year-old man, Wei Huang, an engineer for Apple. He later died in hospital.

Tesla issued a blog post late Friday saying the driver had activated the Autopilot but ignored several warnings.

"In the moments before the collision ... Autopilot was engaged with the adaptive cruise control follow-distance set to minimum," Tesla said.

"The driver had received several visual and one audible hands-on warning earlier in the drive and the driver's hands were not detected on the wheel for six seconds prior to the collision.

"The driver had about five seconds and 150 meters (164 yards) of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken."

Tesla added the reason the car sustained such great damage was because a highway barrier "had been crushed in a prior accident without being replaced".

"We have never seen this level of damage to a Model X in any other crash," it said.

The company, founded 15 years ago by Elon Musk, sought to downplay fears over its technology.

"Over a year ago, our first iteration of Autopilot was found by the US government to reduce crash rates by as much as 40 percent," it said.

Pedestrian killed

In January last year, the US Transportation Department closed an investigation into the fatal 2016 crash in Florida of a Tesla Model S on Autopilot, finding that no "safety-related defect" had caused that accident, the first of its kind.

The latest fatal Tesla crash came the same week a collision involving an autonomous Uber vehicle in Arizona killed a pedestrian and caused that company to temporarily halt its self-driving car programme.

Circumstances of the two crashes are different: Tesla's Autopilot is a driver assistance feature, while the Uber vehicle was designed to operate autonomously but with a driver behind the wheel to correct mistakes.

Dashcam footage released by police showed that the operator appeared to be distracted seconds before the car hit the woman.

The nonprofit group Consumer Watchdog has argued that autonomous vehicles are not ready for roads and the public should not be put at risk to test such technology.

After the Uber accident, Democratic Senator Richard Blumenthal said, "Autonomous vehicle technology has a long way to go before it is truly safe for the passengers, pedestrians and drivers."

Competition

Both Uber and Tesla are rivals in the multibillion-dollar drive to develop vehicles which, in the future, will not need any driver intervention.

Among other contenders, General Motors has asked to test a car with no steering wheel on roads beginning next year. Google-owned Waymo is also intensifying its self-driving efforts.

If the final, fifth stage, of autonomous driving is still distant, microprocessor manufacturer NVIDIA unveiled an artificial intelligence platform to enable that goal several months ago.

The system can perform 320 trillion operations a second, completely independently of a vehicle's passengers.

California-based NVIDIA provided some technology in the Uber car which crashed in Arizona, prompting the chip firm to suspend its road tests pending more information about the incident....

Tesla: Autopilot was on during crash
Apple engineer killed in Tesla car operating in driverless mode

dezeen.com · 2018

Electric car company Tesla has confirmed that a recent fatal crash involving one of its vehicles occurred while the car was in autopilot mode.

Tesla released a blog post last week to provide more details about the accident on 23 March 2018, which took place in Mountain View, California, and killed the car's driver – Apple software engineer Wei Huang, 38.

The statement said that the Model X sports utility vehicle's autopilot function was engaged, and its adaptive cruise control follow-distance was set to minimum.

Driver failed to take control

It also suggested that the driver was given ample warning to override the function before the collision, but failed to take action.

"The driver had received several visual and one audible hands-on warning earlier in the drive and the driver's hands were not detected on the wheel for six seconds prior to the collision," said the statement.

"The driver had about five seconds and 150 metres of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken."

Related story Tesla's Autopilot reduced crashes by 40 per cent, finds US inquiry

Tesla also placed some of the blame on the road infrastructure, which had not been repaired since an earlier incident.

"The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced," it said. "We have never seen this level of damage to a Model X in any other crash."

Autopilot still safer says Tesla

The company assured that its autopilot function makes accidents less likely, not more, but requires drivers to remain vigilant and keep hands on the wheel to help avoid potential collisions.

The company's founder, billionaire entrepreneur Elon Musk, called the Model X the "safest SUV ever" when it launched in 2015. A US government study found that the autopilot function reduces accidents by 40 per cent, initiated after the first death from a crash that occurred while using the autopilot function in July 2016.

"The consequences of the public not using autopilot, because of an inaccurate belief that it is less safe, would be extremely severe," Tesla's statement said. "There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year."

"We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars," the statement added, ending with condolences to the victim's family and friends.

Related story Tesla's electric Model X is the "safest SUV ever" says Elon Musk

The news comes at a troubling time for Tesla. Musk has reportedly taken over the management of its Model 3 electric car production after failing to meet first-quarter targets.

It has also been a bad period for autonomous cars. Last month, an Uber taxi killed a woman in the first fatal accident between a pedestrian and a self-driving car, while a crash involving a Tesla Model S that may have been in autopilot mode in January 2018 is currently under investigation....

Apple engineer killed in Tesla car operating in driverless mode
Family of Tesla driver killed in Model X crash on Autopilot is preparing to sue Tesla

electrek.co · 2018

The safety of Tesla Autopilot came back into focus after it was confirmed that the driver assist system was on during the fatal accident that killed a Model X owner in Mountain View last month.

Now the family of the deceased say that they are prepared to sue Tesla after a media interview.

As we previously reported, the Model X was driving on Autopilot when it entered the median of a ramp on the highway as if it was a lane and hit a barrier about a hundred and fifty meters after going into the median.

The impact was quite severe because there was no crash attenuator since it was already destroyed from a previous crash. The driver was rushed to the hospital, but he, unfortunately, died of his injuries.

Sevonne Huang, the wife of the driver, Walter Huang, gave an interview to ABC7 yesterday.

During the interview, she said that her husband had previously complained about the Autopilot’s behaviour at hat exact location:

Sevonne Huang: “And he want to show me, but a lot of time it doesn’t happen.” Dan Noyes: “He told you that the car would drive to that same barrier?” Sevonne: “Yes.” Noyes: “The same barrier that he finally hit?” Sevonne: “Yeah, that’s why I saw the news. I knew that’s him.”

The family hired attorney Mark Fong and say that they are prepared to sue Tesla.

Fong commented:

“Unfortunately, it appears that Tesla has tried to blame the victim here. It took him out of the lane that he was driving in, then it failed to break, then it drove him into this fixed concrete barrier. We believe this would’ve never happened had this Autopilot never been turned on.”

Tesla responded to the interview in a statement:

“We are very sorry for the family’s loss. According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location. The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so. The fundamental premise of both moral and legal liability is a broken promise, and there was none here. Tesla is extremely clear that Autopilot requires the driver to be alert and have hands on the wheel. This reminder is made every single time Autopilot is engaged. If the system detects that hands are not on, it provides visual and auditory alerts. This happened several times on Mr. Huang’s drive that day. We empathize with Mr. Huang’s family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm to others on the road. NHTSA found that even the early version of Tesla Autopilot resulted in 40% fewer crashes and it has improved substantially since then. The reason that other families are not on TV is because their loved ones are still alive.”

Electrek’s Take

I can’t blame the family for having this reaction or any kind of reaction after such a tragic loss, but when it comes to the lawsuit, it looks like they are destroying their own case.

As Tesla said in the statement and as it was confirmed by other Tesla owners recreating the circumstances of the crash, an attentive driver would have plenty of time to go back to the correct lane after the car enters the median, which means that Huang was most likely not paying attention.

On top of it, his wife says that he was aware that Autopilot had difficulties handling this specific situation and yet he decided to activate it anyway and apparently not pay close attention.

In my opinion, as long as Tesla is being clear about drivers needing to stay attentive and keep their hands on the steering wheel, there’s not much of a case here that Tesla is responsible for the accident.

But with this said, as Tesla Autopilot improves it seems that some drivers are growing more confident with the driver assist system and are putting too much trust in it.

I think it’s important to remind everyone that as long as Tesla doesn’t claim it’s anything more than a level 2 driving system and it’s not thoroughly tested to be anything more than that, they should always stay vigilant and be ready to take control.

Subscribe to Electrek on YouTube for exclusive videos and subscribe the podcast....

Family of Tesla driver killed in Model X crash on Autopilot is preparing to sue Tesla
Tesla Criticized for Blaming Autopilot Death on Driver

industryweek.com · 2018

Consumer-safety advocates and autonomous-vehicle experts criticized Tesla Inc. for issuing another statement about the death of a customer that pinned the blame on driver inattentiveness.

Days after publishing a second blog post about the crash involving Walter Huang, a 38-year-old who died last month in his Model X, Tesla issued a statement in response to his family speaking with San Francisco television station ABC7. The company said the “only” explanation for the crash was “if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.”

“I find it shocking,” Cathy Chase, president of the group Advocates for Highway and Auto Safety, said by phone. “They’re claiming that the only way for this accident to have occurred is for Mr. Huang to be not paying attention. Where do I start? That’s not the only way.”

Groups including Advocates for Highway and Auto Safety and Consumer Reports have criticized Tesla for years for naming its driver-assistance system Autopilot, with the latter calling on the company to choose a different moniker back in July 2016. The two organizations share the view of the National Transportation Safety Board, which has urged carmakers to do more to ensure drivers using partially autonomous systems like Autopilot remain engaged with the task of driving. The U.S. agency is in the midst of two active investigations into Autopilot-related crashes.

It’s Tesla’s responsibility to provide adequate safeguards against driver misuse of Autopilot, including by sending visual and audible warnings when the system needs a human to take back over, Chase said. “If they’re not effective in getting someone to re-engage -- as they say that their drivers have to -- then they’re not doing their job.”

High Stakes

The stakes for Tesla’s bid to defend Autopilot are significant. The NTSB’s investigation of the March 23 crash involving Huang contributed to a major selloff in the company’s shares late last month. Chief Executive Officer Elon Musk claimed almost 18 months ago that the system will eventually render Tesla vehicles capable of full self-driving, and much of the value of the $51 billion company is linked to views that it could be an autonomous-car pioneer.

From October: Musk spends a year being wrong about self-driving Teslas

Tesla has declined to say how long drivers can now use Autopilot between visual or audible warnings to have a hand on the wheel. It’s also refused to comment on how many alerts can be ignored before the system disengages, what version of Autopilot software was in Huang’s Model X, or when the car was built.

“Just because a driver does something stupid doesn’t mean they -- or others who are truly blameless -- should be condemned to an otherwise preventable death,” said Bryant Walker Smith, a professor at the University of South Carolina’s School of Law, who studies driverless-car regulations. “One might consider whether there are better ways to prevent drivers from hurting themselves or, worse, others.”

Under Investigations

The NTSB is looking into the crash that killed Huang, as well as a collision in January involving a Tesla Model S that rear-ended a fire truck parked on a freeway near Los Angeles with Autopilot engaged. The agency said after Tesla’s second blog post about the Huang incident that it was unhappy with the company for disclosing details during its investigation.

In its latest statement, Tesla said it is “extremely clear” that Autopilot requires drivers to be alert and have hands on the steering wheel. The system reminds the driver this every time it’s engaged, according to the company.

“Tesla’s response is reflective of its ongoing strategy of doubling down on the explicit warnings it has given to drivers on how to use, and not use, the system,” said Mike Ramsey, an analyst at Gartner Inc. “It’s not the first time Tesla has taken this stance.”

Huang’s wife told ABC7 he had complained before the fatal crash that his Model X had steered toward the same highway barrier he collided with on March 23. The family has hired Minami Tamaki LLP, which said in a statement Wednesday that it believes Tesla’s Autopilot is defective and likely caused Huang’s death. The San Francisco-based law firm declined to comment on Tesla’s statement.

Crash-Rate Claim

The National Highway Traffic Safety Administration, which has the power to order recalls and fine auto manufacturers, found no defect after investigating the May 2016 crash involving a Tesla Model S driven on Autopilot by Josh Brown, a former Navy SEAL. The agency closed its probe in January 2017.

According to data Tesla gave NHTSA investigators prior to its decision against any recall, Autopilot’s steering system may prevent the rate of crashes per million miles driven by about 40 percent, a figure the company cited in its latest statement.

“We empathize with Mr. Huang’s family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm...

Tesla Criticized for Blaming Autopilot Death on Driver
After Several Deaths, Tesla Is Still Sending Mixed Messages About Autopilot

digg.com · 2018

After making repeated statements on an ongoing government investigation into a fatal crash involving Autopilot, Tesla has been kicked out of the probe being conducted by the National Transportation Safety Board for violating its legal agreement with the NTSB. Adding to the drama is the fact that Tesla may have lied before the NTSB publicly announced the decision, saying that it voluntarily left the investigation because it wouldn't allow Tesla to "release information about Autopilot to the public." On Thursday afternoon, Tesla released another statement refuting the NTSB's version of events and again claiming they left the investigation of their own accord).

Tesla's statements artfully package the company's exit from the investigation as a matter of information freedom. But the company's statements on Autopilot incidents, including in its most recent investigations, have consistently placed blame on drivers rather than investigating its own technology.

Last month, a Tesla Model X crashed into a concrete median near Mountain View, California, killing driver Walter Huang, sparking the NTSB investigation. In a statement a week after the crash, Tesla acknowledged that Huang was using Autopilot during the crash, but squarely blamed him for his own death, saying:

The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.

Nowhere in the post did Tesla acknowledge why its Autopilot feature had directed the car to crash into a concrete barrier, except that "the adaptive cruise control follow-distance" was "set to minimum."

In another fatal crash in May 2016, a tractor-trailer crossed in front of a Tesla using Autopilot. The Tesla did not break and the Tesla smashed into the semi, effectively peeling off the car's roof and killing the driver. Tesla acknowledged Autopilot's role in the crash, saying "Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied." Despite this, Tesla ultimately blamed the driver:

When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot 'is an assist feature that requires you to keep your hands on the steering wheel at all times,' and that 'you need to maintain control and responsibility for your vehicle' while using it.

In another Autopilot crash in January, a Tesla slammed into a parked firetruck. Tesla's response was that Autopilot is "intended for use only with a fully attentive driver.”

The two deaths and crash fit into a series of accidents and viral videos that show the imperfections of Tesla's Autopilot technology.

The public scrutiny of the crashes has been comparatively reserved to the reaction garnered from the one death caused by Uber's self-driving car, and Tesla's reactions have been much more defensive.

A key aspect of Tesla's responses to its Autopilot crashes is the fact that it fits the bill of a level 2 automation system in the Society of Automotive Engineers's automation 6-level framework. A level 2 system will manage speed and steering in certain conditions but requires that drivers pay attention and be ready to take over — Tesla attempts to enforce this by alerting the driver whenever their hands aren't on the wheel, and eventually disabling Autopilot if a drivers hands aren't on the wheel long enough (although drivers have figured out ways to hack this). A Level 3 car will only alert the driver when it detects a situation it can't handle.

Tesla has conveniently used this fact in its responses to high-profile crashes but has also repeatedly advertised its cars self-driving capabilities as higher than a level 2 — sending mixed signals to drivers and the public.

On its Autopilot website, Tesla touts "Full Self-Driving Hardware on All Cars," despite its insistence that the software is only meant for assistance.

<img src="http://static.digg.com/images/a88e74d8ce754329951658ddd2fbbd98_f067ce2537094ef68757324bf8478870_1_post.png" alt="" />

On the same page, a video shows a Tesla being operated with no hands, detecting its environment. A caption reads "The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself."

Only at the bottom of the page, does Tesla specify that "Full Self-Driving Capability" is a software add-on that hasn't actually been released yet, and doesn't apply to Tesla's existing "Enhanced Autopilot."

Adding to the confusion is the fact that Tesla CEO Elon Musk has repeatedly said that Tesla's cars on the market that are equipped for Autopilot will eventually be able to achieve "approximately human-level autonomy," and could possibly facilitate "full automation." In 2016 he c...

After Several Deaths, Tesla Is Still Sending Mixed Messages About Autopilot
Tesla Model S was on Autopilot, Utah driver tells police

usatoday.com · 2018

CLOSE The National Transportation Safety Board is investigating a crash and fire involving a Telsa Model S car. Two teens died in Fort Lauderdale, Florida crash on Tuesday. The probe is not expected to involve Tesla's semi-autonomous Autopilot system. (May 11) AP

A Tesla sedan with a semi-autonomous Autopilot feature rear-ended a fire department truck at 60 mph (97 kph) apparently without braking before impact on May 11, 2018, but police say it's unknown if the Autopilot feature was engaged. (Photo: South Jordan Police Department/AP)

SAN FRANCISCO — A Tesla Model S that crashed into a stopped fire truck at high speed was operating in Autopilot mode, the driver of the car told Utah police officials.

Tesla says it continues to work with police on the investigation, and has not yet released details of the incident based on the car's computer logs.

The driver of the vehicle, a 28-year-old woman from Lehi, Utah, slammed into the truck in South Jordan, Utah on Friday. The woman also told police she was looking at her phone prior to the collision and estimated her speed at 60 mph, which is consistent with eyewitness accounts, according to a police statement issued late Monday.

The result of the violent crash was an accordioned front end for the electric car, but only a broken foot for the driver, according to Sgt. Sam Winkler of the South Jordan Police Department.

The driver of the United Fire Authority mechanic truck was evaluated for whiplash and was not checked into the hospital.

Tesla said the company's previous response to the crash still stood, which noted that Autopilot — a semi-autonomous system that works like a souped-up cruise control — requires constant vigilance and is not meant to take over driving responsibilities while the driver focuses on other chores.

Winkler said that South Jordan police was continuing to investigate the crash, and would be working with Tesla to gather vehicle information from the Model S's computers over the coming days. Police officials also said they were getting technical assistance from National Transportation Safety Board officials.

Eyewitness accounts indicate the Model S did not slow down or swerve as it rammed into the back of the truck, which was stopped at a traffic light in the far right lane.

More: Elon Musk shakes up Tesla as another Model S faces crash queries

More: Tesla crash that killed two Florida teens probed by NTSB investigators

Autopilot has been in the crosshairs of federal crash investigators, dating back to a 2016 crash of Tesla Model S in Autopilot mode that killed its driver after the car failed to stop for a tractor trailer that cut across its path.

More recently, the NTSB was called in to review details of a March crash in which a Tesla Model X slam into a highway divider in Mountain View, Calif. The driver died.

Tesla has said the California driver ignored the car's warnings to take back control of the vehicle. But the driver's family is considering suing on the grounds that Tesla ignored the driver's previously raised concerns about Autopilot acting up on that same stretch of Silicon Valley highway.

NTSB and National Highway Traffic Safety Administration officials also are investigating a recent Tesla Model S crash in Florida in which two teens died and one was injured.

The car hit a concrete barrier at high speed in a residential neighborhood and burst into flames. Autopilot is not thought to be a factor, but investigators are looking into the ensuing battery fire.

Just prior to Utah police announcing that the driver indicated Autopilot had been in use, Tesla CEO Elon Musk posted a series of tweets that emphasized the safety of his product.

What’s actually amazing about this accident is that a Model S hit a fire truck at 60mph and the driver only broke an ankle. An impact at that speed usually results in severe injury or death. — Elon Musk (@elonmusk) May 14, 2018

"What’s actually amazing about this accident is that a Model S hit a fire truck at 60mph and the driver only broke an ankle," Musk tweeted (although initially reported as an ankle injury, South Jordan officials said the injury was a broken foot). "An impact at that speed usually results in severe injury or death."

Musk also lamented media coverage that he said glossed over the 40,000 annual U.S. road deaths, and acknowledged that while no technology is perfect "a system that, on balance, saves lives & reduces injuries should be released."

Follow USA TODAY tech reporter Marco della Cava on Twitter.

Read or Share this story: https://usat.ly/2rDPulU...

Tesla Model S was on Autopilot, Utah driver tells police
A Closer Inspection of Tesla’s Autopilot Safety Statistics

medium.com · 2018

A Closer Inspection of Tesla’s Autopilot Safety Statistics

Source: Tesla, Inc.

The automotive industry is at the beginning of a grand experiment. If completely successful, humanity could be ushered into a new economy where driving is a hobby, only for sunny days along clear roads with a view. The struggles and tedium of the daily commute could be handled by autonomous vehicles, traffic accidents could fall to nil, passengers could focus on working and relaxing in their mobile offices, and the elderly, disabled, and blind could have considerable mobility and autonomy. If a complete failure, automobile companies would have invested billions of dollars in computer vision, sensors, and automated driving systems only to have no effect on or actually increase the number of traffic accidents and fatalities by introducing new risks. This would cause a public backlash, and requiring regulators to impose a slow, costly review process that slows the pace of innovation so that after an initial roll-out to a few hundred thousand vehicles, further roll-outs are halted. Then, autonomous vehicle technology may follow the same path as the U.S. nuclear power industry, which has stopped building new power plants since the Three Mile Island accident in 1979. Which scenario or whether something in between unfolds depends on good design, as well as careful understanding and communication of the safety of autonomous driving technology and the path from partially autonomous to fully autonomous vehicles. And understanding the safety of autonomous vehicles (AV) is a very thorny statistics problem.

Recent fatal and injury-producing crashes involving vehicles with Tesla’s Autopilot and Uber’s self-driving pilot have led to significant disagreement among experts, reporters, automakers, and regulators about safety statistics for partial autonomy technologies[1]. Tesla, in particular, has made recent headlines after two crashes and 1 fatality with its Autopilot-equipped partial autonomy vehicles in the past few months. Tesla claims that its technology is 3.7x safer than the existing U.S. vehicle fleet, stating a fatality rate of 1 death per 86 million miles for conventional vehicles versus 1 death per 320 million miles for Autopilot-equipped vehicles, but many experts question the methodology and data behind these statistics. In this article, I’ll review the data, methods, and the three main criticisms of Tesla’s methodology for conventional vehicle fatality rates, provide my best estimates, and make recommendations for regulators and automakers on the safety of autonomous vehicles. I don’t have access to data to verify the fatality rate for Tesla Autopilot-equipped vehicles but the company has promised to release public Autopilot safety data in future quarters.

1. What’s an Autopilot mile?

One informal complaint I’ve heard among analysts is the question of which miles should be included as an ‘Autopilot mile’ in Tesla’s statistic of 1 fatality per 320 million miles. Some analysts argue that one should only compare miles driven in a vehicle with Autopilot engaged to manually-driven vehicle-miles to obtain a fatality rate. Instead, Tesla’s methodology includes all miles driven with an Autopilot-enabled vehicle, whether or not the functionality was engaged.

I agree with Tesla’s methodology on Autopilot mileage because the road conditions under which a partial autonomy system is rated for operation (highways, clear lane markings, etc) are systematically different from manually-driven miles. If one only used Autopilot-enabled miles in the fatality rate calculation, a comparable baseline of miles for a manual vehicle driven under similar road conditions would be difficult to obtain and there are already considerable gaps in the vehicle mileage data needed to compute good partial autonomy safety statistics (more below).

Because the characteristics of manually-driven miles in Autopilot-enabled vehicles are very different than the miles driven in a manually-driven vehicle — more curves, poor lane markings, rain or poor-visibility weather, etc. — it could be possible that crashes are more likely to occur when an Autopilot-enabled vehicle turned over operation to the driver, because road conditions were worse. If that hypothesis were true, these types of crashes should be included as an Autopilot crash, as it pertains to the road coverage of Autopilot and the hand-off between autonomous and manual control, which is related to Tesla’s design choices.

So, unless the owner of an Autopilot-enabled vehicle never or rarely chose to enable the functionality, the proper comparison for fatality rate safety statistics should be made between Autopilot vehicles and all other vehicles.

2. What are comparable vehicles and fatalities?

Another criticism of Tesla’s Autopilot safety statistics is aimed at its choice of comparable baseline vehicles in the 1 fatality per 86 million miles statistic. Analysts believe this statistic was obtained from the Insurance Insti...

A Closer Inspection of Tesla’s Autopilot Safety Statistics
Elon Musk says Tesla crashes shouldn’t be front page news. Here’s why they should.

recode.net · 2018

Proponents of self-driving-car technology often tout one statistic: More than 37,000 people died in automotive-related accidents every year for at least the past two years.

Self-driving cars would help reduce those accidents, these people say.

The logic is simple. The highest number of auto-related deaths are due to drunk driving, speeding and not wearing a seat belt. This can ostensibly be solved by taking the human out of the front seat of the car.

But a high level of human-driving-related deaths doesn’t mean that the current versions of semi-autonomous technology are safer.

In fact, some industry experts say it’s actually less safe to introduce technology that only takes over some of the driving task, since it relies on two imperfect systems: Technology that is inherently limited in its capabilities, and humans.

Still, some continue to highlight the safety of semi-autonomous tech on the road today by citing that statistic. Earlier this week, Tesla CEO Elon Musk chided the Washington Post for writing about a Tesla crash in which the driver involved said Autopilot — the company’s driver-assist technology — was engaged.

It’s super messed up that a Tesla crash resulting in a broken ankle is front page news and the ~40,000 people who died in US auto accidents alone in past year get almost no coverage https://t.co/6gD8MzD6VU — Elon Musk (@elonmusk) May 14, 2018

“It’s super messed up that a Tesla crash resulting in a broken ankle is front page news and the ~40,000 people who died in U.S. auto accidents alone in past year get almost no coverage,” Musk tweeted.

The National Highway Traffic Safety Administration is now investigating that crash, as Reuters first reported. This will be the third Autopilot-related crash NHTSA is investigating this year alone.

Before dissecting the numbers, it’s important to address Musk’s larger point, which is that the media unfairly covers Tesla crashes more than human-driven crashes. By Musk’s own admission, Tesla’s driver-assist technology is still unproven and is being tested — on real humans — so it’s important to track its progress. One way to do that is to tally accidents and fatalities.

That’s especially vital since Musk has said on multiple occasions that Teslas are almost “4x better” than average cars, with only one fatality per 320 million miles in cars equipped with Autopilot. Some question these company-provided statistics. It’s also unclear if all those miles were driven in Autopilot mode or just account for those driven in cars that come equipped with the driver-assist technology.

(We may know more next quarter, which is when Musk said he will start to publish safety reports.)

By comparison, those approximately 40,000 vehicle deaths in a year happened across the 3.2 trillion vehicle miles that people travelled on public roads in 2016, the most recent year for which a full set of data is available. That’s about one death per 80 million miles driven.

But we can’t compare it apples to apples. It stands to reason that manually driven vehicles operated on all types of roads — not just on highways, like Autopilot — that have been driven millions of miles more than Teslas have a higher likelihood of getting into accidents.

On top of that, the fatality rate that NHTSA puts out every year includes driver deaths, pedestrian deaths, motorcycle deaths and bicycle deaths. Tesla’s just includes known driver and pedestrian fatalities.

As of November 2016, Tesla’s fleet of vehicles on the road had driven 1.3 billion miles using Autopilot. While we don’t have updated numbers yet, Musk said on the company’s most recent earnings call that that number is steadily increasing and makes up one-third of all highway driving in Tesla vehicles.

The company also claims that NHTSA said that Autopilot resulted in 40 percent fewer crashes than Tesla cars that didn’t have the technology.

But even these numbers can be misleading. The agency itself said that it did not test the safety of Autopilot during a 2016 investigation. It just compared the crash rates of cars that had Autopilot installed to those that didn’t; it did not assess whether Autopilot was engaged during those miles driven.

Right now there is no definitive means of quantifying how safe autonomous technology truly is, or how much safer than a human driver a robot driver needs to be for it to be ready to hit public roads. One study conducted at the University of Michigan says that in order to be 80 percent confident that self-driving tech is 90 percent safer than human-driven cars, test vehicles need to be driven 11 billion miles.

No autonomous vehicle company has yet to drive that in the real world or in simulation....

Elon Musk says Tesla crashes shouldn’t be front page news. Here’s why they should.
FACT CHECK: Tesla safety claims aren't quite right

phys.org · 2018

In this Sept. 29, 2015, file photo, Elon Musk, CEO of Tesla Motors Inc., introduces the Model X car at the company's headquarters in Fremont, Calif. For years, Tesla has boasted that its cars and SUVs are safer than other vehicles on the roads, and Musk doubled down on the claims in a series of tweets this week. (AP Photo/Marcio Jose Sanchez, File)

For years, Tesla has boasted that its cars and SUVs are safer than other vehicles on the roads, and CEO Elon Musk doubled down on the claims in a series of tweets this week.

The electric vehicles are under intense scrutiny from federal investigators, who have been looking into post-crash battery fires and the performance of Tesla's Autopilot semi-autonomous driving system. On Wednesday, they traveled to Utah to open another inquiry into a Tesla crash—their fourth this year—in which a Model S slammed into a firetruck that was stopped at a red light.

A look at the tweets and Tesla's past claims about the safety of its vehicles and Autopilot:

MUSK (from his tweets Monday): "According to (National Highway Traffic Safety Administration), there was an automotive fatality every 86M miles in 2017 ((tilde)40,000 deaths). Tesla was every 320M miles. It's not possible to be zero, but probability of fatality is much lower in a Tesla."

THE FACTS: This is based on a Tesla analysis of U.S. fatal crashes per miles traveled in 2017. The company's math is correct on the fatality rate involving all of the nation's 272 million vehicles, about 150,000 of which are Teslas, according to sales estimates from Ward's Automotive. But Tesla won't say how many fatalities occurred in its vehicles or how many miles they were driven.

We don't know of any Tesla fatalities in 2017, but the numbers can vary widely from year to year. There have been at least three already this year and a check of 2016 NHTSA fatal crash data—the most recent year available—shows five deaths in Tesla vehicles.

In this Dec. 2, 2015, file photo, Tesla Motors Inc. CEO Elon Musk delivers a speech at the Paris Pantheon Sorbonne University as part of the United Nations Climate Change Conference in Paris. For years, Tesla has boasted that its cars and SUVs are safer than other vehicles on the roads, and CEO Elon Musk doubled down on the claims in a series of tweets this week. (AP Photo/Francois Mori, File)

Statistically, experts say Musk's tweet analysis isn't valid. While Teslas could have a lower death rate, it may speak more about the demographics of Tesla drivers than it does about safety of the vehicles, says Ken Kolosh, manager of statistics for the National Safety Council.

Expensive Teslas tend to be driven by middle-age affluent people who are less likely to get in a crash than younger people, Kolosh said. Also, Tesla drivers tend to live in urban areas and travel on roads with lower speeds, where fatality rates are lower, he said.

Musk also is comparing a fleet of older, less-expensive vehicles to his newer and more costly models, Kolosh said. Most Teslas on the road are six years old or less. The average vehicle in the U.S. is 11.6 years old, according to IHS Markit. Older, less-expensive vehicles often aren't maintained like newer ones and would have more mechanical problems.

___

MUSK (from his tweets Monday in reference to the Utah crash): "What's actually amazing about this accident is that a Model S hit a fire truck at 60 mph and the driver only broke an ankle. An impact at that speed usually results in severe injury or death."

THE FACTS: It's true that the driver in the Utah crash sustained minor injuries considering how fast her car was traveling. The same is true for a January freeway crash near Los Angeles in which the driver was not hurt. But not all Tesla crashes end the same way.

In this April 15, 2018, photo, unsold 2018 models sits amid a series of charging stations on a Tesla dealer's lot in the south Denver suburb of Littleton, Colo. For years, Tesla has boasted that its cars and SUVs are safer than other vehicles on the roads, and CEO Elon Musk doubled down on the claims in a series of tweets this week. (AP Photo/David Zalubowski, File)

In March, the driver of a Tesla Model X was killed in California when his SUV hit a barrier while traveling at "freeway speed." NHTSA and the National Transportation Safety Board are investigating that case, in which the Autopilot system was engaged. Autopilot was also engaged in the Utah crash, according to a summary of data from the car.

Last week, the NTSB opened a probe into an accident in which a Model S caught fire after crashing into a wall at a high speed in Florida. Two 18-year-olds were trapped in the vehicle and died in the flames. The agency has said it does not expect Autopilot to be a focus of that investigation.

___

TESLA (from a March 30 press release): "Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40 percent."

THE FACTS: The government says it did not assess how effective Autop...

FACT CHECK: Tesla safety claims aren't quite right
HWY18FH011 Preliminary

ntsb.gov · 2018

Executive Summary Executive Summary ​On Friday, March 23, 2018, about 9:27 a.m., Pacific daylight time, a 2017 Tesla Model X P100D electric-powered passenger vehicle, occupied by a 38-year-old driver, was traveling south on US Highway 101 (US-101) in Mountain View, Santa Clara County, California. As the vehicle approached the US-101/State Highway (SH-85) interchange, it was traveling in the second lane from the left, which was a high-occupancy-vehicle (HOV) lane for continued travel on US-101. According to performance data downloaded from the vehicle, the driver was using the advanced driver assistance features traffic-aware cruise control and autosteer lane-keeping assistance, which Tesla refers to as “autopilot.” As the Tesla approached the paved gore area dividing the main travel lanes of US-101 from the SH-85 exit ramp, it moved to the left and entered the gore area.[1] The Tesla continued traveling through the gore area and struck a previously damaged crash attenuator at a speed of about 71 mph.[2] The crash attenuator was located at the end of a concrete median barrier. The speed limit on this area of roadway is 65 mph. Preliminary recorded data indicate that the traffic-aware cruise control speed was set to 75 mph at the time of the crash.[3] The impact rotated the Tesla counterclockwise and caused a separation of the front portion of the vehicle. The Tesla was involved in subsequent collisions with two other vehicles, a 2010 Mazda 3 and a 2017 Audi A4 (see figure 1). Figure 1. Southbound view of US-101 depicting Tesla, Audi, and Mazda vehicles at final rest. (Source: S. Engleman) A preliminary review of the recorded performance data showed the following: The Autopilot system was engaged on four separate occasions during the 32-minute trip, including a continuous operation for the last 18 minutes 55 seconds prior to the crash.

During the 18-minute 55-second segment, the vehicle provided two visual alerts and one auditory alert for the driver to place his hands on the steering wheel. These alerts were made more than 15 minutes prior to the crash.

During the 60 seconds prior to the crash, the driver’s hands were detected on the steering wheel on three separate occasions, for a total of 34 seconds; for the last 6 seconds prior to the crash, the vehicle did not detect the driver’s hands on the steering wheel.

At 8 seconds prior to the crash, the Tesla was following a lead vehicle and was traveling about 65 mph.

At 7 seconds prior to the crash, the Tesla began a left steering movement while following a lead vehicle.

At 4 seconds prior to the crash, the Tesla was no longer following a lead vehicle.

At 3 seconds prior to the crash and up to the time of impact with the crash attenuator, the Tesla’s speed increased from 62 to 70.8 mph, with no precrash braking or evasive steering movement detected.

During the collision sequence, the Tesla’s 400-volt lithium-ion high-voltage battery was breached, and a postcrash fire ensued (see figure 2). The driver was found belted in his seat. Bystanders removed him from the vehicle before it was engulfed in fire. The driver was transported to a local hospital, where he died from his injuries. The driver of the Mazda sustained minor injuries, and the driver of the Audi was uninjured.​

Figure 2. Northbound view of US-101 depicting Tesla postcrash fire (left) and remains of Tesla after initial fire was extinguished (right). (Source: S. Engleman) The Mountain View Fire Department applied approximately 200 gallons of water and foam during a period of fewer than 10 minutes to extinguish fires involving the vehicle interior and the exposed portion of the high-voltage battery. Technical experts from Tesla responded to the scene to assist in assessing high-voltage hazards and fire safety. After being allowed to cool, the vehicle was transported with a fire engine escort to an impound lot in San Mateo. The highway was reopened at 3:09 p.m. Around 4:30 p.m. that afternoon, at the impound lot, the Tesla battery emanated smoke and audible venting. The battery was monitored with a thermal imaging camera, but no active fire operations were conducted. On March 28, 5 days after the crash, the battery reignited. The San Mateo Fire Department responded and extinguished the fire. The crash attenuator was an SCI smart cushion attenuator system, which was previously damaged on March 12, 2018, in a single-vehicle crash involving a 2010 Toyota Prius (see figure 3).​

Figure 3. Undamaged attenuator (left) next to crash-damaged attenuator (right). The NTSB continues to work with the California Highway Patrol and the California Department of Transportation to collect and analyze data, including all pertinent information relating to the vehicle operations and roadway configuration. All aspects of the crash remain under investigation as the NTSB determines the probable cause, with the intent of issuing safety recommendations to prevent similar crashes.​

Probable Cause Probable Cause ​The information in...

HWY18FH011 Preliminary
NTSB Releases Its Investigation Results of the March Tesla Crash

digitaltrends.com · 2018

Share

On Friday, March 23, a Tesla Model X crashed into a concrete divider on U.S. Highway 101 in Mountain View, California. The driver died as a result of the crash — which occurred despite numerous warnings, the car company said.

Now, the National Transportation Safety Board has released its own preliminary report about the crash, noting that the driver did not have his hands on the steering wheel for a full six seconds before the vehicle crashed into the barricade. The NTSB also concluded that the Tesla Model X had its Autopilot function on and engaged during the crash.

The federal agency’s report does not differ from Tesla’s own investigation into the crash, though the NTSB has yet to point to a concrete decision about what ultimately caused the fatal accident. In the report, investigators note that all “aspects of the crash remain under investigation as the NTSB determines the probable cause, with the intent of issuing safety recommendations to prevent similar crashes.”

Some new details that have emerged from the NTSB include that the Autopilot feature was set to maintain a driving speed of 75 mph. We also now know that about eight seconds before the crash, the car was behind another vehicle moving at 65 mph, which probably caused the Model X to slow down a bit. Four seconds before the collision, however, the Tesla stopped following the aforementioned vehicle, causing it to accelerate just before it hit the barricade.

Prior to the crash, the Model X did send two visual and an auditory cue to the driver to take the wheel. “These alerts were made more than 15 minutes prior to the crash,” the NTSB noted.

In Tesla’s initial report following the accident, the company included photos of a crash attenuator at the accident site before retrieving the car’s computer logs. One photo shows the safety device appearing in proper condition on an unstated date. The second image, taken on March 22 by a dash cam in a car driven by a witness to the accident, shows the same barrier crushed from an earlier crash.

Previous Next 1 of 2 Tesla Tesla

According to Tesla, “the reason this crash was so severe is that the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had either been removed or crushed in a prior accident without being replaced.”

In a follow-up report after Tesla retrieved the car’s logs, the company stated, “In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

Reacting to criticism that Tesla lacks empathy for crash tragedy when it quotes the relative statistical safety of driving in Telsa vehicles, the company stated, “Nothing could be further from the truth. We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety. None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.”

Tesla’s Autopilot warnings, and the company’s admonitions about not using the system without keeping hands on the wheel and eyes on the road, do not mean the system is safe from misuse. Following the first Tesla fatality in 2016, the National Traffic Safety Board reported the driver was at fault for not paying attention and for “overreliance on vehicle automation.” The NTSB also reported at the time that Tesla “could have taken further steps to prevent the system’s misuse,” Reuters reported.

The National Highway Traffic Safety Administration (NHTSA) is also investigating the accident.

Human error is widely considered at least partially responsible for more than 90 percent of fatal accidents each year, as documented by a study published by Stanford Law School. Tesla states there is one automobile fatality for all vehicles for every 86 million driving miles, but one fatality for every 320 million miles with Teslas equipped with Autopilot hardware. According to Tesla’s figures, Tesla drivers with Autopilot hardware are 3.7 times less likely to be in a fatal accident.

Updated on June 8 to include findings from the NTSB’s investigation....

NTSB Releases Its Investigation Results of the March Tesla Crash
Police: Tesla That Hit Mass. State Trooper's Vehicle Was In Autopilot

www.nbcboston.com · 2020

Maria Smith sat on her front porch in Holbrook and recalled her terrifying experience last December while driving home from college.

A Massachusetts State Police trooper had just stopped the 21-year-old at about 10 p.m. on Route 24 in West Bridgewater. As the trooper approached her driver's side window, Smith reached for her vehicle registration. Then, she was suddenly jolted by a loud collision.

"It just happened so quick," Smith said. "Before I knew it, my car was flying forward. I looked behind me, and my whole back windshield was blown out. There was glass in my hair."

According to court documents, a Weston man driving a Tesla slammed into a Massachusetts State Police cruiser that was stopped in the left lane of the road, propelling the SUV forward into Smith's vehicle before spinning out.

The driver, Nicholas Ciarlone, is now facing a negligent driving charge. Once courthouses reopen, he is scheduled to be arraigned in September.

Ciarlone declined to answer questions about the incident when the NBC10 Boston Investigators approached him outside the Brockton District Court following a clerk magistrate hearing.

However, court documents point to Tesla's emerging driver assistance technology as an element in the crash.

A trooper who responded to the scene wrote that Ciarlone said his Tesla was set to Autopilot mode and he "must not have been paying attention."

"I thought that was terrifying," Smith said. "To think the sensors are not equipped enough to pick up a police car with its sirens and lights on the highway."

When Autopilot is engaged, the car helps with steering and matches your speed to surrounding traffic. All Tesla models come equipped with the feature, which doesn't make the vehicle fully autonomous, but assists with what Tesla describes as the "most burdensome parts of driving."

But there are a growing number of examples in which Autopilot didn't prevent crashes, including:

A deadly collision with a tractor trailer in Florida in 2016;

A fatal crash into a concrete barrier on a California highway in 2018;

A Utah driver who slammed her Tesla into a stopped fire truck at a red light;

A collision involving a stopped Connecticut State Police vehicle last year.

Nationally-recognized auto safety watchdog Sean Kane said he believes Tesla is testing out Autopilot in real time on public roadways.

"We are all involved in a clinical trial that we didn't sign up for," Kane said.

NBC10 Boston first took a closer look at Autopilot last November, when a Newburyport driver said he accidentally fell asleep for 14 miles behind the wheel as his vehicle navigated the highway.

That driver shared his story to highlight the different ways Tesla owners were taking detours around the car's warning system, which reminds drivers to keep their hands on the wheel. Online videos showed drivers rigging the steering wheel with everything from weights to water bottles to a piece of fruit to trick the software.

Following that report, Massachusetts Sen. Ed Markey had a strong reaction, sending a letter to Tesla CEO Elon Musk.

Markey later called on Tesla to rebrand Autopilot, arguing the name caused drivers to rely too heavily on the technology.

The Massachusetts lawmaker also raised the issue at a hearing in Washington D.C., peppering the head of the National Highway Traffic Safety Administration with questions about the federal agency's oversight.

An NHTSA spokeswoman told NBC10 the agency is aware of the Massachusetts crash and is gathering details from Tesla and law enforcement.

"All forms of distracted driving — including by drivers who abuse their vehicles' advanced safety features — are dangerous and put the driver and other road users at risk," an NHTSA statement reads.

Tesla did not respond to questions from NBC10 Boston about the crash. But the electric carmaker did recently roll out improvements designed to help its vehicles identify stop signs and traffic lights while in Autopilot.

Tesla has also maintained drivers need to pay attention while the vehicle is in Autopilot and be ready to take over at a moment's notice.

Kane believes that message contradicts human nature.

"You can't call something 'Autopilot' and then have the driver fully engaged. That doesn't make any sense at all," he said. "Unfortunately, the regulators are allowing it to happen right before their eyes."

Smith and the state trooper both went to the hospital following the crash in December, but were not seriously hurt.

Smith, who saw the trooper get knocked to the ground next to a highway barrier, told NBC10 Boston it all came down to a matter of inches.

"If my car had pushed forward any more, he probably would've ended up getting crushed by it," she said.

...

Police: Tesla That Hit Mass. State Trooper's Vehicle Was In Autopilot
Tesla on ‘Autopilot’ hits police vehicle which hits ambulance, driver possibly drunk: police

www.mercurynews.com · 2020

Police are probing possible drunk-driving in the case of a Tesla driver in Arizona who said he was using the Bay Area electric car maker’s controversial “Autopilot” system when his sedan smashed into an unoccupied police vehicle, which then hit an ambulance.

The crash occurred Tuesday on an Arizona highway, according to the state’s Department of Public Safety. “We can confirm the driver indicated to troopers the Tesla was on autopilot at the time of the collision,” the department tweeted, adding that the 23-year-old male driver was being investigated for driving under the influence.

The police sergeant who had driven the department’s SUV was not in it at the time of the crash, and the ambulance occupants were not hurt, the department said. The Tesla driver was hospitalized with serious but not life-threatening injuries, police said.

Tesla did not immediately respond to a request for comment. After a fatal 2018 accident involving a Tesla on Autopilot in Mountain View, the Palo Alto company said that “Autopilot can be safely used on divided and undivided roads as long as the driver remains attentive and ready to take control,” the National Transportation Safety Board noted in a report. In a 2018 blog post, Tesla claimed Autopilot makes crashes “much less likely to occur,” arguing that “No one knows about the accidents that didn’t happen, only the ones that did.”

Crashes involving Tesla’s Autopilot driver-assistance system have sparked multiple investigations by the federal safety board. The agency found a Tesla driver’s over-reliance on the automated system was a factor in a 2016 fatal Model S crash in Florida, and determined that in 2018 in Mountain View, Autopilot steered a Tesla Model X SUV into a Highway 101 barrier, a collision that caused the driver’s death.

After another fatal Florida crash, between a Model 3 sedan and a truck in March 2019, the agency blamed the driver’s over-reliance on automation and Tesla’s design of the Autopilot system as well as “the company’s failure to limit the use of the system to the conditions for which it was designed,” it said in a report.

The report noted that after the 2016 Florida crash, which involved a collision between a Tesla and a truck, the agency recommended that Tesla and five other car makers using automated systems develop technology to “more effectively sense the driver’s level of engagement and alert the driver when engagement is lacking.” However, while the five other companies responded with descriptions of their planned solutions, “Tesla was the only manufacturer that did not officially respond,” the report said.

A Tesla rear-ended a fire truck parked to respond to an accident on I-405in Culver City, Calif. on Jan. 22, 2018, according to the Culver City FireDepartment and California Highway Patrol. (Culver City Firefighters Local1927)

The agency also found Autopilot was a factor when a Model S slammed into the back of a fire truck on I-405 in Culver City near Los Angeles in 2018. The driver was also to blame in the non-injury collision, for using Autopilot in “ways inconsistent with guidance and warnings from the manufacturer,” the agency reported.

Tesla Model 3s also suffer dangerous sudden acceleration, lawsuit claims

NHTSA looking into 12th Tesla crash possibly linked to Autopilot

NTSB: Autopilot was engaged in fatal Tesla crash in March

In December, the National Highway Traffic Safety Administration had announced its 12th investigation into a Tesla crash possibly tied to Autopilot. In that accident, a Model 3 rear-ended a parked police car in Connecticut....

Tesla on ‘Autopilot’ hits police vehicle which hits ambulance, driver possibly drunk: police
Tesla with Autopilot hits cop car—driver admits he was watching a movie

arstechnica.com · 2020

Police in North Carolina have filed charges against a driver whose Tesla crashed into a police car early Wednesday morning, Raleigh's CBS 17 television reports. The driver admitted to officers that he had activated the Autopilot technology on his Model S and was watching a movie on his phone at the time of the crash.

"A Nash County deputy and a trooper with the Highway Patrol were on the side of the road while responding to a previous crash when the Tesla slammed into the deputy’s cruiser," CBS 17 reports. "The impact sent the deputy’s cruiser into the trooper’s vehicle—which pushed the trooper and deputy to the ground."

Thankfully, no one was seriously injured by the crash.

The driver was charged with a violation of the state's "move over" law and with having a television in the car.

It's an important reminder that no car on the market today is fully self-driving. Drivers need to pay attention to the road at all times, regardless of what kind of car they have or what kind of driver-assistance technology their car has.

Tesla could use better driver monitoring technology

In the last year, there have been at least three similar incidents involving Tesla vehicles crashing into police cars. This happened in Arizona in July and in Connecticut and Massachusetts last December.

To be fair, this isn't just a Tesla problem. Studies have found that driver-assistance systems like Autopilot—from Tesla and other automakers—are not good at stopping for stationary vehicles. A study earlier this month found that driver assistance systems from BMW, Kia, and Subaru failed to consistently stop for stationary vehicles on a test track.

FURTHER READING

New cars can stay in their lane—but might not stop for parked cars

Still, Tesla clearly has room for improvement. Obviously, it would be good if Autopilot could actually detect stopped vehicles. But Tesla could also use better driver monitoring technology.

Tesla vehicles use a steering wheel torque sensor to try to detect whether a driver is paying attention. This kind of sensor is easy to defeat. It's also possible to keep a hand on the wheel without actually paying attention to the road.

Tesla could learn from Cadillac, whose Super Cruise technology includes an eye-tracking camera that verifies that the driver is looking at the road. An eye-tracking system like this would likely prevent incidents like Wednesday's crash in North Carolina. If the driver had tried to watch a movie while Autopilot was engaged, the system would have detected that he was not watching the road, warned the driver, and eventually deactivated itself....

Tesla with Autopilot hits cop car—driver admits he was watching a movie