Incident 232: Tesla Model X on Autopilot Missed Parked Vehicles and Pedestrians, Killing Motorcyclist in Japan

Description: A Tesla Model X operated on Autopilot reportedly failed to recognize the parked motorcycles, pedestrians, and van in its path in Kanagawa, Japan, and ran over a motorcyclist who previously stopped when a member of his motorcyclist group was involved in an accident.
Alleged: Tesla developed and deployed an AI system, which harmed Yoshihiro Umeda , pedestrians and Tesla drivers.

Suggested citation format

Lam, Khoa. (2018-04-29) Incident Number 232. in Lam, K. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
232
Report Count
4
Incident Date
2018-04-29
Editors
Khoa Lam

Tools

New ReportNew ReportDiscoverDiscover

Incidents Reports

Tesla Inc. was sued on Tuesday by the family of a Japanese man who was killed when a driver fell asleep behind the wheel of a Model X and the vehicle 'suddenly accelerated.'

The case concerns the 'first Tesla Autopilot-related death involving a pedestrian,' according to court documents.

Documents filed in San Jose federal court by widow Tomomi Umeda and daughter Miyu Umeda claimed Yoshihiro Umeda, 44, was the victim of a 'patent defect' in Tesla's technology.

In April 2018, Umeda was among a group of motorcyclists who parked behind a small van on the far-right lane of the Tomei Expressway in Kanagawa, Japan.

They initially stopped after a member of the group was involved in an accident.

Meanwhile, a resident driving a Tesla Model X turned on the vehicle's Autopilot technologies, including Traffic Aware Cruise Control, Autosteer, and Auto Lane Change features, when he entered the highway.

In this time the driver began to fall asleep behind the wheel and lost sight of the road ahead.

'At approximately 2:49 p.m., the vehicle that the Tesla had been tracking in front slowed down considerably and indicated by its traffic blinkers that it was preparing to switch to the immediate left-hand lane, in order to avoid the group of parked motorcycles, pedestrians, and van that were ahead of it,' court documents said.

When the vehicle 'cut-out' from the lane, the Tesla Model X accelerated from approximately nine miles an hour to 23 miles an hour.

That's when the Tesla ran Umeda over.

'The Tesla Model X’s sensors and forward-facing cameras did not recognize the parked motorcycles, pedestrians, and van that were directly in its path, and it continued accelerating forward until striking the motorcycles and Mr. Umeda, thereby crushing and killing Mr. Umeda as the Tesla Model X ran over his body,' documents said.

The Tesla also reportedly hit a van, other pedestrians and motorcycles.

Court documents allege that the incident occurred without any input or action taken by the driver, except for his hands on the wheel.

Tesla's Autopilot system was referred to as a 'half-baked, non-market-ready product that requires the constant collection of data in order to improve upon the existing virtual world that Tesla is trying to create.'

It went on to explain that Tesla's issues with the Autopilot feature stem from the uncertainty of the vehicle's ability to adapt to roadways in real time.

'The inherent problem and issue with Tesla’s Autopilot technology and suite of driver assistance features is that this technology will inevitably be unable to predict every potential scenario that lie ahead of its vehicles,' court documents said.

'In other words, in situations that occur in the real world but are uncommon and have not been “perceived” by Tesla’s system, or in “fringe cases” involving specific scenarios that the system cannot or has not processed before and pose a great risk to human safety such as in the instant case, actual deaths will occur.'

The plaintiffs claimed that Tesla should have known it was selling dangerous vehicles.

But 'Tesla has refused to recall its cars and continues to fail to take any corrective measures.'

Tomomi and Miyu Umeda revealed that they expect Tesla to 'lay all of the blame' on the driver.

'If Tesla's past behavior of blaming its vehicles' drivers is any example, Tesla likely will portray this accident as the sole result of a drowsy, inattentive driver in order to distract from the obvious shortcomings of its automated... technology,' court documents said.

'Any such effort would be baseless. Mr. Umeda's tragic death would have been avoided but for the substantial defects in Tesla's Autopilot system and suite of technology marketing its vehicles with reckless disregard for motorists and the general public.'

The court documents make clear that the plaintiffs believe Tesla should be held accountable for its conduct and continual marketing of its product.

'Tesla should be held culpable for its conduct and acts committed in marketing its vehicles with reckless disregard for motorists and the general public around the world,' it said.

DailyMail.com has reached out to the Umeda's attorney for further comment.

Autopilot, a feature available on new Teslas, allow the vehicle to steer, accelerate and brake automatically within a lane.

Drivers can completely disengage Autopilot in a Tesla by pushing up a stalk near the steering wheel or by tapping the brakes. They can also take control of the steering wheel to switch away from Autopilot.

Although 13 percent said Autopilot had put them in a dangerous situation, 28 percent said it saved their lives, according to Bloomberg.

61 percent of surveyors were 'very satisfied' with Autopilot Safety and 42 percent felt 'somewhat satisfied' with Navigate on Autopilot Reliability.

One person wrote: 'The car detected a pile-up in fog and applied the brakes/alerted driver and began a lane change to avoid it before I took over. I believe it saved my life.'

However, Tesla has received its fair share of disparaging media, including videos of cars narrowly missing highway medians and ignoring driver commands.

In 2018, an Apple Inc. engineer was killed when his Tesla Model X slammed into a concrete barrier in Silicon Valley while on Autopilot.

Walter Huang, a father-of-two, died in hospital following the fiery crash after his Tesla veered off U.S. 101 in Silicon Valley and into a concrete barrier.

Data from Huang's crash showed his SUV did not brake or try to steer around the barrier in the three seconds before the crash. The car also sped up from 62mph to 71 mph just before crashing.

Documents said that Huang was using his phone at the time of the crash and did not have his hands on the steering wheel.

According to the investigative documents, Huang had earlier complained to his wife that Autopilot had previously veered his SUV toward the same barrier where he would later crash.

Last year, footage of a Tesla driving on Autopilot crashing into the back of a truck in California surfaced.

The driver wrote that the vehicle was only only going about 10 miles-per-hour and that the 'Autopilot' distance was set to three car lengths.

The driver added that the collision resulted after the 'cameras, radar, and sensors' the Autopilot relies on 'suddenly ignored the giant semi'.

In Delray, Florida, when Jeremy Banner's 2018 Tesla Model 3 slammed into a semi-truck, killing the 50-year-old driver.

NTSB investigators said Banner turned on the Autopilot feature about 10 seconds before the crash, and the Autopilot did not execute any evasive maneuvers to avoid the collision.

The three other fatal crashes date back to 2016.

Tesla, on its website, calls Autopilot 'the future of driving'.

'All new Tesla cars come standard with advanced hardware capable of providing Autopilot features today, and full self-driving capabilities in the future—through software updates designed to improve functionality over time.'

The National Highway Traffic Safety Administration special crash program began probing the twelfth Telsa crash linked to Autopilot after a Model 3 car rear-ended a police car.

Tesla has continued to remind drivers that although Autopilot can handle certain navigation, drivers should always be alert and diligent behind the wheel.

As features like artificial intelligence continued to permeate different sectors of society, Tesla CEO Elon Musk said that companies who use the technology should be regulated.

Musk's opinion on the dangers of letting AI proliferate unfettered was prompted by a report published in MIT Technology Review about changing company culture at OpenAI, a technology company that helps develop new AI.

Elon Musk formerly helmed the company but left due to conflicts of interest.

The report claims that OpenAI has shifted from its goal of equitably distributing AI technology to a more secretive, funding-driven company.

'OpenAI should be more open imo,' he tweeted. 'All orgs developing advanced AI should be regulated, including Tesla.'

In the past Musk has likened artificial intelligence to ‘summoning the demon’ and has even warned that the technology could someday be more harmful than nuclear weapons.

Speaking at the Massachusetts Institute of Technology (MIT) AeroAstro Centennial Symposium in 2014, Musk described artificial intelligence as our ‘biggest existential threat’.

‘I think we should be very careful about artificial intelligence. If I had to guess at what our biggest existential threat is, it’s probably that. So we need to be very careful with artificial intelligence.

‘I’m increasingly inclined to think that there should be some regulatory oversight, maybe at the national and international level, just to make sure that we don’t do something very foolish.'

He continued by likening the act of creating an AI to a horror movie.

‘With artificial intelligence we’re summoning the demon. You know those stories where there’s the guy with the pentagram, and the holy water, and … he’s sure he can control the demon? Doesn’t work out.’

In February, Tesla's stock jumped 40 percent in two days to send the company's market value to more than $140 billion.

It was the largest one-day gain since 2013.

After testing on public roads, Tesla is rolling out a new feature of its partially automated driving system designed to spot stop signs and traffic signals.

The feature will slow the car whenever it detects a traffic light, including those that are green or blinking yellow.

It will notify the driver of its intent to slow down and stop, and drivers must push down the gear selector and press the accelerator pedal to confirm that it's safe to proceed.

The update of the electric car company's cruise control and auto-steer systems is a step toward Musk's pledge to convert cars to fully self-driving vehicles later this year.

Tesla is sued by family of man who was killed by car using Autopilot

The Daily Mail reports a Japanese motorcyclist’s family is suing Tesla, after he was killed by a Tesla Model X in April, 2018.

According to documents filed for the lawsuit, 44-year-old Yoshihiro Umeda was with a group of motorcyclists at the side of the road in Kanagawa, Japan, sorting out the aftermath of a crash. According to the court documents, he was struck by a Tesla Model X, when the car’s driver engaged the autonomous driving system and fell asleep. The Model X had been allegedly following another car, which changed lanes to avoid the earlier crash scene. According to the Daily Mail the court documents state:

The Tesla Model X’s sensors and forward-facing cameras did not recognize the parked motorcycles, pedestrians, and van that were directly in its path, and it continued accelerating forward until striking the motorcycles and Mr. Umeda, thereby crushing and killing Mr. Umeda as the Tesla Model X ran over his body

Umeda’s family says Tesla released self-driving cars to public roads with defective self-driving technology. It will now be up to the court to decide what actually happened and who was, or was not, ultimately responsible.

It’s not the first fatal crash involving a self-driving Tesla. There have been at least four other incidents where the car’s autonomous driving system was accused of being partially responsible, although driver error also played a major part in these incidents and Tesla has always maintained that the self-driving feature requires drivers to maintain vigilance of the road and be able to take over at any time. Umeda’s death was the first fatality involving a motorcyclist; the other deaths were all Tesla drivers.

The implications of self-driving car technology for motorcyclists and other vulnerable road users were debated quite fiercely a few years back, but the buzz seems to have died off in the past few months. Has the media grown tired of it, or have road users simply decided the tech is safe? Whatever the case, this legal battle is going to bring the issue back to the forefront.

Tesla sued after motorcyclist's death

Tesla has been sued by the family of a 44-year-old Japanese man who was killed when a Model X using Autopilot crashed into a group of people standing to the side of an expressway near Tokyo, Bloomberg reports.

According to the complaint filed in federal court in San Jose, California, the driver of the Tesla Model X fell asleep shortly before the crash. When a vehicle ahead of the Model X changed lanes to avoid the group of people, the Model X allegedly accelerated and ran into the group, killing Yoshihiro Umeda.

The lawsuit adds that the accident was the result of issues with the Autopilot system, in particular inadequate monitoring of whether the driver is alert, as well as a lack of safeguards against unforeseen traffic situations.

“If Tesla’s past behavior of blaming its vehicles’ drivers is any example, Tesla likely will portray this accident as the sole result of a drowsy, inattentive driver in order to distract from the obvious shortcomings of its automated driver assistance technology,” the widow and daughter of Umede said in the complaint.

Umeda was with a group of motorcyclists who were standing behind a van at the far right side of the Tomei Expressway following an earlier traffic collision. The complaint says this is the first case of a Tesla Autopilot-related pedestrian fatality.

This is not the first time the inadequacies of Tesla’s Autopilot system has been blamed on a crash. Earlier this year, the National Transportation Safety Board said the fatal crash involving a Model X in March of 2018 occurred in part because of Tesla’s “ineffective monitoring of driver engagement.”

Tesla Autopilot Blamed On Fatal Japanese Model X Crash

Napping behind the wheel? Nope. Even with a “self-driving” car, the driver has to be awake and aware. Self-driving cars are right over the horizon, with Tesla leading the way. There have been some serious bumps in the road along the way, most notably a pedestrian fatality when the Model X driver fell asleep at the wheel.

The first Tesla Autopilot pedestrian fatality

Elon Musk, the mind behind the electric car brand Tesla, is working hard to engineer a self-driving car – one that cruises along on autopilot while the driver theoretically sits back and enjoys the ride. But the reality of the self-driving car is still confined to test drives and Disneyland, as one lawsuit that derived from a tragic accident in Japan highlighted recently, per Car Complaints.

A Tesla Model X hit a motorcyclist in Kanagawa, Japan, in April of 2018. The lawsuit alleges that several of the autopilot sensors in the car failed, and the driver was unaware of the failures as he had apparently dozed off prior to the accident. The victim was standing in a group of parked cyclists when the Tesla plowed into the group, fatally injuring that individual.

How did this happen?

According to the court filings, the Tesla failed to sense when the car in front of it signaled a lane change and then slowed down to avoid hitting the pedestrians in front of it. The Tesla had been traveling at about 9 MPH but accelerated to 24 MPH when the car in front moved over, hitting the crowd of cyclists.

The driver of the Tesla had his hands on the steering wheel (verified by Tesla’s Autosteer technology) but did nothing to stop, slow, or move the car. What’s interesting is that the lawsuit does not charge the driver with anything, but rather Tesla for building a defective car.

There were multiple failures in the autopilot systems. The cameras did not sense the vehicle in front slowing down or changing lanes, and the emergency braking system failed to notice the parked motorcycles in front as the car accelerated.

The NTSB has ruled that Tesla’s technology is insufficient for total driver disengagement, and that monitoring the driver’s interaction with the steering wheel is a poor indicator of engagement. The lawsuit, filed in Tesla’s home state of California, alleges that Tesla willfully ignored the NTSB’s recommendations for safety regulators, and is seeking damages on behalf of the victim’s family.

Other Tesla fatalities have involved the driver; this was the first one where a pedestrian was the victim. Tesla has always maintained that the driver is ultimately responsible for driving the vehicle and that the autopilot function is simply an enhancement.

Will self-driving cars really happen?

If you’re thinking that self-driving cars are some futuristic fantasy, think about all the features of your new car that were unheard of 20 years ago. Cruise control was just the beginning of automated driving; now your car lets you know when you’re following too closely, when the car ahead of you slows down, and when it’s safe to change lanes.

So you’re practically piloting an autonomous vehicle (AV) now, but you’re still in control and wouldn’t dream of napping or watching a movie while you’re behind the wheel.

AVs are safer

AVs will happen, and the roads will be safer when they do. One of the biggest advantages of a self-driving car is that it knows the rules of the road, follows the speed limit and isn’t tempted to text and drive. Transitioning from current technologies to AVs will be the hard part, but once all the cars on the road are smarter than you, those roads are going to be a lot less dangerous.

According to the USDOT, over 94 percent of crashes are caused by human error. The endgame for AVs is to not only reduce traffic fatalities but also to reduce the number of cars on the roads. AVs will not necessarily be private vehicles, but part of the on-demand rideshare economy.

Good news for Tesla

The Japanese lawsuit isn’t going away, but there is some good news for Tesla’s autopilot software. The owner of a 2018 Tesla Model 3 AWD drove his car on autopilot on the Blue Ridge Parkway in North Carolina for over an hour in foggy weather on a spiral mountain road. This drive included tunnels and a bridge hanging over the side of the mountain, and the autopilot function handled both with ease.

Tesla Autopilot Technology Killed a Man in Japan, According to This Lawsuit