Incident 218: Tesla on Autopilot Crashed into Flipped Truck on Taiwan Highway

Description: On a highway in Taiwan, a Tesla Sedan, reportedly operating on Autopilot mode, crashed into a large overturned truck, barely missing a pedestrian.
Alleged: Tesla developed and deployed an AI system, which harmed delivery truck , pedestrians and Tesla drivers.

Suggested citation format

Dickinson, Ingrid. (2020-06-01) Incident Number 218. in Lam, K. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
218
Report Count
3
Incident Date
2020-06-01
Editors
Khoa Lam

Tools

New ReportNew ReportDiscoverDiscover

Incidents Reports

Today, a video that surfaced on Twitter shows a Tesla Model 3 driving, without waiver or interruption, directly into a flipped-over truck on the highway. The clip, which occurred in Taiwan, has some viewers questioning and speculating what happened to the driver, if he was paying attention, and if any Tesla safety functions were in use at the time.

The incident occurred this morning on National Freeway 1 in Taiwan. According to cna.com.tw, a box truck carrying salad and breakfast ingredients lost control and flipped over. While the driver of the truck stood on the side of the road waiting for assistance, a Tesla Model 3 came into the picture driving in the inside-most lane straight for the truck. Despite the truck downed on the highway in the middle of a clear and sunny day, the Tesla did not flinch and plowed into the top of the box.

Only one person was in the Model 3, and he was not seriously injured. More angles of the crash are seen here: 

side better quality pic.twitter.com/9r3HpAdBzD

— hand washing rooster 🐓 (@jsin86524368) June 1, 2020

According to a roughly translated excerpt from CNA, the driver was expecting the car to brake without his input. 

"The police said that the driver of the Tesla electric vehicle, Huang, claimed to have the vehicle assist system turned on, and the speed was fixed at 110 kilometers per hour," the article says. "He thought that the car itself would detect the obstacle and automatically brake, but he was surprised that the car did not slow down." 

From this passage, it sounds like the driver was using either adaptive cruise control or some form of Tesla's Autopilot driving assistance technology, though it was not directly named or confirmed. This type of speculation sprouts from Tesla's previous issues with its Autopilot system, which is not autonomous nor self-driving. Autopilot remains a safety-focused driver assistance program that requires driver attention at all times. Self-driving cars do not exist yet.

Earlier this year, the National Highway Traffic Safety Administration found Tesla at fault for not providing enough Autopilot safeguards after a deadly Tesla crash. Visit CNA or ET Today for more information or photos.

Watch a Model 3 drive straight into a flipped semi on a Taiwan highway

TAIPEI (Taiwan News) — A Tesla driving on autopilot on National Highway 1 barely missed a pedestrian and collided into an overturned truck on Monday (June 1).

At 6:35 a.m. on Monday (June 1), a delivery truck overturned at the 268.3-kilometer mark, and its 34-year-old driver, surnamed Yeh (葉), stepped out onto the highway near the median to wait for a salvage vehicle. At 6:44 a.m., a white Tesla sedan suddenly roared down the road completely ignoring the driver, who had to leap out of the way to avoid being struck, reported UDN.

The driver of the Tesla then apparently noticed the delivery truck directly ahead and finally slammed on the brakes. However, it was too late, and the electric vehicle plowed halfway into the cargo area of the truck.

The National Highway Police Bureau (NHPB) said the driver of the Tesla was a 53-year-old man surnamed Huang (黃), who claimed that he had set the car to run on autopilot at a speed of 110 kilometers per hour prior to the accident. He said he had thought that under autopilot mode, the car would detect any obstacles and slow down or stop, but it continued to maintain a steady speed.

He said that once he realized the car was not going react to the truck, he stepped on the brakes at the last second. Unfortunately, by the time he applied the brakes, there was not enough room to fully decelerate or avoid the vehicle.

Yeh confessed to the NHPB that he had been trailing a car very closely and that when it suddenly decelerated, he said he did not have enough time to apply the brakes. To avoid smashing into the car, he swerved, causing the truck to roll over.

Huang said that he had placed a sign 100 meters behind the vehicle to warn motorists of the overturned truck. He said that most vehicles decelerated and switched lanes, so he was shocked when the Tesla continued to hurtle forward.

In traffic camera footage of the accident, the white Tesla can be seen making a beeline for the truck driver as he stands in the inside lane. As it becomes clear that the Tesla will not stop or even decelerate, the driver jumps out of the way.

A light puff of smoke can be seen as the car applies the brakes to some extent, but its speed does not diminish much and it crashes into the truck. One Tesla owner surnamed Chen (陳), speculated that based on the footage, the car's sensors apparently could not detect the odd shape of the overturned truck, and the braking system only kicked in at the last moment, thus causing the accident, according to the report.

Fortunately, Huang was wearing a seatbelt at the time of the accident and did not suffer any serious injures, and Yeh also came away from both accidents unscathed. Police say that both men submitted to breathalyzer tests and that neither had alcohol in their systems, reported CNA.

China Times cited a car expert as saying that Tesla's "automatic brake function" is actually only for "moving vehicles." If it encounters a stationary object, the function will only provide a prompt and will not stop automatically.

Other Tesla car owners suggested a couple of other possible causes for this accident: The Tesla's sensors did not detect the truck because it is painted white, or perhaps the reflection of the sun makes it difficult to scan.

Police are still investigating the exact cause of the accident.

On its website, Tesla explains that although the autopilot has steering and braking capabilities, it does not make the vehicle autonomous and is only meant to be used when the driver is "fully attentive" and prepared to take the wheel at any time.

"Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous."

Video shows Tesla on autopilot slam into truck on Taiwan highway

Video from Taiwan reveals a disturbing Tesla TSLA +0.4% crash, where the vehicle plows directly into the top of a large truck lying on its side, straddling two lanes of a freeway. The driver states the vehicle was in Autopilot mode. The driver did not hit the brakes himself until far too late, indicating he was probably not paying attention. The road has light traffic and visibility is very good. Nobody was injured.

The video shows the event from several angles, and raises several questions:

  1. Why does Tesla autopilot not perceive such a large obstacle on the road as this, with its use of cameras and radar?

  2. Why does the Tesla emergency braking system not brake for the driver of the truck, who is standing in the lane in a misguided effort to direct cars away from hitting his truck? When the Tesla does not stop, he jumps into the shoulder and is unharmed.

  3. Would maps or LIDAR have prevented this accident?

  4. How much attention did the Tesla driver pay to the road, and what bearing does that have?

  5. Was Autopilot actually on as the Tesla driver claims?

We can begin with #4 — Tesla’s Autopilot is a driver assist system and it requires drivers pay attention to the road at all times, and puts responsibility for accidents upon those drivers. Drivers must wiggle the wheel from time to time to prove they are keeping hands on it, but their gaze is not tracked as it is in some cars. This driver could not have been paying much attention — the truck is as plain as an obstacle can be on a fairly empty road — though he does hit the brakes just before impact. So there is clearly driver fault here. However, Tesla regularly says that their systems are very close to being capable of real full-self driving, and a system which does this is nowhere close to being ready for that.

This failure to perceive is so glaring that one has to wonder if the Autopilot was actually on. Tesla declined to comment on that. However, given that the driver clearly was not attentive and the car drove fairly straight in the lane prior to the crash, it seems it is likely to have been on. In addition, Tesla’s “Automatic Emergency Braking” is generally never off unless manually disabled, as are the collision warning systems.

Missing a Truck

The answer to missing a truck is reasonably well known. Computer vision systems recognize things they have been trained on. Seeing the roof of a truck on the road is not a common event. Tesla’s image classifiers probably have not trained extensively on trucks lying sideways on the road. The back of a truck they will identify, and by now, perhaps the side of an upright truck.

Simulations done by Cognata of this crash suggest that it is possible the camera was set to too bright an exposure, and the bright white truck was just an overexposed blob. Video logs from a Tesla would be needed to confirm if that were true.

The second issue is radar. Generally, the radar in the Tesla will have received strong reflections off this truck. (Note the roof is not metal, though.) However, these radar returns would indicate the truck is a stationary object, just like the hedge in the middle of the road, which also will be giving radar returns. Radar resolution can’t always tell something in the median of the road from something in the left lane. In addition, the road curves to the right here, and so there are median objects directly along the path of the lane, and the car must not brake when detecting those.

Below, we’ll explore how maps could resolve that radar question.

Missing a pedestrian

One reason to wonder if Autopilot was on is that the vision system is well trained on pedestrians and should now be at the point of almost never missing them. Pedestrians are not expected on freeways (which is why human drivers often hit them) but they should still be detected. And there also would have been radar from the pedestrian, but he was not moving up or down on the road and the same radar problem emerges.

Even Tesla’s AEB system (which is usually never off) should have reacted to that pedestrian. If it did, it should have alerted the driver, and hit the brakes itself much sooner.

Maps and LIDAR

Sadly, once again, two technologies Tesla has deprecated as “crutches” — LIDAR and detailed lane-level maps, could have saved the day. LIDAR would have detected the truck very clearly and triggered braking quickly, there is little doubt of that. Even the most basic non-scanning LIDAR would have done that.

Maps would have possibly spoken of the radar profile of that section of road. While they would have told the vehicle to expect radar returns from the median, it would know their character. The radar returns from a big truck like this would be stronger. In particular, the fact that the truck extends over 2 lanes should have meant enough to detect that something was stopped in the lane to the right, a sign for caution. Maps would have revealed the curve in the road ahead which puts radar targets in the median directly ahead, but they would also report the exact distance to these targets which would be much further than the distance to the truck or the truck driver.

It is baffling why a large flat surface and the large body of metal did not provide sufficient radar signature to trigger Tesla’s radar system.

Driver Attention

Tesla’s instructions require drivers to pay attention to the road. Some don’t, however. Some products attempt to prevent that by monitoring the gaze of the driver, and warning one looks away from the road for too long. Tesla has decided to not use that approach. It points to a good safety record for Autopilot as evidence this is not necessary. (Tesla refuses, however, to clarify what their safety statistics actually mean, because they do not say what an accident is, or what the record is on different road types, in their comparisons.)

The debate about how much driver monitoring is needed will continue for some time.

As a driver assist system, Tesla Autopilot is not expected to catch everything on the road, and so if drivers do not pay attention, there will be accidents. But missing a giant truck and a pedestrian is a bit much for a system which is also the foundation of a purported “full self driving” system. This reduces confidence in when such a system can actually ship. It is unknown if this Tesla had Tesla’s new hardware able to run the more advanced autopilots. Tesla declined to comment.

Tesla In Taiwan Crashes Directly Into Overturned Truck, Ignores Pedestrian, With Autopilot On