Incident 231: A Tesla Crashed into and Killed a Road Sweeper on a Highway in China

Description: A Tesla Model S collided with and killed a road sweeper on a highway near Handan, China, an accident where Tesla previously said it was not able to determine whether Autopilot was operating at the time of the crash.
Alleged: Tesla developed and deployed an AI system, which harmed Gao Yaning and Tesla drivers.

Suggested citation format

Anonymous. (2016-01-20) Incident Number 231. in Lam, K. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
231
Report Count
4
Incident Date
2016-01-20
Editors
Khoa Lam

Tools

New ReportNew ReportDiscoverDiscover

Incidents Reports

Tesla Motors came under renewed questioning about the safety of its Autopilot technology after news emerged on Wednesday of a fatal crash in China that may have occurred while the automated driver-assist system was operating.

The crash took place on Jan. 20 and killed Gao Yaning, 23, when the Tesla Model S he was driving slammed into a road sweeper on a highway near Handan, a city about 300 miles south of Beijing, according to a report broadcast on Wednesday by the Chinese government news channel CCTV.

The report includes in-car video looking through the windshield as the car travels in the left lane at highway speed just before ramming into a parked or slow-moving orange truck. The video, apparently shot by a camera mounted on the rearview mirror, recorded no images, sounds or jolts that would suggest the driver or the car hit the brakes before impact. At that point, the in-car video ends.

“When it was approaching the road sweeper, the car didn’t put on the brake or avoid it,” a police officer said in the CCTV report. “Instead, it crashed right into it.”

In an emailed statement, Tesla said on Wednesday that it had not been able to determine whether Autopilot was active at the time of the Handan accident. The company declined to say when it learned of the fatality in China, or whether it had reported the crash to United States safety officials, who are investigating a fatal accident in Florida on May 7 in which Autopilot was engaged.

So far, that Florida accident is the only confirmed death involving a Tesla with Autopilot turned on. In that accident, there was no sign that the driver or Autopilot had applied the brakes before the car collided at high speed with a tractor-trailer that had turned in front of it.

Although Tesla learned of the Florida accident a few weeks after it happened, it did not publicly disclose it until late June, when the National Highway Traffic Safety Administration announced it was investigating the crash.

News of the Chinese crash will renew questions about when the company should disclose information about accidents in cars equipped with Autopilot and what information should be shared.

“Because of the damage caused by the collision, the car was physically incapable of transmitting log data to our servers, and we therefore have no way of knowing whether or not Autopilot was engaged at the time of the crash,” a Tesla spokeswoman, Alexis Georgeson, said in the company’s statement. “We have tried repeatedly to work with our customer to investigate the cause of the crash, but he has not provided us with any additional information that would allow us to do so,” she said of the car’s owner, Mr. Gao’s father.

She said Tesla was saddened to learn of the death of Mr. Gao. “We take any incident with our vehicles very seriously and immediately reached out to our customer when we learned of the crash,” she said.

Tesla and Autopilot have been under scrutiny since the disclosure of the May fatality. That crash killed Joshua Brown, 40, whose 2015 Model S was traveling 74 miles per hour when it collided with a tractor-trailer that had turned left and was crossing a highway near Williston, Fla. Autopilot’s radar and cameras failed to recognize the white truck against a bright sky.

News of the China crash comes just three days after Tesla’s chief executive, Elon Musk, outlined changes planned for Autopilot that he said would have prevented Mr. Brown’s accident and that he contended would make the Model S one of the safest cars on the road.

The top congressional leaders will meet to discuss a stimulus deal and a year-end spending bill ahead of Friday’s deadline.

European truck makers say they will phase out fossil fuel vehicles by 2040.

The changes include refinements in Autopilot’s radar that improve its ability to spot and identify obstacles down the road and additional warnings to force drivers to keep their hands on the steering wheel and eyes on the road while the system is active.

News of a fatal crash involving a Tesla in China will renew questions about when the company should disclose information about accidents in cars equipped with its Autopilot technology.

Tesla has said Autopilot is not meant to take over completely for a human driver. When Autopilot is turned on, drivers are given audio and text warnings to remain alert and engaged while using it.

In August, Reuters reported that Tesla removed a Chinese term for “self-driving” from its China website after a driver in Beijing had a nonfatal crash while Autopilot was engaged. That driver later complained that the carmaker had oversold Autopilot’s capability.

The video of the January accident indicates the type of unexpected problem that can crop up at highway speeds. Critics of Autopilot say a driver can be lulled into complacency, leaving too little time to take back control of the vehicle.

Mr. Gao was traveling in the left lane of a three-lane highway with another car ahead of him. When the car ahead moved into the center lane, it revealed the orange truck, which was straddling the road’s left shoulder. Mr. Gao’s car never slowed before plowing into the truck.

Police investigators concluded that Mr. Gao was responsible for the accident, CCTV reported. But in July his family sued the dealer who had sold the Tesla.

The driver’s father, Gao Jubin, told CCTV he thought his son had been relying on Autopilot to drive the car and so was not watching the road when the crash took place.

The lawsuit was filed “to let the public know that self-driving technology has some defects,” the family’s lawyer said in the report. “We are hoping Tesla, when marketing its products, will be more cautious. Don’t just use self-driving as a selling point for young people.”

Autopilot Cited in Death of Chinese Tesla Driver

In early 2016, 48-year-old Gao Jubin had high hopes for the future, with plans to ease up on running a logistics company and eventually turn over control of the business to his son, Gao Yaning. But his plans were abruptly altered when Yaning died at the wheel of a Tesla Model S that crashed on a highway in China while the car—Jubin believes—was traveling in Autopilot, the name for Tesla’s suite of driving aids. Now, Jubin says, “Living is worse than death.”

The last time Jubin saw Yaning, it was Jan. 20, 2016, days before the Chinese New Year. Jubin’s family had just left a wedding reception, and his son was in high spirits. The ceremony was for his girlfriend’s brother, so he’d spent the day keeping busy and helping out when needed.

Given that the wedding involved his potential future-in laws, Chinese tradition meant Yaning should enthusiastically help out with planning the ceremony, Jubin told Jalopnik this month, and he did. “He did a lot of logistics of the wedding, preparing transport and accommodating the guests,” Jubin said.

Following the ceremony, Yaning arranged transportation for his parents to get home. He took Jubin’s Tesla Model S and went to meet some friends. Before departing, Jubin said, his son offered up a word of caution: “Be careful driving.”

In retrospect, it was an omen. On Yaning’s way home, the Model S crashed into the back of a road-sweeping truck while traveling on a highway in the northeastern province of Hebei. At the time, his family believes, the car was driving in Autopilot, which allows those cars to automatically change lanes after the driver signals, manage speed on roads and brake to avoid collisions.

Yaning, 23, died shortly after the crash. Local police reported there was no evidence the car’s brakes were applied.

The crash in China came at a precarious time for Tesla. In the summer of 2016, the automaker and its semi-autonomous system were facing intense scrutiny after it revealed a Florida driver had died earlier that year in a crash involving a Model S cruising in Autopilot. An investigation by U.S. auto regulators was already underway.

But no one knew about Yaning’s death—which happened months before the Florida crash—until that September, when Jubin first went public about a lawsuit he’d filed against Tesla over the crash. Police had concluded Yaning was to blame for the crash, but Jubin’s suit accused Tesla and its sales staff of exaggerating Autopilot’s capabilities.

He asked a local court in Beijing to authorize an independent investigation to officially conclude that Autopilot was engaged. The suit sought an apology from Tesla for how it promoted the feature.

Multiple U.S. news outlets covered Jubin’s case after Reuters wrote about the suit in 2016, but interest almost immediately faded. The case, however, is still ongoing, according to Jubin, who spoke with Jalopnik this month by Skype in his first interview with a U.S. media outlet.

He hopes the suit will bring more attention to the system’s limited capabilities and force Tesla to change the way it deploys the technology before it’s refined. (A federal lawsuit filed last year in the U.S. echoed that concern, alleging the automaker uses drivers as “beta testers of half-baked software that renders Tesla vehicles dangerous.” Tesla called the suit “inaccurate” and said it was a “disingenuous attempt to secure attorney’s fees posing as a legitimate legal action.”)

In September 2016, the company said the extensive damage made it impossible to determine whether Autopilot was engaged. Following the incident, Tesla updated the system so that if a driver ignores repeated warnings to resume control of the vehicle, they would be locked out from using Autopilot during the rest of the trip.

Even if the system was engaged during the collision, Tesla told the Wall Street Journal, Autopilot warns drivers to keep their hand on the steering wheel, which is reinforced by repeated warnings to “take over at any time.” Yaning, Tesla said, took no action even though the road sweeper “was visible for nearly 20 seconds.”

A Tesla spokesperson said in a statement to Jalopnik that “We were deeply saddened to learn that the driver of the Model S lost his life in the incident.” A police investigation, the spokesperson said, found the “main cause of the traffic accident” was Yaning’s failure “to drive safely in accordance with operation rules,” while the secondary cause was the street-sweepers had “incomplete safety facilities.”

“Since then, Tesla has been cooperating with an ongoing civil case into the incident, through which the court has required that a third-party appraiser review the data from the vehicle,” the statement said. “While the third-party appraisal is not yet complete, we have no reason to believe that Autopilot on this vehicle ever functioned other than as designed.”

Within the first year of introducing Autopilot, in October 2015, Tesla faced intense criticism for labeling the feature as such. Regulators across the globe expressed concern that the name is misleading. Tesla has always maintained that Autopilot is only a driving aid—not a replacement—and that motorists must pay attention to the road while it’s engaged.

But Tesla’s Autopilot messaging has, at times, been criticized as conflicting and ultimately confusing. And Tesla drivers continue to push the system to its limits, with some producing videos that show Autopilot being used in ways the automaker wouldn’t officially endorse.

In the months following the crash, concerns over Autopilot seemed to dissipate, as carmakers ratcheted up efforts to develop semi-autonomous driving aids of their own that could challenge Tesla. Autonomy, the auto industry’s way of thinking went, was the way of the future.

The National Highway Traffic Safety Administration, in early 2017, also gave Tesla some peace of mind, after it cleared the automaker and Autopilot in the fatal Florida crash. But the U.S. National Transportation Safety Board later pinned blame partially on Autopilot and an over-reliance by the motorist on Tesla’s driving aids. (The NTSB can only make safety recommendations, while NHTSA’s authorized to order recalls or issue fines.)

In the Florida crash, NTSB Chairman Robert Sumwalt said, “Tesla’s system worked as designed. But it was designed to perform limited tasks in a limited range of environments. The system gave far too much leeway to the driver to divert his attention to something other than driving.”

There are few known crashes involving Tesla drivers that had Autopilot engaged, but given how relatively new autonomous technology is for commercially available cars, regulators have taken interest in how they function, even for minor accidents. The legal system has yet to seriously weigh in. A federal lawsuit over Tesla’s Autopilot rollout is pending; the first case over an accident involving an autonomous car—filed by a motorcyclist against General Motors—emerged only in January.

Tesla previously said that “any vehicle can be misused.”

“For example, no car prevents someone from driving at very high speeds on roads where that is not appropriate or from using cruise control on city streets,” the automaker said. “In contrast, Tesla has taken many steps to prevent a driver from using Autopilot improperly.”

But Tesla owners have continued to post examples of the system being misused, raising concerns that some either don’t understand Autopilot’s limitations, or rely on it far too much.

Those concerns were rekindled after a Tesla driver in January slammed into the back of a parked firetruck on a California freeway while his Model S was reportedly driving in Autopilot.

In response, the NTSB and NHTSA both launched new investigations into the use of Autopilot, highlighting some of the criticisms Jubin first raised after his son’s crash two years ago.

In particular, does Tesla do enough to ensure drivers won’t misuse Autopilot?

Officially, it hasn’t been concluded if Autopilot was engaged at the time of Yaning’s crash. Tesla claimed Jubin’s car’s damage made it physically incapable of transmitting log data, so the company had “no way of knowing whether or not Autopilot was engaged at the time of the crash.”

But Jubin believes he has more than enough evidence—recorded in-car video footage, a report from an expert who examined the clips, anecdotal comments from other Tesla owners—to prove it.

The formal inspection hasn’t been completed, but Jubin’s attorney, Cathrine Guo, told Jalopnik on Tuesday that Tesla’s U.S. headquarters has turned over a document that decodes data produced on a SD memory card installed in the Model S. The document, Guo said by email, recorded the actions of both the car and driverand shows that Autopilot was on at the time of the crash.

When asked for a response, Tesla referred to the spokesperson’s statement, which said: “While the third-party appraisal is not yet complete, we have no reason to believe that Autopilot on this vehicle ever functioned other than as designed.”

In the dashcam clips, the Model S appears to drive smoothly, centered in lanes. “Even when the road was rugged and bumpy,” Jubin said, “it didn’t change course.”

Jubin also spoke with Tesla owners and experts and he said that they agreed that Autopilot must’ve been engaged.

And at an initial court hearing, Jubin’s attorney said the speed of the Model S remained consistent for eight minutes before the collision. Jubin said he found a professor at Beijing Jiaotong University who’s an expert in autonomous driving. After reviewing the video and accompanying documents, Jubin said, “The professor felt that ... Autopilot had to be the cause of the accident.” Jubin’s attorney also said that Yaning hadn’t been drinking that day.

A judge from the Chaoyang District People’s Court of Beijing recently granted Jubin’s request for a third-party inspection to confirm whether Autopilot was engaged, and the probe could begin as early as this month, attorney Guo, said.

“Based on all the information I have,” Jubin said through his attorney, who translated the conversation. “I have no doubt that accident [was] caused by the Autopilot.”

Throughout the nearly two hour interview with Jalopnik, Jubin had to pause several times to compose himself. Yaning was a “very kind, selfless, altruistic” individual, he said.

“When he was a kid, when he was playing with his friends, he always took snacks from home and shared them,” Jubin said. “When some of the kids didn’t get it, he would come back home and get more.”

“He had been very understanding of us parents, helping us feed his little sister, or washing her clothes,” he went on. “Every time the four of us went out, it was like three parents with a kid... it was a really happy, beautiful family, and all of a sudden it was gone, like that.”

Yaning’s family grew up in Handan city, in the Hebei Province of China, where Jubin worked in the coal trade. Eventually, Jubin said, he started a logistics company and took on some public service work.

Jubin said his son spent two years in the army. He returned home and, eventually, applied for a local college to study business administration. The plan, Jubin said, was for him to one day takeover the logistics firm.

“I did not expect everything was in vain,” he said.

Yaning’s last moments alive were captured on a camera that was apparently installed on the dashboard of Model S. The lens looked out through the windshield and onto the road. Jubin’s attorney provided Jalopnik about two-and-a-half minutes of footage that was first published in September 2016 by Chinese state broadcaster CCTV.

The video shows the white sedan cruising along a four-lane highway in relatively clear weather. At one point, Yaning can be heard jubilantly singing, as the car merges from the middle to the left lane.

The car continues along the road, appearing to travel at the same speed. At one point, a car ahead of the Model S moves into the center lane, leaving an orange street sweeper straddling the road’s left shoulder directly in Yaning’s path.

Autopilot is designed to adjust speeds using adaptive cruise control. If, say, an object appears in the car’s path, an escalating series of visual and audio warnings are supposed to go off, signaling for the driver to resume control of the wheel. Based on the video, no warning alert went off and the Model S never slowed before Yaning slammed into the truck.

Tesla vehicles come equipped with automatic emergency braking technology regardless if Autopilot is turned on, which is supposed to alert the driver of a potential hazard. If the driver doesn’t react in time, the car should automatically apply its brakes.

But Tesla’s owner manual states that Autopilot isn’t adept when it comes to recognizing stationary vehicles at high speeds, as Wired notes. “Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead,” according to the manual.

Jubin had been home at the time of the crash. In the interview, Jubin said a Tesla customer service representative called and told him “we detected your airbag exploded.”

“I said, ‘My son was driving today. I don’t know the details, you should hang up and call my son, and find out what is going on,’” Jubin said.

Shortly after, Yaning’s friend who was driving near him at the time of the crash in a separate car called Jubin and said he’d been in a serious accident. Jubin and his wife immediately left and drove to the site of the accident.

When they arrived, Jubin discovered first responders addressing a gruesome scene.

“I got there and saw the car was in pieces, and a lot of blood was streaming down to the ground from the car,” he said. “The blood covered a big area on the ground. We felt very sad. It was a very bad feeling, and my wife [was] praying in my car for my son’s safety. We got to the hospital and the doctor told me they tried and couldn’t save him.”

The intervening months took a toll. He suspended work at the logistics company and let every employee go. Jubin’s now focused solely on the lawsuit and taking care of his family.

“He’s gone and we don’t know how to carry on,” Jubin said. “Living is worse than death.”

Jubin’s main complaint is what he called Tesla’s misleading advertising of Autopilot. Since introducing the system in 2015, the automaker has taken flak from consumer advocates, as videos showed Tesla drivers reading or even sleeping while their cars were in motion.

Jubin said Tesla is to blame for how some customers have perceived the capabilities of Autopilot.

In particular, he pointed to a conversation he had with Yaning after purchasing the Model S. Yaning, he said, explained that a Tesla salesperson told him that Autopilot can virtually handle all driving functions.

“If you are on Autopilot you can just sleep on the highway and leave the car alone; it will know when to brake or turn, and you can listen to music or drink coffee,” Jubin said, summarizing the salesperson’s purported remarks.

This tracks with reporting after Yaning’s death went public. Some of Tesla’s Chinese sales staff, for instance, took their hands off the wheel during Autopilot demonstrations, according to a report from Reuters. (Tesla’s Chinese sales staff were later told to make the limitations of Autopilot clear.)

But Jubin said his son was “misled” by salespeople who oversold Autopilot’s capabilities. It continued even after Yaning’s death, he claimed.

“When I was at a Tesla retail store, they were still advertising, and online too, how you can sleep or drink coffee and everything,” he said.

After Jubin initially filed his suit in July 2016, Tesla removed Autopilot and a Chinese term for “self-driving” from its China website and marketing materials. The phrase zi dong jia shi, means the car can drive itself, the Wall Street Journal reported at the time. Tesla changed that to zi dong fu zhu jia shi, meaning a driver-assist system.

“When you hear autonomous driving, or self-driving, when you hear it described as that, as safe, especially on expressways, it’s totally different from the description of assisted autonomous driving,” said Guo, Jubin’s attorney. “That’s one of the reasons we sued Tesla.”

Automakers are currently testing fully-autonomous cars, but no one in the industry expects them to be available to buy for years to come. Jubin’s supportive of the movement toward autonomy, but he urged drivers around the world to be cautious and fully understand their limitations.

“I hope more Tesla owners become aware of it,” he added, “and avoid accidents like this.”

The suit initially asked for about $2,000 in compensation for the family’s grief over Yaning’s death. But the complaint has since been amended, and now asks for 5 million yuan (roughly $750,000). If he prevails, Jubin said, he plans to use some of the money to start a charity fund “to warn more Tesla owners not to use Autopilot.”

“We hope there would be no more tragic families like ours,” he said.

Jubin believes the industry’s technological advancement toward fully-autonomous driving is certain, but he feels Autopilot is too premature for release.

“Tesla should release the feature after it’s fully developed,” Jubin said, “not in the process of perfecting it.”

UPDATE: In a response to this story sent after publishing, a Tesla spokesperson said that the driver’s father told Tesla personnel that his son knew Autopilot very well and had read the owner’s manual for Model S “over and over again.”

Furthermore, the automaker asked Jalopnik to note that the car warns Autosteer is in Beta, requires hands on steering wheel, should not be used on roads with sharp turns and questionable lane markings, and refers drivers to the manual, which additionally states the driver is responsible for minding the system.

Two Years On, A Father Is Still Fighting Tesla Over Autopilot And His Son's Fatal Crash

Vehicle maker Tesla has confirmed that one of its electric vehicles involved in a fatal crash in northern China two years ago did have its "Autopilot" function engaged at the time of the crash, reports China Central Television (CCTV).

The Tesla Model S crashed into a street sweeper on a highway near the city of Handan, Hebei Province, on January 20, 2016. 23-year-old car driver Gao Yaning was killed in the incident.

The victim's family filed suit against Tesla, blaming the company for their son's death.

The new revelation would make the case in Handan the first known fatality involving Tesla's "Autopilot" technology. The previously-considered first fatality was thought to have been in the US state of Florida four months later in May, 2016.

Tesla has since stopped using the term "Autopilot" in its descriptions for its Model S vehicles on its Chinese website.

Tesla confirms 'Autopilot' engaged in fatal crash in China

Shanghai (Gasgoo)- A lawsuit about a fatal crash of Tesla in China has a new progress recently. In front of plentiful evidences, Tesla Motors admitted that the vehicle was under "autopilot" condition when the fatal crash happened, according to local reporters who got information from relevant lawyers. Meanwhile, the automaker stated that the investigation report of the accident is incomplete and it will fully cooperate with the court trial until the accrediting agency delivers final results. 

On January 20, 2016, a Tesla Model S slammed into a road sweeper on a highway near Handan, a city about 300 miles south of Beijing, which caused the driver Gao Yaning's instant death, according to a report by CCTV. Traffic polices identified that Gao Yaning should bear the main liability for this rear-end accident.

Gao Jubin, the owner of the involved Tesla as well as Gao Yaning's father, sued Tesla China sales company in Chaoyang District People's Court of Beijing for his son being killed while the car's automated driver-assist system was operating and asked for RMB 10,000 as compensation. The case was opened in court on September 20, 2016. 

The lawyer for the plaintiff stated that Gao's lawsuit was not about how much compensation he can get, but aimed to warn others that Tesla's "autopilot" system has defectiveness and do not easily try it.

At that time, Tesla said that it had not been able to determine whether "autopilot" function was active when the Handan accident had occurred. 

This is not the unique accident that referred to Tesla's autonomous driving system. On May 17th, 2016, a Tesla Model S plowed into the back of a fire truck on a freeway near Culver City, California which killed the driver. This is also the first-reported fatal accident caused by autonomous driving system in the world.

Tesla fatal crash case in China: company admits the car in “autopilot” when accident happened