Incident 153: Tesla Driver on Autopilot Ran a Red Light, Crashing into a Car and Killing Two People in Los Angeles

Description: In 2019, a Tesla Model S driver on Autopilot mode reportedly went through a red light and crashed into a Honda Civic, killing two people in Gardena, Los Angeles.
Alleged: Tesla developed and deployed an AI system, which harmed Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez.

Suggested citation format

Zhu, Helen. (2019-12-29) Incident Number 153. in Lam, K. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
153
Report Count
3
Incident Date
2019-12-29
Editors
Khoa Lam

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover

Incident Reports

DETROIT (AP) — California prosecutors have filed two counts of vehicular manslaughter against the driver of a Tesla on Autopilot who ran a red light, slammed into another car and killed two people in 2019.

The defendant appears to be the first person to be charged with a felony in the United States for a fatal crash involving a motorist who was using a partially automated driving system. Los Angeles County prosecutors filed the charges in October, but they came to light only last week.

The driver, Kevin George Aziz Riad, 27, has pleaded not guilty. Riad, a limousine service driver, is free on bail while the case is pending.

The misuse of Autopilot, which can control steering, speed and braking, has occurred on numerous occasions and is the subject of investigations by two federal agencies. The filing of charges in the California crash could serve notice to drivers who use systems like Autopilot that they cannot rely on them to control vehicles.

The criminal charges aren’t the first involving an automated driving system, but they are the first to involve a widely used driver technology. Authorities in Arizona filed a charge of negligent homicide in 2020 against a driver Uber had hired to take part in the testing of a fully autonomous vehicle on public roads. The Uber vehicle, an SUV with the human backup driver on board, struck and killed a pedestrian.

By contrast, Autopilot and other driver-assist systems are widely used on roads across the world. An estimated 765,000 Tesla vehicles are equipped with it in the United States alone.

In the Tesla crash, police said a Model S was moving at a high speed when it left a freeway and ran a red light in the Los Angeles suburb of Gardena and struck a Honda Civic at an intersection on Dec. 29, 2019. Two people who were in the Civic, Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez died at the scene. Riad and a woman in the Tesla were hospitalized with non-life threatening injuries.

Criminal charging documents do not mention Autopilot. But the National Highway Traffic Safety Administration, which sent investigators to the crash, confirmed last week that Autopilot was in use in the Tesla at the time of the crash.

Riad’s defense attorney did not respond to requests for comment last week, and the Los Angeles County District Attorney’s Office declined to discuss the case. Riad’s preliminary hearing is scheduled for Feb. 23.

NHTSA and the National Transportation Safety Board have been reviewing the widespread misuse of Autopilot by drivers, whose overconfidence and inattention have been blamed for multiple crashes, including fatal ones. In one crash report, the NTSB referred to its misuse as “automation complacency.”

The agency said that in a 2018 crash in Culver City, California, in which a Tesla hit a firetruck, the design of the Autopilot system had “permitted the driver to disengage from the driving task.” No one was hurt in that crash.

Last May, a California man was arrested after officers noticed his Tesla moving down a freeway with the man in the back seat and no one behind the steering wheel.

Teslas that have had Autopilot in use also have hit a highway barrier or tractor-trailers that were crossing roads. NHTSA has sent investigation teams to 26 crashes involving Autopilot since 2016, involving at least 11 deaths.

Messages have been left seeking comment from Tesla, which has disbanded its media relations department. Since the Autopilot crashes began, Tesla has updated the software to try to make it harder for drivers to abuse it. It’s also tried to improve Autopilot’s ability to detect emergency vehicles.

The company has said that Autopilot and a more sophisticated “Full Self-Driving” system cannot drive themselves and that drivers must pay attention and be ready to react at anytime. “Full Self-Driving” is being tested by hundreds of Tesla owners on public roads in the U.S.

Bryant Walker Smith, a law professor at the University of South Carolina who studies automated vehicles, said this is the first U.S. case to his knowledge in which serious criminal charges were filed in a fatal crash involving a partially automated driver-assist system. Tesla, he said, could be “criminally, civilly or morally culpable” if it is found to have put a dangerous technology on the road.

Donald Slavik, a Colorado lawyer who has served as a consultant in automotive technology lawsuits, including many against Tesla, said he, too, is unaware of any previous felony charges being filed against a U.S. driver who was using partially automated driver technology involved in a fatal crash.

The families of Lopez and Nieves-Lopez have sued Tesla and Riad in separate lawsuits. They have alleged negligence by Riad and have accused Tesla of selling defective vehicles that can accelerate suddenly and that lack an effective automatic emergency braking system. A joint trial is scheduled for mid-2023.

Lopez’s family, in court documents, alleges that the car “suddenly and unintentionally accelerated to an excessive, unsafe and uncontrollable speed.” Nieves-Lopez’s family further asserts that Riad was an unsafe driver, with multiple moving infractions on his record, and couldn’t handle the high-performance Tesla.

Separately, NHTSA is investigating a dozen crashes in which a Tesla on Autopilot ran into several parked emergency vehicles. In the crashes under investigation, at least 17 people were injured and one person was killed.

Asked about the manslaughter charges against Riad, the agency issued a statement saying there is no vehicle on sale that can drive itself. And whether or not a car is using a partially automated system, the agency said, “every vehicle requires the human driver to be in control at all times.”

NHTSA added that all state laws hold human drivers responsible for operation of their vehicles. Though automated systems can help drivers avoid crashes, the agency said, the technology must be used responsibly.

Rafaela Vasquez, the driver in the Uber autonomous test vehicle, was charged in 2020 with negligent homicide after the SUV fatally struck a pedestrian in suburban Phoenix in 2018. Vasquez has pleaded not guilty. Arizona prosecutors declined to file criminal charges against Uber.

Felony charges are 1st in a fatal crash involving Autopilot

A Tesla driver who had his car on Autopilot in a crash that killed two people will stand trial on two counts of manslaughter in Los Angeles, Fox Business reported.

The fatal accident in 2019 occurred when Kevin George Aziz Riad, 27, was driving a Tesla Model S at 74 mph in Gardena, Los Angeles.

The Tesla driver, who previously pleaded not guilty, will go on trial for vehicular manslaughter. The case may be the first time a driver is facing a court trial for using semi-automated technology in a fatal crash.

Riad's car went through a red light and crashed into a Honda Civic in a collision that killed Gilberto Alcazar Lopez, 40, and Maria Guadalupe Nieves-Lopez, 39, the report said.

Prosecutors said the Tesla's Autopilot features including autosteer and traffic aware cruise control were being used when the driver crashed into the Honda.

Six minutes before the collision, no brakes were used, crash data showed. But sensors appeared to show that the driver being tried for manslaughter had a hand on the steering wheel, according to a Tesla engineer who testified.

The driver will now be tried on two counts of vehicular manslaughter, according to a Fox 11 LA report. Riad and a female passenger in the Tesla were treated for injuries in hospital.

A number of car crashes have been recorded while drivers used Tesla Autopilot functions, and the first self-driving-related death was recorded in 2016.

The National Highway Traffic Safety Administration is examining a dozen crashes that involved Tesla drivers using Autopilot features amid scrutiny over its advanced driver-assistance functions.

Tesla's Autopilot and Full Self-Driving features need "active driver supervision", does not make the car autonomous, and is intended for "fully attentive" drivers, the company says on its website.

Tesla did not immediately respond to Insider's request for comment.

A Tesla driver who had his car on Autopilot in a fatal crash faces manslaughter charges, report says

GARDENA, Calif. (KABC) -- A Tesla driver who was behind the wheel with autopilot engaged when his vehicle crashed and killed two people in Gardena pleaded not guilty Thursday.

Kevin George Aziz Riad, 27, is accused of running a red light and slamming into a Honda Civic at 74 mph back in 2019, killing the driver and a passenger.

He faces two counts of vehicular manslaughter.

The crash killed Gilberto Alcazar Lopez, 40, of Rancho Dominguez and Maria Guadalupe Nieves-Lopez, 39, of Lynwood, who were in the Civic and were on their first date that night, relatives told the Orange County Register.

Riad and a woman in the Tesla were hospitalized with non-life threatening injuries.

This is the first felony case in the United States against a driver using Tesla's automated driving system.

A Tesla engineer testified that Riad had a hand on the steering wheel, but did not apply brakes in the six minutes before the crash.

A police officer testified in May that several traffic signs warning motorists to slow down were posted near the end of the freeway.

Tesla has said that autopilot and a more sophisticated "full self-driving" system cannot drive themselves and that drivers must pay attention and be ready to react at anytime.

Tesla's autopilot system in general is the subject of two federal safety investigations.

Tesla driver pleads not guilty in deadly crash while autopilot was engaged

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents