Incident 208: Tesla Phantom Braking Complaints Surged, Allegedly Linked to Tesla Vision Rollout

Suggested citation format

Lam, Khoa. (2021-05-01) Incident Number 208. in Lam, K. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
Report Count
Incident Date
Editors
208
8
2021-05-01
Khoa Lam

Incidents Reports

Tesla is seeing an increase in complaints over serious and dangerous phantom braking events plaguing Autopilot in the latest software updates.

Phantom braking is a term used to describe when an advanced driver assist system (ADAS) or a self-driving system applies the brakes for no good reason.

The system can be falsely detecting an object on the road or anticipating a collision that won’t actually happen and apply the brake to try to avoid it.

Obviously, phantom braking is something you want to avoid since it can create accidents if someone is following too closely behind you.

For Tesla owners, it’s always been part of Autopilot, but it has been manageable for the most part. Events would be few and far between.

But things have been seemingly getting worse lately for many Tesla owners.

Last month, Tesla briefly pulled a new version of its Full Self-Driving (FSD) Beta software after many testers reported constant phantom braking issues – a problem that CEO Elon Musk himself acknowledged.

FSD Beta is only being tested by a limited number of Tesla owners, but now it’s clear that the phantom braking is also becoming a significant problem for Autopilot users.

Over the last few weeks, Electrek received many reports from Tesla owners claiming to have experienced an unsual number of phantom braking events.

Following up on the issue, we found that the National Highway Traffic Safety Administration (NHTSA) has seen a significant increase of complaints from Tesla owners regarding the same phantom braking issue over the last month.

One of the recent NHTSA complaints reads:

“Upon accepting delivery at the end of May we have accumulated 9,000 miles on the car and have a had horrible experiences with the traffic aware cruise control slamming on the brakes for no apparent reason with nothing ahead or passing cars. Behavior can be 5-10 mph slowdowns or in some cases FULL brake pressure which puts us in danger of being rear ended. Multiple times we have been close to rear-ended.”

The complaints are all very similar to this one with drivers saying that their vehicles are having a significant number of phantom braking events on Autopilot.

Some of the owners said that they reached out to Tesla regarding the issue, but they were told that it is due to “software evolving”:

“While driving using cruise control the vehicle will occasionally brake suddenly for unknown reasons. In one instance I was worried that the car following me would either hit my car or be forced to take other action possibly causing an accident. When I contacted Tesla regarding my concern they said something about the software program evolving…no fix available.”

The uptick in complaints appears to start around the time that Tesla transitioned to its vision-based Autopilot and dropped the use of the radar.

That started in May 2021, but there’s also an even more significant increase in complaints over the last few weeks.

Electrek‘s take

I have also been experiencing this phantom braking problem personally on my Model 3 recently.

I am an avid Tesla Autopilot user and phantom braking has always been something to watch out for. It would happen every now and again, but not frequently.

To me, it was just one more reason to be careful and always be paying attention when using the system.

But following the 2021.40 software update that I received on my Tesla Model 3 last week, I am seeing a significant increase in phantom braking events. I’m getting several of them per drive. As many as one every 10 km on Autopilot.

It’s very similar to what is described by the other owners above.

Sometimes it’s a strange deceleration for no reason and other times the car applies the brakes abruptly for no good reason.

An interesting example that I noted last week was when I auto lane changed to the left lane on the highway because I saw a car coming up on the ramp that was going to merge around the time I was passing the ramp.

I was driving on Autopilot and let it move to the left lane to let the car merge, which it did without issue, but as I was passing it, Autopilot decided to slam on the brake as if the car was both merging and changing lanes to the left, which it wasn’t.

Luckily, there was no car behind me and I was able to fight back against the braking event quickly, but it was scary.

I talked with other Tesla owners on the latest updates, and it seems that some are having the same issues while others aren’t.

There’s undeniably a significant uptick in phantom braking events, but it doesn’t seem to be affecting all cars the same way.

Tesla has a serious phantom braking problem in Autopilot

Owners say their cars are suddenly slamming the brakes at high speeds, nearly causing crashes in many cases.

Teslas are unexpectedly slamming on their brakes in response to imagined hazards — such as oncoming traffic on two-lane roads — which has prompted their terrified owners to lodge a surge of complaints with the National Highway Traffic Safety Administration over the past three months, according to a Washington Post analysis of federal auto safety data.

The phenomenon, known as “phantom braking,” has been a persistent issue for Tesla vehicles.

The automaker was forced to recall a version of its Full Self-Driving software in October over false positives to its automatic emergency-braking system that it said were triggered by the software update. Complaints soared after the recall and remain elevated, signaling continued owner concern.

Owner reports of phantom braking to NHTSA rose to 107 complaints in the past three months, compared with only 34 in the preceding 22 months.

In addition to the safety recall in late October, the timing of the complaints coincides with a period in which Tesla has stopped using radar sensors in its vehicles to supplement the suite of cameras that perceive their surroundings. Tesla announced last year that it would stop equipping Tesla Model Y and Model 3 vehicles built in North America with radar beginning in May 2021. Tesla’s new approach is known as “Tesla Vision.”

Tesla vehicles are equipped with eight surround-view cameras that the automaker says “provide 360 degrees of visibility around the car at up to 250 meters of range.” It also leverages 12 ultrasonic sensors to detect objects around the vehicle. Tesla eventually wants to transition its fleet to Tesla Vision, and some owners have been left wondering whether their cars’ radar sensors will be disabled.

Drivers and safety experts said they believe the systems began acting erratically after the changes.

Several of the owners who filed complaints with regulators said their cars seemed overly sensitive to trucks in the opposite lane. One owner described how around noon on a straight road, the car lurched from 50 mph to a near-stop seemingly in response to a large truck.

“[It] was scary to almost stop in the middle of my lane,” the owner wrote.

“Phantom braking is what happens when the developers do not set the decision threshold properly for deciding when something is there versus a false alarm,” said Phil Koopman, a Carnegie Mellon University professor who focuses on autonomous vehicle safety. “What other companies do is they use multiple different sensors and they cross-check between them — not only multiple cameras, but multiple types of sensors,” such as radar and lidar, a type of sophisticated sensor that uses laser lights to paint a dot matrix mapping the environment.

“With only one sensor type, it’s harder to be sure because you do not have the cross-check from a different type of sensor,” he said.

The NHTSA complaints are not individually verified by the agency. Owners submit their description of the issue, their vehicle identification number and other identifying information when they report their problems to the agency. NHTSA spokeswoman Lucia Sanchez said the agency is engaging in a dialogue with Tesla over the phantom braking reports.

“NHTSA is aware of complaints received about forward collision avoidance and is reviewing them through our risk-based evaluation process,” she said. “This process includes discussions with the manufacturer, as well as reviewing additional data sources, including Early Warning Reporting data. If the data show that a risk may exist, NHTSA will act immediately.”

Tesla, which disbanded its public relations department in 2020, did not respond to a request for comment. The automaker has contended in the past that its driver-assistance feature suite, Autopilot, is safer than typical driving when crash data is compared. Autopilot is a system primarily intended for highway use, so the data cannot be directly compared with vehicle crashes overall. Tesla’s chief executive, Elon Musk, has called Autopilot “unequivocally safer.”

But the company faces mounting scrutiny from regulators, including recalls and safety investigations that called into question the responsibility and performance of its driver-assistance approach.

Tesla initiated a safety recall of its Full Self-Driving driver-assistance software late last month because of a feature than enabled it to conduct “rolling stops,” proceeding through intersections with stop signs without fully halting. Tesla had met with NHTSA officials twice in January to discuss the issue, part of growing safety concerns about the cars’ automation-targeted software. NHTSA opened a safety investigation in August over about a dozen reports of crashes with parked emergency vehicles while Autopilot was activated.

The Post analysis covered more than a year of data for Tesla vehicles from the four most recent model years.

Owners of the 2022 Tesla Model 3 complained 20 times about the issue known as phantom braking, where the car suddenly slows or stops with no external cause, out of 22 total complaints involving the model.

The events are typically triggered by false positives of the forward collision warning and automatic emergency-braking systems. Owners often cited their use of Tesla’s Autopilot driver-assistance system or traffic-aware cruise control in their complaints. They also commonly referred to issues on two-lane highways, where the system was triggered by an oncoming truck in the opposite lane.

The bulk of the complaints recently about Tesla cars have been related to the phantom braking issue — 107, or 57 percent, out of 189 complaints since November about the 2020 through 2022 Tesla Model Y and Model 3, along with the 2019 Tesla Model 3.

Some drivers recalled instances of phantom braking even when they were not using Autopilot. Other owners said in complaints to NHTSA that they were shaken by the incidents and feared being rear-ended.

“These events are hair raising for me and passengers, let alone for a driver behind me,” one owner wrote in a report to the agency. “If he/she does not pay attention at that very moment, the result could even be disastrous. I would never have expected such a serious safety issue with a Tesla.”

Owners also said they experienced the phenomenon one or multiple times while on extended trips. Some said their road trip experiences were marred by the brakes suddenly triggering or the frequency jolts made by their cars.

“My wife has requested that I don’t use cruise control or autopilot while she’s in the car, as we experienced an unwarranted, aggressive automatic braking episode which caused great pressure against her pregnant belly on a previous road trip,” one owner said in a report.

Owners’ comments reflected apparent exasperation with the recurrence of the problem and a seeming inability to get it fixed.

“[These] things are happening with NOTHING present in front of my vehicle, and sometimes with nothing around me at all,” one wrote.

Luis Fernandez, who drives a 2022 Tesla Model Y, said he was at Taylor Street and Pine Street in San Francisco recently when his car spotted a plastic bag several feet in front of him. The bag didn’t pose a hazard and was soon out of his view, said Fernandez, who uses the vehicle to drive for Uber.

But his car jolted him from 25 mph to 15 mph before he could intervene.

“Suddenly the car kind of locked, but it immediately released because the plastic bag moved away. . . . The car just completely took precaution,” he said. “Automatically, it braked.”

A 2021 Tesla Model Y owner, Ben Morris, confirmed that he was one of the people who filed a complaint with NHTSA. He said this was his third Tesla, and the issues presented themselves more than ever before on his other models.

“We primarily drove the car on two-lane highways, which is where the issues would show themselves consistently,” he said in an email. “Although my 2017 Model X has phantom braked before, it is very rare, the vision-based system released May 2021 is night and day. We were seeing this behavior every day.”

He recalled an instance when his wife was driving at highway speeds of 55 to 60 mph and “it slammed on the brakes hard, sending our children’s booster seats slamming into the front seats.”

Thankfully, he said, the children weren’t in the car.

Tesla drivers report a surge in ‘phantom braking’

Over 100 complaints have been made in the last three months.

Tesla vehicles are inexplicably slamming on their brakes for no reason, frightening owners and eliciting over 100 complaints to the federal government in the last three months alone, according to The Washington Post.

It’s been a persistent issue for the automaker. Last October, Tesla CEO Elon Musk said on Twitter that the company was forced to “roll back” version 10.3 of its Full Self-Driving beta software because of issues with forward collision warnings and phantom braking.

But since then, the number of complaints about Tesla’s braking has spiked. According to The Washington Post’s analysis, reports from Tesla owners about phantom braking to the National Highway Traffic Safety Administration (NHTSA) rose to 107 complaints in the past three months, compared with only 34 in the preceding 22 months.

“Using adaptive cruise control with autopilot steering (as well as without Autosteer), multiple episodes of severe ‘phantom breaking [sic]’ where the car slams on the breaks [sic] for no apparent reason,” a Model Y owner from Sterling, Ill., wrote in a November 16th complaint. “No other cars around. Flat, clear open freeway.”

Another Model Y owner, who reported installing FSD in October 2021, said they “immediately” experienced issues with Autopilot and Traffic-Aware Cruise Control after the update was installed, including “spurious forward collision warnings.” “These warnings involved the standard warning beeps and red indicators on the driving display, and at one point included an unnecessary emergency braking incident when no obstacle was in front of me,” this person wrote. “As such, I had reverted to driving the car in manual mode, not on autopilot,” this person wrote.

A Model 3 owner in San Ramon, Calif., reported “numerous phantom braking events when on [A]utopilot. These seemingly happen out of nowhere, various conditions, and for no apparent reason.”

The problem may be traced to the controversial decision last year to remove radar sensors from new Model 3 and Model Y vehicles. The decision came after Musk publicly expressed a desire to rely on cameras to power the company’s advanced driver assistance system.

Tesla has drawn intense scrutiny from safety advocates and regulators for its willingness to allow its customers to test what is essentially an unfinished version of a product that Musk has long promised will lead to fully autonomous vehicles on the road. Earlier this week, the company was forced to issue a software update to remove an FSD feature that allows cars to perform a “rolling stop” — a maneuver in which the vehicle moves slowly through a stop sign without coming to a full stop. (A rolling stop is a common driving maneuver despite being illegal in all 50 states in the US.)

A spokesperson for NHTSA said the agency was “aware of complaints received about forward collision avoidance and is reviewing them through our risk-based evaluation process. This process includes discussions with the manufacturer, as well as reviewing additional data sources, including Early Warning Reporting data. If the data show that a risk may exist, NHTSA will act immediately.”

During an earnings call last week, Musk cited FSD as “a primary area of focus.” FSD is a beta version of an advanced driver-assist system that controls some of the car’s functions on local roads but still requires human supervision. In contrast, autonomous vehicles are cars that can operate on public roads without any human intervention or supervision.

Still, the company claims that FSD will lead to more profits in the future thanks to the “higher utilization of our vehicles.” Musk has said that once Tesla’s cars are able to drive themselves, the company will leverage that capability into a robotaxi fleet. The goal is to make it so that each Tesla customer’s car can double as an autonomous vehicle that other people can hail while the owner isn’t using it.

Tesla said it released seven over-the-air software updates for FSD over the quarter and that there are currently 60,000 vehicles operating with the advanced driver assist system in the US.

Last fall, complaints began to surface on social media of problems with some Tesla vehicles. Owners said the 10.3 update of FSD introduced phantom forward collision warnings, while others noticed a disappearing Autosteer option, traffic-aware cruise control (TACC) problems, and occasional Autopilot panic.

Now it’s clear some owners were also filing complaints with NHTSA. To be sure, the agency does not individually verify each complaint. Owners submit a description of the issue, their vehicle identification number, and other identifying information when they report their problems to the agency.

Late last year, a Tesla Model Y with FSD allegedly crashed southeast of Los Angeles. No one was injured in the crash, but the vehicle was reportedly “severely damaged.” The incident was reported to NHTSA, but there were no media reports of the crash, leading some Tesla fans to dismiss the incident as fake.

Tesla owners report dozens of instances of ‘phantom braking’

The so-called “phantom braking” increased after Tesla both made a software update and stopped using radar sensors in October.

Some Tesla drivers say they're experiencing an increase in "phantom braking," in which their cars make random, jolting stops because they misinterpret hazards like trash on the road, trucks in nearby lanes and oncoming traffic on two-lane roads. 107 Tesla drivers have filed complaints with the National Highway Traffic Safety Administration in the past three months, according to federal data reviewed by The Washington Post. Only 34 complaints had been filed in the preceding 22 months.

“My wife has requested that I don’t use cruise control or autopilot while she’s in the car, as we experienced an unwarranted, aggressive automatic braking episode which caused great pressure against her pregnant belly on a previous road trip,” one driver said in their report.

Tesla's Full Self-Driving tech has continued to be controversial and occasionally problematic, even as Elon Musk has touted the tech's features and potential. Tesla recalled one iteration of the software in October after a surge in this so-called “phantom braking.” According to the Post, complaints have stayed elevated since the recall.

Tesla also recalled 54,000 vehicles this week because a more aggressive Full Self-Driving mode allowed vehicles to roll through stop signs. The feature also warned that the car might “perform more frequent lane changes [and] will not exit passing lanes.”

The timing of the complaints also aligns with when Tesla stopped using radar sensors in its vehicles in October. The company announced that it would be switching over to a self-driving system entirely dependent on just camera sensors last year, which it calls “Tesla Vision.” Many experts warn the shift may decrease safety.

107 drivers recently complained about their Teslas making random, jolting stops

After opening strong in pre-market trading, Tesla’s stock (TSLA) fell this morning as more attention is brought to its serious phantom braking issue on Autopilot.

Back in November, Electrek released a report called ‘Tesla has a serious phantom braking problem in Autopilot.‘

It highlighted a significant increase in Tesla owners reporting dangerous phantom braking events on Autopilot.

Phantom braking is a term used to describe when an advanced driver assistance system (ADAS) or a self-driving system applies the brakes for no good reason.

The system can be falsely detecting an object on the road or anticipating a collision that won’t actually happen and apply the brake to try to avoid it.

Obviously, phantom braking is something you want to avoid since it can create accidents if someone is following too closely behind you.

This issue is not new in Tesla’s Autopilot, but our report focused on Tesla drivers noticing an obvious increase in instances, which also showed in complaints to the NHTSA.

Our report made the rounds in a few other outlets, but it didn’t really go mainstream until now.

The Washington Post just posted a very similar report focusing on the NHTSA complaints, which they compiled in this chart:

Tesla’s stock (TSLA) dropped by more than 3% following the report this morning, after it was up more than 1% in pre-market trading.

In the report, NHTSA spokeswoman Lucia Sanchez said that they are talking with Tesla about the complaints:

“NHTSA is aware of complaints received about forward collision avoidance and is reviewing them through our risk-based evaluation process. This process includes discussions with the manufacturer, as well as reviewing additional data sources, including Early Warning Reporting data. If the data show that a risk may exist, NHTSA will act immediately.”

Tesla has not commented on the issue. Elon Musk did admit that one of the Full Self-Driving Beta updates had a real issue with phantom braking, but this particular problem is with the Autopilot suite of features and not the FSD Beta.

Many owners are still reporting having the same problematic phantom braking rate, but at least the complaint rate with NHTSA has reduced over the last two months.

Tesla (TSLA) falls as more attention is brought to Autopilot's serious phantom braking issue

DETROIT (AP) — More than 750 Tesla owners have complained to U.S. safety regulators that cars operating on the automaker’s partially automated driving systems have suddenly stopped on roadways for no apparent reason.

The National Highway Traffic Safety Administration revealed the number in a detailed information request letter to Tesla that was posted Friday on the agency’s website.

The 14-page letter dated May 4 asks the automaker for all consumer and field reports it has received about false braking, as well as reports of crashes, injuries, deaths and property damage claims. It also asks whether the company’s “Full Self Driving” and automatic emergency braking systems were active at the time of any incident.

The agency began investigating phantom braking in Tesla’s Models 3 and Y last February after getting 354 complaints. The probe covers an estimated 416,000 vehicles from the 2021 and 2022 model years. In February, the agency said it had no reports of crashes or injuries.

The letter gives Tesla a deadline of June 20 to respond to the information request but says the company can ask for an extension.

Shares of Tesla Inc. tumbled more than 9% Friday.

A message was left early Friday seeking comment from Tesla.

In opening the probe, the agency said it was looking into vehicles equipped with automated driver-assist features such as adaptive cruise control and “Autopilot,” which allows them to automatically brake and steer within their lanes.

“Complainants report that the rapid deceleration can occur without warning, and often repeatedly during a single drive cycle,” the agency said.

Many owners wrote in their complaints that they feared a rear-end crash on a freeway.

In the letter, NHTSA asks for the initial speed of when the cars began to brake, the final speed, and the average deceleration. It also asks if the automated systems detected a target obstacle, and whether Tesla has video of the braking incidents.

The agency is now seeking information on warranty claims for phantom braking including the owners’ names and what repairs were made. It’s also seeking information on Tesla’s sensors, any testing or investigations into the braking problems, or if any modifications were made.

The letter focuses on Tesla’s testing of the automated systems when it comes to detecting metal bridges, s-shaped curves, oncoming and cross traffic, and different sizes of vehicles including large trucks. The agency also wants information on how cameras deal with reflections, shadows, glare and blockage due to snow or heavy rain.

The agency asks Tesla to detail its assessment of the “alleged defect” in the automated systems, including what caused the unnecessary braking, what failed, and the risk to motor vehicle safety that the problem poses. It asks Tesla “what warnings, if any, the operator and the other persons both inside and outside the vehicle would have that the alleged defect was occurring, or subject component was malfunctioning.”

The probe is another in a string of enforcement efforts by the agency that include Autopilot and “Full Self-Driving” software. Despite their names, neither feature can drive the vehicles without people supervising.

It’s the fourth formal investigation of the Texas automaker in the past three years, and NHTSA is supervising 23 Tesla recalls since January of 2021.

The agency also is investigating complaints that the automatic emergency braking systems on more than 1.7 million newer Hondas can stop the vehicles for no reason.

In addition, NHTSA has a broader probe under way into crashes involving partially automated driving systems from all automakers. Since 2016, the agency has sent teams to 34 crashes in which the systems were either in use or suspected of operating. Of the 34, 28 involved Teslas.

Fifteen people died in the crashes that NHTSA is investigating, and at least 15 more were hurt. Of the deaths, 14 occurred in crashes involving Teslas, agency documents say.

NHTSA also is investigating why Teslas on Autopilot have crashed into emergency vehicles parked on roads.

Tesla and CEO Elon Musk have been fighting with U.S. and California government agencies for years, sparring with NHTSA and most notably with the Securities and Exchange Commission.

Musk has offered to buy Twitter for $44 billion and make it a private company, but says he has put the deal on hold because of allegations that the social media platform has more automated bot accounts than it has disclosed.

US has over 750 complaints of Teslas braking for no reason

Tesla’s self-driving tech continues to be under increased scrutiny.

Tesla’s Autopilot driver-assistance system is attracting more negative headlines after it emerged that 758 owners have complained to U.S. regulators about incidents of “phantom braking” – where the car has suddenly stopped for no apparent reason.

The alarming figure was revealed in an information request letter that has been sent to Tesla by the National Highway Traffic Safety Administration and was posted on the NHTSA website.

In the 14-page letter, dated May 4, the agency asks the automaker to supply all reports it has received of phantom braking, plus details of any crashes, injuries, deaths and property damage claims. It also inquires whether Tesla’s Full Self Driving mode – which some campaigners are trying to ban – and automatic emergency braking systems were active at the time of any incident.

Phantom braking happens when a driver assistance system applies the brakes for no clear reason. Sometimes the system has falsely detected what it thinks might be an object on the road and brakes in an attempt to avoid it. Sudden stops can then present problems for vehicles following behind, increasing the likelihood of a crash.

While there have been reports circulating of phantom braking on Teslas for some time, the agency only began officially investigating it on the Model 3 and Model Y last February after receiving 354 complaints. The NHTSA’s inquiries cover an estimated 416,000 vehicles from the 2021 and 2022 model years, and earlier this year the agency said it had no reports of crashes or injuries.

The level of detail the letter asks Tesla to provide is extensive. Among the requests for each episode of phantom braking are: “Software, firmware and hardware versions in place at the time of the incident, along with vehicle and mileage and date of installation.” The agency also asks for the speeds before and after braking, the average deceleration, the maximum deceleration, whether video exists and whether an obstacle was detected.

The request for information on warranty claims includes the names of owners who made them and the nature of any repairs that may have been carried out. The NHTSA has also requested details on Tesla’s sensors and wants to know if the company has conducted any investigations into the braking problems and performed any modifications as a result.

The agency is also keen to know more about the “robustness” of the automaker’s testing program, with particular queries on how metal bridges and S-shape curves, oncoming traffic, large trucks and more are detected, and inquiries regarding how the camera system copes with horizon glare, dirt, bad weather and more. Tesla has until June 20 to respond.

In recent months, Tesla’s self-driving tech has come under increasing scrutiny. There have been suggestions the Autopilot and Full Self Driving labels are giving drivers a false sense of security, with billionaire U.S. businessman Dan O’Dowd campaigning for FSD to be banned, while the NHTSA is conducting several investigations into crashes involving Teslas.

Tesla CEO Elon Musk has crossed swords with the NHTSA before, famously calling the agency “the fun police” earlier this year.

Hundreds of Tesla Owners Report Phantom Braking

The National Highway Traffic Safety Administration (NHTSA) has released a damning report on Tesla’s Level 2 driver assist systems, called Autopilot and Full Self-Driving. Over 750 Tesla owners have reported their vehicles have mysteriously stopped on roadways for no clear reason. While that should be a concern for Tesla, it’s also far from the only safety problem the automaker’s semi-autonomous technology has faced.

Since introducing features with names like “Autopilot” and “Full-Self Driving,” Tesla has faced criticism for overstating the capabilities of what are still merely driver-assist systems that still require constant vigilance from the person behind the wheel. Linguistic concerns are only part of the problem, though; the very basis of the technology has been riddled with faults that are being discovered by ordinary consumers beta-testing the software on public roads.

In this most recent phantom braking issue, NHTSA has requested more information from Tesla about the 750 complaints. From the Associated Press:

In the letter, NHTSA asks for the initial speed of when the cars began to brake, the final speed, and the average deceleration. It also asks if the automated systems detected a target obstacle, and whether Tesla has video of the braking incidents.

The agency is now seeking information on warranty claims for phantom braking including the owners’ names and what repairs were made. It’s also seeking information on Tesla’s sensors, any testing or investigations into the braking problems, or if any modifications were made.

The letter focuses on Tesla’s testing of the automated systems when it comes to detecting metal bridges, s-shaped curves, oncoming and cross traffic, and different sizes of vehicles including large trucks. The agency also wants information on how cameras deal with reflections, shadows, glare and blockage due to snow or heavy rain.

In 2017, Autopilot steered a man into a concrete barrier at 113 km/h, a fatal accident; the driver, it was found, had been using his cell phone and may not have noticed that his Tesla had taken a sharp turn. However, the National Transportation Safety Board found that, in this case, it was likely Tesla Autopilot had not even been programmed to recognise concrete barriers and therefore wouldn’t be programmed to stop for one.

The inability to recognise certain objects has resulted in deaths of two drivers whose vehicles didn’t know to stop for tractor trailers. Teslas also didn’t know to stop for emergency vehicles that may have been parked on the side of the road or in a lane of traffic, which resulted in at least 12 reported accidents. When Full Self-Driving Beta was released, the quality of a Tesla’s left turns decreased as users continued to test the software; the cars also aimed at packed lanes and scraped against bushes. Consumer Reports even compared FSD Beta to a drunk driver.

Of course, we can’t ignore the human component in these situation; had the drivers been paying attention, they likely would have realised the beginning of a dangerous situation and been able to make evasive manoeuvres to prevent a crash. After all, drivers are technically supposed to have their hands on the wheel and their butts in a seat in order to engage Tesla’s driver-assist software.

But as Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University who studies automated vehicles, told CBS News: “It’s very easy to bypass the steering pressure thing. It’s been going on since 2014. We have been discussing this for a long time now.” We at Jalopnik covered all sorts of ways a driver could add steering wheel pressure without actually having their hands on the wheel. And that pressure sensor was only added after Tesla was called out for it; the company initially avoided installing one to save money.

Whatever NHTSA’s finds in this new phantom braking case, the very fact that Tesla’s semi-autonomous driver-assist systems are consistently facing so much scrutiny should be a red flag to the automaker itself, consumers, other drivers and regulatory bodies. It should also raise important questions as we continue on this autonomous vehicle trend: How much testing is required before a semi-autonomous vehicle hits the road? How many regulations should be required to guarantee the safety of these technologies? And why are we using conventional drivers as beta testers for software that perplex everyone from engineers to ethicists?

Teslas Are Braking for No Reason, But That’s Not Autopilot’s Only Problem