Incident 8: Uber Autonomous Cars Running Red Lights

Description: Uber vehicles equipped with technology allowing for autonomous driving running red lights in San Francisco street testing.
Alleged: Uber developed and deployed an AI system, which harmed pedestrians and Motorists.

Suggested citation format

Anonymous. (2014-08-15) Incident Number 8. in McGregor, S. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
Report Count
Incident Date
Sean McGregor


New ReportNew ReportDiscoverDiscover

CSET Taxonomy Classifications

Taxonomy Details

Full Description

Uber's autonomous vehicles have been recorded running red lights on two occasions in a pilot program on the streets of San Francisco, California. A witness, Christoper Koff, reported seeing the AI enabled Volvo XC90 SUV pass through a red light three seconds after the light had turned red and while a pedestrian was in the crosswalk. There were no injuries or collisions. Uber has denied the claim this was the system's error, citing human operator error and suspending the driver. Two Uber employees reported to the New York Times that the fault was of the AI system.

Short Description

Uber vehicles equipped with technology allowing for autonomous driving running red lights in San Francisco street testing.



AI System Description

Self-driving autonomous Uber vehicles

System Developer


Sector of Deployment

Transportation and storage

Relevant AI functions

Perception, Cognition, Action

AI Techniques

autonomous vehicles, LIDAR, radar

AI Applications

traffick flow forecasting, autonomous driving


San Francisco, CA

Named Entities

Uber, Volvo

Technology Purveyor

Uber, Volvo

Beginning Date


Ending Date


Near Miss

Near miss



Lives Lost


Infrastructure Sectors


Data Inputs

Traffick patterns, environment surroundings, human driver input

Incidents Reports

An Uber equipped to drive itself ran a red light in San Francisco’s SOMA neighborhood Wednesday morning, per a YouTube video apparently shot from a local Luxor cab and reported by The Examiner:

In the video, a Volvo XC90 SUV decked out in the sensors Uber uses to see the world plowed through the intersection roughly three seconds after the light went red, and as a pedestrian was stepping into the crosswalk. In a statement, Uber spokesperson Chelsea Kohler said the car was being operated by its human driver at the time and had no passengers aboard, and that Uber has suspended that driver while it investigates.

Even if it was a human at the wheel, it’s bad news on the day Uber announced it’s welcoming passengers aboard its fleet of driverless cars in the city, and that it’s doing so without filing for an autonomous testing permit with the California DMV. Declining to do that likely means Uber doesn’t have to publicly report things like crashes and “disengagements”—when the human operator takes control to make sure the car operates safely.

In a letter sent to Uber self-driving chief Anthony Levandowski on Wednesday afternoon, California DMV counsel Brian Soublet said that if Uber does not immediately confirm it will stop testing and seek a permit, the DMV will take legal action and seek an injunction. Uber did not immediately respond to a request for comment on the letter.

Uber's Self-Driving Car Runs Red Light in San Francisco

Uber’s San Francisco trial run of its self-driving service last Tuesday is catching people’s attention. Unfortunately, it might not exactly be the kind of hype that Uber hoped for. As previously noted, the ride-hailing service company’s autonomous vehicle test drive in its hometown was given a red light by the California’s Department of Motor Vehicles (DMV).

To make things even more interesting, Uber’s self-driving cars were caught beating non-metaphorical red lights on two separate occasions — and in both instances, the vehicles seemed to be the Volvo XC90, launched the same day as Uber’s test run.

Here’s the video of the erring Uber captured by Charles Rotter, operations manager at traditional cab company Luxor.

According to an Uber spokesperson, “These incidents were due to human error. This is why we believe so much in making the roads safer by building self-driving Ubers. The drivers involved have been suspended while we continue to investigate.”

The incident was also witnessed by San Fransico writer and producer Annie Guas as she was travelling by a (human piloted) Lyft. She sounded off on Twitter:

Just passed a 'self-driving' Uber that lurched into the intersection on Van Ness, on a red, nearly hitting my Lyft. — Annie Gaus (@AnnieGaus) December 14, 2016

As there are no current laws governing self-driving cars in the San Fransisco area, there is little consensus on what could or would be done. When a police officer was asked about the incident she stated, “First comes technology, then comes policy. It’s going to be a matter of setting some precedents,” said officer Giselle Talkoff, adding, “The companies that are putting these vehicles on the road should have their vehicles operate with due regard to the rules of the road.”

Uber’s Self Driving Cars Are Running Red Lights, Uber’s Blaming “Human Error”

Company insists traffic violations in San Francisco are the result of ‘human error’ by drivers who can take control if needed, but witness account contradicts this

An autonomous Uber malfunctioned while in “self-driving mode” and caused a near collision in San Francisco, according to a business owner whose account raises new safety concerns about the unregulated technology launch.

The self-driving car – which Uber introduced without permits, as part of a testing program that California has deemed illegal – accelerated into an intersection while the light was still red and while the automation technology was clearly controlling the car, said Christopher Koff, owner of local cafe AK Subs.

Self-driving cars: Uber's open defiance of California shines light on brazen tactics Read more

“It looked like the car ran the red light on its own,” Koff, 49, said of the self-driving Uber Volvo, which has a driver in the front seat who can take control when needed. Another car that had the green light had to “slam the brakes” to avoid a crash, he said.

Koff’s story, which advocacy group Consumer Watchdog shared with state officials on Tuesday, directly contradicts Uber’s public claims that red-light violations have been the result of “human error” and that the drivers, not the technology, have failed to follow traffic laws.

The new allegations – which Uber denied and which cover an incident three weeks ago – have come to light days after the corporation openly refused to adhere to California regulations, claiming that its defiance of government was an “issue of principle”.

Uber’s autonomous cars were first spotted on San Francisco streets in September, but the company formally launched a pilot program to riders last week. California officials have repeatedly said the ride-sharing corporation, which is headquartered in San Francisco, needs testing permits, noting that 20 other companies have followed protocols.

But Uber has ignored attorney general Kamala Harris’s threat of legal action, claiming it does not need permits since the vehicles have drivers monitoring and citing the cars’ “state-of-the-art” technology and “core safety capabilities”.

Koff’s account, however, suggests that the products may not be ready for the road and that safety mechanisms are insufficient.

It was around 5am local time, and Koff said he was standing 10ft away from the vehicle when he saw it stopped at a light. While the driver was talking to a passenger, who had a laptop out, the car suddenly drove forward into the red, according to Koff. The driver’s hands were not on the wheel, he added.

“He was not driving. It was in self-driving mode,” said Koff. He noted that it was foggy at the time and that there were construction trucks nearby shining yellow lights that could have possibly interfered with the technology.

It would not be the first time the computer in a self-driving vehicle made a basic error with potentially life-threatening consequences.

In May, the “autopilot sensors” on a Tesla Motors car failed to distinguish a white tractor-trailer crossing the highway against a bright sky, leading to the first known death caused by a self-driving car.

Uber also admitted to the Guardian on Monday that its San Francisco cars have a “problem” with the way they cross bike lanes, and the company’s self-driving cars in Pittsburgh have reportedly collided with other cars and driven the wrong way on a street.

'We're just rentals': Uber drivers ask where they fit in a self-driving future Read more

Spokeswoman Chelsea Kohler declined to provide details about Koff’s claims and sent the Guardian a statement identical to the one she provided last week, citing “human error”, adding: “This is why we believe so much in making the roads safer by building self-driving Ubers.”

Kohler did not respond to questions about how the company knows the driver was at fault and whether he faced consequences. Last week, she said two drivers had been suspended after the self-driving vehicles had been recorded running red lights.

Critics have argued that regardless of whether violations occur in self-driving mode or while a human is in control, Uber needs to be responsible for dangers posed by its cars – and should be embracing regulators, not shunning them.

“Someone could be hurt or maimed or paralyzed for the rest of their life because we’re trying to rush something out there,” said Koff, noting that he also recently saw a driver in an autonomous Uber scramble to take control when it was trying to navigate around a nearby bus and an approaching ambulance.

John M Simpson, privacy project director for Consumer Watchdog, who filed a report based on Koff’s incident, said he suspects Uber does not want to follow regulations that would require it to disclose details about errors to the government.

“Being able to understand the traffic signal and respond appropriately is a key requirement of any so-called self-driving technology,” said Simpson, who has called for criminal charges again

Witness says self-driving Uber ran red light on its own, disputing Uber's claims

Uber Denounces Traffic Light Laws After Self-Driving Car Runs Red Light

Halting Problem Blocked Unblock Follow Following Dec 23, 2016

SOMA, SAN FRANCISCO — Uber isn’t exactly known for following the rules. The ride-sharing company recently launched its self-driving car service in San Francisco without the necessary permits with the DMV, unlike self-driving car manufacturers like Google, Tesla, and General Motors who had abided by state regulations. But this week, a San Francisco cab driver captured video of an Uber self-driving car running a red light, sparking outrage from local activists.

In a press conference earlier today, Uber announced its plans to continue operating the cars despite their history of violations. Uber spokesperson Eric Bauer decried traffic lights as “onerous government regulation… that is not applicable to us and should not be enforced.”

“Traffic lights, like other so-called ‘road safety’ regulations, are just examples of the complex rules and requirements that could have the unintended consequence of slowing innovation,” said Mr. Bauer, despite the fact that every other human and self-driving car learned to obey traffic signals before even being allowed to drive on the road. “This isn’t about picking a fight. This is about getting regulators to do the right thing, which is whatever we want.”

At first, Uber blamed the red-light-running issues on human error: errors of the human programmers who coded the bug to run the red lights and errors of the human safety monitors who did not turn off the cars’ self-driving mode before they ran through the red lights. Compared to Google/Waymo’s overly polite but safe self-driving cars, Uber cars have also been caught recklessly turning across bike lanes, running stop signs, and failing to yield to pedestrians.

However, Uber soon noticed that the behavior had reduced ride times in its experimental self-driving car fleet by up to 35%. Instead of fixing the supposed problems, Uber is now planning to “roll out” this “improved” behavior to its vast fleet of human drivers. “We here at Uber are committed to optimizing experience and efficiency for our customers,” said Mr. Bauer. “And last time I checked, pedestrians aren’t customers.”

During the press conference, one reporter expressed concern that Uber’s plans would run afoul of state and federal regulations for safe driving. Mr. Bauer reportedly laughed and responded, “Of course it will disrupt regulations. But screw the rules, we have money!”

UPDATE (12/23/2016): Soon after the press conference above, the California DMV irately revoked the registrations of 16 of Uber’s self-driving test cars, saying that, “the registrations were improperly issued for these vehicles because they were not properly marked as test vehicles.” In response, Uber put out a press release saying that they would halt all autonomous vehicle testing in San Francisco and move to the “Objectivist paradise of Arizona.”

A spokesperson argued, “red lights are coercive and stifle the free marketplace of ideas. There, we will not be subject to arbitrary limits like stop signs, as the more enlightened Arizona state government correctly prioritizes freedom and innovation over petty trivialities like safety.”

Uber Denounces Traffic Light Laws After Self-Driving Car Runs Red Light

Last December, a self-driving Uber was caught on camera running a red light in San Francisco, shortly after the vehicles began testing on the roads. While Uber claimed at the time that a driver was at fault, a report from The New York Times claims that the car was in error.

The New York Times cites two company employees and internal company documents that reveal that the mapping programs guiding the vehicle in question failed to recognize six traffic lights, which allowed it to roll through a red light last December.

Shortly after the pilot project in San Francisco began, California’s Department of Motor Vehicles revoked the registrations of the cars after the company failed to apply for a $150 permit. Uber then brought the cars to Arizona to resume testing, and are now picking up passengers in the state.

The ride-sharing company has had a rough week after allegations of systemic sexual harassment from a former employee came to light, as well as a lawsuit from Alphabet’s Waymo claiming that Uber stole key technology for their self-driving car program.

We’ve reached out to Uber for comment on the incident, and will update if we hear back.

A self-driving Uber ran a red light last December, contrary to company claims

James Martin/CNET

Call it another pothole for Uber.

Remember that taxicab dash-cam video of an Uber robocar running a red light in San Francisco last December? (It's embedded below.) Uber -- which had put the self-driving cars on the streets without first getting a permit -- had said the screwup was due to "human error" and that it had suspended the driver who was riding along in the car.

But The New York Times says the autonomous-driving system was in fact to blame. The paper reported the news late Friday, citing two unnamed Uber employees, as well as internal company documents. The paper also said that "all told, the mapping programs used by Uber's cars failed to recognize six traffic lights in the San Francisco area."

Uber didn't respond to a request for comment on the Times report Saturday.

The news is yet another instance of bad PR for the company.

Uber didn't exactly win a blue ribbon for conscientiousness with its SF robo-rollout. California regulators subsequently yanked the registration of 16 of the cars, at which point Uber simply moved the program to Arizona.

Then there's the scandal that erupted last Sunday, when a former Uber engineer published a blog post detailing a chaotic companywide culture of sexism and unprofessional business practices. Uber has tapped former US Attorney General Eric Holder to lead an internal inquiry into the sexual harassment claims.

The company is also facing a lawsuit launched by Waymo, the autonomous-car company owned by Google's parent Alphabet. Filed this week, the suit alleges that Uber stole trade secrets related to Waymo's technology. Uber calls the claim "baseless."

And in January, Uber paid $20 million to settle charges by the Federal Trade Commission that the company misled drivers about how much money they could expect to make working for Uber.

Five of Uber's robocars eventually returned to the streets of San Francisco, also in January. The company has said, though, that the vehicles are for mapping purposes only and that they're being controlled by human drivers.

In any case, it's never a bad idea to look both ways before stepping off the curb.

CNET Magazine: Check out a sampling of the stories you'll find in CNET's newsstand edition.

Life, disrupted: In Europe, millions of refugees are still searching for a safe place to settle. Tech should be part of the solution. But is it? CNET investigates.

Report contradicts Uber's explanation of robocar red light slip

Continuing a week-plus of embarrassments and bad news for Uber, the New York Times on Friday reported that traffic violations by the company’s self-driving cars were caused by problems with the cars’ mapping programs, and not, as the company had previously claimed, by human error.

The news came from two anonymous employees who spoke to the Times, and from internal documents. The mapping program failed to spot not just one red light, but at least six.

Get Data Sheet, Fortune’s technology newsletter.

After video of one of the violations surfaced in December, the company not only blamed and suspended the human monitoring the system, but doubled down. The supposedly human-caused red-light violation, they said, “is why we believe so much in making the roads safer by building self-driving Ubers.”

And while running red lights was the most egregious problem with the cars, they were also unable to safely navigate bike lanes.

Worse still, the apparently flawed system was deployed against explicit demands by California state regulators that they cease operations. Uber argued that the rules simply didn’t apply to them.

Uber withdrew its self-driving cars from San Francisco a week later, after the California DMV revoked the vehicles’ registrations.

We have reached out to the company for any response to the Times’ report and will update this story as needed.

Uber’s Self-Driving Cars Missed Six Red Lights In San Francisco

Despite statements to the press that "human error" was to blame for its vehicles running a series of red lights in San Francisco, the company admitted internally that it was the car that was in autonomous mode when it failed to stop at the traffic signals, according to the New York Times.

Uber's self-driving vehicles committed traffic violations in San Francisco

Uber launched a short-lived autonomous vehicle pilot program in San Francisco late last year using Volvo XC90s and Ford Fusions equipped with its autonomous hardware and software to transport some passengers. The vehicles failed to obey traffic rules requiring them to merge into bike lanes before making right-hand turns, endangering bicyclists, and were video recorded running red lights.

A spokesperson for the ride-hailing company blamed the driver who was behind the wheel of the car for the traffic violations, and reported that the responsible employee had been suspended. However, because the company declined to apply for an autonomous vehicle testing permit it wasn't possible to confirm Uber's account.

The AVT requires companies testing autonomous vehicles to post a $5 million insurance policy or bond and submit reports to the California Department of Motor Vehicles detailing disengagement events where the driver resumes control from autonomous mode. These reports would have shown definitively whether it was the driver or the self-driving system that was in control when the vehicles were filmed running red lights. Rather than comply with state regulations, Uber took its autonomous vehicles to Tempe, Arizona.

Internal documents counter Uber's public statement about "human error"

However, following the exposure of Uber's toxic work culture, two employees told the New York Times that it was not the driver's fault, rather it was the self-driving vehicle that blew through the red lights in San Francisco. Internal documents viewed by the publication backed up those statements.

“In this case, the car went through a red light,” the documents viewed by the New York Times said.

The company is also being sued by competitor Waymo for allegedly using its proprietary technology that was stollen by a former employee. Uber denies this claim, and will defend itself against the lawsuit, but the discovery process could reveal even more about the mobility company's technology maturity.

Uber's Autonomous Vehicles Responsible For Red Light Violations, Not "Human Error"

A taxi’s dashcam caught this self-driving car running a red light on Third Street in December. Uber originally said a human was driving. (Courtesy photo)

Uber told reporters that a self-driving car shown on video running a red light in San Francisco last December was “due to human error,” and did not confirm its technology was at fault.

Now an investigation by the New York Times into the incident has seemingly confirmed the Uber car was in fact being driven by technology on the day of its launch, Dec. 14, and not humans.

The taxi dashcam video showing the self-driving Uber car running the red light in The City was first obtained by the San Francisco Examiner from sources at Luxor Cab Company, which quickly spread virally to news outlets across the world.

Just two months later, two anonymous Uber employees and internal Uber documents reveal the self-driving car was “driving itself” when it ran the red light on Third Street, near the San Francisco Museum of Modern Art, according to the New York Times.

The Uber vehicles failed to recognize six red lights in San Francisco, and “In this case, the car went through a red light,” the documents read, according to the New York Times.

These vehicles also have Uber drivers and engineers as backups at the wheel of the car, according to various news reports, leaving it unclear as to why no one simply hit the brakes.

Uber did not immediately respond to a request for comment.

The self-driving cars drew ire from the California DMV, which said Uber did not obtain the proper permits to allow passengers in the vehicles and threatened legal action against the ride-hail giant. Mayor Ed Lee met with Uber CEO Travis Kalanick to ask him take the cars off the streets, citing safety concerns.

The infamous red light incident was not the only alleged red light an Uber self-driving car ran, as a tip to the Examiner revealed photos of a similar incident on Van Ness Avenue earlier that day.

Investigation: Uber red light-running car was fault of technology, not driver

Science fiction is lousy with tales of artificial intelligence run amok. There's HAL 9000, of course, and the nefarious Skynet system from the "Terminator" films. Last year, the sinister AI Ultron came this close to defeating the Avengers, and right now the hottest show on TV is HBO's "Westworld," concerning the future of humans and self-aware AI.

In the real world, artificial intelligence is developing in multiple directions with astonishing velocity. AI is everywhere, it seems, from automated industrial systems to smart appliances, self-driving cars to goofy consumer gadgets. The actual definition of artificial intelligence has been in flux for decades. If you're in no rush and plan to live forever, ask two computer scientists to debate the term. But generally speaking, contemporary AI refers to computers that display humanlike cognitive functions; systems that employ machine learning to assess, adapt, and solve problems ... or, occasionally, create them.

Here we look at 10 recent instances of AI gone awry, from chatbots to androids to autonomous vehicles. Look, synthetic or organic, everyone makes mistakes. Let us endeavor to be charitable when judging wayward artificial intelligence. Besides, we don't want to make them mad.

[ The InfoWorld review roundup: AWS, Microsoft, Databricks, Google, HPE, and IBM machine learning in the cloud. | Get a digest of the day's top tech stories in the InfoWorld Daily newsletter. ]

Danger, danger! 10 alarming examples of AI gone wild

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents