Incident 182: Two Cruise Autonomous Vehicles Collided with Each Other in California

Description: In San Francisco, an autonomous Cruise Chevrolet Bolt collided with another Cruise vehicle driven by a Cruise human employee, causing minor scuffs to the cars but no human injuries.
Alleged: Cruise developed and deployed an AI system, which harmed Cruise vehicles and Cruise driver employee.

Suggested citation format

Lam, Khoa. (2018-06-11) Incident Number 182. in McGregor, S. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
182
Report Count
2
Incident Date
2018-06-11
Editors
Sean McGregor, Khoa Lam

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover

Incident Reports

Humans are wild beasts, and they're not getting off the road anytime soon.

Stop me if you’ve heard this one before. On June 11, a self-driving Cruise Chevrolet Bolt had just made a left onto San Francisco’s Bryant Street, right near the General Motors-owned company’s garage. Then, whoops: Another self-driving Cruise, this one being driven by a Cruise human employee, thumped into its rear bumper. Yes, very minor Cruise on Cruise violence.

According to a Department of Motor Vehicles report, the kind any autonomous vehicle tester must submit to the state of California after any incident, both vehicles escaped with only scuffs. “There were no injuries and the police were not called,” Cruise reported.

A single incident does not a metaphor about self-driving technology make, but Cruise has had flurries of bumping and rear-ending incidents in San Francisco, where it has tested its technology since 2016. Many of these are unserious and relatively unremarkable, the sort of thing that might happen to a human driver and that an insurance company would never hear about.

Some are scarier, meriting check-ins to the hospital or legal wranglings. A California motorcyclist filed a lawsuit against GM, alleging a lane-changing Cruise AV knocked him off his bike and injured his back and shoulder. (GM settled the suit in June.) Some have been weird. One Cruise car got slapped by a cabbie. Another took a golf ball to the windshield while driving near a city course. (No, yelling “fore!” does nothing for a robot.)

Why the bumps and bruises? Well, because humans. To its credit, Cruise has chosen to test its cars in a super-challenging environment, the dense and oft-surprising streets of San Francisco. (In January, at least one pedestrian leapt into a Mission neighborhood crosswalk, “shouting, and struck the left side of the Cruise AV's rear bumper and hatch with his entire body,” according to a DMV report.) Here, there are many opportunities to capture data on edge cases, the sorts of road activity (Traffic! Weird lane changes! Foul fog! Construction zones!) that self-driving cars need to understand before they can perform perfectly every time.

The company also says it purposefully programs its cars to be almost too-cautious, to brake when, for example, a cyclist even hints that she might be darting across the road. Last year, CEO Kyle Vogt told reporters that Cruise wants to nail safety before it can focus on smoothing out the herky-jerky behavior that might leave riders a bit queasy, and fellow road users a bit confused. (The company plans to launch a limited driverless taxi service in 2019.)

That said, the rear-endings demonstrate that the technology is far from perfect. Cruise cars follow road laws to a T, coming to full stops at stop signs and braking for yellow lights. But human drivers don’t—and Cruise cars will be self-driving among humans for decades to come. “There has to be a way for these cars and people to share the road in a more efficient manner and understanding manner,” a Cruise spokesperson said.

And that’s annoying, because humans are deeply imperfect. The fact that a driver Cruise trained to work with these vehicles still managed to rear-end one emphasizes exactly how flawed they are. To create a robot that operates with perfect safety among people, the vehicles just might have to learn to emulate some of their worst qualities. Just as long as they don't start slapping people.

A Cruise-on-Cruise Crash Reveals the Hardest Thing About Self-Driving Tech

In a strange twist of fate bordering on the ironic, an autonomous vehicle recently collided with another autonomous vehicle while traveling the streets of San Francisco.

No matter. AV tech still came out ahead according to the autonomous vehicle crash report provided by the California Department of Motor Vehicles.

While both Chevrolet Bolts involved in the June 11 crash are autonomous vehicles owned and operated by GM Cruise, the vehicle at fault had been functioning in conventional mode at the time of the fender bender. That’s according to GM Cruise which, as required by California, reported the incident.

So far for 2018, AV companies have filed 29 accident reports with the DMV. That’s out of a total of 82 collisions since the DMV began tracking such incidents in 2014. For 2017, a total of 28 AV accidents were reported which makes 2018 the worst year yet for AV accidents.

Yes, there are more AVs on the road in California so naturally we can expect to see more accidents. This of course opens the door to even more AV data particularly in congested metros where driving challenges are a constant.

Once an accident occurs, the AV company or companies involved must submit an autonomous vehicle collision report. Here’s what GM Cruise wrote in describing the accident last month between their two Chevy Bolts:

A Cruise autonomous vehicle (“Cruise AV”) was rear-ended while operating in autonomous mode on Bryant Street between 10th Street and 11th Street. The Cruise AV had just made a left turn from 11th Street onto Bryant when a second Cruise autonomous vehicle directly behind operating in conventional mode contacted the rear bumper of the Cruise AV, causing minor scuffs to both vehicles. There were no injuries and the police were not called.

A check box section on the form also allows reporting parties to describe the weather, lighting, roadway surface, roadway conditions, movement preceding collision, type of collision and other factors. Vehicle damage can be noted by shading in boxes of a rectangular graphic that resembles a van (shown above).

Of course, the real irony here is that in a state where businesses face some of the toughest regulations in the nation, AV companies—so long as there’s no major damage or injuries—can work on their own to report these incidents to the state. No government involvement is required.

As we’ve reported before, it’s hard to forget some of these man vs. machine moments. Consider this July 8 incident description, also submitted by GM Cruise:

A Cruise autonomous vehicle (“Cruise AV”) while operating in autonomous mode was involved in an accident on westbound Sutter Street at the intersection of Sansome Street when a jaywalking pedestrian approached the Cruise AV and intentionally stepped up onto the hood of the vehicle while the Cruise AV was stopped at a red light resulting in a dent on the hood. The pedestrian then stepped off and walked away. There were no injuries and the police were not called.

That last line is pretty typical throughout these reports. But in the case of the hood-stomping pedestrian (we’re assuming it’s a man, but who knows), it would have been a good time to call the police. AVs are clearly marked and easy to spot. While this pedestrian may have had Luddite leanings, we’ll never know. The offender may habitually walk on the hoods of any vehicles that get in his way. (Good luck doing that with a Class 8 rig.)

So yes it’s wrong that the police were not notified and that this person got away with property damage. It’s also wrong that no video was released. Doing so would have generated a mountain of sympathy for the nerdy Bolt, put like-minded half-wits on notice and brought in a gazillion hits for media outlets around the globe. And who knows? Maybe it would have led to an arrest, a hefty fine and several ‘be nice’ indoctrination classes.

Following some unsavory headlines involving AVs, the burgeoning technology needs all the help it can get. Hopefully, the next time a Chevy Bolt or any other AV in California is trampled underfoot, the police will be notified and a video will make the rounds. No one—even if they reside at the nation’s capital of all things rebellious—should get away with intentional property damage.

Autonomous vehicles crash together in California