Incident 70: Self-driving cars in winter

Description: Volvo autonomous driving XC90 SUV's experienced issues in Jokkmokk, Sweden when sensors used for automated driving iced over during the winter, rendering them useless.
Alleged: Volvo developed and deployed an AI system, which harmed Volvo , drivers in Jokkmokk and drivers in Sweden.

Suggested citation format

Anonymous. (2016-02-10) Incident Number 70. in McGregor, S. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
70
Report Count
5
Incident Date
2016-02-10
Editors
Sean McGregor, Khoa Lam

Tools

New ReportNew ReportDiscoverDiscover

CSET Taxonomy Classifications

Taxonomy Details

Full Description

Volvo autonomous driving XC90 SUV's experienced issues in Jokkmokk, Sweden when sensors used for automated driving iced over during the winter, rendering them useless. As a response, Volvo has moved the sensors behind the windshield so windshield wipers can wipe away snow and ice during winter weather.

Short Description

Volvo autonomous driving XC90 SUV's experienced issues in Jokkmokk, Sweden when sensors used for automated driving iced over during the winter, rendering them useless.

Severity

Unclear/unknown

AI System Description

Volvo XC90 autonomous driving cars using radar, LIDAR, and sonar

System Developer

Volvo

Sector of Deployment

Transportation and storage

Relevant AI functions

Perception, Cognition, Action

AI Techniques

Lidar technology, radar sensors, environmental sensing

AI Applications

autonomous driving, self-driving vehicle, environmental sensing

Location

Jokkmokk, Sweden

Named Entities

Volvo

Technology Purveyor

Volvo

Beginning Date

2016-02-10T08:00:00.000Z

Ending Date

2016-02-10T08:00:00.000Z

Near Miss

Near miss

Intent

Unclear

Lives Lost

No

Infrastructure Sectors

Transportation

Data Inputs

traffic patterns, radar, LIDAR, video camera footage

Incidents Reports

DETROIT (Bloomberg) -- In Jokkmokk, a tiny hamlet just north of the Arctic Circle in Sweden, where temperatures can dip to 50 below, Volvo Cars’ self-driving XC90 SUV met its match: frozen flakes that caked on radar sensors essential to reading the road. Suddenly, the SUV was blind.

“It’s really difficult, especially when you have the snow smoke from the car in front,” said Marcus Rothoff, director of Volvo’s autonomous-driving program. “A bit of ice, you can manage. But when it starts building up, you just lose functionality.”

After moving the sensors around to various spots on the front, Volvo engineers finally found a solution. Next year, when Swedish drivers take their hands off the wheel of leased XC90s in the world’s first public test of autonomous technology, the radar will be nestled behind the windshield, where wipers can clear the ice and snow.

As automakers race to get robot cars on the road, they’re encountering an obstacle very familiar to humans: Old Man Winter. Simple snow can render the most advanced computing power useless and leave vehicles dead on the highway. That’s why major players including Volvo Cars, owned by Zhejiang Geely Holding Group Co.; Google, a unit of Alphabet Inc.; and Ford Motor Co. are stepping up their efforts to prevent snow blindness.

'A lot of hype'

“There’s been a lot of hype in the media and in the public mind’s eye” about the technology for self-driving cars “being nearly solved,” said Ryan Eustice, an associate professor of engineering at the University of Michigan who is working with Ford on snow testing. “But a car that’s able to do nationwide, all-weather driving, under all conditions, that’s still the Holy Grail.”

The struggle to cure snow blindness is among a number of engineering problems still to be resolved, including training cars not to drive too timidly, causing humans to crash into them, and ethical dilemmas such as whether to hit a school bus or go over a cliff when an accident is unavoidable.

With about 70 percent of the U.S. population living in the snow belt, learning how to navigate in rough weather is crucial for driverless cars to gain mass appeal, realize their potential to reduce road deaths dramatically and overcome growing traffic congestion.

“If your vision is obscured as a human in strong flurries, then vision sensors are going to encounter the exact same obstacles,” said Jeremy Carlson, an IHS Automotive senior analyst who specializes in autonomy.

High-speed sensors

Driverless cars “see” the world around them using data from cameras, radar and lidar, which bounces laser light off objects to assess shape and location. High-speed processors crunch the data to provide 360-degree detection of lanes, traffic, pedestrians, signs, stoplights and anything else in the vehicle’s path. That enables it to decide, in real time, where to go.

Winter makes this harder. Snow can shroud cameras and cover the lane lines they must see to keep a driverless car on course. Lidar also is limited because the light pulses it emits reflect off flakes, potentially confusing a curtain of falling snow with something to avoid, causing the vehicle to hit the brakes.

Radar, which senses objects by emitting electromagnetic waves, is better. It also has the longest track record: It’s been used since 1999 in adaptive cruise control to maintain a set distance from other vehicles.

Self-driving cars succumb to snow blindness as driving lanes disappear

In Jokkmokk, a tiny hamlet just north of the Arctic Circle in Sweden, where temperatures can dip to 50 below, Volvo Cars’ self-driving XC90 sport-utility vehicle met its match: frozen flakes that caked on radar sensors essential to reading the road. Suddenly, the SUV was blind.

“It’s really difficult, especially when you have the snow smoke from the car in front,” said Marcus Rothoff, director of Volvo’s autonomous-driving program. “A bit of ice, you can manage. But when it starts building up, you just lose functionality.”

After moving the sensors around to various spots on the front, Volvo engineers finally found a solution. Next year, when Swedish drivers take their hands off the wheel of leased XC90s in the world’s first public test of autonomous technology, the radar will be nestled behind the windshield, where wipers can clear the ice and snow.

As automakers race to get robot cars on the road, they’re encountering an obstacle very familiar to humans: Old Man Winter. Simple snow can render the most advanced computing power useless and leave vehicles dead on the highway. That’s why major players including Volvo Cars, owned by Zhejiang Geely Holding Group Co.; Google, a unit of Alphabet Inc.; and Ford Motor Co. are stepping up their efforts to prevent snow blindness.

‘A Lot of Hype’

“There’s been a lot of hype in the media and in the public mind’s eye” about the technology for self-driving cars “being nearly solved,” said Ryan Eustice, an associate professor of engineering at the University of Michigan who is working with Ford on snow testing. “But a car that’s able to do nationwide, all-weather driving, under all conditions, that’s still the Holy Grail.”

The struggle to cure snow blindness is among a number of engineering problems still to be resolved, including training cars not to drive too timidly, causing humans to crash into them, and ethical dilemmas such as whether to hit a school bus or go over a cliff when an accident is unavoidable.

With about 70 percent of the U.S. population living in the snow belt, learning how to navigate in rough weather is crucial for driverless cars to gain mass appeal, realize their potential to reduce road deaths dramatically and overcome growing traffic congestion.

“If your vision is obscured as a human in strong flurries, then vision sensors are going to encounter the exact same obstacles,” said Jeremy Carlson, an IHS Automotive senior analyst who specializes in autonomy.

High-Speed Sensors

Driverless cars “see” the world around them using data from cameras, radar and “lidar,” which bounces laser light off objects to assess shape and location. High-speed processors crunch the data to provide 360-degree detection of lanes, traffic, pedestrians, signs, stoplights and anything else in the vehicle’s path. That enables it to decide, in real time, where to go.

Winter makes this harder. Snow can shroud cameras and cover the lane lines they must see to keep a driverless car on course. Lidar also is limited because the light pulses it emits reflect off flakes, potentially confusing a curtain of falling snow with something to avoid, causing the vehicle to hit the brakes.

Radar, which senses objects by emitting electromagnetic waves, is better. It also has the longest track record: It’s been used since 1999 in adaptive cruise control to maintain a set distance from other vehicles.

Key Element

“If everything else fails, I can follow the preceding traffic,” said Kay Stepper, vice president and head of the automated-driving unit at German supplier Robert Bosch LLC. “The radar is the key element of that because of its ability to work robustly in inclement weather.”

One sensor alone will never be enough, however. “You need different types of sensors looking at the same thing, detecting the same object, to very confidently allow the vehicle to do what you expect,” Carlson said.

Google, based in Mountain View, California, is searching for solutions by logging snow miles with its self-driving Lexus SUVs near Lake Tahoe, on the Nevada-California border. Ford is testing driverless Fusion sedans in snowstorms at the University of Michigan’s Mcity, a 32-acre (13-hectare) faux neighborhood for robot cars on the Ann Arbor school’s North Campus. Both companies declined interview requests.

Ford believes it has found a solution to snow-blanketed lane lines, it said in a press release. It scans roads in advance with lidar to create high-definition 3-D maps that are much more accurate than images from global-positioning satellites, which can be 10 meters (33 feet) off.

Pinpoint Location

Eustice, who has worked with the Dearborn, Michigan, company on the problem since 2012, said they’ve also found a way to filter the “noise” created by falling snowflakes. The filtered data combined with information from the 3-D maps enable the car to pinpoint its location to within “tens of centimeters,” he said.

“That’s high enough accuracy that we know exactly what lane we’re in,” and “helps the robot to underst

Self-Driving Vehicles Meet Their Match When Snow Creates Sensor Blindness

There will come a day when automated driving technology steps out of the controlled testing environments and into the real world. For Volvo, that day will arrive in 2017. But are self-driving cars ready to face the elements?

Volvo describes its upcoming Drive Me project as “the first real world trial of autonomous cars.” Yes, there are currently self-driving cars circulating the streets of San Francisco and other locations. But the difference with Drive Me is: these will not be test vehicles. These will be production XC90s with real customers behind the wheel. In 2017, Volvo will start to lease 100 SUVs equipped with IntelliSafe Autopilot which will take to the suburban roads of Gothenburg, Sweden. The technology will finally allow them to experience fully automated driving (SAE Level 4) while on some of the city’s main highway-like commuter routes. Real world testing is an important step in the journey towards safer roads – as Dr Erik Coelingh, Senior Technical Leader for Safety and Driver Support at Volvo explains: “Drive Me is not just about developing a demo car or concept car; it is about developing something that ordinary people can use in their daily lives on the roads as we know them today.” Get your driverless updates every week exclusive insights

in-depth analysis

most relevant news E-mail I agree to 2025AD's Legal & Data Protection rules.

But with the real world roll-out comes a real world problem: the weather. “When you go and deploy the technology in the real world, you have to be able to deal with all weather conditions that may occur,” Coelingh says, and concedes: “We also know that the technology will not work in all weather conditions – so we have to prepare for that. It might be different in the long-term future, but on day one there will be limitations as to what the car can and cannot deal with.”

Prevailing problems So what are these limitations? As drivers, most of us don't like seeing a weather report that reads: "Heavy snow throughout the day. Motorists advised to take extreme caution." Driving rain or a blanket of snow on the ground suddenly makes driving a whole lot harder. Dealing with poor visibility, loss of traction and buried reference points test even the best drivers. But this is not only the case for humans. Such conditions also challenge the sensors that are fundamental to automated driving.

Typically, LiDAR sensors, which emit short pulses of laser light, co-operate with cameras to sense nearby objects and allow the automated vehicle to create a real-time, high-definition 3D image of its surroundings. This works extremely well in fine weather. But what happens when the cameras and sensors can't see the road markings because they are buried in snow? What if the sensor lens is covered by dirt or the snowflakes are mistaken for objects? Then, the autonomous technology has a real problem. Not usually an issue in sunny California, of course - but Gothenburg has been known to host some harsh winters. Since safety is paramount for the Drive Me project, Volvo's approach is to know its enemy. "Knowing the limitations of the technology, you have to get information about the weather conditions - lighting conditions, visibility conditions etc. - in order to answer the question: should the self-driving mode be made available, yes or no?" Dr Coelingh explains.

Snowy, with a chance of driverless cars Dr Coelingh’s team has therefore created a mechanism by which the Drive Me cars will make this decision. “We have designed it so that each car will make an assessment to see if the conditions are ok for self-driving. That assessment will then be sent via a connectivity link to the Volvo cloud. Here, data from all cars will be aggregated and a decision is made if the weather in Gothenburg is appropriate for self-driving or not.” Dr Coelingh goes on to explain how self-driving mode will be allowed if an approval signal is sent back to the car after which, the driver is offered autonomous mode via the interface. This signal will only be sent when several elements align: the car must have the correct map versions and connection to the server. The traffic and weather conditions must be within scope. “The challenge then becomes making that scope sufficiently big. That means being robust in different weather conditions,” Coelingh explains. For this, often the simple ideas are the best: “One thing we have already done for the new XC90 is to place the main radar behind the windscreen so that the area in front will always be cleared by the wipers.” But there are more ways to make the car more weather-resistant: “We will have active cleaning on the sensors on the outside of the car to keep them free from any dirt or debris. Moreover, the XC90 will have optimized cameras which will mean a better performance in low light.”

The extended forecast One fact remains: at present, there is no state-of-the-art automated driving technology that works well on snow-covered roads. Yet the ability to do so is somethi

Self-driving cars in winter

Fully automated cars don’t drink and drive, fall asleep at the wheel, text, talk on the phone or put on makeup while driving. With their sensors and processors, they navigate roads without any of these human failings that can result in accidents.

But there is something self-driving cars do not yet deal with very well – the unexpected. The human brain is still better than any computer at making decisions in the face of sudden, unforeseen events on the road – a child running into the street, a swerving cyclist or a fallen tree limb.

Here are five situations that, for now at least, often confound self-driving cars and the engineers working on them.

5 Things That Give Self-Driving Cars Headaches

Concerns raised about future testing as footage suggests fatal collision in Arizona was failing of system’s most basic functions

Video of the first self-driving car crash that killed a pedestrian suggests a “catastrophic failure” by Uber’s technology, according to experts in the field, who said the footage showed the autonomous system erring on one of its most basic functions.

Days after a self-driving Uber SUV struck a 49-year-old pedestrian while she was crossing the street with her bicycle in Tempe, Arizona, footage released by police revealed that the vehicle was moving in autonomous mode and did not appear to slow down or detect the woman even though she was visible in front of the car prior to the collision. Multiple experts have raised questions about Uber’s Lidar technology, which is the system of lasers that the autonomous cars uses to “see” the world around them.

“This is exactly the type of situation that Lidar and radar are supposed to pick up,” said David King, an Arizona State University professor and transportation planning expert. “This is a catastrophic failure that happened with Uber’s technology.”

'Uber should be shut down': friends of self-driving car crash victim seek justice Read more

The videos of the car hitting Elaine Herzberg also demonstrated that the “safety driver” inside the car did not seem to be monitoring the road, raising concerns about the testing systems Uber and other self-driving car companies have deployed in cities across the US.

“This safety driver was not doing any safety monitoring,” said Missy Cummings, a Duke University engineering professor who has testified about the dangers of self-driving technology. Research has shown that humans monitoring an automated system are likely to become bored and disengaged, she said, which makes this current phase of semi-autonomous testing particularly dangerous.

“The problem of complacent safety drivers is going to be a problem for every company.”

The footage “strongly suggests a failure by Uber’s automated driving system and a lack of due care by Uber’s driver”, Bryant Walker Smith, a University of South Carolina law school professor and autonomous vehicle expert, said in an email. He noted that the victim is visible about two seconds before the collision, saying: “This is similar to the average reaction time for a driver. That means an alert driver may have at least attempted to swerve or brake.”

The car was traveling at 38 miles per hour at 10pm on Sunday, according to the Tempe police chief, Sylvia Moir, who told a reporter that she thought the video showed Uber was not at fault. Experts who reviewed the footage, however, said the opposite appeared to be true.

“I really don’t understand why Lidar didn’t pick this up,” said Ryan Calo, a University of Washington law professor and self-driving expert. “This video does not absolve Uber.”

Facebook Twitter Pinterest An Uber self-driving Volvo fitted with ‘Lidar’ technology. Photograph: Uber Handout/EPA

Even though the video appeared dark, King said there was likely more visibility than the footage suggested and noted that the darkness should not affect the car’s detection abilities.

“Shadows don’t matter to Lidar,” added Cummings. “There is no question it should have been able to see her.”

Police have emphasized that the victim was not in a crosswalk at the time of the crash, but experts said the technology still should have stopped the vehicle, a Volvo, and King noted that the exact section where Herzberg entered the street is a common area for pedestrians to cross near a local park.

John Simpson, the privacy and technology project director with Consumer Watchdog, said the video revealed a “complete failure” of Uber’s technology and its safety protocols, and said all testing programs on public roads should be suspended while the case is under investigation.

“Uber appears to be a company that has been rushing and taking shortcuts to get these things on the road,” said Simpson, noting that Arizona leaders lured the corporation to its state with promises of fewer regulations, after Uber fought with California over its vehicles running red lights. “It’s inexcusable.”

Uber, which temporarily suspended testing, declined to comment on the causes of the crash. A spokesperson said in a statement that the video was “disturbing and heartbreaking”, adding: “Our cars remain grounded, and we’re assisting local, state and federal authorities in any way we can.”

Uber crash shows 'catastrophic failure' of self-driving technology, experts say

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents