To navigate snowy roads, Ford autonomous vehicles are equipped with high-resolution 3D maps – complete with information about the road and what’s above it, including road markings, signs, geography, landmarks and topography. Photo courtesy of Ford.

To navigate snowy roads, Ford autonomous vehicles are equipped with high-resolution 3D maps – complete with information about the road and what’s above it, including road markings, signs, geography, landmarks and topography. Photo courtesy of Ford.

Snow-covered roads in Michigan have provided a rigorous testing ground for Ford Fusion Hybrid autonomous research vehicles this past winter, and the challenging road conditions have also helped Ford demonstrate the versatility of the 3D digital mapping capabilities made possible by recent advances in LiDAR (light detection and ranging) systems.

To operate in snow, Ford Fusion Hybrid autonomous vehicles first need to scan the environment to create high-resolution 3D digital maps. By driving the test route in ideal weather, an autonomous vehicle creates highly accurate digital models of the road and surrounding infrastructure using four LiDAR scanners. These scanners generate a total of 2.8 million laser points a second, according to Ford.

The resulting map then serves as a baseline that’s used to identify the car’s position when driving in autonomous mode. Using the LiDAR sensors to scan the environment in real time, the car can locate itself within the mapped area later, when the road is covered in snow.

While mapping their environment, Ford autonomous vehicles collect and process a diverse set of data about the road and surrounding landmarks — signs, buildings, trees and other features. The car collects up to 600 gigabytes per hour, which it uses to create a high-resolution 3D map of the landscape, Ford said.

Ford’s autonomous vehicles generate so many laser points from the LiDAR sensors that some can even bounce off falling snowflakes or raindrops, returning the false impression that there’s an object in the way. Of course, there’s no need to steer around precipitation, so Ford — working with University of Michigan researchers — created an algorithm that recognizes snow and rain, filtering them out of the car’s vision so the vehicle can continue along its path.

When the subject of vehicle navigation comes up, most people think of GPS. But where current GPS is accurate to just more than 10 yards, autonomous operation requires precise vehicle location. By scanning their environment for landmarks, then comparing that information to the 3D digital maps stored in their databanks, Ford’s autonomous vehicles can precisely locate themselves to within a centimeter, Ford said.

In addition to LiDAR sensors, Ford uses cameras and radar to monitor the environment around the vehicle, with the data generated from all of those sensors fused together in a process known as sensor fusion. This process results in 360-degree situational awareness, according to Ford.

Sensor fusion means that one inactive sensor — perhaps caused by ice, snow, grime or debris buildup on a sensor lens — doesn’t necessarily hinder autonomous driving. Still, Ford autonomous vehicles monitor all LiDAR, camera and radar systems to identify the deterioration of sensor performance. This helps keep sensors in ideal working order, Ford said. Eventually, the cars might be able to handle ice and grime buildup themselves through self-cleaning or defogging measures.

The company's winter weather road testing takes place in Michigan, including at Mcity — a 32-acre, real-world driving environment at the University of Michigan. Ford’s testing on this full-scale simulated urban campus is aimed at supporting the company’s mission to advance autonomous driving. 

0 Comments