"A person can look at a picture and pick out a feature instantly," Reinholtz said. "Our brains are great at doing that. But it's still really hard for a camera-based system to extract the same kind of information from a series of images."
Engineers are continuing to work on it. Infrared cameras enable vehicles to create thermal images of the scene ahead, making it easier to recognize animals or humans in the vehicle's path. Moreover, researchers are working on the development of software that will make sense of camera images and warn the vehicle about approaching obstacles.
Radar-based systems are making similar advancements. High-frequency products of up to 77GHz are being employed to look for obstacles hundreds of meters in front of a vehicle. Already, such systems are making inroads in today's production vehicles for applications such as adaptive cruise control and collision alerts.
Velodyne LIDAR's HDL-64E uses 64 lasers on a spinning turret
to create a "point cloud" of obstacles on the road ahead.
(Source: Velodyne LIDAR)
For autonomous vehicles, the biggest step forward lies in the use of LIDAR (light detection and ranging) devices. A pulsed laser sends out light, which reflects off obstacles and bounces back to an onboard receiver. Measuring the light's time of flight enables the system to know the distance to approaching obstacles. Systems such as Velodyne LIDAR's HDL-64E use up to 64 separate lasers on a rotating turret, spitting out collimated light pulses at high speed, thus enabling the vehicle's computers to create a three-dimensional "point cloud" of obstacles.
"If you're going to make an autonomous decision in real-time, you need to have enough information, just as we do with our eyes, ears, and nose," said Stuart Woods, general manager and executive vice president of Velodyne LIDAR. "And you have to bring the data in very quickly in order to do that."
The road to zero
For engineers, the emergence of such technologies creates a giant challenge. Computing systems (software in particular) must be able to interpret the river of sensor data that arrives every second. "It's all about the software," said Tom Baer, executive director of photonics research at Stanford University. "The integration process -- taking data from many disparate sources -- is the key part. Somehow, you have to take that data and overlap it to form a coherent spatial mosaic that the vehicle can use."