Learning to self-navigate
Making that happen won't be easy. Automakers will need to rely on developments outside the prescribed boundaries of the auto industry -- in universities and at supplier facilities.
The biggest (and possibly most important) participants are the sensor suppliers. Makers of accelerometers and gyroscopes are already working with automotive teams on the development of systems such as dead reckoning, which enables vehicles to know where they are. For dead reckoning, engineers blend information from global positioning satellites (GPS) with inertial data coming from the car's onboard sensors. This gives the vehicle a sense of where it is on a map at any given time. Engineers say that it's necessary to use both sources because GPS systems don't update themselves fast enough to provide exact locations. GPS receives about five signals per second, while inertial sensors may update at rates of 1kHz or 2kHz.
"There's a function called map matching," Muddiman said. "You know all the physical entities -- streets, driveways, intersections -- and you use the inertial data to confirm that you've traversed from one digitized point on the map to another digitized point."
Radar technology, which is already making inroads in adaptive cruise control and collision avoidance, could be employed for forward object detection in autonomous vehicles.
(Source: Freescale Semiconductor)
To get the inertial part of that equation, engineers need to draw data from accelerometers and gyroscopes. They typically use so-called low-G accelerometers, which can sense subtle changes in acceleration and direction down to tenths of a G-load. The sensors make it possible for the vehicle to interpret distance and position at high-resolution levels. Even the smallest movement, such as the changing of lanes, can be picked up by the low-G accelerometers. Gyroscopes add to that knowledge by measuring the vehicle's attitude (pitch, roll, and yaw) and thus filling in the data the accelerometers miss.
For engineers, the real trick lies in taking that mountain of data and blending it to form a coherent picture. To do that, they employ processors. Sensors, for example, may contain onboard processors that filter the digital data and send it to a "base band" applications processor, which also examines GPS data. Typically, such computing chores can be handled by dual- or quad-core processors. "The application processor takes all the data and compares it to determine if the information it's getting from the GPS system is accurate," Muddiman said.
Figuring out a car's location is only one of the chores that makes autonomy possible. A larger, more complex task is determining what's in front of the vehicle and whether it's time to stop or go. To do that, researchers are employing stereo vision cameras, radar systems, and lasers.
Vision, which is not yet playing a big role in autonomous vehicles, uses cameras similar to those employed in smartphones. Engineers say those cameras could provide important information, but researchers haven't figured out how to make sense of it all.