Newshounds and auto aficionados were baffled last week after seeing what looked like a coffee maker atop the new prototype version of Google’s self-driving car. What, some wondered, was that doing up there?
As it turns out, Google isn’t brewing espresso. The cylindrical unit mounted on a bracket atop the car’s roof is a spinning drum containing 64 lasers. The drum, which rotates 20 times per second, provides the car’s computers with a 360-degree view of its surroundings. By doing 1.3 million measurements every second, it enables Google’s autonomous car to drive forward, back up, change lanes, and make turns by “seeing” virtually every object in its vicinity before making a decision.
Google’s self-driving vehicle draws information from the front, sides, and rear by using a rotating laser system atop its roof.
”The car needs to know what’s happening around it,” Wolfgang Juchmann of Velodyne LIDAR told Design News. “The most important focus is forward, of course, because that’s the direction you usually drive. But it needs to know if someone is coming up alongside, or passing from behind.”
Built by Velodyne and known as the HDL-64E LIDAR, the new unit operates by making “time of flight” calculations -- it sends a pulse of light and measures the time it takes to come back. Therefore, it knows the distance to every object in a 100-meter vicinity. Its 64 laser lines are oriented on different angles, including up, down, and everything in between. A 5 Hz to 15 Hz user-selectable frame rate lets users tailor how much data the system grabs.
Velodyne LIDAR’s HDL-64E operates by making “time of flight” calculations -- it sends a pulse of light and measures the time it takes to come back. Therefore, it knows the distance to every object in a 100-meter vicinity. Its 64 laser lines are oriented on different angles, including up, down, and everything in between.
(Source: Velodyne LIDAR)
In demonstrations of the HDL-64E, Juchmann typically uses a laptop computer to process the data. Google, however, must provide additional computing power to enable the car to process data from other sensors -- such as cameras, gyroscopes, and accelerometers -- and to help it make driving decisions. “The car needs a more powerful computer,” Juchmann told us. “It has to recognize, is there a pedestrian in my path? Is it a bicycle? How fast is it going? Should I brake? Should I turn?”
Still, Juchmann said that the view of the autonomous car as a “supercomputer on wheels” is fading. The insides of Google’s self-driving Lexus and autonomous Prius are “very clean," and not cluttered by computers, he said.
In a blog posted on its website, Google said it is building approximately 100 of the prototype vehicles. All are designed from the ground up and have no steering wheels or foot pedals. The company said it plans to start testing early versions of them later this summer.
But the auto industry’s migration toward autonomous vehicles will be a long process, Juchmann warned. ”Because Google is making everyone aware of the self-driving car, it’s coming along faster than anyone expected 10 years ago,” he said. “But the cars companies are doing this step by step.”