Does Uncle Sam need fully autonomous robotic vehicles? With billions of development and procurement dollars now going into unmanned military vehicles for the air, land and sea, you might think the answer is an unqualified “yes.” But the answer is not so straightforward when you throw the notion of complete autonomy into the mix.
The military already uses thousands of unmanned vehicles and will likely use many more in the coming years. Ground troops in Iraq and Afghanistan, for example, have employed small robots for tasks such as bomb disposal. The Army’s Future Combat Systems program, meanwhile, includes small and large unmanned ground vehicles, from a 30-lb Small Unmanned Ground Vehicle to 2.5-ton Multifunctional Utility/Logistics and Equipment vehicles. Unmanned underwater vehicles are likewise being pressed into Naval service for tasks such as searching for mines.
And the skies above our war zones swarm with unmanned aircraft. Steve Zaloga, a senior military analyst with the Teal Group and one of the authors of the group’s 2008 study on unmanned aerial vehicles (UAVs), says the military had 520 UAVs in 2006, up from just 127 in 2002. “Not only are there a lot more UAVs but they’re actually being used more often, too,” Zaloga says, adding that these unmanned aircraft chalked up more than 160,000 flight hours in 2006, up from 30,000 hours in 2002.
The Teal Group study projects nearly $55 billion dollars will be spent on UAVs over the next 10 years, with the spending growing from $3.4 billion in 2008 to $7.3 billion by 2017. “Almost one hundred percent of that spending will come from the military. There’s almost no civilian UAV market at the moment,” Zaloga says.
Unmanned, however, doesn’t necessarily mean autonomous. Unmanned vehicles tend to fall somewhere on a spectrum of autonomy — one that ranges from full remote control by a human operator to independent action based entirely on programming. According to Zaloga, unmanned aircraft developers have shied away from total autonomy because of the military’s belief that humans can best process real-time intelligence data collected by reconnaissance UAVs — not to mention make the best firing decisions in the case of armed UAVs. “Even newer UAV platforms that could be operated without any human control probably won’t be,” he predicts.
Ground vehicles, by contrast, recently moved in the opposite direction when it comes to autonomous operations. Last month, the prospects for truck-sized robotic vehicles took a big step forward when the Defense Advanced Research Projects Agency (DARPA) held its Urban Challenge race, the most difficult test of full-size robotic vehicles to date.
Unlike two previous races that took place on fixed courses in the desert, the Urban Challenge took place on the streets of George Air Force Base, a Victorville, CA, facility the military has used to train troops for urban warfare. The race required the robotic vehicles, 11 finalists in all, to navigate through a course that simulated a variety of city driving conditions. The robots had to merge into moving traffic, make their way through intersections and traffic circles, find and drive into parking spots — all the while avoiding other moving vehicles and static obstacles. And in what may be a big improvement over the many human drivers who barely passed their driver’s test, these robots had to obey California driving laws to avoid disqualification.
The vehicles had to run the entire race with no one in the drivers’ seat or at the end of a remote-control joystick. Instead, these vehicles made driving decisions on their own, using massive amounts of on-board computing power, banks of sensors and sophisticated software. “All the vehicles out there at the challenge were simply amazing. Eleven autonomous vehicle were able drive on a complicated course while interacting with each other and 50 other human-driven vehicles. It was a big day for robotics,” says Chris Urmson, technology director for Carnegie Mellon’s Tartan Racing team, the race’s winner.
Averaging about 14 mph over 55 miles, Tartan’s “Boss,” a robotized 2007 Chevy Tahoe bristling with sensors (see photo below), finished the race about 20 minutes ahead of the second-place team, Stanford Racing. Tartan Racing took home a $2 million cash prize for its efforts. And DARPA ended up with a successful demonstration of robot technologies that could make large, fully autonomous vehicles a deployable reality.
Yet, the biggest winner may ultimately be consumers. Whether the technologies found on the Boss and the other finalists make it to the battlefield or not, some will likely end up on civilian automobiles in the coming years. “Some of the systems we used on Boss are very similar to what’s already available on commercial driver assistance systems,” says Michael Darms, a project engineer with Continental Automotive and a member of the Tartan Racing team. “But the challenge also provided a lot of insight into using new autonomous features in ways that could help drivers in the future.” These features might include improved assistance with braking, lane-changing, parking and intersection safety.
One interesting aspect of the race relates to what Tartan Racing, as well as the other finalists, were able to accomplish with technology that’s fully commercial right now or about to be as soon as the next automotive model year.
Boss includes a variety of commercial and next-generation radar systems from Continental, as well as laser sensors from Velodyne and SICK. Like many of the finalists, Boss relied on an off-the-shelf Applanix vehicle positioning system that combines global positioning and inertial data to keep track of the vehicle location. The system gets its processing power from a rack of 10 Intel Core Duo processors and Tartan Racing’s engineers used common software development tools. “It’s true that we used many readily available technologies,” says Urmson. “But it’s how we used them that made all the difference.”
Urmson points to two tough robotics challenges Tartan Racing managed to overcome with its winning vehicle. One relates to system integration and control architecture. “The scope of the system is kind of large,” Urmson says.
And “large” is something of an understatement. Boss takes in, analyzes and acts on information collected by 19 different sensors and a positioning system. And it does so within 0.2 sec overall response time, which roughly equals the reaction time of a typical human drivers. Urmson says Boss’ distributed control system at any given time has about 100 processes running on that bank of Intel Core Duos. “About half of them are logging data and the other half are involved in decision making, motion planning and high level perception,” he says. Such is the complexity of fully autonomous driving that Boss generates about a terabyte of telemetry data every 12 hours. Its software consists of about 300,000 lines of code. “Actually given the complexity of what we’re trying to do, we were kind of proud that the software was only 300,000 lines,” Urmson says.
The second robotics challenge, perhaps one with much wider implications than Tartan Racing’s one-off systems integration, comes down to the team’s ability to give Boss human-like driving smarts, a task that required breakthroughs in robot’s sensor-based perception systems as well as its motion-planning and behavioral algorithms.
The perception problem is a particularly difficult one in Urmson’s view. “Understanding what’s out there in the world, what’s moving, what’s not and where the vehicle is relative to all of that is a challenge,” he says. Its a challenge Tartan engineers solved with a technology that integrates data from complementary sensors into a detailed picture of the robot’s environment. As Urmson explains, radar units primarily handle the long range sensing — out to about 150m — while the laser sensors provide the close range sensing. Typically the two kinds of sensors work together, with some overlap, to identify moving vehicles and static obstacles. “A radar sensor alone can’t necessarily differentiate a car from a pop can or other metallic object in the distance,” he says. “So at close range we use the laser sensor to confirm the radar input or reject any false positives.”
Darms, whose day job at Continental and doctorate in engineering both involve driver assistance technology, calls this sensing system Sensor Fusion and argues it makes Boss unique from a perception standpoint. “This is the first time so many sensors of different types have been combined so successfully in an automotive application,” he says.
As for motion planning, Team Tartan engineers wrote complex algorithms that can generate optimal trajectories for the vehicle in both roadways and the lane-less spaces such as parking lots. Urmson says Boss is capable of calculating 1,000 trajectories/sec. “On the roads, the idea is that we roll out dynamically feasible trajectories in front of the vehicle, ones the vehicle could drive given the bounds of its acceleration. We then check each one to see if it intersects with any obstacle. From the trajectories that haven’t hit anything, we pick one that keeps us the most central in the lane,” he says. In parking lots, an even more complicated algorithm is used to account for all the different ways a vehicle can traverse open space.
Boss’ list of robotic breakthroughs also includes a new behavioral system that governs what Urmson calls “tactical driving decisions.” This system, for example, helps Boss deal with unexpected circumstances in intersections — not just “who goes first” but “what if the wrong car goes first” or “what if a car stops in the middle of the intersection.”
This system, which wasn’t present in Tartan Racing vehicles used in DARPA’s previous Grand Challenges, represents Boss’ common sense. Urmson describes the behavioral system as “basically rule-based” but adds that it has other types of intelligence built in, as well. He’s seen it return some helpful emergent behaviors — that is, behaviors not covered explicitly in Boss’ programming. “This is a very active area of research in robotics in general. How do you give robots common sense or the appearance of common sense,” he says.
If there was any aspect of Boss’ design that didn’t give the engineering team much trouble it was the electromechanical actuation of vehicle controls. Throttle control took place via the Tahoe’s engine control module. “We send a few bits over CANbus,” Urmson says. Brake, steering wheel and shifter actuation all required small electric motors in the cockpit — though Urmson thinks that in a few years these three actuation tasks would all be done by-wire as with the throttle.
Urmson notes the actuation technology is better understood in the engineering community than robotic perception and behavior technologies. But he doesn’t downplay the importance of getting the actuation to work well. “Nothing on a project like this is simple,” he says. “It’s all varying degrees of hard.”
All that hard work doubtless has some military value, but it won’t necessarily put any robot trucks on the battlefield for a few years.
As the Teal Group’s Zaloga points out, full-sized robotic ground vehicles may initially encounter some of the same objections associated with UAVs. “There’s still some fear that fully autonomous ground vehicles may lack the situational awareness and common sense of a human driver,” Zaloga says. And a lack of common sense might manifest itself as diminished ability to adapt to difficult terrain, avoid capture or mount a defense against attacks.
With fully autonomous vehicles still developmental, it’s too soon to tell whether these objections to fully autonomous vehicles will win out over the potential troop safety benefits. In the meantime, don’t be too surprised if you see a driverless SUVs pulling into a parking space at your local supermarket.