Some day, you’ll see videos of it rescuing battlefield casualties. Or maybe you’ll pass it as it strides down a crowded hospital corridor.
Either way, you’re likely to be unprepared for the first sighting of the Bear (Battlefield Extraction Assist Robot). This, after all, is no automotive paint robot. Developed by a Massachusetts-based start-up for the U.S. Army, it’s only remotely related to the tens of thousands of industrial robots that have served as its ancestors. On the robotics scale, it’s less like a production line assembly robot, and more like Star Wars’ C3PO or Schwarzenegger’s Terminator.
In many respects, it seems to be drawn straight from the annals of science fiction. Unlike its predecessors, the Bear’s mobility doesn’t depend on slides or gantries or rotary tables. It has legs, knees, elbows and a face. Soon, it will squat, pick up a 250-lb man and carry him down a flight of stairs. Moreover, it can keep its balance if a wounded soldier moves in its arms. It’s strong; it’s smart; it’s mobile; it’s … humanoid.
“The farther we go with this, the closer it comes to a human android-type form,” notes Gary Gilbert, chief of knowledge engineering for the U.S. Army Telemedicine and Technology Research Center. “The humanoid form has certain characteristics that enable it to do a lot of things we want to do. It turns out maybe evolution knew what it was doing.”
Serving A Market Gap
More than three years in the making, the Bear has indeed adopted the humanoid form, and it’s looking more human all the time. U.S. Army engineers say they wanted those human characteristics because the battlefield tasks it will perform require agility and strength. Initially, those tasks will include rescuing battlefield casualties and disposing of bombs – operations that are better done by robots than by live soldiers.
“It needs to get to a site, perform its mission, return safely, and protect casualties from threats in the environment,” Gilbert says.
Although the Bear now uses wheels for mobility, its current prototype employs jointed track-based legs, which will enable it to carry out next-level tasks. Those include attaching itself to a ground vehicle, riding out to a battle scene, getting off the vehicle, finding the casualty, and loading him or her onto a stretcher for evacuation.
Daniel Theobald, president of Vecna Technologies Inc., says he invented the Bear robot because he saw a gap in the mobile robotics marketplace. The market, he says, was made up of two broad classes of robots: small “virtual presence robots” that could scoot beneath a car while searching for bombs; and large, remotely operated vehicles, including tanks and drones.
“We saw a real vacuum between them,” Theobald says. “There was a need for a robot that could go into tights spaces – in buildings and up stairs, where vehicles can’t go. But, at the same time, there was also a need for that robot to be strong enough to manipulate the environment.”
Theobald foresaw his new robot being able to lift hundreds of pounds, thus exceeding the strength of small mobile robots, which typically cannot lift more than about seven pounds. By endowing the robot with such strength, Theobald believed it could fill an important niche – hoisting wounded soldiers, lifting small vehicles, rescuing civilians during nuclear or biological attacks, and checking for bombs beneath the carcasses of dead animals. He even foresaw it being used in hospital settings, where it could move patients in their beds and aid amputees or the elderly.
“You have to be strong in order to manipulate the environment in a significant way,” Theobald says. “You need the ability to pick up people, move rubble, or even lift a car to help someone who’s trapped.”
Making that happen, however, was no small engineering feat. To accomplish it, Theobald considered it critical that the robot use hydraulics to power its upper body. He employed 1,500-psi hydraulic cylinders from Quincy Ortman Cylinders, fed by valves from HydraForce Inc. Vecna’s engineering team also custom-designed an orientation-independent hydraulic reservoir for the system, mainly so that hydraulic fluid wouldn’t spill from the robot when it bent down or assumed an unusual position. The hydraulic system enables the Bear’s upper arms and torso to lift approximately 400 pounds.
“There’s definitely a trade-off to using hydraulics,” says Theobald, whose graduate work at MIT included developing Web-based control algorithms for a robotic Mars explorer. “Hydraulic systems are fairly heavy and you pay a penalty for that. But the benefit is that you can get all the power into one joint at one time. Whereas, if you distributed electric motors around the robot, you would only get what’s available from each individual motor. To get equal power, you would need huge electric motors.”
Even as it lifts massive weights, however, the Bear faces a separate challenge: balance. To keep it from tipping as it lifts a wounded soldier, steps over a log, or walks down a hill, the Bear’s engineers also employed dynamic balancing.
“When you scale down the footprint of a robot while keeping it fairly massive, you end up with a balance problem,” Theobald says.
Dynamic balancing addressed that issue by keeping the robot’s center of gravity over its legs (or wheels, as the case may be). Much like a human, the Bear had to learn to shift its weight while standing. If it tipped forward, it had to learn to lean back to place its center of gravity over its feet. Conversely, if it tipped backward, it had to lean forward.
To accomplish that, Theobald and robotics product manager, Jamie Nichol, incorporated sensors that enable the Bear to keep track of its limbs, torso and legs. A torso-based inertial measurement unit from SpaceAge Control Inc. detects the robot’s attitude, while optical joint encoders from U.S. Digital track angular displacement of body parts. Signals from the encoders are sent to 21 PIC microcontrollers from Microchip Technology, Inc., which are incorporated in the robot’s joints. Joint microcontrollers are segregated into five sub-networks: left arm; right arm; left leg; right leg; and torso.
“We wanted to give every joint its own smarts,” notes Jamie Nichol, whose Ph.D. work at Stanford included mechatronics and kinematics. “By doing that, we reduced the amount of network traffic in the robot.”
Reducing that network traffic was critical, Vecna engineers say, because the robot employs a central computer that must do perform the highest level chores. The Linux-based central processing computer, an EPIA-M mother board from Via Technologies, does the more intense computing, including running the custom-designed balancing algorithms and high-level coordination programs. Based on decisions made by those programs, the CPU sends signals the microcontrollers at the nodes, which “talk” to the motors, valves, and other actuators that power the robot.
The Bear’s legs, or wheels, also take direction from microcontrollers located at the knees and hips. While early versions of the Bear have employed wheels for movement, current prototypes incorporate four motor-driven tracks. The tracks – located above and below the knees on each leg – are powered by a 2 HP brush-type permanent magnet from MagMotor Corp., through a planetary gearbox reduction. In essence, the tracks form the robot’s legs, enabling it to stand straight up and walk, climb stairs, or step over obstacles. See additional videos of Bear in action on YouTube.
“We designed two independent legs with tracks, so it’s able to crawl up the stairs while maintaining positive contact with the top of every stair,” Theobald says. By doing so, the massive robot doesn’t break off a chunk of stairway as it ascends or descends. “Obviously, when you’re carrying a human being, you don’t want to take risks you don’t need to take,” he says.
Next Challenge: Autonomy
For reasons such as those, U.S. Army engineers say they’re pleased with the humanoid configuration of the Bear.
“A lot of people said, ‘You don’t need that,’” says Gilbert of the U.S. Army. “They told us, ‘Just use four wheels or a forklift.’ But it turns out those forms can’t negotiate stairs; they can’t corner sharply enough; and they’re not gentle enough.”
Gilbert adds that the Army also hopes to endow future Bear-type robots with autonomous intelligence. Today, he says, the Bear is still tele-operated through remote control. Eventually, though, Army engineers hope to use laser, radar and sonar-type sensors to give future robots the ability to sense, understand and deal with their environments.
“Robots are still in their infancy,” Gilbert says. “And autonomy is still the biggest challenge in robotics.”
For now, though, the Bear’s engineers have succeeded in attaining the program’s first two goals: proof-of-concept and the ability to climb stairs. It can now stand up on its jointed, track-style legs and run. In the near future, however, there is still a long list of goals: react to unexpected obstacles; run sideways down a hill; sense the environment around it.
“The ultimate objective of this program is complete autonomy,” Gilbert says. “And the only way to get there is to start, and that’s what we’ve done.”
Impressive that a machine can do this, however that did not seem like a battlefield. Even though it was pretty fast for a robot in extracting the wounded soldier I don't see the enemy sitting around waiting for this robot to do its thing. Then you have one wounded dyeing soldier and a broken piece of machinery that was just shot up. Either replace the soldier with a machine or put more foot soldiers on the ground I don't see a way around this???
One way to keep a Formula One racing team moving at breakneck speed in the pit and at the test facility is to bring CAD drawings of the racing vehicle’s parts down to the test facility and even out to the track.
Most of us would just as soon step on a cockroach rather than study it, but that’s just what researchers at UC Berkeley did in the pursuit of building small, nimble robots suitable for disaster-recovery and search-and-rescue missions.
Design engineers need to prepare for a future in which their electronic products will use not just one or two, but possibly many user interfaces that involve touch, vision, gestures, and even eye movements.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies.
You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived.
So if you can't attend live, attend at your convenience.