January 31, 2007

11 Min Read
Humanoid Robots Put Best Faces Forward

David Hanson does something most other robot builders just won’t do. He creates robots whose faces mimic the appearance and movements of a real human face. That might not sound like such a radical idea, but it is.

The vast majority of humanoid robots, for all their artificial intelligence and ability to perform some human-like behaviors, still have abstract faces and expressions. One reason is a pervasive view in robot building circles that people find realistic renderings of human facial features creepy. “That’s a view I completely reject. We are naturally attracted to faces and gestures,” says Hanson, president of Hanson Robotics Inc.

Another reason has to do with the significant technical challenges of building an AI-driven motion control system that can credibly simulate the human face’s 48 major muscle groups in response to speech and machine vision inputs. “The robots don’t just have to make the right expression, they have to make the right expression at precisely the right time,” Hanson says. Expressions based on eye contact, for example, might have to take place in less than a third of a second in order to appear realistic.

Hanson, who studied art at the Rhode Island School of Design and is completing a PhD in Interactive Arts and Engineering from the University of Texas, says he’s been working to overcome both problems with “a combination of artistry and engineering.”

The artistry is easy to see. Hanson’s robotic heads, which include well-known depictions of Albert Einstein and writer Philip K. Dick, can typically display many thousands of nuanced, believable facial expressions. “A lot of us intentionally avoid making our robots look too human, but David pulls it off because of his incredible attention to facial detail,” says Aaron Edsinger, a researcher at MIT’s Computer Science and Artificial Intelligence Lab and another builder of humanoid robots.

From an engineering standpoint, Hanson’s robot are a study in how to create low-power, compact motion control systems. Consider that his Einstein robot head, which is actuated by 33 servo motors and related linkages, requires just 10W of power at 6V to achieve its full range of expressions. That’s so little power that it runs on eight AA batteries. Other “expressive robots,” not to mention the “dumb” animatronic exhibits found in theme parks and on movie sets, typically require more than 3 kW of power and must be tethered to power supplies, as well as air sources or hydraulic fluid reservoirs, according to Hanson. “Before Einstein, robots capable of complex facial expression were not self-contained,” he says.

And self-containment matters a great deal if expressive humanoid robots are to take part in what Hanson calls “the robot revolution.”Robots appear likely to become more ingrained in our daily lives over the next few years.The 2006 World Robotics study published by the International Robotics Federation predicts that number ofnew domestic robots, including vacuums and lawnmowers, will reach 3.9 million units by 2009. The study forecasts that there will also be 1.6 million new entertainment and leisure robots by 2009.

Hanson wants to make at least a few of those new robots. Until now, he has delivered just a handful of pricey custom robots, some costing in excess of $130,000, to museums, entertainment venues, and research labs. NASA’s Jet Propulsion Lab., for example, has one of his robotic heads. So does the Cooper-Hewitt museum, which last December installed a new version of his Einstein robot as part of the National Design Triennial. In the coming months, though, Hanson will unveil new biped robots aimed at the consumer market.

Created in conjunction with Tomotaka Takahashi, a well-known Japanese robot maker, these new RoboKind robots will be 14 inches tall with a body designed by Takahashi and a head designed by Hanson. These cartoon-like robots will not only walk around but also offer a range of facial expressions. “Biped robots aren’t that unusual in Japan. There are even soccer matches for them,” says Hanson. “But our new robot will be the only one capable of complex facial expressions.”Hanson says a limited edition version of the new robot will sell for roughly $10,000, while the standard model will cost around $3,000. There will also be a $300 model with a reduced set of features.

All of Hanson’s robots, whether for museum use or the home, share some key technical elements. The ones that get the lion’s share of attention have to do with Hanson’s approach to artificial intelligence — in particular the way in which he generates credible facial expressions based on conversational interaction. The Philip K. Dick robot even won an award from the American Assoc. for Artificial Intelligence.

But Hanson’s mechanical innovations offer lessons for engineers who have to design a compact, efficient motion systems. “If we want to bring robots into our world, power consumption becomes very important,” Hanson says. “Unfortunately, there aren’t many easy ways to decrease power consumption given current actuator technology.”Hanson, however, has figured out a few ways. Here’s a look at them:

Secret Skin

To understand how Hanson’s robots work, it’s a good idea to start with the skin. Rather than pick a skin material only for its cosmetic attributes, Hanson created “Frubber” a patented silicone elastomer whose mechanical properties influence the design of each robot’s entire motion control system.

According to Hanson, Frubber is a foamed platinum-based elastomer that can contains up to 70 percent air by volume. Foamed elastomers are commonplace in industrial uses. What sets Frubber apart, though, is that it has what Hanson calls a “structured porosity.” He has developed proprietary processing methods and elastomer chemistries that allow him to control the size distribution and shape of the foam’s open and closed air cells, which span a size range from about 1 micron up to a few millimeters. Hanson says this size distribution allows a given volume of foam to contain the maximum amount of air — with smaller cells filling in the spaces between larger ones. Hanson can fine-tune the amount of air, so the final density of Frubber can vary.

This airy material moves “a lot like human facial tissue,” Hanson says. It also moves without much force. The foamed elastomer contains only about 30 percent of the material of an equivalent solid elastomer. “So right off the bat, you would expect it to take only about 30 percent as much energy to deform the foam,” Hanson says. “But it actually takes less energy than that.”

Looked at under a microscope, the material has an accordion-like cell structure that unfurls when the material elongates. Hanson says the force required to elongate a typical Frubber formulation is 1/10 that of an equivalent solid elastomer, while the force needed to compress the same bit of Frubber is 1/13 that of the solid material — with the compressive forces slightly higher to return the accordion cells to their collapsed state. Hanson says the material can withstand “hundreds of thousands” of cycles. Elongations up to nearly 900 percent are possible, though the material’s cell walls may start to become damaged at 450 percent.

Frubber’s low-force deformation has some important benefits for the rest of the motion system. “Less force means I can use smaller motors and linkages, which in turn reduces my power requirements,” Hanson says. “Frubber makes it all possible.”

A similar view comes from Dr. Yoseph Bar-Cohen, a senior research scientist and advanced technologies group supervisor at NASA’s Jet Propulsion Lab. Bar-Cohen, a pioneer in artificial muscle research, installed one of Hanson’s robotic heads his Nondestructive Evaluation and Advance Actuators (NDEAA) Lab at JPL.

These clips show the sociable-robot prototypes in action. The polymer innovations allow the highly expressive folding and bunching of the skin, and consume little power while at it, enabling the lightweight robots to run on batteries. Video Courtesy of David Hanson.

Bar-Cohen plans to use the head as a platform to develop artificial muscles based on various electroactive polymers, which currently have some force and speed limitations. “David’s robots require very little force and power. That’s the beauty of them,” he says. “If there’s any platform suitable for testing artificial muscles for robotics, it’s this one.” Bar-Cohen says, though he adds that electro-active polymers still need some work before they’ll be ready to actuate robots.

Actuation

Even with Frubber’s force reduction contributions, Hanson still had to develop actuators capable of displaying the facial expressions in response to inputs from each robot’s AI software — which is based on off-the-shelf speech-recognition, machine-vision and even face-recognition software. “I think of the facial expressions as four-dimensional sculptures,” Hanson says.

And that time dimension can be tough. What other sculptor has to close control loops in fractions of a second? In fact, Hanson closes different loops at different speeds, depending on the robot’s task. “For face recognition tasks, a robot can think for a while, perhaps a second, without seeming unnatural,” he says. Yet for maintaining eye contact, human-like speed dictates that the control loops close at 20-30 ms. “Any more than that is too much latency,” he says.

To physically create these moving sculptures, Hanson uses small servo motors — and lots of them. The current version of the Einstein robot has 33 different servo motors, about half of them bi-directional to mimic facial muscles that work in opposing pairs. The motors he uses have 256-increment feedback. “That’s not nanoscale precise,” Hanson says. Yet having 33 motors of that resolution provides “an astronomical number of motion possibilities,” he adds. Hanson usually uses off-the-shelf motors from HiTec. “They’re not the least expensive motors but they have a great price-to-performance ratio,” he says. “And you can pick them up at any hobby shop.”

To control all the motion, Hanson created his own PIC-based controller. It generates the servo outputs after blending inputs from the robot’s vision and speech-recognition systems — which together run on a single laptop computer connected wirelessly to the robot. To get a sense of how Hanson blends this data, consider how his robots interact with a group of people. It might turn its head or gaze toward a human speaker from the group and even seem to address that person in a conversation. Behind the scenes, the robot’s controls blend data from two sources in order to identify the speaker. The vision system lets the robots “see” whose mouth is moving. And the speech-recognition system’s stereo microphone allow them to “hear” from which direction the sound emanates – or more precisely, Hanson monitors phase difference in the stereo signal.

Identifying a speaker is just one example. Transforming all the sensory data into thousands of appropriate expressions in real time, requires thousands of rules, algorithms and even some reliance on a library of static expressions. And Hanson notes that human-like expressions have multiple degrees of freedom. For example, robots might simultaneously raise their eyebrows and open their mouths to register “surprise.” Creating all the rules that govern the expression generation relies as much “on cognitive science and my own intuition as it does on software engineering,” Hanson says.

For all his prowess controlling electric motors and packaging them within small robot skulls, Hanson does not believe conventional motors are really an ideal way to actuate robotic faces. In fact, he’s had to jump through hoops to use them as approximations of natural muscle movement. “Motors have inertial and shock behaviors that you don’t get with human muscle,” he says. Hanson has successfully accounted for these difficulties through linkage design. Often times he adds compliant elements to otherwise rigid linkages so they behave in a more naturalistic manner. Other times, though, he connects motors to a region of the face using braided nylon cables.

Always on the lookout for an alternative to motors, Hanson keeps close tabs on artificial muscle technology. Lately he has been evaluating piezo actuators, a project that has received some funding from the National Science Foundation. But he says these actuators are still too expensive, at roughly $700, for his real-world robots. Most of his hopes for future actuation rest on artificial muscles based on electroactive polymers. “They would eliminate a tremendous amount of mechanical complexity, especially if you could mold them into the skin,” he says.

Sign up for the Design News Daily newsletter.

You May Also Like