Artificial Nervous System Gives Robots Unprecedented Sense of Touch

Researchers at the National University of Singapore have developed an artificial nervous system that can be paired with sensor-based skins to give robots and prosthetics a human-like tactile sense.

Elizabeth Montalbano

September 23, 2019

4 Min Read
Artificial Nervous System Gives Robots Unprecedented Sense of Touch

One of the key capabilities researchers are trying to achieve with robots and prosthetic devices is a sense of touch akin to how humans feel.

roboticskin_0.jpg

The Asynchronous Coded Electronic Skin (ACES) developed by Assistant Professor Benjamin Tee (far left) and his team at NUS responds 1000 times faster than the human sensory nervous system. (Image source: NUS)

We’ve already told you how a collaboration between UCLA and the University of Washington has developed a flexible sensor “skin” that can be stretched over any part of a robot’s body or prosthetic. Now researchers at the National University of Singapore (NUS) have tackled this same challenge but in a complementary way, developing an artificial nervous system they said can give robots and prosthetics a sense of touch on par with or even better than human skin.

The Asynchronous Coded Electronic Skin (ACES), created by a team led by Assistant Professor Benjamin Tee in NUS Materials Science and Engineering, can be paired with any type of sensor system—such as the one mentioned above—to function as an electronic skin for robots and prosthetic devices, he told Design News.

“We wanted to create the world's most advanced prosthetic skin that can transmit the richness of human tactile sensations for truly lifelike prosthetics,” Tee told us.

When researchers set out to do this, they discovered that the same speed and efficiency with which human skin can react is what’s missing from current technologies developed to serve this purpose, he said.

“This is because our skin sensors work all the time, and this requires a sophisticated nervous system,” Tee told Design News. “In humans, we have the peripheral nervous system that our skin cells connect to. However, such a nervous system is still missing for artificial electronic skins. Hence, we set out to create an artificial nervous system for skin sensors.”

Independence Equals Speed

How ACES does that is to take a different approach to “feeling” than previously developed electronic “skins,” Tee said. “The main concept is [that] each sensor is independent, and we developed the nervous system or communication system that allows for that,” he told Design News.

So far, most previously developed similar technologies have relied on scanning each sensor pixel sequentially, Tee said. The NUS researchers created a nervous system based on individual response, which allows for faster transmission of “feeling” data, he said.

“In our approach, we created a nervous system that allows us to have each sensor pixel respond individually, without need for sequentially reading each pixel,” Tee told us. “For example, if we have 1,000 sensors, all of them can send the data out at the same time. In sequential systems, if each sensor takes 1 millisecond, that would take 1 full second to transmit all the data from every sensor pixel.”

This independence also eliminates a limitation that typical sensor skins have, which is an interlinked wiring system that can make them sensitive to damage and difficult to scale up, Tee said. And while the artificial nervous system detects signals like its human counterpart, unlike the nerve bundles in the human skin, it is made up of a network of sensors connected via a single electrical conductor.

Researchers described their work in a paper in the journal Science Robotics.

Superhuman Skin?

In some ways, ACES performs even better than human skin in terms of tactile feedback, Tee said. The system can detect touches more than 1,000 times faster than the human sensory nervous system; it is capable of differentiating physical contact between different sensors in less than 60 nanoseconds, which researchers said is the fastest ever achieved for an e-skin technology.

At the same time, ACES also can accurately identify the shape, texture, and hardness of objects within 10 milliseconds, which is 10 times faster than the blinking of an eye, Tee told us. He attributed this capability to the high fidelity and capture speed of the system.

Researchers envision that their system can lead to better prosthetic devices that are more intuitive and give the humans wearing them a more life-like experience, he told Design News.  ACES also could even help create robots that can perform advanced tasks, such as performing more complex surgeries than what’s possible with surgical robots today, Tee said.

RELATED ARTICLES:

The team plans to continue to improve the technology as well as integrate it with

prosthetic hands that will be used in a clinical study to see how the system performs in a real-life setting, he added.

“We hope that in the near future, losing a limb will no longer be as debilitating,” Tee told Design News. “Maybe we can even gain greater skills with such prosthetic devices.”

Elizabeth Montalbano is a freelance writer who has written about technology and culture for more than 20 years. She has lived and worked as a professional journalist in Phoenix, San Francisco and New York City. In her free time she enjoys surfing, traveling, music, yoga and cooking. She currently resides in a village on the southwest coast of Portugal.

Correction: A prior version of this article misattributed the research to the Singapore University of Technology and Design. All research was done by the National University of Singapore. We regret the error. 

About the Author(s)

Elizabeth Montalbano

Elizabeth Montalbano has been a professional journalist covering the telecommunications, technology and business sectors since 1998. Prior to her work at Design News, she has previously written news, features and opinion articles for Phone+, CRN (now ChannelWeb), the IDG News Service, Informationweek and CNNMoney, among other publications. Born and raised in Philadelphia, she also has lived and worked in Phoenix, Arizona; San Francisco and New York City. She currently resides in Lagos, Portugal. Montalbano has a bachelor's degree in English/Communications from De Sales University and a master's degree from Arizona State University in creative writing.

Sign up for the Design News Daily newsletter.

You May Also Like