In what could be a major advance for both industrial robots and prosthetic hands, researchers at the USC Viterbi School of Engineering have developed a specially designed robot that can outperform humans in identifying a wide range of natural materials by sensing their textures. The robot has a new type of tactile sensor built to mimic the human fingertip. It uses algorithms imitating human strategies to identify textures by touch.
The BioTac tactile robotic sensor is built to mimic the human fingertip and uses algorithms imitating human strategies to identify textures by touch. (Source: SynTouch)
Built by Gerald Loeb, professor of biomedical engineering and director of the USC Medical Device Development Facility, and biomedical engineering doctoral candidate Jeremy Fishel, the multimodal BioTac fingertip-shaped sensor is filled with conductive fluid. Fingerprint-like ridges on its flexible elastomeric "skin" have a biomimetic size of 0.4mm spacing. When the BioTac slides over textured surfaces, vibrations are induced that propagate through the fluid, changing its impedance, which is sensed by electrodes. That data is conveyed to a pressure sensor.
The testbed apparatus consists of a stepper motor attached to a lever that raises or lowers the BioTac on or off of textured surfaces. Contact force is controlled by adjusting the stepper motor's vertical position. A special vibration-free linear stage is used to slide textures past the BioTac to emulate lateral motion. The textured surfaces are attached to flat, square magnets that can be quickly mounted and dismounted on a steel plate attached to the linear stage (watch a video below).
The BioTac sensor selects from a database of 117 different textures derived from common materials. The database serves as the equivalent of previous experiences the robot can use for distinguishing new textures it encounters. After selecting and making an average of five exploratory movements, the robot could correctly identify novel materials 95 percent of the time. Compared to human subjects who could not distinguish between two very similar textures, the robot was successful 99.6 percent of the time.
To differentiate the new texture from a set of plausible candidates, the discrimination algorithm uses a method when exploring a texture to adaptively select the best movement to make and the correct property to measure, based on its previous experience. Loeb and Fishel call this process "Bayesian exploration" in an article describing BioTac published in Frontiers in Neurorobotics.
"Extending this algorithm to a complete robotic system working in unstructured environments is expected to degrade the quality of measured signals, which was enhanced by the careful design of a custom-built experimental apparatus," they wrote.
"In particular, the actuators in humanoid robots are likely to be considerably noisier than our apparatus, introducing both variability into the exploratory movements and noise into the sensor signals. Additional training to better understand the characteristics of noise and variability is one way to compensate for this. We expect the Bayesian exploration method to be robust to this and evolve to make the most of available information."
The authors are also equity partners in SynTouch LLC, which develops and manufactures tactile sensors for mechatronic systems that mimic the human hand, including the BioTac.
Mike J, you're right. Every interesting development in robot R&D is being researched by more than one organization, and there are a huge number of robot labs in universities. For every subject like this one there's usually a handful of different approaches, too.
"The challenges are numerous. Interfaces must be structured so nerve fibers can grow through. They must be mechanically compatible so they don't harm the nervous system or surrounding tissues, and biocompatible to integrate with tissue and promote nerve fiber growth. They also must incorporate conductivity to allow electrode sites to connect with external circuitry, and electrical properties must be tuned to transmit neural signals."
Charles exactly what I was thinking.. if it was possible to somehow wire the finger/arm such that the signal would stimulate the brain in such a way that it would think the person was actually touch something.Charles exactly what I was thinking.. if it was possible to somehow wire the finger/arm such that the signal would stimulate the brain in such a way that it would think the person was actually touch something. If they don't have this capability now, I'm sure it will be just around the corner.
This looks like an intersting development in sensors, but, beyond the $15000 cost of the development kit, you're going to have to invest a lot of time towards developing algorithms to interpret the sensor signals.
From an automated grapsing perspective, I can imagine a system that uses the BioTac signals to detect slippage of a grasped object and automatically tighten the gripping force to compensate. This would allow lower grasping forces in mobile robot manipulation tasks where overly tight grasping is the norm.
Ann, a few years ago, doctors at the Rehabilitation Institute of Chicago were talking about adding touch to prosthetic limbs. I wonder if this would make it easier to do that, or if it would even be possible to send the signals from this finger to the human brain.
Future generations of this sensor, combined with sensors for temperature and pressure will give a very close approximation of human sensorium. Whatever sort of actuators available at that time (Festo does have some interesting ones now) will provide movement. Detxerous, sensing fingers are the result. Sensors and actuators will likely be connected to a local network router in order to simplify the trunks feeding back to the central core of the robot.
Human-like manipulators will make for very, very useful general-purpose robots, ones that don't need custom tooling to perform a job.
Nice article and video, Ann. As the narrator notes, ultimately, the data from the robotic touch has to get between the ears of the user. I would think there is a wide range of uses for this technology.
naperlou thanks, those are good examples of how this technology could supplement existing inspection technology. Same goes for various robotic handling and sorting functions, some of which also already use machine vision and could be supplemented by robots with a sense of touch.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.