One more example of how technology is making robots much more human-like. But what's the business benefit of having a robot develop a sense of touch? Are there specific applications where this kind of added capability would be useful?
Beth, I can think of one right off the bat from some groups I have been talking to. The application is automated product inspection. This is done now with vision systems. Adding a tactile sensor to the inspection system would be useful in a lot of situations. Presently, we use vision systems to evaluate texture of surfaces. This could be tuned to be more accurate.
The main applications mentioned by the researchers are giving industrial robots a finer sense of touch for distinguishing more easily and quickly among objects they handle, as well as prosthetic hands for people.
naperlou thanks, those are good examples of how this technology could supplement existing inspection technology. Same goes for various robotic handling and sorting functions, some of which also already use machine vision and could be supplemented by robots with a sense of touch.
Nice article and video, Ann. As the narrator notes, ultimately, the data from the robotic touch has to get between the ears of the user. I would think there is a wide range of uses for this technology.
Future generations of this sensor, combined with sensors for temperature and pressure will give a very close approximation of human sensorium. Whatever sort of actuators available at that time (Festo does have some interesting ones now) will provide movement. Detxerous, sensing fingers are the result. Sensors and actuators will likely be connected to a local network router in order to simplify the trunks feeding back to the central core of the robot.
Human-like manipulators will make for very, very useful general-purpose robots, ones that don't need custom tooling to perform a job.
Ann, a few years ago, doctors at the Rehabilitation Institute of Chicago were talking about adding touch to prosthetic limbs. I wonder if this would make it easier to do that, or if it would even be possible to send the signals from this finger to the human brain.
This looks like an intersting development in sensors, but, beyond the $15000 cost of the development kit, you're going to have to invest a lot of time towards developing algorithms to interpret the sensor signals.
From an automated grapsing perspective, I can imagine a system that uses the BioTac signals to detect slippage of a grasped object and automatically tighten the gripping force to compensate. This would allow lower grasping forces in mobile robot manipulation tasks where overly tight grasping is the norm.
A recent example of a major CAE revamp is MSC Apex, released last month by MSC Software Corp. In a discussion with Design News, MSC executives noted that its next-generation platform is designed to substantially reduce CAE modeling and process time, “in some cases from weeks down to hours.”
The Thames Deckway would run for eight miles close to the river’s edge, rising and falling slightly with the tidal cycle. It will generate its own energy from a series of devices that will line the pathway and use a combination of sources to make the path self-sustaining.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.