By focusing their attention on patterns created by flickering lights on a PC screen, which are associated with specific actions, users can control which actions they want a robot to perform, where the robot moves, and how it interacts with its environment. (Source: CNRS-AIST Joint Robotics Laboratory)
Jim_E, thanks for the link to that Wired article (and I agree about print editions: Rolling Stone in the hand is very different from Rolling Stone on line, e.g.). But trying to control the incredibly complex movements of a hand and its fingers has got to be a few orders of magnitude more complicated than controlling legs enough to make them walk. So I'm not surprised there's been little progress in that area.
Chuck, I wish we had more info on the project's engineering details, which are still under development. Considering how much work has already been done aimed at similar goals, such as various methods of motion capture, I suspect it won't take all that long to write the algorithms. Battar, thanks for the response on this subject, too. FWIW, Fujitsu started working on turning the electrical impulses from a person's thoughts into electronically controlled actions back in the late 80s to early 90s.
Interesting link, Jim_E. Thanks for posting. I would think that the "bionic limb" idea would actually be easier since they are trying the use the biological processes already in place to do essentially what they were designed to do - think about moving your hand that used to be at the end of your arm and the new hand at the end of your arm moves as the original once did. The process of separate robots seems like a whole other ballgame.
Greg, the elderly could certainly benefit if they're among either target group, such as people confined to bed or wheelchairs. Since the technology is still being developed, most of the current learning curve is occurring among experimenters as they learn what thoughts produce what actions. Ideally, there won't be much for users.
The algorithms are far simpler than you think because you have a "man-in-the-loop" who can unconciously compensate for fairly large errors. For example, given 2 systems which react with a 30 degree difference in angular movement for the same input - well with one you'll just push a little harder until you get the desired result. You wouldn't even notice it. With fully automated autonomous systems, output must match input exactly or there will be trouble.
There is currently much discussion around the term "platform," which may be preceded by the adjectives "mobile," "wearable," "medical," "healthcare," etc. However, regardless of the platform being discussed, they usually have one key aspect in common: They tend to be wireless. So, why is this one aspect so fairly universal? The answer is convenience.
Everyone has a MEMS story. For most of us it’s probably the airbag that saved our lives or the life of a loved one. Perhaps it’s the tire pressure sensor that alerted us about deflation before we were stranded alone on a dark muddy road.
Bioimimicry is not merely a helpful design tool -- it also encourages designers to think not only about how to solve design problems by imitating nature, but how to make the products, materials, and systems they design more ecologically sound and nature-friendly.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.