Very cool project. It's really interesting how widespread an impact gaming technology is having on so-called "serious" development, from robotics to CAD software. Kinect-like interfaces are popping up in a variety of different platforms and will push the envelope in terms of helping people interact with previously pretty inaccessible technologies.
The Kinect approach is definately an important one for machine control. It is also most like human vision. I have seen, over many years (decades) the attempt to create autonomous vehicles and machines. They often use exotic sensors. Lately, though, there have been articles about using a Kinect system to drive these. The vision system is often coupled with a database or model of the scenario. This is much like what we humans do. Factory robots are starting to use some of this technology as well. This is a lot like the small robots that mimic insects, or other creatures. Mimicing humans may be the way to go here as well.
I think the key here is the Kinect visual-based motion sensor--a picture is worth 1000 lines of code? It's analogous to talking to your computer. They are both much more natural ways of interacting with machines, at least from the human perspective.
Nice to see gesture recognition is getting up to speed and developing some traction in public awareness. Given the several mentions of various Kinect sensor implementations, it seems fair to mention another "disruptively innovative" technology which handles all the tasks this article describes. Look for and check out the threads of commentary, info etc which were started when a company named Leap Motion made an announcement on May 21st.
Key elements of their announcement: an inexpensive sensor device which enables position-detection, motion-detection, and gesture recognition -- with a reproducible position-detection accuracy of 0.01mm (i.e., ten micrometers, one wavelength of long-wavelength-range IR), anywhere within a "recognition space" volume of eight cubic feet. And a movement detect-and-report latency below the threshold for human perception -- USB comm latency and your monitor's refresh rate are the bottlenecks there (I'm still hoping to hear a stat for maximum trackable position rate-of-change, re effective point-measurement-rate). And an API which uses perhaps 5% of the CPU time on a nothing-to-write-home-about generic PC. ...Hey, my jaw dropped too.
I am just one of many hopeful entries in their (still open) pool of developer applicants, with thousands scheduled to be selected to receive an SDK and a free Leap device in the next three months or so. Their obvious intention is to "crowd-source" a base of useable applications by the time the device is commercially available in the first part of 2013. Devices can be pre-ordered now, for the impatient.
Look for their website, their facebook page, their YouTube videos, and their forums. Because of patents pending, complete specs and technique info have not yet been released, but there has been some fairly credible guessing going on.
Important to note: The Leap technology will be making OUR reality "machine readable" -- If you can SEE something, you can use it as an input for consideration. No tape required. Anticipate interesting times.
Ann, I think this is a great achievement and revolutionary thought, where robots can be used in a very human friendly way. I think it may be able to detect the remote motions also, where we can use such technologies is disaster areas.
flared0ne, I did see the Leap announcement, but so far it's not a real product yet. If they can do what they say they want to do, it may leave Kinect technology in the dust. Also, as we stated in my article: ShapeTape was used only to test the A*Star system. It will not be required to use it: that's what Kinect is for.
Industrial trade shows, like Design News' upcoming Pacific Design & Manufacturing, deserve proper planning in order to truly get the most out of them as marketing tools. Here's how to plan effectively.
The series now can interface with a wider array of EtherNet/IP-compliant hardware across many industrial sectors, including factory automation systems, plastic injection molding apparatus, and materials-handling equipment.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.