Gesture interpretation for robot movements--now that's a cool idea and one I'm sure has to be well underway in research labs. I don't think gesture movements are that unfamiliar to users any more. Between the new generation of smart phones (not just Apple) and other commonplace electronic devices, more and more users are getting familiar. And for those up and coming engineers born and breed on consoles like the Xbox and Wii, this kind of interaction will be expected.
There's also the risk of unintentional consequences. How many laptop users have brushed the touchpad and sent their cursor elsewhere strewing their typing in several locations?
The article says Leap will track all ten fingers. No scratching while working! And I wonder what finger gesture will mean "CANCEL / Forget about it"?
In all seriousness, gestures can be quite difficult compared to holding something tangible, with a small amount of weight. This was compared to Kinect - I found Wii bowling to be more realistic with the minimal amount of the Wii controller in my hand compared to the similar game in Xbox with Kinect.
This is amazingly cool. It's yet another example of how gaming leads the way in electronics. For years, graphics chips have trickled down from gaming to less expensive products, giving us applications such as 3D navigation. Who says gaming is for kids?
What a great idea. I hope they work with Macs. It's not typing that I find wearing so much as all the touch-pad/mousing for web surfing. Aside from computer users, the other application possibility that comes to my mind is robot control. If this UI can interpret human movements, why can't it be adapted to do the same with robots? I've been wondering about gesture control/interpretation for robots that since the Kinect debuted. So far, I've seen research where a Kinect tells a robot about its environment http://www.designnews.com/document.asp?doc_id=240288 but what about the other direction?
I hear what you both are saying about there not being a direct link to CAD at this time. But I think that's the operative words--at this time. I think there's a lot in this device that portends how input devices will evolve incorporating many of the gestures and capabilities that are becoming commonplace on our personal devices. Perhaps they'll call it the "consumerization of CAD." In any event, it's all about this company convincing CAD and design tool vendors to leverage their toolkit to take advantage of the new interaction paradigm. But this device, the Kinect, and what we see on the gaming front are no doubt, going to heavily influence CAD and design tool development moving forward.
I was impressed. I is like taking the big screen Hollywood to the little desktop screen.
However, I am not sure how to integrate it into my CAD package. I am not sure how it would be that beneficial in actual drawing conditions, as the sensitivity may not handle changing a few tenths of a degree or mm.
In an age of globalization and rapid changes through scientific progress, two of our societies' (and economies') main concerns are to satisfy the needs and wishes of the individual and to save precious resources. Cloud computing caters to both of these.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.