If you think professional engineering and design tools are immune to being influenced by the consumer world, think again. Many interactive simulation and user interface innovations that are cropping up in hardcore CAD and CAE packages are taking their cues from the gaming world.
Now, input devices are following in the same direction. Leap Motion, a new motion-control company, is the latest to tackle the challenge of bringing accurate 3D motion control to traditional computer interaction, whether for performing basic computing tasks like navigation or doing more creative work like precise virtual 2D and 3D drawings.
In much the same vein as Microsoft's Kinect Xbox add-on that lets people game via hand and full-on body movements, the Leap attaches to a computer via a USB port and turns it into a gesture-recognition device.
The device's built-in, patented software analyzes images from built-in cameras, which track the individual movements of all 10 of a user's fingers in addition to tracking objects like a pen. And the startup team is touting its accuracy compared to other available offerings (think Kinect), saying it is down to one hundredth of a millimeter, a precision level that is well-suited for touch-free natural gestures controls like pinch-to-zoom.
The Leap accurately senses the individual movements of all 10 of a user's fingers and tracks objects like a pen, enabling a 3D workspace that recognizes intuitive gestures. (Source: Leap Motion)
Founders David Holz and Michael Buckwald envision the Leap filling a gap between what they say is easy to do in the real world but hard to do digitally. Traditional input devices like mice and keyboards make actions that are highly intuitive in the real world, like drawing a picture or manipulating 3D objects -- highly technical tasks. However, existing motion-sensing technology is still pretty crude, they said. That's where the Leap comes in.
"Like molding a piece of clay or creating a 3D model -- that inspired us to create the Leap and fundamentally change how people work with their computers," said Buckwald, Leap Motion's CEO, in a press release.
While CAD users and engineers aren't the sole target audience for this device, company officials see big potential for any professional working with 3D images and in need of large-scale 3D visualization, including scientists or energy experts working with oil exploration maps.
Once calibrated, the Leap creates a three-dimensional interaction space of four cubic feet to control the computer or device with precise gestures. Users can interact with their software tools via a mix of swipes, flicks, and pinches to move 3D models, sketch concepts, or make changes.
Of course, applications like CAD and sketch tools have to support the Leap device in order to enable this new world of gesture-based interaction. To that end, Leap Motion is offering a software development kit and encouraging developers to build native apps or port existing tools over to support the device.
Here's the best news. The Leap is pretty cheap. As it is not available yet, those interested in taking it for a spin can pre-order the device for $69.99.
Gesture interpretation for robot movements--now that's a cool idea and one I'm sure has to be well underway in research labs. I don't think gesture movements are that unfamiliar to users any more. Between the new generation of smart phones (not just Apple) and other commonplace electronic devices, more and more users are getting familiar. And for those up and coming engineers born and breed on consoles like the Xbox and Wii, this kind of interaction will be expected.
There's also the risk of unintentional consequences. How many laptop users have brushed the touchpad and sent their cursor elsewhere strewing their typing in several locations?
The article says Leap will track all ten fingers. No scratching while working! And I wonder what finger gesture will mean "CANCEL / Forget about it"?
In all seriousness, gestures can be quite difficult compared to holding something tangible, with a small amount of weight. This was compared to Kinect - I found Wii bowling to be more realistic with the minimal amount of the Wii controller in my hand compared to the similar game in Xbox with Kinect.
This is amazingly cool. It's yet another example of how gaming leads the way in electronics. For years, graphics chips have trickled down from gaming to less expensive products, giving us applications such as 3D navigation. Who says gaming is for kids?
What a great idea. I hope they work with Macs. It's not typing that I find wearing so much as all the touch-pad/mousing for web surfing. Aside from computer users, the other application possibility that comes to my mind is robot control. If this UI can interpret human movements, why can't it be adapted to do the same with robots? I've been wondering about gesture control/interpretation for robots that since the Kinect debuted. So far, I've seen research where a Kinect tells a robot about its environment http://www.designnews.com/document.asp?doc_id=240288 but what about the other direction?
I hear what you both are saying about there not being a direct link to CAD at this time. But I think that's the operative words--at this time. I think there's a lot in this device that portends how input devices will evolve incorporating many of the gestures and capabilities that are becoming commonplace on our personal devices. Perhaps they'll call it the "consumerization of CAD." In any event, it's all about this company convincing CAD and design tool vendors to leverage their toolkit to take advantage of the new interaction paradigm. But this device, the Kinect, and what we see on the gaming front are no doubt, going to heavily influence CAD and design tool development moving forward.
I was impressed. I is like taking the big screen Hollywood to the little desktop screen.
However, I am not sure how to integrate it into my CAD package. I am not sure how it would be that beneficial in actual drawing conditions, as the sensitivity may not handle changing a few tenths of a degree or mm.
MIT students modified a 3D printer to enable it to print more than one object and print on top of existing printed objects. All of this was made possible by modifying a Solidoodle with a height measuring laser.
Siemens released Intosite, a cloud-based, location-aware SaaS app that lets users navigate a virtual production facility in much of the same fashion as traversing through Google Earth. Users can access PLM, IT, and other pertinent information for specific points on a factory floor or at an outdoor location.
Sharon Glotzer and David Pine are hoping to create the first liquid hard drive with liquid nanoparticles that can store 1TB per teaspoon. They aren't the first to find potential data stores, as Harvard researchers have stored 700 TB inside a gram of DNA.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.