The $69.99 Leap USB peripheral creates a 3D interaction space of eight cubic feet, where users can interact precisely with and control software on a laptop or desktop computer in much the same vein as Microsoft's Kinect technology. (Source: Leap Motion)
Those are a lot of crazy devices. I wonder if any will "win" in the market place. I didn't notice tablet devices, except in the last slide as a secondary device. My expeience is that these are still used by those who need to draw in any detail. Do you see a future for these?
@Naperlou: I definitely see a future for tablet use among engineers, although I guess I didn't envision it as an input device. Glad you brought it up. With more and more engineers out in the field at customer sites or collaborating with design partners in the field, having the mobility and the larger real estate of a tablet platform to conduct design reviews, visualize assemblies, do conceptual sketching--all of that work is easily translated to the tablet platform thanks to the incoming slew of mobile design apps.
Beth, I was going to say that a 3D input device is going to be limited by a 2D visualization, but InfiniteZ seems to have the answer to that, with what looks like will be the "holotank" of science fiction.
Beth, I definitely agree with you on this one. My wife and one of our sons surprised me with a Kindle Fire for my last birthday. I have been working with this marvelous device for several days now and have found the operation remarkable in that there is no real strain on my wrists or hands. The apps that can be downloaded do just about all of the things I need done and then some. I would gladly move my computer mouse to the shelf if I could use the tablet. I can see a tremendous advantage for a CAD or CAE operators that live on a computer day after day. I had no idea there were as many "options" relative to data entry. I suspect most if not all of these are on the "market" right now and can be purchased. Great post.
Steven Spielberg had it right. If you remember the futuristic 2002 movie, "Minority Report," Tom Cruise interacted with his computer in 3D fashion, mostly by pinching, drawing and waving his arms. I think this kind of technology is inevitable.
I think gesture recognition capabilities of input devices will be as revolutionary as the mouse was. The touchpad has already completely changed how I surf the web and work in my computer's OS. Gesture recognition will also be kinder to our fingers, wrists and tendons.
Steven Speilberg must have gotten a glimpse at his buddy Steve Job's early work in gesture interfaces for the iPhone and subsquent iPad. On a serious note, it's pretty crazy that what was considered out there 10 years ago is now pretty mainstream. All you have to do is hand an iPhone to a four-year-old and right off the bat, they intuitively know how to size and scroll through screens with gestures and pinch movements.
If you haven't seen Minority Report, Beth, you should see it. Tom Cruise opens and closes screens and moves things around by waving his arms and using his hands, in a way the way that I imagine is similar to what you've described here. Also, I have a hunch you're right about Spielberg's connection to Jobs. I don't know anyone else who could have imagined that so accurately years before it actually happened.
If he imagined the gesture interface of today back then (without a peak from his buddy Steve Jobs) then Speilberg missed out on the opportunity to count tech genius among his many talents. I'll have to check out the movie, Chuck. Thanks for the heads up!
I think that as people get more accustomed to these new movements, it just becomes a more natural way of interesting with the computer. I've tried to use my daughter's laptop (which is my old MacBook) and I immediately get stymied because the gesturing and pinching movements supported by my new MacBook and that now don't seem strange to me at all, don't work on her system.
@Beth: I'll admit, at first this knob movement felt strange and awkward. However, my left hand quickly got up the learning curve and became used to the required motions. By the end of the week, I found myself subconsciously reaching for the knob to rotate and zoom the model with my left hand instead of using the standard mouse picks with my right hand mouse. Overall, I liked the idea of using two hands to manipulate and create CAD models (instead of mostly using just one).
These are really neat, usefull, and not to mention fun. I wish I worked for a company that would spring for $75 mice like the one I purchased because it has 19 buttons and track resolution is completely adjustable. I would have my stereoscopic glasses at work except the monitors are 60hz refresh. But wait... I am using a $7,000 workstation with 12 Xeon cores nearly 3Ghz each and it came with a $10 two button mouse/keyboard combo. I would love to see 3D devices in the work place, but I am afraid it's the employee who will be purchasing these things. Until things get a lot cheaper it's the massive multiplayer online gaming mouse for me. Heck, it even has that wow factor where the buttons eluminate and dim.
I think Engineers are probably more open to change than most people, since we are always working with new tools and technology, but it seems that the average person is reluctant to change. I always get a kick out of people opening up their laptop and the trying to find extra space to plug in a mouse into it rather than using the touchpad or other input device on the laptop.
@tekochip: No doubt people hate change and you count me among that mix. The idea of having to learn new things simply to handle the day-to-day tasks that you do everyday is where people typically have the most opposition. But as you say, it's opening yourself up to new ways of working that ultimately might save you time and help you do a better job in the process.
It isn't really that you have to do it that is the problem. I like learning new things, in good order. The problem is that you have to do it so often. If I have to spend a week on the learning curve, I would like to go a couple of months working efficiently. The real problem is that this week you have to learn this change, next week you have to learn a different one. It is hard to be really effective when you are never off the learning curve. This situation is taken from annoying to explosive by managments who have to have everything yesterday. You can't keep up at peak efficiency. Being on the learning curve all the time just makes it three times worse.
@SparkyWatt: Changing work patterns and work habits is the biggest obstacle to any new technology implementation. It's up to the providers of these new devices to make it somewhat intuitive to operate with tried-and-true design and engineering tools otherwise any added utility is for naught.
I think not enough credit was given to Microsoft's Kinect system, which was originally created for XBox and is now being integrated into many no-touch devices due to its low cost (resulting from large scale production).
It would not surprise me at all if this incredible device is integrated into the upcoming Windows 8 suit of computers, tablets, and phones. Exciting times are here; that is for certain!
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.