At Your Fingertips: How Leap Motion Lets You Control VR With Your Bare Hands

No keyboard, no mouse, and no controller necessary. Leap Motion's sensor-based controller lets users control VR and AR using only their bare hands.

Chris Wiltz

July 24, 2017

6 Min Read
At Your Fingertips: How Leap Motion Lets You Control VR With Your Bare Hands

Virtual reality has an interface problem. In fact, if you asked the team at Leap Motion, they'd say all of modern technology does. But Leap Motion's sensor-based controller could change all of that by allowing us to interact in the digital world using the world's oldest and most user-friendly mode of input – our bare hands.

“We're still enslaved to the same interactions with computers as we were a decade ago,” Mira Murati, Leap Motion's VP of Product, told Design News. “When we interact with technology we are extremely input bandwidth limited.” Murati said San Francisco-based Leap Motion was founded on a simple question: Why should things be that way? “Our hands are a universal form of input. We build things, play chess, throw balls, but when it comes to technology we are limited by the touchscreen," she said.

As convenient and ubiquitous as the touchscreen has become the engineers at Leap Motion still find it too limiting because it only responds to a set of pre-programmed gestures and controls, some of which have to be learned by the user. “We didn't really want to make gestures the focus of interaction,” Murati said. “We wanted really rich physical interaction, no sign language or things you have to learn and remember. We're tracking fingers because we want people to interact with digital objects as they would with physical ones.”

Leap Motion's latest controller (center) can be directly embedded into any OEMs VR or AR headset. (Image source: Leap Motion)

Anyone who has tried a VR-specific controller like the Oculus Touch or HTC Vive controller has experienced the pros and cons. On one hand (no pun intended) they offer a more intuitive and functional VR input experience than a keyboard and mouse or even a game controller, but on the other hand they fall just short enough of real, intuitive hand gesture control to make tasks like CAD design feel awkward at times, especially for users who may not be well versed in using the controllers already.

The first Leap Motion controller, released back in 2013 was PC-based, but Murati said Leap Motion has always seen VR as the goal. “VR was always on our mind. This lack of a natural input is really the most critical missing piece preventing VR and AR from having a rapid take off,” Murati said. “The moment you see your hands in the digital world you have a sense of embodiment and a sense of self. In many ways VR is probably the most human-centric platform today. It's really about the wearer. By bringing in hands as the main form of interaction for VR we're really designing for the human platform rather for computers.”

We had a chance to try Leap Motions sensor, which can be mounted into any Windows-based VR headset, at the VRLA conference and expo earlier this year. Following a recent partnership with Qualcomm, the company was demonstrating its sensor in conjunction with Qualcomm's Snapdragon 835 mobile VR development platform. The demo was a simple program that allowed users to manipulate various sized blocks in VR, but what was immediately noticeable was how accurate and dexterous Leap Motion's sensor is at tracking hand movements. Rather than a simple series of gestures, the virtual hands are able to mimic movements down to individual finger movements – all without the use of any hand-worn sensors.

Murati said what the team at Leap Motion has done is essentially solve a very difficult computer vision problem ... all with a controller made with off-the-shelf parts. The infrared sensor is able to track hand and finger movements (with a 180-degree field of view), which are then filtered into the controller's software API, and converted into virtual hand movements. All-in the unit comes in at about 8-mm in height according to Leap Motion's specs. “You have an IR window, beneath that two cameras that see hands and reconstruct a model of each hand,” Murati said.

The real leg work is done on the software end. In 2016, Leap Motion released an update to Orion, the hand-tracking software that powers its controller, with the aim of improving performance, particularly toward mobile VR. “Since we already had the hardware out we could make dramatic updates to the software,” Murati said. “That's what we did with Orion. We took a look at the tracking pipeline and asked ourselves what would this look like in VR? And we re-did it from the ground up.”

In the past several months, Leap Motion has shifted its focus toward mobile VR – the next wave of hardware that will require no external sensors to track user movements. “To do this we needed the controller to have higher performance, lower power, and at least ten times faster hand tracking, while also being smoother and more accurate,” Murati said. “We wanted to focus on mobile VR because ultimately we want VR to be accessible. Mobile VR experiences are considered second class to PC-based VR right now and that comes down to the fact that input is very limited with things like tapping on your head to control the [Samsung] Gear VR, for example.”

Leap Motion is currently working with OEMs to get its sensor embedded into the next generation of VR and AR headsets coming to the market. The company also maintains a strong presence in the DIY community – encouraging developers at all levels to undertake projects, both in and outside of VR, using its sensor technology. Researchers at The Burke Medical Research Institute in White Plains, NY are using the Leap Motion controller for physical therapy for stroke patients. Mercedes-Benz has designed a concept car that uses the controller for infotainment and dashboard control. Developers have used the controller in conjunction with AutoCAD. And, of course, all manner of independent game development has come about through Leap Motion's annual 3D Jam competition for projects developed for the controller.

With so much engagement with its developer community, and ongoing discussions with OEMs, likely the biggest question on anyone's mind is how soon will Leap Motion's controller allow for haptic feedback and touch sensitivity? Unfortunately, Murati said Leap Motion has no plans of its own to get into haptics, but she did point to companies like UltraHaptics, that are focused entirely in the space of creating virtual objects that can be felt and touched. Perhaps someday a savvy OEM will look to combine a VR headset, Leap Motion's controller, and some sort of tactile interface all into one package?

Leap Motion also recently completed a round of $50 million in series C funding and is looking to expand its operations into China and broaden its reach beyond entertainment into more and enterprise applications including healthcare, education, and industrial training simulation.

“When speaking with OEMs it doesn't require much convincing when it comes to showing the value proposition of hands as the main input for [VR and AR] platforms,” Murati said. “Our conversations with OEMs and partners have been very smooth. The industry understands that input needs to have a higher standard."

ARMTechCon logoARM Technology Drives the Future. Join 4,000+ embedded systems specialists for three days of ARM® ecosystem immersion you can’t find anywhere else. ARM TechCon . Oct. 24-26, 2017 in Santa Clara, CA. Register here for the event, hosted by Design News ’ parent company UBM.

Chris Wiltz is the Managing Editor of Design News.  

Sign up for the Design News Daily newsletter.

You May Also Like