The Sound of Touch: Enhancing UI and VR With Ultrasound: Page 2 of 3

Chirp Microsystems has unveiled a miniature ultrasound sensor for touch-free gesture control that it says will help usher in truly mobile VR and create the next generation of user interfaces.

like hand waving but also “microgestures,” such as finger movements, with 1mm of accuracy. This fine accuracy makes the ToF sensor ideal for wearables applications, where a device may be too small to properly utilize a touchscreen and for hands-free applications such as sterile medical or clean room environments or infotainment applications.

To be fair, the foundation of what Chirp is doing is not new. Researchers and businesses have been experimenting with so-called acoustic gesture control or recognition, using sound to detect movements to control an interface, for decades. Nintendo even released a gaming peripheral, the Power Glove , back in 1989 that utilized ultrasonic sensors to let users control video games with hand movements.

The only problem with the Power Glove was that, in technical terms, it was a piece of junk that didn't actually work (Nintendo discontinued it only a year after release).

What Chirp has done however represents the newest, lowest-power, and smallest form factor application of sound-based gesture recognition technology.

Enabling Mobile VR

Chirp Microsystems' ToF sensor enables touchless sensing for wearables and other consumer electronics and "inside-out tracking," allowing VR user to move anywhere in their virtual environment while their tracking moves with them. (Image source: Chirp Microsystems)

Chirp is aiming for the FoT chip to make big (sound)waves in VR/AR. Many VR applications right now are focusing on motion tracking controllers like the HTC Vive controller and Oculus Touch . These controllers come in pairs and are designed to mimic the use of human hands as closely as possible. Rather than using a mouse or joystick to pick up an object for example, HTC and Oculus controllers let users pick things up in a virtual space by actually squeezing their hands and fingers on a trigger.

The drawback to these solutions (aside from the consumer cost) is that they use light and camera-based tracking, again leading to power consumption issues, not to mention the additional hardware required for the setup. And they restrict the user into a given area of space when moving around.

ESC Boston   is back! Get ready for an expanded, expert-led program over two full days, May 3-4, 2017 . Part of America's largest embedded systems industry conference series, ESC Boston gives you the chance to drill down into four tracks covering embedded hardware, embedded software, connected devices and the IoT, and advanced technologies.    Register Today .

“We're very excited about VR and AR because we believe our solution is a disruptive solution to the tracking problem everyone is facing right now,” Kiang said. By embedding its chip into a head-mounted display (HMD) as well as a separate controller, Chirp says it can create a solution that is lower power, lower latency, lower cost, and better resolution that camera-based systems for VR and AR. Kiang said Chirp's sensor offers the potential for full 360-degree immersive experiences, a wider field of view than camera-based systems, and a system that will work under any lighting conditions, unlike light-based systems.

Having the sensors embedded into a HMD also offers a very attractive feature for mobile VR and AR users – inside-out

Comments (1)

Please log in or register to post comments.
By submitting this form, you accept the Mollom privacy policy.
  • Oldest First
  • Newest First
Loading Comments...