Gesture Sensors Could Help Mechatronics Engineers

Jon Titus

September 20, 2012

2 Min Read
Gesture Sensors Could Help Mechatronics Engineers

A recent announcement from a company that specializes in sensors used for gesture detection sparked thoughts about using this type of control in mechatronic devices. The company mentioned using gesture controls in TV sets and set-top boxes. That type of control works for interactive games, too.

It got me thinking: Why not extend gesture controls to mechatronic devices? A Google search located many academic papers about this type of control, and some ambitious hackers have created gesture controllers for robots that use a Wii game interface or accelerometers. My thoughts tend more to real-world applications, such as teaching robots to mimic human operations (like performing tasks for disabled people based on hand, head, or eye motions) and safety applications that could shut down equipment.

103706_465240.jpg

Mechatronics engineers already have programming languages such as ROBOTC (based on C) and RAIL (based on Pascal) that control robot actions and sensors. But these languages use the same fundamental line-by-line code I learned in the mid-1960s! The mechatronics capabilities of equipment and robots have expanded, but programmers still control them with old-fashioned languages. National Instruments' LabVIEW software provides a higher-level graphical programming approach that better abstracts engineers from languages. So we have taken a step in the right direction.

Most mechatronics engineers would rather tackle new tasks than go through the same coding process to get actuators to move and motors to run. Now vision systems can capture human motions in three dimensions. Some sort of translator could convert this 3D information into similar robotic motions. Likewise, people who create mechatronic protheses could use captured human motions to configure the response of actuators based on sensor inputs in an artificial limb. This sort of thing might seem like science fiction, but I wager we'll see gesture controls in more and more products.

In the area of safety, for example, many machines require operators to place each hand on a control switch before the controller starts any action. Instead of having operators move their hands to special switches, why not simply let them hold up their hands or fingers in front of a gesture sensor? This type of control could improve productivity, reduce the effects of repetitive motions, and improve safety.

What else might clever engineers learn to control using gestures? Discuss in the comment section below.

Related posts:

Sign up for the Design News Daily newsletter.

You May Also Like