A lot of hard work goes into developing new sensor technologies. The initial applications that drive sensor companies to invest in the research to bring new technologies to market are frequently in automotive, medical, industrial or military/aerospace. However, once new sensors exist, sometimes it is all fun and games, especially for users of some of the newest toys that take advantage of built-in sensing capabilities.
Accelerometers Get on Board
Protecting lives is serious business. Carmakers and their suppliers created the low-cost semiconductor accelerometer business based on the requirement for these sensors in airbag systems. Now that low-cost accelerometers exist, they are being applied for motion detection and stabilization in a variety of consumer electronics products from cameras to hard disk drives. The cost is low enough that new interactive games have been developed using accelerometers for motion detection. One of the more visible applications in 2006 was Nintendo’s Wii. At the 2007 Consumer Electronics Show (CES), Qmotions and Freescale Semiconductor showed attendees how to become an integral part of a snow- or skateboarding game. Qmotions-Xboard full-motion game controller uses a three-axis accelerometer developed by Freescale that measures body movement and converts it into full body action. The accelerometer’s location in an actual board allows players to really get into the game.
With the user balancing on the board, the accelerometer provides tilt feedback data regarding the motion and translates the measurements into a character in the game on either a skateboard or snowboard that tilts in the same direction. For this application, tilt recognition could have been done with a two-axis accelerometer. However, the accelerometer has a non-linear tilt output, so using the third axis for the X and Z or the Y and Z measurements provides more accurate tilt data. “It is measuring gravity, so it can use both the X and the Z for that type of rotation and when it is tilting forward, it can use the Y and the Z for getting accurate tilt measurements,” says Michelle Kelsey, marketing manager for Inertial Sensors, Freescale Semiconductor.
The accelerometer selected for the application is Freescale’s MMA7260Q, a low-g three-axis device with a g-select option that can be used to chose a range of ±1.5, ±2 ±4 or ±6g. Since Qmotions specializes in developing and marketing PC/Console-based active game technologies, the device also provides a platform for other products.
Future designs could take advantage of the accelerometer’s additional capability with a software update to the existing hardware to get more information such as shock, vibrationor fall detection. “It is pretty exciting to see how all the gaming equipment is going to be changing to be even more like a virtual reality type of input,” says Kelsey.
The innovative Roomba robotic vacuum cleaner relies on infrared (IR) sensors to detect obstacles and floor discontinuities. Four pairs of IR sensors monitor the floor to detect steps and another pair looks for walls. Using this same infrared technology, Kids Delight developed the Zig Zag Zog UFO Saucer. This toy alien avoids being captured by using three IR sensors to detect obstacles in its path and two more IR sensors in its head to warn of someone trying to reach down and bop its head — the object of the game. After being caught, the toy operates at successively higher speeds requiring even higher performance from both the sensor and detection circuitry.
Air Guitar Shirt
Not content to just go through the motions, Dr. Richard Helmer, an engineer from the Commonwealth Scientific and Industrial Research Organization (CSIRO) in Belment, Australia, developed a textile motion sensor to capture air guitar movements and turn them into music. The sensor and custom software for interpreting gestures create a virtual instrument.
The shirt is made of highly conductive fibers. The resistance value of the fibers changes linearly with the strain and compression of stretching the fiber by various arm movements. “The sensors result in a variable voltage related to elbow bend which is digitized by electronics attached to the shirt,” says Helmer. “This is broadcast wirelessly to a computer where the signals are interpreted to deliver sounds.”
By associating a set of audio samples with different arm positions, Helmer can play a sample or part of a recorded song. For example, he can choose a particular portion with his left arm at an angle such as 45 degrees ±10 degrees and trigger the sound with his right arm as it passes 45 degrees. The system provides continuity in either “arms relaxed” or “arms flopping around a bit” modes. The same piece of music can be played in different ways providing different skill levels such as verse chorus, picking each chord or even by note. In a video clip, the inventor plays a tambourine where the sensor material is more readily observed as a set of black sleeves over the elbows.
Big Boys and Their Toys
Loading a car with the latest sensing technologies just so it can operate autonomously seems to be a fun activity, too. DARPA’s Urban Challenge gives engineers and engineering students the opportunity to do this and win up to $2 million. The race will be held at a yet-to-be-disclosed urban location in the Western United States, where a driverless vehicle must travel 60 miles.
Stanford University won the 2005 DARPA Grand Challenge, a 132-mile desert route, with a vehicle called Stanley. David Stavens, a Ph.D. candidate at Stanford and one of the co-designers/co-creators of Stanley is involved with Junior, a Stanford vehicle being designed for the 2007 race. “In terms of sensors, the biggest change between Stanley and Junior is that Stanley only needed to see stationary obstacles in front of him,” says Stavens. “Junior needs to see moving obstacle all around him.”
The primary sensor in Junior is Velodyne’s HD Lidar. The high definition (HD) sensor provides 3D information about the surrounding environment. With its 64 lasers spinning at about 15 to 16 rev/sec, the unit generates 2,621,440 points/sec to cover about a 50m range. Since the sensor rotates much faster than anything in the environment, it provides a sense of three dimensions. “Sixteen cycles per second is more than fast enough for any type of ground vehicle or pedestrian or bicycle tracking,” says Stavens. To detect objects beyond the range of the Velodyne HD Lidar, an Ibeo ALASCA XT Lidar provides measurements to distances over 200m. The unit has four echoes per plane and operates in four planes producing 16 echoes per measurement.
A Point Grey Ladybug2 provides the vision input. The unit uses six Sony 1/3-inch progressive scan CCD cameras with a resolution of 1024 × 768 and a frame rate of 30 frames/sec. Five cameras are positioned in a horizontal ring and one points straight up. The head unit uses a proprietary 1.2 Gbps optical link to the compressor unit that provides the ability to stream images at up to 30 frames/sec.
Stavens considers the Applanix pulse LV 420 system a key part of the sensing technology since it determines the vehicle’s location. The unit combines two dual-frequency GPS receivers, a high-performance inertial measurement unit (IMU), wheel odometery and OmniSTAR’s satellite-based differential correction service. Fusing all the inputs together in realtime provides an extremely accurate estimate of the vehicle’s location. “That turns out to be essential because when you are doing the sensor fusion of all these different sensors, one of the things that is really important is knowing precisely where the vehicle is when these measurements are being taken,” says Stavens.
The sensors, particularly the Ladybug2 and the Velodyne HD Lidar, produce a phenomenal amount of data. To communicate the data, the Veledyne, Ibeo and Applanix sensors use Ethernet. The Point Grey Ladybug2 uses FireWire 800. The vehicle’s computers process the instruments’ data as frequently as 200 times per second.
Since the sensors all exist, Stavens speculates several teams in the Urban Challenge will have a similar sensor suite, some type of laser combined with some type of camera. “So it really comes down to the software,” he says. “We think of this as a software competition — who can write the best artificial intelligence software.”