During infancy, a baby's brain maps a mother's moving lips to the sound of her voice with the superior colliculus portion of the brain, which associates external direction with an internal visual reference guide. Researchers at the University of Illinois are using that idea for development of a new self-aiming camera that could help the military distinguish a flock of geese from a fleet of MiGs. "The superior colliculus serves as the visual reflex center of the brain," says Sylvian Ray, a UI professor of computer science. "It is the primary agent for deciding which direction to turn the head in response to sensory stimuli," he says. The system includes microphones and two cameras. As the camera detects motion by comparing successive frames, the system monitors audio signals from the omni-directional microphones. Sound location algorithms analyze the sound and send information to a neural network. A second camera equipped with a long-range lens determines the position of the target. Ray says the combination of sight and sound offers a stronger stimulus than each individually. For more information, go to www.uiuc.edu.
When you think of the DARPA Robotics Challenge, you may imagine complex humanoid contraptions made of metal and wires that move like a Terminator Series T-90. But what actually happened at the much-vaunted event was something just a bit different.
Traditional dev kits are based on a manufacturer’s microcontroller, radio module, or sensor device. The idea is to aid the design engineer in developing his or her own IoT prototype as quickly as possible. A not-so-traditional IoT development kit released by Bosch aims to simplify IoT prototyping even further.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.