During infancy, a baby's brain maps a mother's moving lips to the sound of her voice with the superior colliculus portion of the brain, which associates external direction with an internal visual reference guide. Researchers at the University of Illinois are using that idea for development of a new self-aiming camera that could help the military distinguish a flock of geese from a fleet of MiGs. "The superior colliculus serves as the visual reflex center of the brain," says Sylvian Ray, a UI professor of computer science. "It is the primary agent for deciding which direction to turn the head in response to sensory stimuli," he says. The system includes microphones and two cameras. As the camera detects motion by comparing successive frames, the system monitors audio signals from the omni-directional microphones. Sound location algorithms analyze the sound and send information to a neural network. A second camera equipped with a long-range lens determines the position of the target. Ray says the combination of sight and sound offers a stronger stimulus than each individually. For more information, go to www.uiuc.edu.
Determining the quantities and location of sensors in an Internet of Things application requires a thorough problem statement and a clear vision of success, an expert will tell engineers at the upcoming Design & Manufacturing Show in Minneapolis.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies.
You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived.
So if you can't attend live, attend at your convenience.