During infancy, a baby's brain maps a mother's moving lips to the sound of her voice with the superior colliculus portion of the brain, which associates external direction with an internal visual reference guide. Researchers at the University of Illinois are using that idea for development of a new self-aiming camera that could help the military distinguish a flock of geese from a fleet of MiGs. "The superior colliculus serves as the visual reflex center of the brain," says Sylvian Ray, a UI professor of computer science. "It is the primary agent for deciding which direction to turn the head in response to sensory stimuli," he says. The system includes microphones and two cameras. As the camera detects motion by comparing successive frames, the system monitors audio signals from the omni-directional microphones. Sound location algorithms analyze the sound and send information to a neural network. A second camera equipped with a long-range lens determines the position of the target. Ray says the combination of sight and sound offers a stronger stimulus than each individually. For more information, go to www.uiuc.edu.