Drivers who turn on their high-beam headlights during heavy snowfall quickly learn what the term "blinding snowstorm" really means. Light bounces off the flakes and returns to the driver's eye, ruining forward vision. Now, however, researchers at Carnegie Mellon University's (CMU) Robotics Institute believe they've found a solution to that problem. By combining projector-type headlights with camera-based sensors, they believe they can direct light between snowflakes or raindrops, reducing the reflection that temporarily blinds the driver.
"We can illuminate the space around the particles," Srinivasa Narasimhan, associate professor of robotics at CMU, told us. "And we can do it because we now have a way to control light over space and time."
Carnegie Mellon University created a smart headlight system that directs light between rain drops or snowflakes.
CMU's development relies on automakers to go beyond traditional headlights, to light sources such as light-emitting diodes (LEDs), liquid crystal displays (LCDs), or Digital Light Processing (DLP). By combining one of those light sources with camera-based sensors and a microprocessor, CMU researchers have proven that they can locate the offending droplets and turn off the light pixels that would otherwise scatter off them.
"For a long time, the automotive headlight has been a bulb with mirrors and lenses to process the beam," Narasimhan said. "Here, we want to use a light projector instead of a bulb. That way, we have a million pixels that we can control."
The heart of the system is the powerful software developed at CMU, which recognizes the speeding precipitation droplet and turns off the correct pixel in response to it. The software -- along with the light source, sensor, and an Intel quad core i7 processor -- form an embedded system that that reads the sensor signal and orchestrates the on/off reaction at each pixel.
The key to the system's ability to do that is the co-location of the light projector and the camera. CMU employs a splitter to send out a beam of light and pick up the reflected light from the same pixel. It does all this -- captures the image, processes the data, and projects the light -- in a scant 13ms. Ultimately, the researchers want to reduce that time to about 2ms to 3 ms, which they said would enable the system to work in a raging thunderstorm on a car driving 60 mph. To bring the time down, they plan to incorporate the light source, sensor, and processor chip in a single embedded system. "What we need is tight integration between the LCD, camera, and embedded computer," Narasimhan said. "With everything tightly coupled, we can bring the time down by eight milliseconds."