Ever wondered what you look like on camera from a kilometer away? Now you can find out, because scientists at Heriot-Watt University in Edinburgh, Scotland, have developed a camera system capable of capturing precise 3D images up to a mile away. The new system is accurate enough to detect subtle millimeter-depth changes, and its 3D object detection ability IT easily surpasses standard 2D camera technology.
The system was produced by a study led by Professor Gerald Buller from Heriot-Watt's School of Engineering and Physical Sciences. The team has taken the time-of-flight technique already in use in several autonomous machine vision applications and upgraded it to allow for long-range image detection. The technique traditionally uses a laser beam that is bounced off an object and recaptured to measure the time it took to travel back to a detector. However, the new system uses low-power infrared pulses to measure pixel-by-pixel travel time off of a distant object. The measurement of individual photons allows the system to process an entire image while accounting for depth changes at the millimeter scale.
These figures were photographed from 910m away. The images on the left had more time to collect data. Those on the right show noise from a hasty picture. Facial distortions are due to weak reflection from skin. (Source: Heriot-Watt University)
By using infrared, rather than higher-frequency laser beams, the team was able to achieve imaging of "uncooperative objects" -- objects that would not reflect laser pulses easily. The researchers chose a beam with a wavelength of 1,560nm -- longer and redder than visible light but safe for the human eye.
"Our approach gives a low-power route to the depth imaging of ordinary, small targets at very long range," Dr. Aongus McCarthy, a project group member and research fellow at Heriot-Watt, said in a press release. "While it is possible that other depth-ranging techniques will match or out-perform some characteristics of these measurements, this single-photon counting approach gives a unique tradeoff between depth resolution, range, data-acquisition time, and laser-power levels."
As of now, the team believes the primary use for the system will be to monitor static, manmade objects. Faces in the image above are distorted because human skin is not a suitable reflector for the low-power infrared light. The team is also working on making the technology available for outdoor vegetation monitoring and rock movement detection to prevent environmental hazards. McCarthy said that, with a few hardware and software changes, the system will be able to detect the speed and direction of objects while extending its range up to 10 kilometers -- all within the next five years.
Also wouldn't be surprised if Naval Special Warfare Group is not out testing the long range version operationally. A person camoflaged against a brushy background would stand out as a 3D image--happy hunting.
You mention GoogleEarth- that's the thought that came to my mind, but in a slightly different way: I always imagined that all the terrestrial imaging from space was simply the result of high-end optics and a huge MegaPixel array to capture detail from 100 miles high. (I actually don't have a clue how Google gets that resolution; I'm guessing) But this article seems to describe more of a scanning system than an optics system. Seems like its more like RADAR than Photography.
Hmm if it has mm resolution at a km what's it like from 5m? Does this mean we can spin someone or something on a stool and get an accurate 3D model? Could be great for engineering an conversion of something for 3D printing. I know there are solutions out there with cameras and lasers but I've not seen anything that was as good as a contact system
The picture quality seems blur but then I guess is due to the fact that they don't work well with the human face, I think they ought to make the resolution better because it goes without saying that agencies like the CIA and FBI will adopt the technology. The environmental monitoring part is pretty fascinating since the natural calamities won't find us with our guards down.
Are they robots or androids? We're not exactly sure. Each talking, gesturing Geminoid looks exactly like a real individual, starting with their creator, professor Hiroshi Ishiguro of Osaka University in Japan.
Truchard will be presented the award at the 2014 Golden Mousetrap Awards ceremony during the co-located events Pacific Design & Manufacturing, MD&M West, WestPack, PLASTEC West, Electronics West, ATX West, and AeroCon.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.