Distance accuracy and precision are crucial components of effective advanced driver-assistance systems (ADAS) or higher levels of autonomy. One of the leading technologies for this purpose is LIDAR, however, it is still too expensive for these applications.
Thanks to advances in stereo-pair imaging, distance accuracy and precision can be leveraged by triangulating the cameras, be they visible or thermal imaging, to create 3D perception maps of objects in the scene, or ideally, as a redundant system with LIDAR for higher levels of autonomy. Due to recent breakthroughs and advancements in stereoscopic technology, stereo-paired cameras can be placed virtually anywhere on the vehicle with the ability to automatically adjust and recalibrate the cameras regardless of weather, atmospheric conditions, or even a bumpy roadway.
Why Thermal Imaging?
Thermal cameras have the ability to detect heat, or infrared energy, produced and reflected by everything on earth, thus they can see significantly farther than headlights at night. Thermal sensing excels in driving situations where other sensor technologies might be challenged, including low-visibility and high-contrast conditions: nighttime, shadows, dusk, or sunrise, or while facing direct sun or headlight glare. As a passive sensing modality, thermal is not affected or blinded by other active sensors including LIDAR or radar, and is effective in challenging weather conditions such as fog, smoke, and dust.
Improving Detection Precision by Identifying Roadway Obstacles in Real-Time
Stereoscopic imaging works similar to human vision in that it’s based on the triangulation of rays, in this case, visible and/or thermal rays, from two or possibly more viewpoints, providing depth perception by computing distance to different objects in a given scene.
To achieve high levels of situational awareness, vision sensors such as visible or thermal imaging that have higher resolution and higher precision data about every object in the scene, are required. Meaning, the number of pixels on an object is critical for understanding that the object is there (probability of detection) and determining what the object is (classification).
The higher the number of pixels, the better. Stereoscopic imaging using thermal and visible-light cameras can greatly improve ADAS and autonomous vehicle safety through improved situational awareness at night, in high-contrast scenes such as shadows and in adverse weather and poor lighting conditions.
LiDAR, on the other hand, will need to be fused with a camera in order to achieve situational awareness at long distances, which will require additional computational power and resources, as opposed to stereo which already uses vision sensors as part of the technology.
In addition to object detection, thermal stereo pairs can also be leveraged as an alternative and redundant layer of 3D perception data. By determining the distance and shape of objects around the vehicle, this data can also be used to help speed development and testing of future ADAS/AV features, including Automatic Emergency Braking (AEB), Lane Keep Assist, Active Cruise Control, as well as autopilot systems found in higher levels of autonomous vehicles.
Thermal stereo cameras are already being developed in the marketplace and can serve as a valuable tool for unmanned and autonomous aircraft, boats, and of course ground-based vehicles. More specifically, companies have already developed autonomous vehicle vision systems that rely on both visible and thermal stereo cameras to provide greater situational awareness.
For a few hundred dollars, stereoscopic imaging saves those who are integrating the technology a significant amount of money compared to LIDAR – which can be in the thousands. This savings is then passed on to the consumer, driving increased adoption.
Although autonomous carmakers, entrepreneurs, engineers, and developers have made significant progress in the creation of true self-driving cars, many challenges remain. These challenges are not insurmountable. By continuing to iterate, test and validate, including implementing emerging technologies like thermal stereo vision, it’s just a matter of time before we see the world’s first fully autonomous vehicle.