How truly important is sensing in autonomous vehicle technology? Indu Vijayan, Director of Product Management at AEye, will answer this question during her keynote address at DesignCon. Vijayan will examine the current market for autonomous vehicles, identify the growth drivers, and discuss ways the industry can solve the most challenging edge cases inhibiting broader rollouts. Better still, attendees will experience a real-time AI-based driving demo showing intelligent sensing in action!
To learn more about the technology and trends behind this demo, Design News invited Vijayan to share her thoughts on critical issues. What follows is a portion of the resulting discussion.
Design News: Tell us a bit more about the live AEye demo taking place during your keynote.
Indu Vijayan: I will discuss AI-driven sensing and showcase via a real-time interactive driving demo conducted live from the keynote stage. The demo will help illustrate how that technology is assisting vehicles to see smarter, respond faster, and accelerate the adoption of autonomous features.
Design News: Why is sensing so crucial in autonomous vehicles?
Indu Vijayan: For a car to see as a human does, it needs a sensor suite that equips the vehicle to detect objects and understand its environment on any roadway, in any light and weather condition, and at any speed. Sensors must replicate and, ideally, enhance human vision, enabling vehicles to make simple decisions, like finding a clear driving path and complex choices, such as navigating roads without signage.
As automated driving features have increased, so too have the number of sensors in a passenger vehicle. This began in the 1990s when adaptive cruise control was released, using radars, through the present - when there are complementary sensors for each angle around a car. The number of vehicles with sensors is expected to reach ~90 million by the end of the decade, with half of them being used for higher levels of autonomy - SAE Level 2 and above. These higher levels of autonomy will require vehicles to use a mix of sensors, including LiDAR, which can identify and precisely locate potential obstacles and threats, to navigate roadways safely.
Design News: Without giving too much away concerning your keynote, what are the most challenging cases inhibiting the broader rollouts of autonomous vehicles.
Indu Vijayan: We refer to the most challenging cases, the 5% that are the most challenging to solve in autonomy, as “edge” or “corner” cases. These can be weather-related, auto-related, object-related, etc. These are the types of scenarios that a sensor system must solve to be smarter than a human. This will require a system-level approach that is better than the sum of its parts and surpass the performance of both the human eye and camera alone.
Register now for DesignCon 2021 to learn about the latest chip, board, and system technologies and implementations, plus see the latest exhibitions. Join engineers, technicians, managers, C-level executives, and exhibitors at this 14-track, 3-day conference and 2-day expo, now in its 26th year.
John Blyler is a Design News senior editor, covering the electronics and advanced manufacturing spaces. With a BS in Engineering Physics and an MS in Electrical Engineering, he has years of hardware-software-network systems experience as an editor and engineer within the advanced manufacturing, IoT and semiconductor industries. John has co-authored books related to system engineering and electronics for IEEE, Wiley, and Elsevier.