The fully autonomous vehicles of the not-so-distant future promise tremendous gains in automotive safety and transportation efficiency. But to fulfill this promise, automotive OEMs must move beyond contemporary levels of vehicle autonomy.
Making that leap will require overcoming a unique set of challenges for testing automotive radar sensors in advanced driver assistance systems (ADAS) and autonomous driving systems, as well as developing new methodologies for training algorithms that conventional solutions are ill-equipped to address.
SAE International (formerly the Society of Automotive Engineers) defines six levels of vehicle autonomy, with Level 0 representing fully manual and Level 5 representing fully autonomous.
Today’s most advanced autonomous vehicle systems rate only Level 3, which means they are capable of making some decisions such as acceleration or braking without human intervention. Getting from Level 3 to Level 5 will require many breakthroughs, including closing the gap between software simulation and roadway testing, and training ADAS and autonomous driving algorithms to real-world conditions.
Software simulation plays an important role in autonomous vehicle development. Simulating environments through software can help validate the capabilities of ADAS and autonomous driving systems. But simulation cannot fully replicate real-world driving conditions or the potential for imperfect sensor response — something that fully autonomous vehicles will inevitably have to contend with.
OEMs rely on on-road testing to validate ADAS and autonomous driving systems prior to bringing them to market. While road testing is and will continue to be a vital and necessary component of the development process, it is time-consuming, costly, and difficult to repeat specifically in the area of controlling environmental conditions. Relying upon on-road testing alone to develop vehicles reliable enough to navigate urban and rural roadways safely 100 percent of the time would take decades. In order for development to occur in a realistic timeframe, training algorithms are needed.
Validating radar-based autonomous driving algorithms is a crucial task. The sensors capture information about road and traffic conditions and feed that information to processors and algorithms that enable it to make decisions about how the vehicle should respond to any given situation. Without proper training, autonomous vehicles could make decisions that undermine driver, passenger, or pedestrian safety.
Just as people become better drivers with time and experience, autonomous driving systems improve their ability to deal with real-world driving conditions with time and training. And achieving Level 5 autonomy will require complex systems that exceed the abilities of the best human drivers.
Premature road testing of unproven ADAS and autonomous driving systems also creates risks. OEMs need the ability to enable validation of actual sensors, electronic control unit code, artificial intelligence, and more.
Some systems do not provide a true approximation of real-world driving scenarios. They have a limited field of view and cannot resolve objects at distances of less than 4 meters. Some of these systems use multiple radar target simulators, each presenting point targets to radar sensors and emulating horizontal and vertical positions by mechanically moving antennas around. This mechanical automation slows overall test time. Other solutions create a wall of antennas with only a few target simulators, enabling an object to appear anywhere in the scene, but not concurrently. In a static or quasi-static environment, this approach enables a test with a handful of targets moving laterally at speeds that are limited by the speed of robotic arms.
Image courtesy of Thomas Goetzl, Keysight Technologies
Current simulators can emulate a maximum of just 32 objects – including vehicles, infrastructure, pedestrians, obstacles, and other objects. This is far fewer objects than a vehicle traveling on the road may encounter at any given time. Testing radar sensors against a limited number of objects delivers an incomplete view of driving scenarios and masks the complexity of the real world.
Thomas Goetzl is vice president and general manager for Automotive and Energy Solutions (AES) of the Electronic Industrial Solutions Group at Keysight as well as managing director of Keysight Technologies in Germany. Tom was most recently the AES Business Manager for power applications focusing on test solutions for Smart Grid, Power Semiconductors, Batteries, and EV/HEV. During his time with HP, Agilent Technologies, and now Keysight Technologies, Tom has managed R&D projects, developed sales channels for remote server management products, and was named marketing manager in 2005. In 2012, Tom became the worldwide marketing manager for Electronic Test Division (ETD).