Uber's AVS can display information about real-world performance of autonomous vehicles. (Image source: Uber Engineering)
While Uber hasn't made a secret of its autonomous car ambitions, the ride sharing company has been quietly making inroads in developing new technologies for the space. The latest is an new, open-source version of its Autonomous Visualization System (AVS) that will allow developers and engineers to share autonomous vehicle data in a comprehensible and standardized fashion.
“Understanding what autonomous vehicles perceive as they navigate urban environments is essential to developing the systems that will make them operate safely, “ Uber engineers Xiaoji Chen, Joseph Lisee, Tim Wojtaszek, and Abhishek Gupta wrote in a blog post. “And, just as we have standards for street signs and traffic infrastructure to help human drivers, autonomous vehicle developers would be well-served by a standard visualization platform to represent input from sensors, image classification, motion inference, and all other techniques used to build an accurate image of the immediate environment.”
With the new AVS, Uber is offering engineers a web-based toolkit for building applications for analyzing perception, motion, and planning data from autonomous vehicles. By open-sourcing the toolkit, Uber wants to provide a standalone and standardized visualization layer for developers that will eliminate the need for developers to build their own custom visualization software for autonomous vehicles. “With AVS abstracting visualization, developers can focus on core autonomy capabilities for drive systems, remote assistance, mapping, and simulation,” the Uber Engineering team wrote.
Autonomous driving isn't just a challenge for automakers. Major technology companies such as Google, Microsoft, and Nvidia, as well as various academic institutions and startups, are all tackling various aspects of the equation. Visualization tools that display what autonomous vehicles “see” around them are particularly crucial for ensuring self-driving cars operate safely. With more sophisticated sensor technologies and various other hardware solutions constantly coming into play, it creates an ever-evolving ecosystem for engineers as visualization becomes not just about playing back data but also about simulation, mapping, image collecting, data labeling, and more. In itself, this creates a whole infrastructure built around providing the tools engineers need for these tasks.
|(Image source: Uber Engineering)|
But within all of this there's been a noted lack of standards, according to the Uber Engineering team. “The lack of a visualization standard has resulted in engineers assembling custom tools around ready-made technologies and frameworks in order to deliver solutions quickly,” the Uber blog said. “However, in our experience, these attempts at developing tools around disparate, off-the-shelf components lead to systems that are challenging to maintain, inflexible, and generally not cohesive enough to form a solid foundation for a platform.”
The AVS functions in two layers. The first layer, XVIZ, is the data layer for handling the streams of information coming from various sensors in the autonomous vehicle, such as point clouds from the vehicle's LiDAR sensors. The second layer, streetscape.gl, takes all of the data from XVIZ and transforms it into visual streams in the form of 3D viewports, charts, tables, and videos, depending on the user's preferences.
By open-sourcing AVS, Uber said it wants to not only allow easier access to developers, but also to encourage third-parties to add new features and contributions to the platform. The Uber Engineering team said it hopes AVS will also eventually spread into other areas beyond autonomous vehicles and into other mobility-related arenas such as urban planning investment, geospatial analysis, and advanced mapping, among others. “We find that an open data and tools strategy can help governments, developers, researchers, and the overall industry accelerate towards a smarter transportation ecosystem for the future.”
Chris Wiltz is a Senior Editor at Design News covering emerging technologies including AI, VR/AR, blockchain, and robotics.
The nation's largest embedded systems conference is back with a new education program tailored to the needs of today's embedded systems professionals, connecting you to hundreds of software developers, hardware engineers, start-up visionaries, and industry pros across the space. Be inspired through hands-on training and education across five conference tracks. Plus, take part in technical tutorials delivered by top embedded systems professionals. Click here to register today!