How much longer do we have to wait for cars to become self-driving nap pods?
Well, it could be a while. Despite significant progress and thousands of miles in real-world testing, there are inherent risks in handing over the reins entirely to software in autonomous vehicles. We are absolutely moving in that direction and we should be, but there is an unprecedented challenge around maintaining and working on systems that are so interdependent. If you make a change to system A, what impact does that have on systems B, C, and D? Is that impact predictable and testable?
The challenge for making self-driving cars safe involves aligning the disparate software systems into full interoperability. (Image source: Uber)
When most people think about autonomous vehicles, they think of them as a singular self-driving capability. In actuality, they comprise multiple interlocked systems that need to inform each other in real time to make and execute the right decision.
The Systems that Make Self-Driving Possible
In March, a self-driving Uber vehicle demonstrated what happens when those systems fail, hitting and killing a pedestrian who was wheeling her bike across a four-lane road. That car was operating with three autonomous systems or modules. The perception module leverages cameras, radar, and LiDAR to identify objects around the car. Next, the prediction module forecasts the movements of those objects. Lastly, a third module sets the driving policy and makes decisions based on the output of the first two modules. In the Uber case, the perception module could not identify the pedestrian in time, delaying the decision-making process until it was too late for the car to brake.
Of course, this technology is still in its relative infancy. This is when accidents are almost expected. Even though accident rates for human drivers are still higher, though, any accident resulting from a self-driving vehicle will garner additional scrutiny. So, what can automotive companies do to ensure that our software chauffeurs are benign moving forward?
An Exciting Testing Future
Like tires or any other part of a vehicle, software can be tested for quality in myriad ways. Historically and still predominantly, automotive software is embedded and sits unchanged in a vehicle unless there is a firmware update required. The development is waterfall and quality-intensive upfront. This approach likely won’t change anytime soon, but the permanent nature of embedded software is changing due to over-the-air updates.
Tesla popularized this technology initially with its Model S. Most recently, it wowed the industry with a software update to the Model 3 that reduced its braking distance at 60 mph by 19 feet nine days after receiving criticism for it from Consumer Reports.
Most other auto manufacturers haven’t been able to implement this technology yet. But companies like Here are now enabling software update deployment via a supplementary part.
Some automotive engineers are skeptical about how making changes so rapidly without proper testing could impact safety. On the other hand, if we extrapolate out the future of this kind of technology, it could empower dynamic software testing. Instead of needing to bring in a vehicle for an inspection, your software could be qualified overnight and updated upon failure. Recalls, which might otherwise cost millions of dollars to remedy, could be addressed with a simple over-the-air update.
With a higher volume of updates on increasingly complex systems, traceability is going to be more difficult to manage and more important to maintain. If an accident happens and software is to blame, how fast can the manufacturer or suppliers identify the problem and implement a fix that doesn’t create additional problems?
The Software Traceability Challenge
In a recent SmartBear report, we found that 55 percent of software development teams are using Git as their source control system. Each change that they make has a unique commit ID associated with it. You can get a rough sense for when it was submitted and who might have approved the related pull request.
For the automotive industry, that is not enough. Companies need a more formal peer review structure as part of their software assurance program. Every change activity to software needs a well-documented, auditable peer review. For code reviews, that means gathering additional information like a completed time-stamped review checklist, clear conversation threads about potential defects, and review metrics like lines of code reviewed and time spent on reviews.
There are pure code review tools that allow you to do most of this, but they are not capable of being a source of truth for software traceability because they don’t capture all changes. If a design document, requirements document, or test plan is changed, there is no record of it. If there is a record, it exists in a separate tool. Many leaders in the industry have looked to solve this by adopting Collaborator, a peer review tool that enables teams to conduct comprehensive reviews of both code and documents in the same place.
Just as messaging and project management tools have been bundled into the collaboration tools space, code review and document review tools should be considered (within the peer review functional category) as a means to supplement versioning and solve for software traceability.
The Digital Thread for Automotive Software
Peer reviews are a large part of the solution, but still only a part. In manufacturing, the Digital Thread refers to a comprehensive communication trail that spans the full development lifecycle, from planning and design to deployment and maintenance. As this concept is now gaining traction in software development, teams need to build their own Digital Threads. This means utilizing third party integration-friendly tools, creating reporting dashboards across their development, and enacting continuous process improvement.
Systems are only becoming more complex. When a hardware team makes a change to one part of a jet engine, that team then needs to get sign-off from the suppliers of all parts in close proximity to ensure compatibility. Development teams will need to adopt a similar mentality so that all related software systems can be tested and verified in harmony with each other. It will take a proactive and deliberate approach to reach this level of linked quality assurance, but the road forward can’t be paved another way.
Patrick Londa is the digital marketing manager for Collaborator at SmartBear Software. With a background growing agile startups in the clean tech and digital health space, Patrick is now focused on software quality, process traceability, and peer review systems for companies in highly-regulated, high-impact sectors.
|SAVE THE DATE FOR PACIFIC DESIGN & MANUFACTURING 2019! |
Pacific Design & Manufacturing , North America’s premier conference that connects you with thousands of professionals across the advanced design & manufacturing spectrum, will be back at the Anaheim Convention Center February 5-7, 2019! Don’t miss your chance to connect and share your expertise with industry peers during this can't-miss event. Click here to pre-register for the event today!