The fallout from the fatal crash of an auto-piloted Tesla Model S earlier this year continues, as automakers and suppliers battle, and experts caution that more such tragedies could occur as the technology evolves.
The crash, which happened last May when a driver allegedly failed to keep his hands on the steering wheel of his automated car, is considered a by-product of a general lack of understanding of a nascent technology. The root of the problem, say experts, is that self-driving cars are at an "in-between" phase, in which they're not fully manual and not fully autonomous. As a result, drivers, and sometimes even engineers, can fall prey to misunderstandings.
"Those 'in-between' levels can be a bit tricky," Inseok Hwang, professor of aeronautics and astronautics at Purdue University, told Design News. "Today, some cars are manually-driven and some are relatively automated. In these in-between years, there will be a lot of problems."
The Tesla crash serves as a flashpoint because it has fostered a public disagreement between the automaker and its supplier, MobilEye N.V., a manufacturer of camera-based advanced driver-assistance safety systems used in Tesla vehicles including the Model S.
In July, MobilEye executives said the company would end its partnership with Tesla Motors Inc. once its current contract expires, in part due to disagreements over how the technology was deployed, especially in the fatal Tesla crash.
The accident, which occurred on May 7th, happened when a Tesla Model S driver operated his vehicle in an Autopilot mode, which features lanekeeping, automatic braking, and automatic steering. Public agencies investigating the accident said a white tractor-trailer drove across a Florida highway and the Model S car sped under it without braking. A blog on Tesla's website suggested the electric car's sensors didn't see the white trailer against the backdrop of a brightly lit sky. Worse, the driver was allegedly watching a movie at the time, and didn't have his hands on the steering wheel.
Mobileye executives expressed concern, not only because the driver was inattentive, but because its technology wasn't meant for such driving scenarios. "Today's collision avoidance technology, or Automatic Emergency Braking (AEB), is defined as rear-end collision avoidance, and is designed specifically for that," Dan Galves of MobilEye wrote in an e-mail to Design News. "This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon." Separately, MobilEye also told Reuters news service that Tesla was "pushing the envelope in terms of safety."
MobilEye said that it will continue to "support and maintain" current Tesla Autopilot product plans, but added that its relationship with the automaker will not extend beyond its current generation of automated driving products.
Tesla, meanwhile, has emphasized that the driver misapplied the technology by being inattentive. "When drivers activate AutoPilot, the acknowledgement box explains, among other things, that AutoPilot 'is an assist feature that requires you to keep your hands on the steering wheel at all times,'" the company wrote in its blog. It added, "We do this to ensure that every time the feature is used, it is used as safely as possible."
A Human-Automation Problem
Aerospace engineering experts say such scenarios are unsurprising, given the fact that autonomous technology is in an emerging state. The aviation industry had similar problems during the birth of autopilot systems a half-century ago, they say. The difference, however, is that the aviation industry is strictly controlled, whereas autonomous car development is more industry-driven.
"Airline industry pilots have to go through intensive flight training," Hwang told us. "A lot of time is actually spent on how to use autopilot systems." The aviation industry has also studied the potential problems associated with autopilot systems, including inattentiveness and confusion caused by too much information.
Hwang said that such "human-automation interaction problems" need to be addressed in autos, as well as in aircraft. "The driver doesn't know what the autopilot can do or can't do or will do," he said. "The driver's expectations and the autopilot's expectations can be different. Then it can cause a dangerous situation."
Consumers and engineers will also face another problem - the rapid evolution of the technology itself as it emerges. Because vehicle development sometimes takes years, sensors may be "old" when they reach the market, and therefore might not offer the same resolution and range as newer generations. In the Tesla case, for example, experts say LIDAR sensors would have helped. "Their cameras didn't perceive it correctly," Charles Reinholtz, chair of the Mechanical Engineering Department at Embry-Riddle Aeronautical University, told Design News. "They probably thought (the tractor-trailer) was open sky. If they had had LIDAR on that vehicle, they would have almost certainly seen that tractor-trailer and would have reacted correctly."
Such problems are inevitable, largely because the technology is still evolving and because engineers can't foresee every scenario. Reinholtz said that some developers are now concerned with how the vehicles will deal with the unpredictable actions of human drivers. Some drivers, for example, may aggressively cut in front of autonomous cars, believing the sensors will always enable them to react in time. "The question is how we will deal with drivers who are unpredictable or are deliberately abusing the technology," Reinholtz said.
Suppliers are working on such issues, but the solutions aren't ready today. MobilEye, for example, will add Lateral Turn Across Path detection technology capabilities to its systems beginning in 2018. That technology would have almost certainly recognized the tractor-trailer in the tragic Tesla accident. Moreover, the company is developing learning software that will use neural networks to help autonomous vehicles deal with unforeseen circumstances.
All of these developments will take time, and are part of a long development process that needs to occur if vehicles are ever to reach full autonomy. In the meantime, experts say, auto companies need to effectively communicate the existing capabilities of the technology to a largely-confused public and suppress the temptation to over-hype it.
That will be especially important as more manufacturers, including GM, Ford, Daimler, and Volvo, adopt autonomous features. Many automakers are already deeply committed to the technology, with Ford chairman Bill Ford even saying that autonomous vehicles could have the same impact on society "as Ford's moving assembly did 100 years ago."
Still, more accidents will occur as the technology grows, experts say. Understanding what autonomous features can and can't do, and effectively communicating that knowledge to consumers, will help prevent some of those accidents. It will also help the industry and public to put them in perspective when they inevitably happen. "It's not going to be trivial and there will be cases that are tragic but, on balance, it will save lives and money," Reinholtz said.
In the long term, automakers may have little choice in the matter. Ultimately, they'll have to come to grips with the fact that the technology is here, because they may be compelled by law to adopt it, Reinholtz added. If they don't, and if the technology is seen as practical and life-saving, automakers could be held liable for lives lost.
"As an engineer, it's so hard to implement something that you can't be one hundred percent sure of," Reinholtz said. "But sometimes that's what we have to do."
Senior technical editor Chuck Murray has been writing about technology for 32 years. He joined Design News in 1987, and has covered electronics, automation, fluid power, and autos.