If we ever do get any of the smart cars into general use they will undoubtedly be the slowest vehicles on the road, and the most irritating to every other driver. The reason is that they will all be programmed with the lawyer sitting right next to the code writer and insisting that the software must not exercise any judgement but only do what has been proven to be safe. The result will wind up being way too safe, probably to the point of paralysis. That is what will ultimately kill the madness of the self-driving car, since they will never execute judgement but only go by the rules that are programmed into them. So the lawyers will destroy what the engineers build. How else could it possibly wind up?
On my last trip, I was passed by a Mustang, traveling so much faster than the posted 75 mph, that my SUV rocked. The driver cut around traffic on the left shoulder...sporting a space saver spare on the right front.
Just last month, the Exxon pipeline ruptured and spilled over ten thousand gallons of tar sand into an Arkansas neighborhood.
My SUV spent several days in the shop just this past December when I drove through a local rain puddle in the middle of our street (been there for the 28 years I have lived on the street). The ignition computer AND starter shorted out with the minor splash (splash shields were missing...only maintenance ever done on the vehicle was by the dealer)
My point is that the smart vehicle/autonomous roadway system success depends 100% on built in carefully thought out hardware and programs with redundant systems and meticulous maintenance.
The weakest link within any system is people. As a species, we take shortcuts with maintenance and common sense, then appologize profusely when disaster happens.
Commercial aviation has periodic disasters even though it is run by professionals...who are funded by political whims. I doubt that the automotive corrallary will see FAA style dilligence.
Driving is one of the daily joys I have in this world. In my 20 mile commute I can take any of 15 or so routes and each route has challenges. Each corner has its entry, apex and exit. No matter the speed, you can hit the apex and accelerate from 21 to 24 or in your mind from 75 to 120. Some of my most entertaining conversations are with (or about) other drivers while driving. Before I will tolerate or support autonomous driving automobiles, I would suggest a remote shut-down that would count the number of electronic "complaints" received and once the magic number was exceeded, the automobile would stop. That way we all could vote on the behavior of our fellow drivers (and get voted on ourselves). We cannot abdicate our responsibility to drive and act civily to a piece of technology. No matter what the lawyers try to prove, we are responsible for our actions and we do not need more technology to further separate us from those responsiblities. This rant is in no way relevant to the technological advances demonstrated in an autonomous vehicle, just to the justification thereof.
It has occurred to me that the simple specification for smart car reliability is "that it should not fail in any way at any time under any conditions". Until that condition is available we will need to have drivers paying attention and able to take control.
Design engineers need to prepare for a future in which their electronic products will use not just one or two, but possibly many user interfaces that involve touch, vision, gestures, and even eye movements.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies.
You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived.
So if you can't attend live, attend at your convenience.