Monday, June 5, 2017

Drivers are prone to error, but so are coders.

I'm finding it difficult to understand why, in an era when so many people interact directly with software on a regular basis — and know first hand how temperamental and buggy it very often is — we are so uncritical of the idea of the driverless car.
People are fond of saying that most car accidents are the result of human error. Consider this : 100% of software defects are the result of human error. Untold thousands of person-hours have been spent building operating systems (both smartphone and desktop) and the programs that run on them: systems that have a prescribed set of machine-readable inputs. And still we get bugs, software updates that render our computers unusable, viruses and other malware, the Blue Screen Of Death, the Spinning Beachball Of Death, core dumps, system panics, and the rest of it. No one has shown us why we should believe that the software running driverless cars — where the set of possible inputs is much larger and much less predictable — would be any better.
A driver-augmentation system that helps with lanekeeping in a prepared environment? Sure. Heads-up displays to conserve and focus driver attention? You bet. Awareness-management systems that alert drivers to potential trouble? Absolutely — as long as that much-maligned but still miraculous human situational intelligence is there to quarterback the whole thing.
Here's a funny thing about driverless cars and safety: people talk about humans being inherently dangerous behind the wheel, but we know for a fact that not all humans are equally dangerous behind the wheel. Even better, we know which humans tend to be most dangerous behind the wheel — because insurance companies know it, and they convey that data to us in the form of the premiums they charge. By a significant margin, the most dangerous drivers — that is, the most expensive to insure — are young people in their 20's. Want safer roads? We could start by raising the minimum driving age or enforcing better driver training and stricter testing.
Software is written by humans that work for corporations. Delphi, Intel, Volvo, Google - it doesn't matter. Reality check: corporations are driven by the bottom line. I don't want to be sitting in a car knowing that the corporation behind the software hit some quarterly deadline by taking shortcuts to get something out the door to ensure the board got it's compensation and the shareholders got their dividends. This happens every day. That's the reality of driverless cars.
If my smart home thermostat has a bug in it, the house might be too cold or too hot, but it's has zero potential to kill me. That is simply not true when it comes to driverless cars.