Monday, February 15, 2016

Seems humans are intervening in drone car testing far more than we knew.

In my single-man crusade to try to keep people from being blinded by hype on drone cars, here's my first post for 2016.
You're likely aware by now that for a company to test drone cars in California, they need a permit, and they need to report many things as part of the permit process. One of those things is how many times the driver is forced to take control of the car. Well - the first round of filings is in, and if you're holding your breath for drone cars to be the hot new thing, you might want to start breathing again.
Overall, seven companies (Bosch, Delphi, Google, Nissan, Mercedes, Tesla and VW) filed 'disengagement' reports in the fifteen-month period between September 2014 and November 2015. A disengagement is defined as either the self-driving software failing and needing a human to take control, or the test driver feeling compelled to take control to avoid a dangerous situation.
If you read the report (linked below) you'll see there were 2,894 disengagements logged. By far and away the most disengagements were for "perception discrepancy". This means the car failed to see something, or failed to react in the correct way to external stimulus. The lowest number of disengagements were for "incorrect behaviour prediction of other traffic participants". This last one flies in the face of Google's (in particular) oft-touted excuse that it's always other people to blame for their car having accidents and disengagements.
The reports are not a glowing endorsement of drone cars: for the public to properly accept them as an alternative, the error rate will have to be ridiculously low. 12 potential crashes per 50,000 miles is far, far, far from perfect.
DMV Autonomous car disengagement report