In my long-running rant against the future of drone-cars, another snippit of anecdotal evidence that we're heading towards a bleak future. Google - in theory the current head of the pack when it comes to drone car technology- admitted this week that even they don't know how their cars will make life-or-death decisions.
Remember my posts where I talk about having to sign a contract when you purchase a car, that indicates that you're OK with the car killing you if need be? Yeah - THAT life-or-death decision.
Google also announced earlier in the year that yes, in fact their drone car WAS responsible for the bus crash it was involved in. So we have one company who's admitted their cars aren't the infallible nirvana everyone was promised, and that they don't know how to endow the car with the ability to make a life-changing decision.
And we have Tesla who have a couple of open lawsuits on their hands, one of which looks increasingly like it was indeed their car at fault for the death of it's driver.
Sure, Tesla also have stories about how their so-called "autopilot" helped get a guy to hospital, but one of those stories doesn't cancel out the fact that one of their cars killed it's driver.
In the meantime, you can join the debate because MIT have made a 'moral' game where you get to choose who gets killed in a self-driving car accident, including yourself. if you fancy seeing what the future looks like, when you hand over the responsibility of driving to a two-ton robot, head over to MIT's Moral Machine and start plowing down crowds of elderly women and male athletes....My results showed an extreme intolerance of rich people and people crossing on a red light :)
Me - I'm still not looking forward to this future. Too many companies are trying to ram something down our throats because they can, rather than sitting back to decide whether they should. These situations never end well.
Google admits it doesn't know how its cars will make life or death decisions.