Monday, October 10, 2016

Even Google doesn't know how to make it's cars make life-or-death decisions.

In my long-running rant against the future of drone-cars, another snippit of anecdotal evidence that we're heading towards a bleak future. Google - in theory the current head of the pack when it comes to drone car technology- admitted this week that even they don't know how their cars will make life-or-death decisions.
Remember my posts where I talk about having to sign a contract when you purchase a car, that indicates that you're OK with the car killing you if need be? Yeah - THAT life-or-death decision.
Google also announced earlier in the year that yes, in fact their drone car WAS responsible for the bus crash it was involved in. So we have one company who's admitted their cars aren't the infallible nirvana everyone was promised, and that they don't know how to endow the car with the ability to make a life-changing decision.
And we have Tesla who have a couple of open lawsuits on their hands, one of which looks increasingly like it was indeed their car at fault for the death of it's driver.
Sure, Tesla also have stories about how their so-called "autopilot" helped get a guy to hospital, but one of those stories doesn't cancel out the fact that one of their cars killed it's driver.

In the meantime, you can join the debate because MIT have made a 'moral' game where you get to choose who gets killed in a self-driving car accident, including yourself. if you fancy seeing what the future looks like, when you hand over the responsibility of driving to a two-ton robot, head over to MIT's Moral Machine and start plowing down crowds of elderly women and male athletes....My results showed an extreme intolerance of rich people and people crossing on a red light :)

Me - I'm still not looking forward to this future. Too many companies are trying to ram something down our throats because they can, rather than sitting back to decide whether they should. These situations never end well.
Google admits it doesn't know how its cars will make life or death decisions.

2 comments:

Unknown said...

I killed everybody crossing on a red light.
And I killed animals over "hoomans".
That lead to some interesting side-results:
1. apparently I prefer females in gender.
2. I scored maximum toward fat people, though that was not intentional :)

Oto said...

No way Google car was guilty for crash with bus. Bus driver was the one that was recklessly driving. If both were autopilot driven then the crash wouldn't have happened at first place as bus wouldn't try to squeeze between. Also Google tried to make car more human(a stupid idea) so it tried to keep right and that caused this crash. So humans are still to blame.

Regarding moral test I abided our local laws that car owner(or driver) is responsible of car technical condition so brake failure is your fault so you must die except if it is possible to avoid obstacle and consequences are less than if not trying to avoid. If trying to avoid causes more damage than driving as per code then this is not avoiding but another crash not related to previous situation.