tag:blogger.com,1999:blog-6239694080870817712.post6407032473572160636..comments2024-03-01T01:07:13.368-07:00Comments on No, but seriously: Even Google doesn't know how to make it's cars make life-or-death decisions.Steven Pricehttp://www.blogger.com/profile/01856604112181023270noreply@blogger.comBlogger2125tag:blogger.com,1999:blog-6239694080870817712.post-66641214206780148182016-10-14T13:47:27.059-06:002016-10-14T13:47:27.059-06:00No way Google car was guilty for crash with bus. B...No way Google car was guilty for crash with bus. Bus driver was the one that was recklessly driving. If both were autopilot driven then the crash wouldn't have happened at first place as bus wouldn't try to squeeze between. Also Google tried to make car more human(a stupid idea) so it tried to keep right and that caused this crash. So humans are still to blame.<br /><br />Regarding moral test I abided our local laws that car owner(or driver) is responsible of car technical condition so brake failure is your fault so you must die except if it is possible to avoid obstacle and consequences are less than if not trying to avoid. If trying to avoid causes more damage than driving as per code then this is not avoiding but another crash not related to previous situation.Otonoreply@blogger.comtag:blogger.com,1999:blog-6239694080870817712.post-23789889968485101302016-10-11T04:51:11.732-06:002016-10-11T04:51:11.732-06:00I killed everybody crossing on a red light.
And I...I killed everybody crossing on a red light. <br />And I killed animals over "hoomans".<br />That lead to some interesting side-results:<br />1. apparently I prefer females in gender.<br />2. I scored maximum toward fat people, though that was not intentional :)Anonymoushttps://www.blogger.com/profile/10509013573518553035noreply@blogger.com