Monday, April 13, 2015

The moral dilemma facing the makers of drone cars.

Self-driving cars, drone cars, call them what you will. Long-time readers will know I view this as a bleak, joyless future. However there's a serious question to be asked, and that is the moral dilemma facing the people programming the software. At some point, your self-driving car is going to be faced with a simple decision : does it kill you, or does it kill the pedestrian at the side of the road?
Think about it - someone jumps a red light at an intersection as you're approaching. It's obvious to the onboard systems that even with full braking, you're going to hit the vehicle that sits in the intersection, and hit it hard enough that the airbags and restraints might not save you. The alternative is to steer and brake at the same time, but the sidewalks are full of pedestrians waiting for the next green light. The crosswalks on the side streets are full of already-crossing pedestrians so at this point, what's the right decision? To kill you and save 3 or 4 pedestrians? To kill the pedestrians and save you? Or to brake as hard as possible and hit the vehicle square in the middle in the hope that you and the occupants of the other vehicle will survive the impact?
Would you want the job of programming that logic? More to the point, would you accept that in a very particular set of circumstances, the car that you're sitting in could be programmed to sacrifice your life to save others?