Monday, April 13, 2015

The moral dilemma facing the makers of drone cars.

Self-driving cars, drone cars, call them what you will. Long-time readers will know I view this as a bleak, joyless future. However there's a serious question to be asked, and that is the moral dilemma facing the people programming the software. At some point, your self-driving car is going to be faced with a simple decision : does it kill you, or does it kill the pedestrian at the side of the road?
Think about it - someone jumps a red light at an intersection as you're approaching. It's obvious to the onboard systems that even with full braking, you're going to hit the vehicle that sits in the intersection, and hit it hard enough that the airbags and restraints might not save you. The alternative is to steer and brake at the same time, but the sidewalks are full of pedestrians waiting for the next green light. The crosswalks on the side streets are full of already-crossing pedestrians so at this point, what's the right decision? To kill you and save 3 or 4 pedestrians? To kill the pedestrians and save you? Or to brake as hard as possible and hit the vehicle square in the middle in the hope that you and the occupants of the other vehicle will survive the impact?
Would you want the job of programming that logic? More to the point, would you accept that in a very particular set of circumstances, the car that you're sitting in could be programmed to sacrifice your life to save others?

4 comments:

Paul said...

I don't see a dilemma at all Chris. I presume there will be something in the programming that says if there are any pedestrians do not steer towards them/avoid them. The logic in your scenario will be:
Maximum braking.
Can I steer left - No
Can I steer right - No
Steer straight.
You're also saying that the car will have some sort of awareness which it doesn't (yet?). There is no "who do I save?" There is only code that it will follow (most of the time).

Anonymous said...

When all the cars will be "drones", this situation won't happen, because a drone won't drive through a red light. (Will there even be necessary to have red lights? Maybe just for pedestrian crossing.). The cars will exchange information between them and the infrastructure, so they'll know about each other's "intentions".
In the situation that you described, it is possible that the self-driving car will already know from the infrastructure that a collision might happen, even before its own sensors detect it, so it will react earlier.

For the transition time however, when drones and non-drones will be on the streets together, I hope first priority should be self-preservation, just like the human instinct is "programmed".

Anyway we can't pre-program every possible situation. Unexpected things will always happen.

Unknown said...

If I was driving a non-automated car then the decision to swerve into pedestrians wouldn't be an option, it's my decision to drive so I (and the other driver) in front unfortunately have to live with the consequences. Swerving into the pedestrians in my eyes would constitute murder or man slaughter, not something the car manufacturers would want to write into their softwares logic.

Thomas said...

Thanks for sharing.