I've blogged about this before, but here's a thought experiment for you (Given that Tesla are now involved in two high profile cases where their autopilot crashed the car and they're saying they're not to blame):
It's the future, you're in your self-driving car, minding your own business. You're on the motorway, commuting to work. A minivan full of kids on the school run is to your right, a motorcyclist is to the left. Traffic is moving at a steady pace. The truck in front of you has a badly-secured load, and as it hits a pot-hole, a large sheet of steel is dislodged and slides off the back, catching the wind and flipping up. At this point a crash is inevitable. At the very least, someone is going to be seriously injured, and at the worst they could potentially be killed. Your self-driving car has logic built into it for a situation like this, so who's it going to choose to kill?
Does your car kill the motorcyclist to avoid the steel sheet and save your life? Does it defend the motorcyclist (because they are the most vulnerable road user in direct proximity), leaving the option of killing you or the van full of kids? If that's the case, who's life is more valuable? You on your own, or the children next to you? Can your car make that life-changing decision in a split second, taking into account all the variables and inputs from all its sensors? What if the self-driving minivan is also trying to make the same decisions at the same time?
Think about it. When you buy a self-driving car, there will be a clause in a contract you will have to sign, that indicates that you're OK with the idea that the car you've just bought might one day choose to sacrifice you for the common good.
Now how do you feel about self-driving cars?