Embed the moral code into the autonomous car
In an interesting lecture of Michael Sandel: What's the right thing to do, he gave an example when the driver has to decide whether he goes forward and kills 5 or he turns and kills 1 person. The majority of that class chooses the second option.
The question is do we want to embed that kind of moral code into the autonomous car. Do you want your own moral code or just go with majority, the default option?