this is not a purely theoretical question. in practice, autonomous vehicles face exactly this dilemma. or rather the manufacturers of the vehicles who have to set the specifications
I forget where it was from but years ago I found an online survey on autonomous cars and their decision making from a university. It was all about deciding to swerve or not in a collision. All kinds of difficult encounters like do you hit the barrier and kill the passenger or swerve and kill the old lady? Do you hit thin person or serve and hit the heavier person?
I’ve never seen a survey drill down into biases quite so deeply.
this is not a purely theoretical question. in practice, autonomous vehicles face exactly this dilemma. or rather the manufacturers of the vehicles who have to set the specifications
I forget where it was from but years ago I found an online survey on autonomous cars and their decision making from a university. It was all about deciding to swerve or not in a collision. All kinds of difficult encounters like do you hit the barrier and kill the passenger or swerve and kill the old lady? Do you hit thin person or serve and hit the heavier person?
I’ve never seen a survey drill down into biases quite so deeply.
I did this as a part of our ethics discussion.
My eventual answer was you always kill the non-driver as no one would ever buy a car that will kill them over someone else.
Easy. Prioritize who is saved based on social credit score.