Although fully autonomous cars are still only in development, potential users are starting to think about how an artificial brain will make life and death choices in case of an accident.
Strangely, humans seem to forget that in case of a collision, we only have a split second to react (and rarely make a conscious choice at all). But in case these decisions can be made, people do not have the same opinion. Based on gender, age and professional background, our views on the morality of self-driving cars differ.
When asked about the morality of self-driving cars altogether, half of the women in a recent survey said such life and death choices could be taught to futuristic vehicles, while two-thirds of men believed it could not.
Interestingly, IT professionals are much more optimistic, as 60% think that our future autonomous cars will be able to make rational decisions in case of a collision.
Protect passengers or pedestrians?
Half of the respondents considered the safety of the occupants of the car as paramount and the other half preferred to protect the “greater good” (choose as few victims as possible). Only a small minority (3%) preferred the idea of the car choosing randomly.
Two-thirds of people nonetheless declare that they would sacrifice themselves in case of a crash, but in fact very few of us will ever be in such a situation, and even fewer will have a choice at all.
Self-driving cars are supposed to have an emergency stop button, but two out of three respondents agree that the manufacturer (and not the user) should be responsible in case of an accident, with few variations among age groups.
Until recently, letting an artificial brain make a life and death choice looked like science-fiction, and this possibility scares a lot of people. But who knows? Maybe autonomous cars will prove to be safer and better drivers than us after all.
The infographic below shares some additional statistics that show the variations of opinions regarding self-driving cars and AI morality.