Self-driving cars don’t think -- they can only do whatever humans have programmed them to do. And they have weightier issues to deal with than just the rules of the road. Like, for instance: If a pedestrian jumps out on the road, and the only way to avoid them will cause a dangerous crash, how should the car respond?

Does the car sacrifice the lives of passengers to avoid a pedestrian? Do you swerve away from one person in the road to hit several people standing on the sidewalk? Do you go straight and hit a baby stroller or avoid it and kill an old lady standing on the corner? Whose lives do you give up?

To find out what people around the world think self-driving cars should do, a group of MIT researchers designed an online survey called Moral Machine. More than 2 million people from over 200 countries responded, and the analysis was published this week in Nature.

“Never in the history of humanity have we allowed a machine to autonomously decide who should live and who should die, in a fraction of a second, without real-time supervision.”
The Moral Machine Experiment

It turns out the choices people make depend largely on where they’re from. As the MIT Technology Review notes, those in mainland China, Taiwan and other Confucianist societies are less likely to spare the young over the old. The researchers believe it might be because these cultures devote more respect to older members of the community.

Another interesting finding: People in China are more likely to choose killing pedestrians over passengers in the self-driving car -- compared to people in Western countries, who are happier to sacrifice passengers.

The researchers admit that the study doesn’t capture the full complexity of the dilemmas facing autonomous vehicles. It assumes, for instance, that machines can tell for sure whether a person is a child or an adult -- and that everyone hit by a car will die for certain.

But the authors say they hope the results will provide a reference point for car designers and policymakers, who should be informed… but not necessarily follow public opinion. The study also suggests that countries may set different rules on autonomous vehicles -- and algorithms will have to be adjusted accordingly.