If a self-driving car is about to crash into a crowd, and it can only swerve to hit a baby stroller or a large man, whom should it decide to kill?
 
Questions like that were the focus of a massive global survey by researchers who say the results could help program autonomous vehicles to behave "ethically" in split-second, life-or-death scenarios.
 
Their paper, titled, "The Moral Machine Experiment," was published in the journal Nature Oct. 24. The researchers' internet platform, called the Moral Machine, asked participants to consider practical dilemmas autonomous vehicles might encounter in daily operation. Participants were asked their moral preferences about how an autonomous vehicle should respond in crisis situations- like whether to swerve to hit a crowd of people or stay on course to hit two people.
 
Commonalities among respondents included a preference for humans over animals, for the young over the old, and for preserving the lives of many people rather than a few people. Respondents also showed a moderate preference for sparing law-abiding bystanders over jaywalkers.
 
Among the authors of the paper was Iyad Rahwan, an associate professor of media arts and sciences at the MIT Media Lab.
 
"On the one hand, we wanted to provide a simple way for the public to engage in an important societal discussion," Rahwan said, according to the MIT News Office. "On the other hand, we wanted to collect data to identify which factors people think are important for autonomous cars to use in resolving ethical tradeoffs."
 
The paper discusses how these preferences can contribute to developing "global, socially acceptable principles for machine ethics," its abstract said.
 
For Stephen Barr, a Catholic scientist who is a physics and astronomy professor at the University of Delaware, the survey results show some merits.
 
"Favoring humans over animals is obviously right," he told CNA. He compared distinctions among humans envisioned in the scenario to "triage situations" in hospitals.
 
"Some of the rankings one finds in this survey make sense and are perfectly traditional. Others are a bit strange, such as disfavoring large people," he said.
 
While not necessarily endorsing the preferences, Barr said, "some rules must be given to the machines that reflect human moral judgments, and they must be the kinds of rules that a machine can apply."
 
"The rules people come up with for the machines to follow will not be perfect, but they may be better than the way a human driver in a state of panic would respond," he said.
 
The researchers' Moral Machine platform collected 40 million decisions in ten languages from millions of people around the world, in 233 countries and territories. Over 490,000 respondents gave demographic data, which researchers split to subgroups by age, education, sex, income and political and religious views.
 
The top six strongest preferences to spare a character in the scenarios, in descending order, were a baby stroller, a girl, a boy, a pregnant woman, a male doctor and a female doctor.
 
Respondents rated criminals of lower importance than dogs, who in turn were lower than an old woman, an old man, a homeless person, a large man, a large woman and a male business executive.
 
Researchers documented individual variations based on respondents' demographics, as well as cross-cultural ethical variation. Their analysis found three major clusters of differences, clusters researchers said "correlate with modern institutions and deep cultural traits."
 
The researchers categorized these clusters as "western," "eastern" and "southern."
 
Those in "southern" countries showed a stronger tendency to save young people over the elderly, while those in the "eastern" cluster, which included many Asian countries, showed a weaker tendency to save children instead of the elderly compared to the other two groups.
 
Catholic ethics generally hold there to be a natural law standard of ethics that transcends cultural groups.
 
Barr said that widespread moral preferences "can be in error" but by definition the natural moral law is not.
 
"But natural law is often reflected in very widespread and transculturally held moral ideas," he said. "The preferences for saving the child over the adult, the pregnant woman over other adults, the human being over the animal, and several other preferences seen in this survey are quite consistent with traditional Christian values and with the natural law."
 
Other preferences seemed arbitrary to Barr, such as the ranking for the large man.
 
"Since questions of the common good are clearly involved, ultimately public authorities must be involved in setting the rules," he said. "In one way or another, public opinion will play a large role, especially, but not only, in democratic societies."
 
If responsible people perceive the rules to be unreasonable, "they will not be tolerated for long."
 
Because public opinion can be influenced by various authorities, the Church "should do what it can to enlighten people on the principles that should apply," said Barr.