If it has to make a choice, will your autonomous car kill you or pedestrians on the street?
The looming arrival of self-driving vehicles is likely to vastly reduce traffic fatalities, but also poses difficult moral dilemmas, researchers said in a study Thursday.
Autonomous driving systems will require programmers to develop algorithms to make critical decisions that are based more on ethics than technology, according to the study published in the journal Science.
"Figuring out how to build ethical autonomous machines is one of the thorniest challenges in artificial intelligence today," said the study by Jean-Francois Bonnefon of the Toulouse School of Economics, Azim Shariff of the University of Oregon and Iyad Rahwan of the Massachusetts Institute of Technology.
"For the time being, there seems to be no easy way to design algorithms that would reconcile moral values and personal self-interest -- let alone account for different cultures with various moral attitudes regarding life-life tradeoffs -- but public opinion and social pressure may very well shift as this conversation progresses."
The researchers said adoption of autonomous vehicles offers many social benefits such as reducing air pollution and eliminating up to 90 percent of traffic accidents.
"Not all crashes will be avoided, though, and some crashes will require AVs to make difficult ethical decisions in cases that involve unavoidable harm," the researchers said in the study.
"For example, the AV may avoid harming several pedestrians by swerving and sacrificing a passerby, or the AV may be faced with the choice of sacrificing its own passenger to save one or more pedestrians."
Programming quandary
These dilemmas are "low-probability events" but programmers "must still include decision rules about what to do in such hypothetical situations," the study said.
The researchers said they are keen to see adoption of self-driving technology because of major social benefits.
"A lot of people will protest that they love driving, but us having to drive our own cars is responsible for a tremendous amount of misery in the world," Shariff told a conference call.
The programming decisions must take into account mixed and sometimes conflicting public attitudes.
In a survey conducted by the researchers, 76 per cent of participants said that it would be more ethical for self-driving cars to sacrifice one passenger rather than kill 10 pedestrians.
But just 23 per cent said it would be preferable to sacrifice their passenger when only one pedestrian could be saved. And only 19 per cent said they would buy a self-driving car if it meant a family member might be sacrificed for the greater good.
The responses show an apparent contradiction: "People want to live a world in which everybody owns driverless cars that minimize casualties, but they want their own car to protect them at all costs," said Rahwan.
"But if everybody thinks this way then we end up in a world in which every car will look after its own passenger's safety or its own safety and society as a whole is washed off."
Regulate or not?
One solution, the researchers said, may be for regulations that set clear guidelines for when a vehicle must prioritize the life of a passenger or others, but it's not clear if the public will accept this.
"If we try to use regulation to solve the public good problem of driverless car programming we would be discouraging people from buying those cars," Rahwan said.
"And that would delay the adoption of the new technology that would eliminate the majority of accidents."
In a commentary in Science, Joshua Greene of Harvard University's Center for Brain Science said the research shows the road ahead remains unclear.
"Life-and-death trade-offs are unpleasant, and no matter which ethical principles autonomous vehicles adopt, they will be open to compelling criticisms, giving manufacturers little incentive to publicize their operating principles," Greene wrote.
"The problem, it seems, is more philosophical than technical. Before we can put our values into machines, we have to figure out how to make our values clear and consistent. For 21st century moral philosophers, this may be where the rubber meets the road."