Follow
Subscribe

Understanding Sensitive Moral Issues Surrounding Self-Driving Cars

Home > News > Content

【Summary】In a scenario wherein both groups are on the verge of losing their lives due to mechanical failure (for example, the brakes suddenly stop working) or bad weather (causing wet, slippery roads), which one will the car’s algorithm choose?

Original Michael Cheng    Oct 19, 2016 5:45 PM PT
Understanding Sensitive Moral Issues Surrounding Self-Driving Cars
Michael Cheng

By Michael Cheng

Self-driving cars are forced to make a plethora of tough, controversial decisions on the open road. In addition to making sure that passengers get to their destination in one piece (without emotional or mental trauma), they must also ensure the safety of nearby pedestrians. Such choices are easy to make when only one party's life is on the line. 

But in a scenario wherein both groups are on the verge of losing their lives due to mechanical failure (for example, the brakes suddenly stop working) or bad weather (causing wet, slippery roads), which one will the car's algorithm choose? 

"Experts say that 90 percent of accidents are avoidable by technology that basically will eliminate human error," said Iyad Rahwan, a professor at MIT's Media Lab. "The other 10 percent are caused by less controllable things, like maybe bad weather conditions, or mechanical failures, or just kind of random freak accidents, that not even a very sophisticated computer can avoid."

The Driver or the Crowd Scenario

This predicament, called "the driver or the crowd," has puzzled engineers assigned to fine-tune and preserve the social aspects of self-driving vehicles. The scenario starts off innocently with the car taking the passenger to a predetermined destination. As the vehicle approaches the intersection, it drives over a slippery spot on the road. The car initiates the breaking system, but it's not biting because the tires are coated with a slippery substance. As it approaches an intersection, numerous pedestrians are crossing the street. Using its on-board sensors, the vehicle knows it can't stop in time. 

Now it must make a decision: should it plow through the pedestrians (saving the passengers in the car) or swerve into a ditch and kill the passengers (which happens to be a pregnant woman and her five-year-old son), ultimately saving the pedestrians? Even human drivers, when forced to make such last-minute, nerve-racking decisions, would have a very difficult time assessing the situation. 

Moral Machine

To understand how people would make choices in such situations, MIT researchers launched the Moral Machine, an online survey that depicts nail-biting scenarios involving self-driving algorithms, like the one above. In the questionnaire, you make the decision for the car. The survey is extremely grueling and uncomfortable to take. In almost every scenario, one or a group of people (from pregnant women to senior citizens) will be on the losing end of your decision, resulting in a loss of life. 

Surprisingly, most people (76 percent out of 182 responses), according to a study conducted by MIT, chose to save pedestrians over passengers. The idea that people are willing to sacrifice one passenger to save five or eight strangers sounds heroic, because it is. Not satisfied with their initial findings, the scientists dug a bit deeper and asked if they would choose a self-driving car that prioritizes the lives of pedestrians over passengers. In a baffling twist of events, most said no, suggesting that they would prefer a self-driving vehicle that saves the lives of passengers. 

"Just as we work through the technical challenges, we need to work through the psychological barriers," explained Rahwan.   

Self-driving cars are forced to make a plethora of tough, controversial decisions on the open road. In addition to making sure that passengers get to their destination in one piece (without emotional or mental trauma), they must also ensure the safety of nearby pedestrians. Such choices are easy to make when only one party's life is on the line. 

But in a scenario wherein both groups are on the verge of losing their lives due to mechanical failure (for example, the brakes suddenly stop working) or bad weather (causing wet, slippery roads), which one will the car's algorithm choose? 

"Experts say that 90 percent of accidents are avoidable by technology that basically will eliminate human error," said Iyad Rahwan, a professor at MIT's Media Lab. "The other 10 percent are caused by less controllable things, like maybe bad weather conditions, or mechanical failures, or just kind of random freak accidents, that not even a very sophisticated computer can avoid."

The Driver or the Crowd Scenario

This predicament, called "the driver or the crowd," has puzzled engineers assigned to fine-tune and preserve the social aspects of self-driving vehicles. The scenario starts off innocently with the car taking the passenger to a predetermined destination. As the vehicle approaches the intersection, it drives over a slippery spot on the road. The car initiates the breaking system, but it's not biting because the tires are coated with a slippery substance. As it approaches an intersection, numerous pedestrians are crossing the street. Using its on-board sensors, the vehicle knows it can't stop in time. 

Now it must make a decision: should it plow through the pedestrians (saving the passengers in the car) or swerve into a ditch and kill the passengers (which happens to be a pregnant woman and her five-year-old son), ultimately saving the pedestrians? Even human drivers, when forced to make such last-minute, nerve-racking decisions, would have a very difficult time assessing the situation. 

Moral Machine

To understand how people would make choices in such situations, MIT researchers launched the Moral Machine, an online survey that depicts nail-biting scenarios involving self-driving algorithms, like the one above. In the questionnaire, you make the decision for the car. The survey is extremely grueling and uncomfortable to take. In almost every scenario, one or a group of people (from pregnant women to senior citizens) will be on the losing end of your decision, resulting in a loss of life. 

Surprisingly, most people (76 percent out of 182 responses), according to a study conducted by MIT, chose to save pedestrians over passengers. The idea that people are willing to sacrifice one passenger to save five or eight strangers sounds heroic, because it is. Not satisfied with their initial findings, the scientists dug a bit deeper and asked if they would choose a self-driving car that prioritizes the lives of pedestrians over passengers. In a baffling twist of events, most said no, suggesting that they would prefer a self-driving vehicle that saves the lives of passengers. 

"Just as we work through the technical challenges, we need to work through the psychological barriers," explained Rahwan.   

Prev                  Next
Writer's other posts
Comments:
    Related Content