Are Autonomous Cars Ready to Decide Who Dies in Accidents?

Home > News > Content

【Summary】The question of who dies in an automotive accident when autonomous cars are driving hasn’t been answered yet and is a lot more complicated than the majority of drivers can imagine.

Original   Vineeth Joel Patel  ·  Dec 03, 2017 9:15 AM PT
author: Vineeth Joel Patel   

Some automakers and technology companies believe that autonomous cars are right around the corner, quickly approaching in the near future. Others have a more realistic view, claiming that driverless cars are still a couple decades away. Either way, a lot of money has gone into autonomous vehicles and more is being poured into the field every day. 

Companies may be foaming at the mouth in an attempt to be the first to sell a fully autonomous vehicle to the masses, but the majority of drivers still aren't sold on the machines. And it's easy to see why, as trusting a driverless vehicle is looking to be a lot harder than the majority of companies anticipated. There's still a lot of kinks that need to be worked out when it comes to self-driving cars and one of the issues revolves around who will die in a dicey situation. 

Every day, humans are faced with difficult decisions when driving a car. And on some occasions, they have to make dire decisions in a matter of milliseconds. Do I swerve to avoid a head-on collision and run into pedestrians? Or maintain a path to get into a head-on collision, saving the pedestrians, but possibly killing myself and the other driver? While these situations may be rare, they do occur and autonomous vehicles will have to face the same predicaments. 

Crashes Are Not Unavoidable

As USA Today reports, the answer to the question of "who dies when the car is forced into a no-win situation?" is still out in the open. "There will be crashes," said Van Lindberg, an attorney in San Antonio who specializes in issues surrounding autonomous cars. "Unusual things will happen. Trees will fall. Animals, kids will dart out." Needless to say, "anyone who gets the short end of that stick is going to be pretty unhappy about it."

Automakers and tech companies may claim that driverless cars are safer than human-operated ones and are quick to point out all of the benefits to an autonomous future, but none have really tackled the thinking behind giving driverless cars the ability to choose who dies in an accident. 

While Congress has finally gotten involved in an autonomous future, the most recent legislation doesn't have a clear thought process on how driverless cars will make ethical considerations. As USA Today reports, the guidance that was set forth by the U.S. Department of Transportation for automakers has a footnote that claims ethical considerations are "important" and acknowledges that "no consensus around acceptable ethical decision-making" has been made.

Waymo Testing 2.jpg

What Companies Think About Accidents In An Autonomous Future

That, obviously, won't help drivers fully embrace a driverless future anytime soon. As the outlet points out, a Daimler executive claimed that self-driving cars would prioritize the lives of passengers inside the vehicle instead of others located outside of the machine. Daimler then backtracked, as it claimed that it would be illegal "to make a decision in favor of one person and against another," reports USA Today

Some may think that the Daimler executive's thoughts on the matter are singular, but others have shared similar beliefs. According to the outlet, Sebastian Thrun, who is credited with starting Google's autonomous program, told Bloomberg that driverless cars would "go for the smaller thing" if put in a position where an accident couldn't be avoided. Autonomous cars, though, Thrun claimed, would be designed to avoid accidents. 

Unfortunately, there are numerous "small things" in today's society. What if that "smaller thing" was a child? This, as Azim Shariff, an assistant professor of psychology and social behavior at the University of California, Irvine, found, leads to mixed feelings from humans. USA Today claims that Shariff conducted a study that found that the majority of respondents agreed that autonomous vehicles should kill the fewest number of people possible in an accident, regardless of whether they were passengers or others outside of the vehicle. Respondents, though, claimed that they were less likely to purchase any vehicle "in which they and their family member would be scarified for the greater good," reports the outlet.  

"These ethical problems are not just theoretical," said Patrick Lin, director of the Ethics and Emerging Sciences Group at California Polytechnic State University. The problems, as one could guess, will lead the majority of people to reject driverless cars, despite knowing that they would be safer in the end. 

How Do Companies Solve The Problem?

What's stopping companies from tackling the dilemma? According to Lin, automakers "simply deny that ethics is a real problem, without realizing that they're making ethical judgment calls all the time," reports USA Today

The question of who dies in an accident isn't the only question automakers and tech companies haven't answered yet. In fact, as the outlet points out, there are various ethical questions that have yet to be answered. And when it comes other no-win hypothetical situations, which USA Today claims are referred to as "the trolley problem," companies are simply downplaying the risks. 

Automakers believe that hypothetical situations, like the one above, won't be an issue with autonomous cars. "I don't remember when I took my driver's license test that this was one of the questions," said Manuela Papadopol, director of business development and communications for Elektrobit. "The cars will be smart – I don't think there's a problem there. There are just solutions." 

What's The Solution?

Automakers and tech companies will eventually have to admit that ethical dilemmas will be a part of an autonomous future. But people, understandably, don't want to wait until then. Some experts and analysts, as USA Today points out, believe that regulators and legislators can solve the issue. But having a nationwide set of rules for autonomous vehicles to follow would be nearly impossible to set in place and even more farfetched to enforce. 

Waymo, which recently became the first company to begin ferrying humans around in autonomous vehicles, believes it has an answer to the solution and it boils down to teaching vehicles how to mimic the same driving patterns as a human. While that might work, it still wouldn't result in a perfect situation. 

"As human beings, we have hundreds of thousands of years of moral, ethical, religious and social behaviors programmed inside of us," said Chief Product Officer of the Society of Automotive Engineers, Frank Menchaca. "It's very hard to replicate that." 

via: USA Today

Prev                  Next
Writer's other posts
Comments
Comments:
    . Related Content