Will Self-Driving Cars Be Able to Make Ethical Decisions?

Fri, 9/29/2017 - 10:01 pm by Kirsten Rincon

The fact that self-driving cars will be safer than conventional vehicles is no longer up for debate, with automakers having proven that by taking the driver away from the equation, autonomous driving technology will drastically reduce the risk of accidents. But, there is still one thing that raises concerns when talking about the prospect of having cars moving along our roads completely independently. It’s the question of how they will react when faced with having to make tough choices involving various road hazards, that would potentially require deciding between protecting their owners and other road users.

This ability to make ethical decisions is the focus of a report recently published by researchers at the University of Alabama at Birmingham, who raised the question of how driverless cars will handle any life-and-death scenario they encounter on the road. In their report, they present a potentially dangerous scenario involving a trolley with five passengers in it moving directly towards a self-driving car, which would put the autonomous vehicle in a very difficult situation where it has to choose between two highly likely deadly options. One option is to swerve in order to avoid a collision with the trolley, in which case, the car would hit a pedestrian, killing him/her on the spot. The other option is for the driverless car to continue moving along its path and crash into the trolley, which would kill the five passengers inside the trolley.

This means that the car has to decide whether it kills one or five people. The question is whether it will choose the lesser of two evils, meaning swerving and only killing the pedestrian in order to avoid a collision with the trolley, which would result in more fatalities. This is a question that engineers working on autonomous driving technology will have to try and find an answer to, and decide how to program driverless cars when it comes to making difficult decisions.

“Ultimately, this problem devolves into a choice between utilitarianism and deontology,” said UAB alumnus Ameen Barghi, one of the researchers who compiled this report. “Rule utilitarianism says that we must always pick the most utilitarian action regardless of the circumstances — so this would make the choice easy for each version of the trolley problem,” Barghi said: Count up the individuals involved and go with the option that benefits the majority.

Researchers say that the answers to this and other questions involving driverless cars and their morals involve philosophy and futurism, and that the entire auto industry will have to try and come up with a solution that will be acceptable for all automakers, so that all autonomous vehicles react the same way when faced with scenarios involving unavoidable accidents, with every choice leading to a fatal outcome for one or more people involved.