Self-Driving Cars Still Subject To Moral Dilemmas

Self-driving cars in Germany will not discriminate when it comes to deciding between two possible crash victims. Newly published guidelines say the vehicles must simply aim to minimize harm.

What happens in a potential crash is a longstanding ethical issue that has come back to prominence with autonomous vehicles. Among human drivers (and hypothetical train operators) there’s always been debate on factors such as whether it’s OK to save yourself or family members even if that means killing or hurting more people; whether it’s better to kill an older person than a child; and whether there’s an ethical difference between actively choosing to change course and cause harm or passively continuing on an existing course.

While in human drivers such decisions are usually made in the moment and will usually be a gut choice, self-driving cars may need to be programmed to make such decisions. Earlier this year MIT researchers launched an online experiment to see what decisions people wanted the cars to make in various scenarios (pictured). That follows previous surveys which found most people thought cars should minimize harm even if that meant killing the driver rather than multiple pedestrians, but also that most people wouldn’t buy a car that had been programmed that way.

Now Germany’s government has received a report from an ethics committee laying down principles for programming such cars. It’s opted for a straightforward principle of minimizing harm to humans. That means damaging property and hurting or killing animals at the expense of causing any damage to humans.

If humans being injured or killed is inevitable, the car must aim to cause as little overall damage as possible, even if that means sacrificing the driver. Any programming must pay no heed to the person’s age or any other physical characteristics, in effect treating all life as equal. The government will now explore how to incorporate these principle in specific laws.

The report also made a wider argument: that if it’s proven that automated cars cause fewer crashes than human drivers, society becomes ethically obliged to promote the use of the technology.




11 Responses to Self-Driving Cars Still Subject To Moral Dilemmas

    • So, if a car swerves into your path and is going to hit you, your car should plough through 20 pedestrians on the pavement/sidewalk to save you if necessary?

      That’s completely amoral.

      • Maybe (I say maybe because I’m no pro) they could design cars that only move on designated car areas (car roads and parking lots), making them physically impossible to go to sidewalks and walking/bike areas by putting some kind of sensors. Just like some robots will only “walk” on the appropriately painted tracks.

        Additional random though: would this kind of system reduce terrorism?

    • Apparently you have never studied moral dilemmas.

      From another point of view, the people in the car chose to enter the car. Knowing that they are jumping into a fast moving metal box that ultimately is the demise of them or someone else. The pedestrians on the other hand have not been a part of that choice. So why should they suffer from it?

      Also, a pedestrian run over by a car at say 40MPH has a very high risk of dying. A occupant of that same car, running into a barrier at the same speed has a much higher chance of surviving if s/he is wearing a seatbelt, even more so with airbags. So again, how should it prioritize? The pedestrians have no option of protection given to them.

      So yes, there is a very big moral dilemma. If you cannot see it, you are concern for worry.

  1. Improve the brakes and a slow-down protocol when nearing potential cross walks. But I guess this is just an example of one of many situations.

    I’ve seen videos on YouTube of trucks stopping in short distances from high speeds.

    • This, I agree with this. Also, you can see on advertisements of human driven new cars too that they have an automatic slow down and brake system to prevent even hurting small animals crossing the roads. Or that’s how they try to depict it. Anyway, hope it’s also safe for the drivers behind such cars, since a sudden stop can cause a chain reaction. But if the slowing down beforehands is part of such system then there should be no concerns.

  2. Sorry for spamming with so many separate comments, but I just needed to say one more thing linked to the article. About being ethically obligated to promote the use of technology. I think we already are, and I don’t see it as a problem. We are already promoting clean electricity powered devices and machines rather than those running on polutants and non-renewable energy sources. Also, we are trying to get robots do the exploration and rescue jobs in places where it’s too dangerous for humans to do so. In cases where it’s more moral to use technology there’s no problem, only improvement.

  3. The only moral thing to do is first kill the 4 people on the crosswalk, and then back up and drive into the wall killing the four people in the car. That’s the only fair choice that can be made.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.