Self-driving cars in Germany will not discriminate when it comes to deciding between two possible crash victims. Newly published guidelines say the vehicles must simply aim to minimize harm.
What happens in a potential crash is a longstanding ethical issue that has come back to prominence with autonomous vehicles. Among human drivers (and hypothetical train operators) there’s always been debate on factors such as whether it’s OK to save yourself or family members even if that means killing or hurting more people; whether it’s better to kill an older person than a child; and whether there’s an ethical difference between actively choosing to change course and cause harm or passively continuing on an existing course.
While in human drivers such decisions are usually made in the moment and will usually be a gut choice, self-driving cars may need to be programmed to make such decisions. Earlier this year MIT researchers launched an online experiment to see what decisions people wanted the cars to make in various scenarios (pictured). That follows previous surveys which found most people thought cars should minimize harm even if that meant killing the driver rather than multiple pedestrians, but also that most people wouldn’t buy a car that had been programmed that way.
Now Germany’s government has received a report from an ethics committee laying down principles for programming such cars. It’s opted for a straightforward principle of minimizing harm to humans. That means damaging property and hurting or killing animals at the expense of causing any damage to humans.
If humans being injured or killed is inevitable, the car must aim to cause as little overall damage as possible, even if that means sacrificing the driver. Any programming must pay no heed to the person’s age or any other physical characteristics, in effect treating all life as equal. The government will now explore how to incorporate these principle in specific laws.
The report also made a wider argument: that if it’s proven that automated cars cause fewer crashes than human drivers, society becomes ethically obliged to promote the use of the technology.