Hey car, will you save me or someone else?
New road rules in Germany rev the ethical engine of driverless cars
Germany has become the first country in the world to green-light regulations around the use of driverless cars. The moral dimension of autonomous vehicles is firmly in the sights of these groundbreaking road rules, particularly how such vehicles respond in situations harmful to humans.
Human lives have been prioritised over protecting animals or property.
Germany might have lost out at World Cup 2018 but it is winning when it comes to pioneering the ethical minefield of driverless cars. According to Reuters, “under new ethical guidelines – drawn up by a government-appointed committee comprising experts in ethics, law and technology – the software that controls such cars must be programmed to avoid injury or death of people at all cost.”
Human lives have been prioritised over protecting animals or property. If a driverless car faces a potential accident involving more than one person, the car’s software cannot make its call based on age, gender or physical condition of those who will be impacted.
During the past few years, momentum has been growing globally around driverless cars, including NSW government trialling them on Sydney’s roads this year until October. As science fiction speedily becomes science fact, a recurring pothole is the moral framework of any artificially intelligent vehicle.
Like the famous “Trolley Problem” thought experiment that was first tested in the 1960s, the conundrum of choice and responsibility is acute with driverless cars. When it comes to the inevitable situation of a potential car accident involving humans, what will be the basis for determining the driverless car’s course of action?
In a Science magazine study released in 2016, respondents liked the idea of driverless cars and also wanted them to serve the greater good via their programming.
However, the prospect of sacrificing their own life to save others was less appealing. Put that another way: the idea of owning a driverless cars that prioritised everyone else above the the owner wasn’t met with ringing endorsement.
Could humans in the driver’s seat be free from any responsibility?
Working out whether you want your car to injure you or someone else is just one of the thorny, real dilemmas to be addressed within the rapid development of on-road parameters of driverless cars. Another is whether humans will want to hand over moral responsibility to their vehicle. Germany already has legislated that a human must be always behind the wheel and ready to take back control if prompted by the autonomous car, but such laws might not prevent people blaming their car for accidents or death.
Think about the possible ramifications of the technology we have created being held accountable for its ethical decision making. Could humans in the driver’s seat be free from any responsibility?
A sociologist at George Washington University, Amitai Etzioni, believes we only would “teach the car ethics … if you assume that the purpose of A.I. is to replace people.” As told to The New York Times, Etzioni does not support such a replacement; instead “it should rather be a partnership between the human and the tool, and the person should be the one who provides ethical guidance.”
So, what will we look to, to determine the ethics of driverless cars? Do we forge new guidelines from existing ethical frameworks, try to conceive of a totally new set of rules, or follow the sort of overarching principle of conduct found in places such as Colossians 3:17? While that verse in the New Testament refers specifically to doing everything “in the name of the Lord Jesus, giving thanks to God the Father through him”, note the holistic nature of that underlying ethical principle for Christians.
All things are to be done in the light, leadership and pattern of living set out by Jesus. All things. Including, then, when we are behind the wheel of a driverless car.