As reported by Engadget: Germany is working on implementing a handful of new rules for autonomous cars that address ethical questions that come with the technology. In June, the ethics commission of the Federal Ministry of Transport and Digital Infrastructure -- made up of 14 scientists and legal experts -- released a report with guidelines it believed self-driving vehicles should be designed to follow. This week, the ministry said it would implement and enforce those guidelines.
One of the proposed rules says that human life should always have priority over property or animal life and another stipulates that a surveillance system, like a black box, should record the activity so that it can be determined later on who was at fault during an accident -- the driver or the technology. Additionally, drivers should get to decide what personal information is collected from their vehicle, so that data can't be used to customize advertising, for example.
Another guideline takes on the ethics thought experiment the "trolley problem." One version of the trolley problem asks what should one do if they were driving a trolley and headed towards five people that will surely die if hit. The trolley driver can divert the trolley to another track where only one person would die. Should they actively choose to kill the one person over the five or not intervene and just let the train continue on its original path? What if they had information on the moral character of those individuals -- should that change anything? You can test yourself with various versions of this dilemma through MIT's Moral Machine.
This question has come up before with self-driving car makers. In 2015, the head of Google's self-driving auto project at the time said that Google's cars won't have the ability to decide who is a better person, morally, to lose in an unavoidable collision. Instead, the company is working to protect the most vulnerable person, like a pedestrian over another vehicle. And in 2016, a Mercedes Benz executive said that if given the choice between saving the person in the car or, say, a pedestrian outside of it, the car should choose to protect its driver because it's the only one you can be sure of surviving. But Germany's ministry says that in a situation where an accident can't be avoided, autonomous cars can't decide who to save, all human lives matter.
In a statement, Germany's transport minister, Alexander Dobrindt, said, "The interaction between man and machine raises new ethical questions during this time of digitization and self-learning systems. The ethics commission has done pioneering work and has developed the world's first guidelines for automated driving. We are now implementing these guidelines."
One of the proposed rules says that human life should always have priority over property or animal life and another stipulates that a surveillance system, like a black box, should record the activity so that it can be determined later on who was at fault during an accident -- the driver or the technology. Additionally, drivers should get to decide what personal information is collected from their vehicle, so that data can't be used to customize advertising, for example.
Another guideline takes on the ethics thought experiment the "trolley problem." One version of the trolley problem asks what should one do if they were driving a trolley and headed towards five people that will surely die if hit. The trolley driver can divert the trolley to another track where only one person would die. Should they actively choose to kill the one person over the five or not intervene and just let the train continue on its original path? What if they had information on the moral character of those individuals -- should that change anything? You can test yourself with various versions of this dilemma through MIT's Moral Machine.
This question has come up before with self-driving car makers. In 2015, the head of Google's self-driving auto project at the time said that Google's cars won't have the ability to decide who is a better person, morally, to lose in an unavoidable collision. Instead, the company is working to protect the most vulnerable person, like a pedestrian over another vehicle. And in 2016, a Mercedes Benz executive said that if given the choice between saving the person in the car or, say, a pedestrian outside of it, the car should choose to protect its driver because it's the only one you can be sure of surviving. But Germany's ministry says that in a situation where an accident can't be avoided, autonomous cars can't decide who to save, all human lives matter.
In a statement, Germany's transport minister, Alexander Dobrindt, said, "The interaction between man and machine raises new ethical questions during this time of digitization and self-learning systems. The ethics commission has done pioneering work and has developed the world's first guidelines for automated driving. We are now implementing these guidelines."
No comments:
Post a Comment