Dobrindt wants autonomous vehicles to have “clear guidelines for algorithms”

Dobrindt wants clear guidelines for autonomous vehicles'klare leitlinien fur algorithmen'

Image: Dilemma/CC-BY-SA-2.0

How should algorithms decide moral dilemmas? So far, there seems to be indifference or ignorance in authorities about the risks of algorithms

German Transport Minister Alexander Dobrindt wants to quickly test autonomous vehicles in cities as well. The motto: "Everything that can be digitized will be digitized. Everything that can be networked will be networked."

In a strategy paper for the cabinet retreat in Meseberg, his ministry proposes the establishment of a commission that would "Ethical ies in the paradigm shift from driver to autopilot" clear guidelines.

In the background is the "Strategy for automated and connected driving" and the political will, somehow Germany as autoland with to the point of the from Dobrindt proclaimed "rough mobility revolution" The bishop had to make a decision, namely "automated and connected driving". It has already been established that semi-automated driving must be based on "the system" by the driver "permanently monitored" must be "to be able to fully take over the driving task at any time" must. But that was no longer the case with autonomous driving. Here had to "the system" independently decide how to react to certain situations.

So it is about programmed ethics, because the commission of representatives from science, car and digital industry should develop guidelines for algorithms, according to which cars react in risk situations. In the world, where it was first reported that it was about "clear guidelines for algorithms that determine vehicle reactions in risk situations". The aim is to avoid personal injury as far as possible, when causing damage to property would be an alternative. Or so. More precisely about the "algorithms" one does not know.

It will also be interesting to see who will be responsible for the ethics encoded in algorithms. And how ultimately a program should decide when the resolution of a conflict involves a moral dilemma, i.e., it is not a matter of personal vs. damage to property, but about which or how many people are endangered. How should "the system" decide whether it is more likely to risk the life of a passerby or that of the passenger, aming, of course, the "the system" act selflessly? Can moral dilemmas be solved with algorithms, which have been negotiated and debated for millennia in philosophy, legal systems and religions, without having found the one right solution?? And which morality do you want to implement: the utilitarian, the teleological, the deontological, the consequentialist, the ethics of mind or the ethics of responsibility? Or did it have to be a consensus ethic that had to change dynamically?

One may be curious how the commission desired by Dobrindt will approach the problem. A standard example is the trolley dilemma, a thought experiment for a moral problem in which every decision, including the decision not to act, leads to people being killed. Innocent one cannot remain, there is also no right acting in the wrong, but only a lightning-fast Abwagen. In the thought experiment, a freight car rolls toward 5 people, who are run over because they cannot get out of the way. But there is a switch. If this is changed, the five people are saved, but another person must die on the alternative route.

How to decide "the system"? Is it morally better to kill only one person, although killing one person is morally pontential to saving the lives of 5 people?? In experiments with VR simulations, most people actually choose the utilitarian version of saving more people. Others, however, freeze and cannot make up their minds (Does morality depend on language?)?). So should the algorithms be built utilitarian? And what if it’s one life against the other? Which person would be more important?

And one does not even have to go to the dramatic level for dilemmas. It might be enough if an autonomous vehicle stops at one person to let him get in, rather than another who is standing next to him and also wants to drive. Numbers the milliseconds who ordered first, regardless of whether one party is a child, a sick person or someone who needs to be in a place more urgently? Oliver Bendel has played out such situations for combat robots, but also for care or service robots. Sometimes, he suggests, robots won’t know what to do but will have to decide or behave anyway. Which ways out are found here with the programmed ethics, only the practice was allowed to show (Buridans Robot).

cluelessness about the consequences of even assistance systems

Constanze Kurz wondered on Netzpolitik.org, however, whether these are not just word bubbles for now. She had u.a. inquired in his ministry in vain. No one had been able to, "provide some basic information on the computer systems already installed on a massive scale in vehicles today and answer related questions about IT security requirements, or name surveys and facts about what impact today’s electronic assistants actually have on accident frequencies."

Kurz interested, "in what ways accident rates have changed since we started sitting in moving computers. Which systems are being used today? Have electronic assistance systems measurably changed road safety?? Which malfunctions in software emerged?" So far, she has been able to ascertain that apparently state authorities and ministries are not interested in this at all. It is possible that the question had not yet arisen, but it plays a major role in the run-up to the mass introduction of driver assistance systems and then also autonomous vehicles.

So far, it is said that these help to prevent accidents, but if it is not clear and recorded whether driver assistance systems already in use are malfunctioning and thus contributing to accidents, we are walking on thin ice. The only information she has received from police authorities is that the relevant statistical surveys have not been carried out or that no information can be provided on regulations and provisions for testing IS security in vehicle electronics and software.

Leave a Reply

Your email address will not be published. Required fields are marked *