EN
One of the primary ethical concerns raised by the prospect of using military autonomous killer robots is the question of the moral responsibility for their use. Robert Sparrow argues that military robots fitted up with the ability to learn would be so independent and self-contained that they would allow human actors to deny responsibility for their use, creating a situation that Andreas Matthias called a “responsibility gap.” A responsibility gap occurs in situations where no one is responsible for the actions of autonomous learning robots. This situation results from the inability of people to fully control and predict the actions of these technologies. In the article, I argue that this conclusion is not a correct one because autonomous technologies do not mitigate people of responsibility for the consequences of using them. Control and predictability are not inevitable prerequisites for attributing responsibility. Given the risks that the use of such weapons presents, those who create or use these weapons are morally responsible for the weapons’ actions. Even though the deployment of autonomous lethal weapons might not be a good idea, the “responsibility gap” does not by itself make them immoral.