English Site French Site Spanish Site Russian Site Arabic Site Chinese Site OHCHR header



A call for a moratorium on the development and use of lethal autonomous robots

Lethal autonomous robots (LARs), once they have been activated, can operate without further human intervention. While drones are operated by humans, these new weapons systems, not currently deployed, can make decisions about targets on their own.

In this chilling report from the Special Rapporteur on extrajudicial, summary or arbitrary executions, Christof  Heyns, the use of LARs is seen as adding an extra dimension to the age-old legal, moral and religious dilemmas confronted when one human being considers killing another.

Heyns argues that these robots have “the potential to pose new threats to the right to life…[to] create serious international division and weaken the role and rule of international law – and in the process undermine the international security system.”

The Special Rapporteur says his report supports “a call for pause, to allow serious and meaningful international engagement with this issue.” It will be too late after they have been deployed to try and formulate an appropriate response, he warns. “As with any technology that revolutionizes the use of lethal force, little may be known about the potential risks of the technology before it is developed, which makes formulating an appropriate response difficult; but afterwards the availability of its systems and power of vested interests may preclude efforts at appropriate control,” he says.

According to Heyns military commanders may want to use robots because of the greater physical and psychological distance away from those being killed.  The “unique human traits such as our aversion to getting killed, losing loved ones, or having to kill other people” no longer serve as constraints when LARs are engaged to do the fighting, he says.

The Special Rapporteur says eventually armed conflict may no longer be a “measure of last resort”.  The general public may feel so removed from combat that “the decision to use force [becomes] a largely financial or diplomatic question for the State, leading to the ‘normalization’ of armed conflict.”
 
Drawing attention to the global agreement to ban landmines, the Special Rapporteur poses the question: is it inherently wrong to let autonomous machines decide who and when to kill? “This is an overriding consideration: if the answer is negative, no other consideration can justify the deployment of LARs, no matter the level of technical competence at which they operate,” he says.

“This approach stems from the belief that a human being somewhere has to take the decision to initiate lethal force and as a result internalize (or assume responsibility for) the cost of each life lost in hostilities, as part of a deliberative process of human interaction,” Heyns says. “Machines lack morality and mortality, and many people believe they should as a result not have life and death powers over humans.  This is among the reasons landmines were banned.”

Heyns recalls the experience of the two World Wars: “The United Nations was set up by people who had fought a terrible war. Nearly 70 years have passed without a global war. The commitment to achieve such an objective can be understood as a consequence of the long-term and indeed inter-generational effects of insisting on human responsibility for killing decisions.”

In conclusion, the Special Rapporteur says it is up to those who favour the development and use of LARs “to demonstrate that specific uses should in particular circumstances be permitted. Given the far-reaching implications for protection of life, considerable proof will be required.”

Heyns is calling on the Human Rights Council to press all States for moratoria on the testing, production, assembly, transfer, acquisition, deployment and use of LARs, until a governing framework is agreed.  The Special Rapporteur also wants to see a High Level panel convened as a priority, made up of experts from a variety of fields, including the law, computer science, robotics, military operations and ethics to review the issue and provide a basis for further international dialogue and the development of an international framework.

Heyns concludes: “In essence I am not proposing an ultimate solution. I am calling for an inclusive, open process – that is, for us as a global community, as humans – to come to a responsible decision on how we want to deal with this new technology.”

31 May 2013


Video: “Robots decide whom to kill” - UN Special Rapporteur on summary executions

See also