Killer robots programmed to open fire without human control are just a “small step” from the battlefield and military powers should agree to outlaw them, a top United Nations official has said.
Angela Kane, the UN’s high representative for disarmament, said governments should be more open about programmes to develop the technology and she favoured a pre-emptive ban before it was too late.
She said: “Any weapon of war is terrible, and if you can launch this without human intervention, I think it’s even worse. It compounds the problem and dehumanises it in a way.
“It becomes a faceless war and I think that’s really terrible and so to my mind I think it should be outlawed. The decision is really in the hands of the states who have the capability to develop them."
Ms Kane said there was “a great deal of concern” about the prospect of killer robots being developed that would commit war crimes on the battlefield.
There had also been a great deal of reluctance from major military powers to discuss the issue.
Ms Kane said: “There’s a great deal of concern about the increasing automation that’s going on in general. Just think about these self-driving cars that we hear are being tested on the roads. So that is only just a small step to develop weapons that are going to be activated without human intervention. Warfare in general is becoming increasingly automated.”
She said: “The concern relates specifically to weapons that have the capability of selecting and also attacking targets without human intervention. Who has the responsibility and who has the liability? This is a really big issue in terms of how we are going about this."
The United Nations held its first meeting on the threat from “lethal autonomous weapons” earlier this year. Another summit is planned for the autumn.
She said many developing countries were worried such weapons may be used on their territory, just as remotely piloted drones have been used in recent years.
"I personally believe that there cannot be a weapon that can be fired without human intervention. I do not believe that there should be a weapon, ever, that is not guided and where there is not the accountability clearly established by whoever takes that step to guide it or to launch it. I do not believe that we could have weapons that could be controlled by robots."
Militaries are making increasing use of robots, from bomb disposal machines to armed remotely piloted drones carrying out missile strikes. Israel has experimented with border control robots and American has trialled them in Iraq and Afghanistan. But an expert on military robots said all current machines' weapons are controlled and directed by humans, who make the final decision whether to fire.
Huw Williams, Unmanned Systems Editor at IHS Jane’s International Defence Review, said he knew of no programmes to make killer robots and even the most advanced machines had little ability to act on their own.
He said: “Autonomy at the moment is quite limited. You set the task parameters, give it way points and tell it to go and do x, y and z. Then the machine decides: 'How do I get from x to y to z in the most efficient manner and get things done?' But in terms of real thinking, real autonomy, then no."
While it is theoretically possible that a robot could have sensors and fire when it detected something, he said taking human control away had big disadvantages for commanders.
He said: “Firstly there's a real loss of control over the battlefield for commanders. Secondly, if a mistake is made there's a real problem with accountability."