Do Killer Robots Already Exist?

his article contributes to a special symposium on science fiction and international law, examining the blurry lines between science and fiction in the policy discussions concerning the military use of lethal autonomous robots
Not because they might develop the worst human characteristics, but because they’re nothing like humans at all.If we now reflect that human beings consist of useful resources (such as conveniently located atoms) and that we depend for our survival and flourishing on many more local resources, we can see that the outcome could easily be one in which humanity quickly becomes extinct.
“Firstly there's a real loss of control over the battlefield for commanders. Secondly, if a mistake is made there's a real problem with accountability."
Killer Robots: Keeping Control of Autonomous Weapons
The company says it will continue to work with its military clients, but has “vouched to not manufacture weaponized robots that remove humans from the loop” as it “has chosen to value our ethics over potential future revenue.” Stop Killer Robots Canada has welcomed the statement by Clearpath Robotics, which it said “has set the ethical standard for robotics companies around the world.”
"It's no longer a flight of fantasy, it's something that people should start taking seriously. But at the same time, it's not too late. Had we waited until we started seeing these enter battlefields worldwide, we think that it would be much harder to get this sort of dialogue going and realistically have a chance of stopping this technology from proliferating,"
Self-driving cars sounds great until you start to think about it. The results over robots taking jobs over has created a split in the scientific community. Forty-eight percent of respondents say that they will destroy more jobs than they create. Fifty-two percent have the opposing view
Every time it kills civilians, we add to guilt, like a bank account. And as time passes, guilt decays and reduces in value (especially if the robot kills bad guys). Now here's the governor bit: whenever guilt is above a value -- say, 100 -- then the robot formulaically becomes less willing to shoot.