Robots have been a staple of science fiction stories for decades. They allow us to imagine what the future might be like, explore ethical issues, and create fantastic stories. However, most people would agree that robots lack something that is at the very core of humanity: a moral compass.
Humans are held responsible for their choices because of their ability to make informed decisions based on their rationality and experience. Robots, however, do what their programs command them to do—no questions asked.
This is an important part of the reason why we should be very worried about new technology that is giving us the capabilities to develop killer robots—fully autonomous weapons that can select and fire on their targets without any human intervention. Popular science fiction has once again predicted an ethical dilemma that we must now face.
It should be noted that these killer robots are distinct from drones. While drones are an issue in their own right, they are different in that they require a human pilot to remotely control the machine. Killer robots take the human element entirely out of the equation.
Imagine the problems that would arise from the use of these killer robots in war. Who would be blamed when one inevitably fails to distinguish between a soldier and a civilian? Would it be the programmer or the manufacturer or someone else? We must do our best to ensure accountability is possible for deeds done in battle—when lives are on the line.
It seems obvious what we must do. We need to stop killer robots before it’s too late.
Loading more stuff…