Human Rights Watch has called for an international ban on ‘killer robots’ - autonomous machines that independently decide their targets, Press TV reports.
“They’re weapons where there is no human intervention. That is, the armed robot itself makes the decision about what its target should be and when it should pull the trigger,” Stephen Goose, the arms division director of Human Rights Watch told Press TV.
“Killer robots are the shorthand name for fully autonomous weapons. These are something we think of as being beyond drones,” Goose said, adding that “the farther down the road this gets, the harder it’s going to be to stop it.”
The expert said an international ban, as well as prohibitions in each country, must be started before these robots become the future of war.
“The more money that’s poured into it, the more time passes, the more they’re going to get integrated into future war plans and into the doctrine of various militaries. We think the only way to approach this is to nip it in the bud and to have a prohibition now,” Goose added.
Goose criticized the role of the United States military in the “secretive and classified” development of these deadly weapons.
The US Pentagon has begun a contest, the Defense Advanced Research Projects Agency Robotics Challenge, to advance its efforts to develop robotic soldiers to fight the wars of the future, focusing on testing the robots' abilities to work in difficult situations designed for humans that “simulate conditions in a dangerous, degraded, human-engineered environment.”
He added that these weapons would violate the proportionality test, which is required under international humanitarian law to weigh the advantages of an attack against possible civilian casualties.
“These kinds of weapons would make it more likely that a state would go to war…shifting the burden of conflict away from the military, those who are trained to fight, to civilians who will bear the brunt of any mistakes that these killer robots make. And they will inevitably make mistakes,” Goose concluded.