Should We Preemptively Ban Killer Robots Of The Future?

"Paradoxically, the human factor is also cited by those in favor of the development of lethal autonomous weapons. "Robots aren't scared," Steve Groves, from the conservative U.S. think tank Heritage Foundation, told CBS last May. "They don't have fits of madness. They don't react to rage."

Peter Asaro dismisses this argument. "It can maybe be demonstrated that an autonomous system is more efficient that a human," he says. "But what humans do goes well beyond aiming and shooting: They take the context into account, are capable of assessing if civilian lives could be at stake. All this will not necessarily make sense for a machine."

Asaro is unconvinced that the recent appeal to preemptively ban these weapons could lead to global consensus and the equivalent of an international non-proliferation treaty. "The United Nations acts very slowly, both for bureaucratic reasons and because they're looking to obtain a consensus from a large number of states," he says. "I think that if it takes two or three more years before reaching a treaty, the most advanced countries will have developed very sophisticated systems, and they won't necessarily want to sign.""