The Center for Internet and Society at Stanford Law School is a leader in the study of the law and policy around the Internet and other emerging technologies.
The use of robots inevitably changes the equation for how police apply "use of force," a term that is broadly defined by the International Association of Chiefs of Police as the "amount of effort required by police to compel compliance by an unwilling subject."
The robot used by the Dallas police department to kill Micah Johnson — the sniper who fired into a peaceful protest and killed five police officers, injuring others — was originally designed to defuse explosives. The police attached a pound of the explosive C4 to the robot, creating a makeshift weapon out of a design that was not intended to inflict harm on people. The robot was also remote-controlled, not autonomous.
Last week the Future of Life Institute released a letter signed by some 1,500 artificial intelligence (AI), robotics and technology researchers. Among them were celebrities of science and the technology industry—Stephen Hawking, Elon Musk and Steve Wozniak—along with public intellectuals such as Noam Chomsky and Daniel Dennett. The letter called for an international ban on offensive autonomous weapons, which could target and fire weapons without meaningful human control.
Patrick Lin made interesting observations on the ethical notion of human dignity in the context of LAWS. Even if LAWS could act in accordance with IHL, taking of human life by machines violates a right to dignity that may even be more fundamental to the right to life.
Download the attached PDF to read Patrick Lin's full testimony.
"Paradoxically, the human factor is also cited by those in favor of the development of lethal autonomous weapons. "Robots aren't scared," Steve Groves, from the conservative U.S. think tank Heritage Foundation, told CBS last May. "They don't have fits of madness. They don't react to rage."
The Campaign to Stop Killer Robots is an international coalition of 59 groups, including Human Rights Watch and the Nobel Women’s Committee.
Spokesman Peter Asaro, an affiliate scholar at the Center for Internet and Society at Stanford Law School, said an international treaty to ban the weapons was urgent.
He told The New Daily killer robots would make it difficult to hold anyone accountable for war crimes and atrocities.
"Among the signatories was Peter Asaro, an assistant professor at The New School for Public Engagement in New York and co-founder of the International Committee for Robot Arms Control.
The committee believes that making decisions “about the application of violent force must not be delegated to machines.”
"Peter Asaro, a professor at the New School in New York, noted that without a human in control, machines fail to take in the unpredictable variables and context of war: “what’s the context, what’s the situation, is the use of force appropriate in this context and this target, and you can automate it but can you automate it well, and who’s responsible when it doesn’t operate correctly”."
No longer confined to science fiction, the use of autonomous robots in a military context has become a very real possibility in recent years. International Innovationcaught up with Dr Peter Asaro, co-Founder of the International Committee for Robot Arms Control, at the UN’s Convention on Certain Conventional Weapons (CCW), where steps were taken towards a treaty on the use of lethal autonomous weapons.
"“These robotic weapons would be able to choose and fire on targets on their own, without human intervention,” says Dr Peter Asaro of the Campaign to Stop Killer Robots (stopkillerrobots.org). “Giving machines the power to decide who lives and dies on the battlefield is an unacceptable application of technology.”
"When he was in fourth grade, Peter Asaro ’94 got the assignment of making a Valentine’s Day mailbox. But unlike most 9-year-olds, he didn’t dig around his mother’s closet for a shoebox. Instead, he cannibalized circuit boards, a clock-radio speaker and a remote-controlled tank to build a robotic mailbox that visited his classmates’ desks.
The story is a harbinger of what was to become a defining interest in Asaro’s life: how technology and people interact and influence one another.
"“Sometimes, you can’t separate the technology from its use, and this can make a technology unethical,” he told io9. “For instance, nukes are inherently indiscriminate and inhumane, and there’s no morally defensible use of them. It’s not clear that this is the case with killer robots, but it’s possible—I think there needs to be more investigation.”
From a moral perspective, Lin says he’s sympathetic to the ban on killer robots. But like Ackerman, he says it’s hard to imagine how that can happen.
"“One reacts to the other because it senses a threat and then the other reacts to that reaction because it senses a threat and then they start calling re-enforcements and it could escalate very quickly, and no humans are actually involved in the decision to initiate or escalate force,” said Peter Asaro, an affiliate scholar at Stanford and professor of media studies at The New School.
Hours after gunman Micah Johnson ambushed police officers in downtown Dallas, he was killed by a bomb strapped on a police robot. Robots in the past have stopped a lot of dangerous situations, but using a robot to kill - that was a first for a domestic police force. Kris Van Cleave reports on the ethical questions about the use of robots to kill suspects.
Affiliate Scholar Peter Asaro is interviewed.
Robotic warfare is no longer the realm of science fiction. From drones to the development of lethal, autonomous robotic weaponry, we take a look at the ethics of the future of killer robots and where this will ultimately lead mankind.
Are you worried about killer robots? Last week, some of the most prominent thinkers in science and technology signed an open letter that warned of the coming arms race should militaries pursue the development and deployment of artificially intelligent weaponry. The letter was written by The Campaign to Stop Killer Robots, an international coalition of NGOs, and was signed by almost 14,000 people, including Stephen Hawking, Elon Musk, and Steve Wozniak.
It may be old news here in Hollywood -- but the world’s scientists are now warning us about killer robots. Hundreds of scientists, including Stephen Hawking and Elon Musk wrote an open letter released this week, in which they warned of a global A.I. arms race.Peter Asaro
CIS Affiliate Scholar David Levine interviews Peter Asaro of the School of Media Studies at The New School, on killer robots.
Killer robots — or lethal autonomous weapons systems — could be the future but should they have a mind of their own to decide who lives or dies?
It is a complex topic being debated at a expert convention in Geneva.
Beverley O'Connor speaks to Dr Peter Asaro, who is the co-founder and vice-chair of the International Committee for Robot Arms Control.