To ban or not to ban. Could a ban work? Should one even be proposed?
Robotics Business Review spoke to some of the world’s leading experts on LARs (lethal autonomous robots).
The genie is out of the bottle
The next generation of military robots?so-called ?lethal autonomous robots? (LAR)?are being designed with autonomy in mind, potentially enabling machines to independently identify potential targets and make the decision to strike.
The U.S. Department of Defense (DoD) document ‘Unmanned Systems Integrated Roadmap FY2011-2036‘ for example, commits the DoD to the development of ?technologies and policies that introduce a higher degree of autonomy.?

Autonomous robots can reduce the military’s manpower burden, the ?decision loop cycle time,? and its reliance on full-time, high-speed communication links, the roadmap enthuses.
The U.K. Ministry of Defence is also talking up the potential benefits of providing military robots with autonomous capabilities. Elsewhere, Russia and China are believed to be developing autonomous, weaponized robot systems.
Military robotics is big business too. Defense market analysts, The Teal Group predict that the global market for weaponized drones will grow from $6.6 billion to $11.4 billion in the next decade.
Take the human out of the loop however, and you are left with just a loop. A loop, that is, of encoded robotic autonomy.
It’s when that autonomy could potentially result in fatal consequences for civilians, as is the case with LAR systems, that serious legal and ethical concerns about the legitimacy of their use arise, according to a growing chorus of voices opposed to the development of weaponized robots.
On October 2nd, for example, Matthew Bolton, an assistant professor in political science at Pace University, New York, called for an international treaty governing the production, trade, transfer and use of all robotic weaponry, including autonomous systems.
And Oct 8th, CNN national security analyst, Peter Bergen, called for an international convention on the legal frameworks governing drone warfare technology.

Their voices join that of the International Committee for Robot Arms Control (ICRAC), which first called for a ban on the ?development, acquisition, deployment, and use? of LAR in 2010.
But what legal mechanisms could be used to establish and enforce a ban? And what are some of the key ethical and legal arguments for and against such a move?
Robotics Business Review interviews experts:
Noel Sharkey, professor of Artificial Intelligence and Robotics at the University of Sheffield:
The use of LAR will breach at least three cornerstones of the international laws of war –the principles of distinction, proportionality and accountability– says Noel Sharkey, professor of Artificial Intelligence and Robotics at the University of Sheffield, and a founding member of ICRAC.

Described in Article 48 of the 1977 Additional Protocol I to the Geneva Convention, the principle of distinction requires all parties to a military conflict to distinguish between the civilian population and combatants.
?No robot or AI system can distinguish between a combatant and a non-combatant. So, without that discrimination, how can a lethal, autonomous robot be ethical and comply with the laws of war?? asks Sharkey.
Meanwhile, the principle of proportionality prohibits the use of such force that may be expected to kill or injure civilians and is disproportionate to the military gains achieved by the strike.
Even if LAR had perfect discrimination, Sharkey argues, robots lack the ?common sense reasoning and battlefield awareness? to make decisions that are both militarily effective and proportionate under international law.
?Soldiers don’t shoot every time they meet an insurgent,? says Sharkey. ?Sometimes, it is more beneficial to their overall goals to let him pass.?
This is the type of decision an experienced commander should make, says Sharkey. Robots lack the intelligence.
And since a robot does not have moral agency, it cannot be held accountable for its actions, making it difficult to decide who is responsible in cases of suspected breaches of the rules of war.
Further concerns include lowering the threshold for war and the impossibility of predicting how opposing LARs –each running its own unique set of top-secret algorithms– might interact in the battlefield.
?We need to move towards prohibition with urgency before development gets too advanced. It is already well underway,? says Sharkey.
Kenneth Anderson, professor of law at American University Washington:
Kenneth Anderson, professor of law at American University Washington and a member of the Hoover Task Force on National Security and Law at Stanford, disagrees.
?I don’t support ICRAC’s call for a ban at all. And I don’t see a role for the United Nations, or anyway a useful one. There are not enough shared interests or views on these questions for the long run,? says Anderson.
Much of the argument over LAR centers around mistaken and ?gigantic objections, such as ‘AI will not deliver,’ and ‘machines are not legally accountable’.?
?While these arguments are academically interesting and important, in a sense they miss the boat, because they tend to assume a model of ‘One day, not autonomous, the next day, autonomous.’
?It’s going to be ? and it already is – a very sliding technological process in which it will be very hard to establish the exact point at which there might be objectionable autonomy,? says Anderson.
The main driver for drone autonomy comes as the result of pressure from other systems, particularly the need to match the speed of your adversaries. This drives the development of drones that are ?genuinely autonomous? in flight, which in turn creates a case for making the drone’s weapons systems autonomous too.
?It’s going to be hard not to eventually make the weapons autonomous, because they have to be integrated into the rest of the system at the speeds it operates,? says Anderson.
Suits originating in other jurisdictions are always possible, ?but won’t be enforced in the U.S,? says Anderson. And if NATO allies, for example, were to start filing suits against the United States over LAR technology, tensions within NATO would grow over whether common weapons systems will be used across the organization.
A forthcoming paper (PDF) for Policy Review co-authored by Anderson and Columbia Law School professor Matthew Waxman, urges an incremental approach to regulation through the gradual development of internal state norms and best practices and calls on the U.S. to develop its own set of principles for the regulation and governance of LAR.
?It is better,? an online draft of the paper states, ?that the United States work to set global standards by actively explaining its compliance with it than let other states or groups set it ? whether those who would impose unrealistic, ineffective or dangerous prohibitions or those who would prefer few or no constraints at all.?
Sarah Knuckey, an international human rights lawyer and director at the Center for Human Rights and Global Justice, at NYU School of Law:
Now is the time to have a serious public debate about the need for domestic and international legal regulation of the development and deployment of LAR, says Sarah Knuckey, an international human rights lawyer and director of the Project on Extrajudicial Executions at the Center for Human Rights and Global Justice, at NYU School of Law.
?With many other drone issues [such as targeted killings by drones and U.S. domestic police drone use] scrutiny, democratic accountability, and public debate trails behind actual use,? says Knuckey, a former advisor to the United Nations’ Special Rapporteur on extrajudicial executions.
?With LAR, we now have the opportunity to carefully think through the ethical, moral, political, and legal issues in advance of a new technology’s use. This opportunity should be seized.?
There are several mechanisms in international law that could be use to establish a ban, says Knuckey, from adding an additional protocol to an existing arms control treaty, to developing a framework agreement on the use of lethal robotics in general, with a binding protocol attached specifically prohibiting LAR.
Alternatively, a new, independent treaty or convention similar to the 2008 Convention on Cluster Munitions (CCM) could be developed.
Meanwhile, regional or country-specific agreements like the Strategic Arms Reduction Treaty (START) between the U.S. and Russia, and soft law options such as the creation of a ‘Code of Conduct’ for LAR research and deployment could also be explored, says Knuckey.
The ?most promising? mechanism and strategy for creating a ban is probably an international treaty similar to the CCM, says ICRAC’s Sharkey.
Adopted in Dublin, Ireland in 2008, the CCM came into effect on August 1st, 2010 and has been ratified by 76 countries. A further 35 countries have signed, but not ratified, the Convention, which prohibits the use, transfer, and stockpiling of cluster bombs.
Controversially, some of the most militarily advanced nations in the world are yet to sign the CCM. But there is value in pursuing a similar strategy to create a ban on LAR, says Sharkey.
?Although a number of countries including the US, Russia and China did not sign these treaties, there has been no widespread use of these weapons since, and the treaty provisions should eventually become customary law.?
One obstacle to be overcome, says Sharkey, is the ?mythical cultural narrative? that greatly overestimates what robots can do.
?You can hear it in the language of proponents who talk about robots never getting hungry or seeking revenge or being more ethical and humane on the battlefield than humans. This fits into the trope that robots have some sort of moral agency,? says Sharkey.
?In fact, robots are pretty dumb machines with no more intelligence than a washing machine. Such talk misleads and distracts from the real issues.?
Ronald Arkin, Regent’s Professor at the School of Interactive Computing, Georgia Tech:
Rather than banning LAR technology outright, methods to ensure that it is used in a controlled and guided way should be developed, says Ronald Arkin, Regent’s Professor at the School of Interactive Computing, Georgia Tech and an expert on the ethics of military robotics.
?Some advocates for banning are arguing from pathos rather than from logos or ethos,? says Arkin, who contends that the use of weaponized autonomous robots in war is inevitable.
?A more effective strategy is to get the players to the table to discuss these things effectively and intelligently and not from arguments of fear.?
Central to Arkin’s position is his belief that LAR systems could eventually outperform human soldiers.

?I have the utmost respect for our young men and women in the battlefield, but the tempo of modern warfare makes it unreasonable to expect them to make truly effective decisions every time,? says Arkin.
?If robots can do better than human soldiers, and I believe that they can, it will reduce the damage suffered by non-combatants and translate into saving lives. This is an important aspect of this work.?
A 2006 U.S. Army survey of the mental health of soldiers and marines who had recently served in a war zone revealed that less than half of soldiers and marines would report a team member for unethical behavior.
One in three believed that torture should be allowed to save the life of a fellow soldier or marine. Ten percent reported mistreating non-combatants or damaging their property unnecessarily.
Human performance in the battlefield, is an ?extremely depressing and horrific? area of study, Arkin observes. But the data lends ethical support to arguments that favor continued development of LAR technologies.
Arkin has outlined a proof of concept for an ?ethical governor? ? a software program that could, if successfully developed and deployed, be used to ensure LARs comply with international rules of war.
If used in strictly limited scenarios such as counter sniper operations, room clearing, or perimeter protection, says Arkin, it becomes possible to encode a weaponized robot with the specific permissions, obligations, and prohibitions that apply under international law.
?You don’t need to encode the entire Geneva Convention in there, just those sections that are particularly relevant,? says Arkin.
The debate continues. But given the rapid proliferation and development of existing military robots, the window for imposing a ban on the development of LAR appears small.
And with major players such as the U.S., U.K., China, Israel, and Russia, unlikely to become signatories to any international treaty prohibiting LAR systems, autonomous, armed robots look set to advance and, indeed, to proliferate.
Perhaps only a sustained public campaign will persuade policy makers to get involved in regulating LAR development and deployment. Even at that, country-specific legislation and regulation seems more likely than international agreement.
Despite the odds, ICRAC’s Sharkey remains hopeful.
?It is always very difficult to put the genie back in the bottle once it is out, but this genie is just showing its face at the moment and there is still time to cork the bottle,? he says.
What do you think?
Should LAR technology be banned or is the genie already out of the bottle? Do you think robots could outperform humans on the battlefield? Do you have a military robotics story to share? Tell us about it.