Dynamics of Debate at the Experts Meeting on Autonomous Weapons

15 May 2014, 1438 EDT

ccwunThis will be nothing like a comprehensive overview on the topic (for one thing I have been in and out of plenary, for another I am filtering this event through the lens of my specific research agenda on framing and norm development). That said, here are a few notes and observations about the nature of the debate and the process here since I arrived yesterday – angles I am not likely to blog about in depth but which are worth noting in passing. Hope others will take them up and link to whatever others are writing – I am mostly following events here rather than online, with the exception of the #CCWUN twitter feed.

Genuine Deliberation Seems To Be Occurring Here. To me this has been the most pleasant surprise of the conference. My prior experience of and book learning about multilateral conferences suggests that delegates and experts should generally present formulaic statements in plenary, aimed at occupying diplomatic ground in contested political terrain, speaking primarily to audiences outside the room, and connecting largely with those they agree with. Instead what is happening here seems to be genuine communicative action. The event is extremely interactive, with speakers presenting in plenary and then delegates and civil society representatives asking pointed questions respectfully and receiving careful, thoughtful answers. NGO side events organized by the Campaign to Stop Killer Robots were absolutely packed with delegates and experts in turn asking thoughtful questions both during and around the presentations. Veteran CCW participants have said to me this is an unprecedented level of deliberative dialogue in this setting that speaks to the gravity of the moral issues at stake and the genuine interest of international society in a meaningful discussion.

Gender Balance Is a Huge, Glaring Problem. The Representative from Norway noted that of the 18 “experts” speaking in plenary, 0 are women. This is particularly egregious since one of the male experts actually occupied more than one slot. This is disheartening more than a decade after states pledged with Security Council Resolution 1325 to integrating women in to peace and security policy-making at the global level. Women present among the humanitarian disarmament NGO community caucused yesterday evening to discuss the problem and strategies for changing it, including encouraging male experts in disarmament to actively “step aside” when invited to all-male panels in order to make room for and refer organizers to female colleagues (actually not a bad idea in academic settings as well). Sarah Knuckey has begun collecting a list of female disarmament experts at Just Security blog to which male experts may refer colleagues. UN Special Rapporteur Christof Heyns noted the absence of gender diversity in the plenary in his remarks his morning, where he called for more participation by women and by experts from the Global South. By contrast, women have been well represented on the NGO side events as well as in NGO statements at the plenary.

Terminologies Remain Contested: The media is of course using the term “killer robots.” The diplomatic term of art at the meeting is “lethal autonomous weapons systems.” Campaigners here are using this term as well interchangeably with their original term “fully autonomous weapons,” and the more popular short-hand “killer robots.” But some campaigners dislike the term “lethal autonomous weapons systems” for three reasons: a) it is unwieldy b) the acronym LAWS to define the systems unhelpfully conflates them with international law whereas they may be unlawful and c) some campaigners are opposed to all targeting decisions without meaningful human control, not just those intentionally designed to be lethal to humans, since even targeting of weapons systems could case harm to humans – of course this criticism could also be applied to the term “killer robots” and d) there is concern about systems that might be ‘less than lethal.’ Some campaigners have proposed “FLAWS” as an alternative to “LAWS.” I will continue to use these terms interchangeably as well depending on my audience.

Composition of Specific Expert Panels Does Not Reflect Diversity of Opinion. Yesterday’s plenary sessions were divided between technical, ethical and legal panels. The ethics speakers primarily raised concerns about AWS. By contrast, the legal panel yesterday contained no perspectives from AWS critics. All speakers were AWS proponents, with the exception of Nils Melzer who I interpreted as occupying a middle ground and opposing a ban without explicitly arguing in favor of the systems. His middle ground position would have made more sense if positions at both sides of the continuum were represented on the panel. Legal arguments in favor of a strict legal requirement of meaningful human control were raised repeatedly in interventions by NGOs and delegates. I wonder if the uniform composition of the ethical and legal panels will give delegates (or the media) the impression that ethical arguments are all pro-ban and legal arguments are all pro-AWS. Actually, there are nuanced positions for and against in both communities that might have been included: Ron Arkin’s argument in favor of AWS is based on utilitarian ethics, and Bonnie Docherty’s arguments against are based on legal analysis. The second legal panel (today) seemed a bit more balanced.

‘Meaningful Human Control’ Is Emerging As a Theme But What It Means Is Contested. I am hearing a general consensus emerging that meaningful human control should be required for all weapons systems. Country delegates I have spoken to as well as the NGOs say they share this perception. Even proponents of such systems and states in favor of developing them have been arguing that humans should retain “meaningful control” over these weapons, and have been generally bending over backward to say they have no intention to ever deploy them in any other way. But there is wide debate on what constitutes meaningful human control and how we would know. A US delegate implied yesterday that autonomous weapons would of course be controlled by humans because as the creators of these weapons we are responsible for their programming. AWS proponents have said and reiterated on panels that as long as a human commander makes the decision to deploy the weapon and could be held responsible for errors this satisfies the “meaningful control” condition. However AWS critics are arguing IHL requires situational awareness and proportionality judgments in context of attacks per se not simply deployment and doubt machines can satisfy this condition.

Other Open Questions:

Is Human Emotion in Targeting Decisions Good or Bad on Balance? AWS proponents stress the advantages to IHL compliance if algorithms make targeting decisions because they will not succumb to stress, exhaustion, anger, fear, revenge or other unsavory, irrational human impulses. Advocates of a ban emphasis that human moral judgment and reasoning are fundamentally based on emotions such as compassion, empathy, and courage. In an intervention this morning the ICRAC delegation stated that the emphasis on techno-rationality risked removing moral reasoning from war.

Is Existing IHL Sufficient To Govern Developments in LAWS? Opponents of a new treaty ban say yes (I’m told this is typical practice in CCW settings): no new law is needed because existing law is sufficient to cover bases. Some experts and delegates emphasized states’ responsibility to conduct Article 36 reviews, and to use any weapons consistently with the principles of distinction and proportionality. Proponents seemed dismissive of the idea that the principle of “humanity” in the law forbade these weapons or that the distinction principle mitigates against putting weapons beyond human control: campaigners argue that if it doesn’t, then clearly new law is needed since the dictates of the public conscience are in favor of meaningful human control.

Is the Marten’s Clause Relevant? The global coalition and the UN Human Rights Rapporteur strongly feel it is. The ICRC and several states also mentioned the importance of the clause, which enjoins states not to assume that what is not prohibited is allowable, but to look to custom as well as the “dictates of humanity and the public consience” in evaluating practices not yet prohibited by the law. In his remarks in plenary, Columbia law professor Matthew Waxman disagreed, suggesting the Marten’s Clause is not binding on states. One speaker argued that public opinion is a poor basis for policy making as it can change over time. The ICRC considers the Marten’s Clause customary law. I have argued the clause requires states to consider public views, and I presented public opinion data on the ethical basis for US public opposition to autonomous weapons at an NGO side event.

What About Other International Law Beyond IHL?: Christof Heyns urged states this morning not to forget about human rights law. This has been echoed in Human Rights Watch’s new report on policing as well as Peter Asaro‘s remarks on ethical principles underpinning both human rights law and IHL. In his second appearance in plenary, Nils Melzer focused on the question of whether the use of AWS would threaten the law on the use of force. In a reply, NGO campaigners for a ban agreed with him that it would not and pointed out no one ever argued the contrary: rather they have argued AWS may have operational effects on compliance with those laws, not the laws themselves.* Finally ethical questions beyond IHL are being consistently raised. My survey data demonstrates it is ethical and moral concerns rather than questions of legality that are particularly salient in the eyes of the public. An open question is how the “principle of humanity” expressed in international law is to be understood as applied to this issue.

For official conference news and links to expert talks, go here. For more on what’s going on from the civil society perspective see the reports at Reaching Critical Will’s website here. For tweet coverage by friends of the global coalition go here.
________________
*I have personally been skeptical of the causal argument about such operational effects of LAWS on the incidence of war, though I must say I heard extremely compelling arugments by ICRAC’s Jurgen Altmann on an NGO side event yesterday where he outlined causal mechanisms by which the sheer speed of interactions between defense computers could contribute to crisis instability. At any rate, it would have been nice to see an “expert” on causes of war to assess this argument since as Altmann reminded us at the side event through reference to the CCW preamble, the CCW is a disarmament as well as a humanitarian law treaty.