By Wanda Muñoz, member of SEHLAC and the Global Partnership for Artificial Intelligence
After more than one year without a meeting in person, the Convention on Conventional Weapons (CCW) Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems met again at the Palais des Nations in Geneva on August 3-13.
At this meeting, and at the informal consultation held by the Belgian Chairmanship of the GGE in June, we in civil society identified what has been called “emerging convergence” on some of the key aspects of the debate. These aspects include:
- The structure of a future instrument on autonomous weapons systems (AWS),
- The inclusion of prohibitions and positive obligations,
- The concept of “meaningful human control” as core to considerations, and
- The need to drop the qualifier of “lethal” to describe the weapons under discussion.
These elements have been presented, advocated for, and well-justified by countries including Argentina, Austria, Brazil, Chile, Costa Rica, Ecuador, El Salvador, Ireland, Mexico, Panama, Pakistan, Peru, the Philippines, Sierra Leone, the State of Palestine, Uruguay, and Venezuela on behalf of the Non-Aligned Movement.
The call for an international legally binding instrument as the best answer to the concerns raised by AWS has also been steadily growing. For more on the momentum toward a new treaty, see, for instance, recent analyses by Human Rights Watch, SEHLAC, and the Campaign to Stop Killer Robots.
CCW High Contracting Parties, the International Committee of the Red Cross (ICRC), the Campaign to Stop Killer Robots, its members organizations, and others are currently preparing for the intense discussions that will take place on the road to the CCW’s 2021 Review Conference. Although the discussions on AWS are certainly fundamental on their own merit, the stakes and impact of what will be resolved by the CCW in the next months will have a much wider impact than that of this very specialized forum.
Indeed, these discussions and their outcomes are about more than autonomous weapons systems; they will certainly have wider legal, social, and political implications. Six of the themes that will carry over from the August meeting warrant particular attention because of their broader significance:
1. The links between international humanitarian law (IHL), international human rights law (IHRL), and ethics. We were surprised—or were we really?— to hear some countries advocate for deleting the concepts of international human rights law and ethics from the Chair’s paper, which presents drafts elements on possible recommendations for the report that the GGE will submit to the Review Conference.
The relevance of IHRL in situations of conflict has been widely accepted and documented; as has the relevance of both ethics and human rights frameworks in the context of discussions on the risks of artificial intelligence (AI) and emerging technologies. Eliminating the reference to IHRL and ethics would set a negative precedent that disregards existing agreements and good practice both in disarmament and in artificial intelligence.
2. The use of the qualifier “lethal” to describe the systems of weapons being addressed. The notion of lethality has been part of the discussions in the CCW since the creation of the GGE in 2014. However, as the debates have moved forward and the GGE experts have advanced in their own understanding of the issues, delegations including Brazil, Chile, Mexico, and many others have expressed an objection to lethality as a defining characteristic of autonomous weapons systems. As has been eloquently explained by the ICRC, “lethality” is not a criteria used in IHL to define a weapon. Lethality is not only related to the intrinsic characteristics of a weapon; it depends on the target and context, and unnecessary suffering and injuries should be avoided as well. Defense systems may also have lethal consequences. Accepting lethality as a qualifier of AWS would undermine previous understandings of IHL and set a negative precedent. Furthermore, we should not accept that “lethality” remains in the discussion just because it was there before—yet this is the main argument some delegations are putting forward to keep it in.
3. The creation of synergies (or lack thereof) between different frameworks. Many delegations love referring to such elements and even “cross-pollination” in other humanitarian disarmament frameworks, and we believe delegations in the CCW have important areas of opportunity in this regard. For those countries who are against a negotiating mandate:
- How does this position align with the Sustainable Development Goals (SDG), particularly SDG#16 which aims to “promote just, peaceful and inclusive societies”?
- How does it align with the OECD AI principles?
- How about those countries with feminist foreign policies, or members of forums such as the Global Partnership on AI, which aims to “promote trust in and the adoption of trustworthy AI”?
- And most of all—how does those countries’ position align with the UNESCO Recommendation on the Ethics of AI?
It is not coherent for countries to endorse those frameworks and policies, and then oppose a negotiating mandate on autonomous weapons systems; they cannot have it both ways.
Furthermore, if the CCW fails to adopt a legally binding instrument on AWS, it would send the message that the right to life can be delegated to autonomous functions. Such an unacceptable precedent would negatively impact other frameworks related to the regulation of artificial intelligence; the standard of what we could demand from regulators and policy makers would be inevitably lowered. If we as an international community accept that the right to life—without which all other rights cannot be exercised—can be delegated to autonomy, then why shouldn’t we accept that the right to health, employment, education, or social housing can be delegated to autonomy as well? Let us not be mistaken; this is what is at stake.
4. Building on experience and expertise from relevant civilian sectors, for instance in the case of algorithmic bias: Some delegations are still questioning whether algorithmic bias would be a concern in autonomous weapons systems. According to Argentinian AI expert Dr. Maria Vanina Martínez, in a recent event by UNIDIR, bias would be an issue because autonomous systems are based on data that represents only a portion of the real world. Additionally, the negative impact of algorithmic bias from a social perspective has been solidly demonstrated in sectors such as housing, justice, access to COVID-19 vaccination, and policing (see, for instance, Buolamwini and Gebru; and for its relevance in AWS: Ramsay-Jones from Soka Gakkai and SEHLAC). It is difficult to see how such problems could be avoided in AWS, particularly those that would target humans. Recognizing these risks is not “demonizing technology”; rather, it is ensuring that problems that have been identified in the civilian world are not allowed to be reproduced in weaponry. Ignoring this evidence would show a disconnect between the discussions in the CCW and evidence from AI and emerging technologies.
5. The active participation and impact of delegations and civil society from the Global South, including from countries affected by conflict, in the CCW. More and more delegates from the Global South are raising their voices. This highlights two key factors that are changing the power dynamics. Firstly, those in the Global South should be heard, as our countries are probably the ones where AWS would be tested and used initially, and they are committed to IHL and IHRL. Secondly, international issues are relevant for all of us, and all countries have the responsibility and the expertise to discuss issues that would impact humanity and human dignity, in or outside their borders. This is certainly not an issue that should be left only in the hands of countries that already have the biggest arms industries. Expect more of this in CCW and other forums—we want a different future, and we are there to forge it.
6. Last but not least, the future of the CCW. We recognize the CCW is a valuable forum to address concerns raised by different weapons with a large number of countries, including the main users and producers of such weapons. But regarding the topic of AWS, discussions have been going on for seven years now. As has been expressed by Bonnie Docherty from Human Rights Watch and Harvard Law School’s International Human Rights Clinic, “Over the years, states parties have added five protocols, amended one, and amended the framework convention. But they have accomplished little since adopting Protocol V on Explosive Remnants of War in 2003.” So GGE, are you ready to really address the issue of autonomous weapons systems as it should be—with a negotiating mandate for a legally binding instrument? Or are you happy to keep the discussion going on … and on … and on? It is time to take your responsibilities and demonstrate your relevance: agree on a negotiating mandate for a legally binding instrument on autonomous weapons systems.
* * * * *
A few countries seem to aim to stop any progress at all, using as pretext consensus reached in previous years on some concepts. But as the Chair rightly pointed out, it is possible to improve consensus, and if the GGE can agree on improvements, there is no reason why it should not do so. We would add that, in fact, CCW has the responsibility to do so.
We all have a responsibility to act now. We have this duty as an international community with an interest in preserving international humanitarian law, international human rights law, and human dignity, irrespective of our affiliations, organizations, and nationalities. We have the precedents, the commitment, the evidence, the knowledge, the motivation, and the networks to make this happen. Let us continue working together, joining forces and encouraging others to raise their voices. Let us do what we need to do to ensure that we have a legally binding treaty on autonomous weapons—and let us do that building on the wealth of knowledge and expertise from other forums and sectors.