Maricela Muñoz, Government Fellow at the Geneva Centre for Security
Policy (GCSP)
Are states ready to prohibit autonomous weapons systems (AWS) that target and apply force to humans without meaningful human control?
Maybe not just yet. . . . Or so it might seem if one listened to the diverse arguments presented during the first informal consultations convened by the chair of the UN Group of Governmental Experts (GGE) for emerging technologies in the area of lethal autonomous weapons systems (LAWS) for 2021. These consultations, which took place virtually from June 28 to July 2, were largely based on the written submissions presented by stakeholders in the last few months.
The mandate of the GGE as prescribed by the states parties to the Convention on Conventional Weapons (CCW) is to identify “consensus recommendations in relation to the clarification, consideration and development of aspects of the normative and operational framework” for these emerging technologies. CCW states parties initiated discussions of such technologies in 2014.
Although CCW states parties have not reached consensus, as of today, many states and international organizations, including the International Committee of the Red Cross (ICRC), have warned of possible gaps in the current international legal framework’s ability to address the challenges brought about by emerging technologies in the area of LAWS. In other words, these actors have highlighted the existence of a vacuum in international law to address the rapid development of weapons technology based on artificial intelligence and machine learning. Consequently, they would like to see a more ambitious outcome coming out of the lengthy discussions.
These stakeholders are of the view that the time is ripe for the GGE on LAWS to present to the 2021 Review Conference of the CCW unequivocal recommendations for the development of a robust operational and normative framework, including a mandate to negotiate international legal norms.
Autonomous Weapons Systems
Today, the world bears witness to the rapid advances in technology, including in the area of artificial intelligence. As it stands, dual use technology may have different impacts depending on its applications. When technology is used in a positive manner, it can lead the world towards more inclusive, sustainable, and peaceful development, contribute to the achievement of Agenda 2030 for Sustainable Development, and therefore improve the lives of many. By the same token, frontier technologies could also be used to develop autonomous weapons systems that threaten to displace humans from their unique role in decision-making with regard to the use of force.
As a multilateral practitioner in the disarmament, non-proliferation, and arms control context, one gets the sense that the architecture is lagging behind in terms of the need to address the challenges posed by new technological developments that may have military and law enforcement applications and cause potentially serious humanitarian consequences and indiscriminate effects.
It is important to consider that the disarmament and arms control architecture ought to be future-proof, and a moral line must be drawn if the international community wants to prevent the dehumanization of technology and its potential applications.
In this regard, a common understanding has emerged that military technologies with certain characteristics in autonomy may present legal, ethical, moral, operational, and security challenges, including compliance with international law, international humanitarian law, international human rights law, international criminal law, and the dictates of human dignity.
Against this backdrop, the ICRC has recommended that the “use of autonomous weapon systems to target human beings should be ruled out . . . through a prohibition on autonomous weapon systems that are designed or used to apply force against persons.” It has also recommended prohibitions on unpredictable AWS and regulations on other AWS.
Prohibitions and Regulations
One could derive from this position, which has also been presented by many delegations in the context of the GGE on LAWS, that AWS could be divided into non-acceptable weapons systems, which are prohibited, and acceptable systems, which are regulated. It will be fundamental to take into account the weapon system’s life cycle, and the degree of human-machine interaction involved in each stage of that cycle.
AWS that select and apply force to targets without meaningful human control will fall under the non-acceptable category. In this case, we can picture an autonomous weapon system that self-initiates or triggers a strike in response to information from the environment received. This action is done through sensors and on the basis of a generalized “target profile” (technical indicators function as a generalized proxy for a target). The weapon system fires itself when triggered by an object or person, at a time and place that is not specifically known, neither chosen nor predicted by the user.
The design and use of autonomous weapons systems that would not be prohibited would fall under the acceptable category, but would still be regulated. The regulations would include a combination of (a) limits on the types of targets, such as constraining them to objects that are military objectives by nature; (b) limits on the duration, geographical scope, and scale of use, including to ensure human judgment and control in relation to a specific attack; (c) limits on situations of use, such as confining them to situations where civilians or civilian objects are not present; and (d) requirements for human–machine interaction, notably to ensure effective human supervision and responsibility, that may translate into a timely intervention and deactivation.
Moreover, the use of target profiles to identify and use force against humans should also be prohibited because it is legally and morally unacceptable. Algorithm-based programming relies on data sets that can perpetuate or amplify social biases, including gender and racial bias, and thus have implications for compliance with international law. Furthermore, the debate should address in depth the collateral harm and differentiated affectation to women and children.
In terms of compliance with current international law, including international humanitarian law, important aspects such as accountability, transparency, safeguards, risk mitigation factors, and other criteria must also be applied to the design, development, production, deployment, and use of autonomous weapons systems.
Another aspect that is important to underline is that machines cannot be anthropomorphized, and therefore they cannot be held accountable or responsible for their acts. Only states and individuals can face consequences for any potential illegal or criminal behavior.
Furthermore, scientists, legal, industry, and military experts, as well as civil society advocates have already stressed the need to ensure sufficient levels of predictability, foreseeability, reliability, oversight, and explainability of autonomous weapons systems. The establishment of spatial and temporal constraints enables commanders and operators to exercise meaningful human control, ensure legal compliance, avoid technical vulnerabilities, and mitigate risks.
Against this background, and taking into account that states in general have expressed that they have no intention of developing and using AWS that are fully autonomous, or that lack adequate and meaningful human control, it is only sensible that the current GGE on LAWS should recommend prohibitions and regulations in the context of its mandate. States should aim to establish international limits on autonomous weapons systems that effectively address humanitarian, legal, ethical, and security concerns raised by these weapons, and do so in a timely and effective manner.
Beyond new legal rules, these limits may also include common policy standards, preemptive national and other risk mitigation measures, and transparency in legal weapons reviews.
In the meantime, it is fundamental to acknowledge that some states have already endeavored to exercise responsible behavior in supporting the development of technologies, including advanced weapons systems, whilst this is not the case for armed non-state actors. Therefore, serious efforts must be undertaken to prevent the acquisition and proliferation of advanced weaponries by armed non-state actors.
An open-ended and inclusive process that addresses the humanitarian and security challenges posed by AWS should take place at the international level, and rightfully in the context of the CCW given its humanitarian purpose.
Looking Forward to the 2021 CCW Review Conference
In the context of the discussions within the GGE on LAWS, a strong point of convergence seems to be emerging, which is that those weapons systems with unlimited autonomy for targeting and applying force to humans should be prohibited. There is also convergence around the importance of retaining meaningful human control in weapons systems, and strong support for the establishment of prohibitions and regulations on AWS.
Formal meetings of the GGE are scheduled to take place in Geneva, as follows: August 3 to 13, September 24 to October 1, and October 18 to 22, 2021.
The question remains whether or not states will act with due sense of urgency, and make robust recommendations to the 2021 CCW Review Conference for the negotiation of a new normative and operational framework.
Personally, I remain optimistic that all interested parties will finally decide to stand on the right side of history, and address—constructively and without delay—the serious challenges posed by autonomous weapons systems.
Like this:
Like Loading...
Maricela Muñoz, Government Fellow at the Geneva Centre for Security
Policy (GCSP)
Are states ready to prohibit autonomous weapons systems (AWS) that target and apply force to humans without meaningful human control?
Maybe not just yet. . . . Or so it might seem if one listened to the diverse arguments presented during the first informal consultations convened by the chair of the UN Group of Governmental Experts (GGE) for emerging technologies in the area of lethal autonomous weapons systems (LAWS) for 2021. These consultations, which took place virtually from June 28 to July 2, were largely based on the written submissions presented by stakeholders in the last few months.
The mandate of the GGE as prescribed by the states parties to the Convention on Conventional Weapons (CCW) is to identify “consensus recommendations in relation to the clarification, consideration and development of aspects of the normative and operational framework” for these emerging technologies. CCW states parties initiated discussions of such technologies in 2014.
Although CCW states parties have not reached consensus, as of today, many states and international organizations, including the International Committee of the Red Cross (ICRC), have warned of possible gaps in the current international legal framework’s ability to address the challenges brought about by emerging technologies in the area of LAWS. In other words, these actors have highlighted the existence of a vacuum in international law to address the rapid development of weapons technology based on artificial intelligence and machine learning. Consequently, they would like to see a more ambitious outcome coming out of the lengthy discussions.
These stakeholders are of the view that the time is ripe for the GGE on LAWS to present to the 2021 Review Conference of the CCW unequivocal recommendations for the development of a robust operational and normative framework, including a mandate to negotiate international legal norms.
Autonomous Weapons Systems
Today, the world bears witness to the rapid advances in technology, including in the area of artificial intelligence. As it stands, dual use technology may have different impacts depending on its applications. When technology is used in a positive manner, it can lead the world towards more inclusive, sustainable, and peaceful development, contribute to the achievement of Agenda 2030 for Sustainable Development, and therefore improve the lives of many. By the same token, frontier technologies could also be used to develop autonomous weapons systems that threaten to displace humans from their unique role in decision-making with regard to the use of force.
As a multilateral practitioner in the disarmament, non-proliferation, and arms control context, one gets the sense that the architecture is lagging behind in terms of the need to address the challenges posed by new technological developments that may have military and law enforcement applications and cause potentially serious humanitarian consequences and indiscriminate effects.
It is important to consider that the disarmament and arms control architecture ought to be future-proof, and a moral line must be drawn if the international community wants to prevent the dehumanization of technology and its potential applications.
In this regard, a common understanding has emerged that military technologies with certain characteristics in autonomy may present legal, ethical, moral, operational, and security challenges, including compliance with international law, international humanitarian law, international human rights law, international criminal law, and the dictates of human dignity.
Against this backdrop, the ICRC has recommended that the “use of autonomous weapon systems to target human beings should be ruled out . . . through a prohibition on autonomous weapon systems that are designed or used to apply force against persons.” It has also recommended prohibitions on unpredictable AWS and regulations on other AWS.
Prohibitions and Regulations
One could derive from this position, which has also been presented by many delegations in the context of the GGE on LAWS, that AWS could be divided into non-acceptable weapons systems, which are prohibited, and acceptable systems, which are regulated. It will be fundamental to take into account the weapon system’s life cycle, and the degree of human-machine interaction involved in each stage of that cycle.
AWS that select and apply force to targets without meaningful human control will fall under the non-acceptable category. In this case, we can picture an autonomous weapon system that self-initiates or triggers a strike in response to information from the environment received. This action is done through sensors and on the basis of a generalized “target profile” (technical indicators function as a generalized proxy for a target). The weapon system fires itself when triggered by an object or person, at a time and place that is not specifically known, neither chosen nor predicted by the user.
The design and use of autonomous weapons systems that would not be prohibited would fall under the acceptable category, but would still be regulated. The regulations would include a combination of (a) limits on the types of targets, such as constraining them to objects that are military objectives by nature; (b) limits on the duration, geographical scope, and scale of use, including to ensure human judgment and control in relation to a specific attack; (c) limits on situations of use, such as confining them to situations where civilians or civilian objects are not present; and (d) requirements for human–machine interaction, notably to ensure effective human supervision and responsibility, that may translate into a timely intervention and deactivation.
Moreover, the use of target profiles to identify and use force against humans should also be prohibited because it is legally and morally unacceptable. Algorithm-based programming relies on data sets that can perpetuate or amplify social biases, including gender and racial bias, and thus have implications for compliance with international law. Furthermore, the debate should address in depth the collateral harm and differentiated affectation to women and children.
In terms of compliance with current international law, including international humanitarian law, important aspects such as accountability, transparency, safeguards, risk mitigation factors, and other criteria must also be applied to the design, development, production, deployment, and use of autonomous weapons systems.
Another aspect that is important to underline is that machines cannot be anthropomorphized, and therefore they cannot be held accountable or responsible for their acts. Only states and individuals can face consequences for any potential illegal or criminal behavior.
Furthermore, scientists, legal, industry, and military experts, as well as civil society advocates have already stressed the need to ensure sufficient levels of predictability, foreseeability, reliability, oversight, and explainability of autonomous weapons systems. The establishment of spatial and temporal constraints enables commanders and operators to exercise meaningful human control, ensure legal compliance, avoid technical vulnerabilities, and mitigate risks.
Against this background, and taking into account that states in general have expressed that they have no intention of developing and using AWS that are fully autonomous, or that lack adequate and meaningful human control, it is only sensible that the current GGE on LAWS should recommend prohibitions and regulations in the context of its mandate. States should aim to establish international limits on autonomous weapons systems that effectively address humanitarian, legal, ethical, and security concerns raised by these weapons, and do so in a timely and effective manner.
Beyond new legal rules, these limits may also include common policy standards, preemptive national and other risk mitigation measures, and transparency in legal weapons reviews.
In the meantime, it is fundamental to acknowledge that some states have already endeavored to exercise responsible behavior in supporting the development of technologies, including advanced weapons systems, whilst this is not the case for armed non-state actors. Therefore, serious efforts must be undertaken to prevent the acquisition and proliferation of advanced weaponries by armed non-state actors.
An open-ended and inclusive process that addresses the humanitarian and security challenges posed by AWS should take place at the international level, and rightfully in the context of the CCW given its humanitarian purpose.
Looking Forward to the 2021 CCW Review Conference
In the context of the discussions within the GGE on LAWS, a strong point of convergence seems to be emerging, which is that those weapons systems with unlimited autonomy for targeting and applying force to humans should be prohibited. There is also convergence around the importance of retaining meaningful human control in weapons systems, and strong support for the establishment of prohibitions and regulations on AWS.
Formal meetings of the GGE are scheduled to take place in Geneva, as follows: August 3 to 13, September 24 to October 1, and October 18 to 22, 2021.
The question remains whether or not states will act with due sense of urgency, and make robust recommendations to the 2021 CCW Review Conference for the negotiation of a new normative and operational framework.
Personally, I remain optimistic that all interested parties will finally decide to stand on the right side of history, and address—constructively and without delay—the serious challenges posed by autonomous weapons systems.
Share this:
Like this: