Three Pathways to Secure Greater Respect for International Law concerning War Algorithms 

Dustin A. Lewis

2020

Legal Commentary

Harvard Law School Program on International Law and Armed Conflict


Executive Summary

Existing and emerging applications of artificial intelligence in armed conflicts and other systems reliant upon war algorithms and data span diverse areas. Natural persons may increasingly depend upon these technologies in decisions and activities related to killing combatants, destroying enemy installations, detaining adversaries, protecting civilians, undertaking missions at sea, conferring legal advice, and configuring logistics.

In intergovernmental debates on autonomous weapons, a normative impasse appears to have emerged. Some countries assert that existing law suffices, while several others call for new rules. Meanwhile, the vast majority of efforts by States to address relevant systems focus by and large on weapons, means, and methods of warfare. Partly as a result, the broad spectrum of other far-reaching applications is rarely brought into view.

One normatively grounded way to help identify and address relevant issues is to elaborate pathways that States, international organizations, non-state parties to armed conflict, and others may pursue to help secure greater respect for international law. In this commentary, I elaborate on three such pathways: forming and publicly expressing positions on key legal issues, taking measures relative to their own conduct, and taking steps relative to the behavior of others. None of these pathways is sufficient in itself, and there are no doubt many others that ought to be pursued. But each of the identified tracks is arguably necessary to ensure that international law is — or becomes — fit for purpose.

By forming and publicly expressing positions on relevant legal issues, international actors may help clarify existing legal parameters, pinpoint salient enduring and emerging issues, and detect areas of convergence and divergence. Elaborating legal views may also help foster greater trust among current and potential adversaries. To be sure, in recent years, States have already fashioned hundreds of statements on autonomous weapons. Yet positions on other application areas are much more difficult to find. Further, forming and publicly expressing views on legal issues that span thematic and functional areas arguably may help States and others overcome the current normative stalemate on autonomous weapons. Doing so may also help identify — and allocate due attention and resources to — additional salient thematic and functional areas. Therefore, I raise a handful of cross-domain issues for consideration. These issues touch on things like exercising human agency, reposing legally mandated evaluative decisions in natural persons, and committing to engage only in scrutable conduct.

International actors may also take measures relative to their own conduct. To help illustrate this pathway, I outline several such existing measures. In doing so, I invite readers to inventory and peruse these types of steps in order to assess whether the nature or character of increasingly complex socio-technical systems reliant upon war algorithms and data may warrant revitalized commitments or adjustments to existing measures — or, perhaps, development of new ones. I outline things like enacting legislation necessary to prosecute alleged perpetrators of grave breaches, making legal advisers available to the armed forces, and taking steps to prevent abuses of the emblem.

Finally, international actors may take measures relative to the conduct of others. To help illustrate this pathway, I outline some of the existing steps that other States, international organizations, and non-state parties may take to help secure respect for the law by those undertaking the conduct. These measures may include things like addressing matters of legal compliance by exerting diplomatic pressure, resorting to penal sanctions to repress violations, conditioning or refusing arms transfers, and monitoring the fate of transferred detainees. Concerning military partnerships in particular, I highlight steps such as conditioning joint operations on a partner’s compliance with the law, planning operations jointly in order to prevent violations, and opting out of specific operations if there is an expectation that the operations would violate applicable law.

Some themes and commitments cut across these three pathways. Arguably, respect for the law turns in no small part on whether natural persons can and will foresee, understand, administer, and trace the components, behaviors, and effects of relevant systems. It may be advisable, moreover, to institute ongoing cross-disciplinary education and training as well as the provision of sufficient technical facilities for all relevant actors, from commanders to legal advisers to prosecutors to judges. Further, it may be prudent to establish ongoing monitoring of others’ technical capabilities. Finally, it may be warranted for relevant international actors to pledge to engage, and to call upon others to engage, only in armed-conflict-related conduct that is sufficiently attributable, discernable, and scrutable.


Front Matter

About HLS PILAC

The Harvard Law School Program on International Law and Armed Conflict (HLS PILAC) provides a space for research on critical challenges facing the various fields of public international law related to armed conflict. The Program’s mode is critical, independent, and rigorous. While its contributors may express a range of views on contentious legal and policy debates, HLS PILAC does not take institutional positions on such matters.

About the War-Algorithms Project

This commentary was written as part of HLS PILAC’s project on “International Legal and Policy Dimensions of War Algorithms: Enduring and Emerging Concerns.” In the current phase of the project, HLS PILAC set out to strengthen international debate and inform policymaking on how artificial intelligence and complex computer algorithms are transforming war. The Program also sought to inform the debate on how international legal and policy frameworks already govern, and might further regulate, the design, development, and use of those technologies. The project is financially supported by the Ethics and Governance of Artificial Intelligence Fund.

The original conception of the current phase of the project envisioned a multi-part process. A team of researchers would undertake independent analysis focusing on international law and policy coupled with research engagements in select countries. The provisional results would be presented for critical comment at a workshop at Harvard Law School for specialists from militaries and other government agencies, international policymakers, and academia. After reviewing the comments, the project team would consolidate their findings and give briefings on them to governments, international policymakers, industry, and civil society. Part of the project plan changed when, due to travel restrictions imposed in relation to the novel coronavirus and the disease that it causes (COVID-19), the workshop was initially postponed and later canceled. In lieu of the workshop, resources were reallocated to support more independent research combined with critical feedback on the analysis. Following the publication of this commentary, it is anticipated that the project team will give briefings on the various findings of the project to actors from governments, international bodies, industry, and civil society.

About the Author

Dustin A. Lewis is the Research Director of HLS PILAC.

Acknowledgments

The author gratefully acknowledges the following people: HLS PILAC Research Assistants Lindsay Anne Bailey, Sonia Chakrabarty, Minqian Lucy Chen, Edi Ebiefung, Zach Grouev, Rachael Hanna, Simon Gregory Jerome, Daniel Levine-Spound, Charles Orta, Will Ossoff, Binendri Perera, Vera Piovesan, Sam Rebo, Lilianna (Anna) Rembar, Ricardo Cruzat Reyes, Juan Rivera Rugeles, Delphine Rodrik, Carolina Silva-Portero, Katherine Shen, Delany Sisiruca, April Xiaoyi Xu, and Eun Sung Yang, for wide-ranging research and analysis over the course of the project; Jennifer Allison and Caroline Walters of the Harvard Law School (HLS) Library, as well as other members of the HLS Library staff, for research support; and Maya Brehm, Neil Davison, Netta Goussac, Martin Hagström, Naz K. Modirzadeh, Aaron Waldo, and Pauline Warnotte, for feedback on drafts of parts or all of the commentary.

Disclaimer

The views and opinions expressed in this commentary should not be taken, in any way, to reflect the official opinion of the Ethics and Governance of AI Fund. Rather, the views and opinions reflected in this commentary are those solely of the author. And the author alone is responsible for any errors.

License

This commentary is published under the following license: Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International license (CC BY-NC-SA 4.0).

Web

This legal commentary is available online at https://pilac.law.harvard.edu/international-legal-and-policy-dimensions-of-war-algorithms.

Introduction

Existing and emerging applications of artificial intelligence and other complex socio-technical systems in armed conflicts span diverse areas.[1] Natural persons may increasingly depend upon such systems in activities and decisions related to:
  • Killing combatants;[2]
  • Destroying enemy installations;[3]
  • Detaining adversaries;[4]
  • Protecting civilians;[5]
  • Undertaking missions at sea;[6]
  • Providing legal advice;[7] and
  • Configuring war-related logistical arrangements.[8]
Concerning terminology, in this legal commentary, by socio-technical system, I mean the specific arrangements through which people and technology interact,[9] with a particular focus here on such systems that rely upon war algorithms and data. By war algorithm, I mean an algorithm expressed in computer code, effectuated through a constructed system, and capable of operating in relation to an armed conflict.[10]

So far, the bulk of States’ efforts to address relevant systems focuses on the conduct of hostilities. In intergovernmental debates on autonomous weapons, a normative impasse appears to have emerged. In short, while all agree on the importance of the “human element” in the use of force, some countries assert that existing law in this area suffices, while several others call for new rules.[11] Meanwhile, States’ views on the spectrum of other applications — including concerning detention, humanitarian services, maritime systems, and legal advice — are rarely brought into view.

One normatively grounded way to help more fully identify and address issues concerning socio-technical systems reliant upon war algorithms and data is to elaborate pathways that relevant actors may pursue to secure greater respect for international law.[12] In this legal commentary, I set out three such courses that States, international organizations, non-state parties to armed conflict, and others may take: forming and publicly expressing positions on key legal issues; taking measures relative to their own conduct; and taking steps relative to the behavior of others. In doing so, I seek to consolidate some of the main findings that have emerged as part of the project of the Harvard Law School Program on International Law and Armed Conflict (HLS PILAC) on “International Legal and Policy Dimensions of War Algorithms: Enduring and Emerging Concerns.”[13] As part of that project, I have been a participant-observer in every meeting of the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems (GGE on LAWS).[14] Further, I have taken part in additional engagements with a diverse array of States, international organizations, non-governmental organizations, and scholars in related areas. I have also participated in a handful of academic initiatives. Further, I served as the editor of A Compilation of Materials Reflective of States’ Views on International Legal Issues pertaining to the Use of Algorithmic and Data-reliant Socio-technical Systems in Armed Conflict, which was published by HLS PILAC in December of 2020.

Through these and other experiences, I have formed views relevant to the framing and approach set out in this legal commentary. Perhaps most fundamentally, I continue to approach the use of algorithmic and data-reliant socio-technical systems in armed conflicts through the lens of international law.[15] That is partly because States widely recognize that international law applies to all armed conflicts. Indeed, unlike domestic legislation, corporate codes of conduct, and ethics policies, international law is the only regime agreed by countries at the international level to impose binding rules on war.

Further, in drawing upon the concept of socio-technical systems reliant upon war algorithms and data, I aim partly to sidestep definitional debates around things like “autonomous weapons,” “artificial intelligence,” and the like.[16] At the GGE on LAWS, there is currently a debate within a debate on definitions of relevant systems. This set of discussions concerns what exactly is and is not covered by the notion of “emerging technologies in the area of lethal autonomous weapons systems” and whether or not it is necessary to have an agreed understanding of those characteristics to decide to move to negotiate a legal instrument.[17] In addition to having set out vastly different views on those questions, States have also elaborated widely divergent positions on the (lack of) utility as well as the definitional parameters of notions like “meaningful human control” over the use of force and concepts such as “humans [inside, on, or out of] the loop.”[18]

All the while, the vast majority of contemporary intergovernmental debate in this area, especially in the GGE on LAWS, focuses on weapons, means, and methods of warfare. In particular, international actors have set out positions on weapons reviews,[19] configuring and selecting targeting and applying force against them,[20] and, more broadly, the “targeting cycle.”[21] It is undoubtedly essential for those aspects of hostilities-related activities and decision-making to be comprehensively considered and addressed, not least given the life-and-death stakes.

Yet, in my view, the existing frame of inquiry is far too narrow. Instead, international actors ought to systematically identify and address the diverse array of all armed-conflict-related activities and decisions implicated by these systems, including but extending beyond weapons. Further, in doing so, international actors ought to assess whether existing measures aimed at securing respect for the law concerning this range of activities and decisions are up to the task.[22]

Therefore, in framing this legal commentary in terms of socio-technical systems reliant upon war algorithms and data, I mean to place focus on the increasingly complex assemblages of algorithms and data developed and employed by natural persons in relation to a diverse array of activities and decisions pertaining to armed conflicts.[23] I remain concerned that focus on weapons to the exclusion of other application areas may forestall the development of positions on the numerous additional activities and decisions that may entail significant implications for how wars are executed, experienced, and governed. Further, from my viewpoint, it is valuable from an analytical perspective to foreground the interactions between people and technology instead of focusing primarily on technical means as such. Hence the focus here on socio-technical systems reliant upon war algorithms and data.

Against that backdrop, it is warranted, in my view, for States and other international actors to consider certain cross-thematic legal questions. At least some of these questions may hold part of the key to unlocking more precisely and comprehensively what is at stake concerning the employment of relevant systems.[24] In doing so, those actors ought to build on the important work already undertaken in the context of the GGE on LAWS and other forums. In section 1, I set out a handful of such questions for consideration. In section 2, I outline some of the existing measures that international actors may take to respect the law pertaining to their own conduct. These measures span diverse activities and decisions, from weapons to detention to humanitarian services. In doing so, I seek to invite international actors to inventory and peruse these types of steps in order to assess whether the nature or character of increasingly complex socio-technical systems may warrant revitalizing commitments or making adjustments to existing measures — or, perhaps, developing new measures relative to one’s conduct. Along similar lines, in section 3, I outline some of the steps that States, international organizations, non-state parties, and others may take to help secure respect for the law where others engage in the conduct of concern. Finally, in the annex, I provide a summary of these example courses.

To be sure, none of these pathways is sufficient in itself. And there are no doubt many others that ought to be pursued. Yet each of the identified tracks is arguably necessary to ensure that international law in this area is — or becomes — fit for purpose.

Some themes and commitments cut across the three pathways. I contend that respect for the law turns in no small part on whether natural persons can and will foresee, understand, administer, and trace the components, behaviors, and effects of relevant systems. It may be advisable, moreover, to institute ongoing cross-disciplinary education and training as well as the provision of sufficient technical facilities for all relevant actors, from commanders to legal advisers to prosecutors to judges. It may also be prudent to establish ongoing monitoring of others’ technical capabilities. Further, it may be warranted for relevant international actors to pledge to engage, and call upon others to engage, only in armed-conflict-related conduct that is sufficiently attributable, discernable, and scrutable.

Finally, a couple of caveats ought to be borne in mind. The bulk of the research underlying this legal commentary drew primarily on English-language materials. The absence of a broader examination of legal materials, scholarship, and other resources in other languages narrows the study’s scope.[25] Further, in this legal commentary, I seek to outline only some of the courses that international actors may take to secure greater respect for international law applicable in relation to armed conflict.[26] Therefore, the analysis and the identification of potential issues, questions, measures, and concerns are far from comprehensive.

1. Forming and publicly expressing positions on key legal issues

By forming and publicly expressing positions on relevant legal issues, international actors may help clarify the parameters of existing law. Doing so may also help detect more concretely specific areas of convergence and divergence on emerging issues. Furthermore, elaborating positions may help pinpoint what precisely is most concerning with respect to particular systems and components adopted across systems. It may also help foster greater trust among current and potential adversaries.

A first set of issues concerns whether or not all armed-conflict-related conduct ought, as a legal matter, to be reflective of human agency.[27] Reflective of human agency, in this sense, might arguably be characterized in the following terms (among no doubt many other potential formulations): armed-conflict-related conduct — in the form of both acts and omissions — must be subject to the exercise by one or more natural persons of intent (or, at least, foreseeability), knowledge, and causal control in respect of the conduct.[28] A rationale for adopting such a position and instantiating it in law is premised on the notion that conduct that may implicate fundamental freedoms, rights, and protections should be undertaken only where natural persons exercise sufficient agency in relation to the conduct. For example, the potential or actual employment of certain A.I.-related techniques or tools has raised concerns that it may be difficult — and, perhaps, impossible — for natural persons: to reasonably foresee or (otherwise) exercise intent in respect of the system’s behavior and effects; to reliably understand, supervise, and administer the system’s components, performance, and effects during an operation; or to sufficiently trace and understand the system’s components, performance, and effects after the fact.[29] In light of these and other issues, international actors may decide to elaborate positions concerning whether or not armed-conflict-related conduct ought, as a legal matter, to be reflective of human agency — and, if so, what such agency entails.

A second set of issues concerns whether natural persons are or are not the only valid agents for the exercise and implementation of international law.[30] One way to formulate this inquiry is to ask whether it is, or at least ought to be, presupposed that the (at least primary) exercise and implementation of international law — including by legal entities, such as States, international organizations, and courts — may be reposed only in natural persons or whether, alternatively, some or all of those responsibilities may be reposed partly or wholly in artificial agents.

A third set of issues concerns the diverse array of evaluative decisions and normative (or value) judgments mandated in international law applicable in relation to armed conflict.[31] Compliance with numerous principles, rules, and standards requires such assessments. Examples include provisions relating to:

  • The prohibition of the destruction or seizure of the enemy’s property unless such destruction or seizure is “imperatively demanded by the necessities of war”;[32]
  • The prohibition on attacks that may be expected to cause “incidental” loss of civilian life, injury to civilians, damage to civilian objects, or a combination thereof that would be “excessive” in relation to the “concrete and direct military advantage” anticipated;[33]
  • The obligation to assess during hostilities whether an injured fighter “refrains from any act of hostility” to become protected as “wounded and sick” in legal terms;[34] and
  • Ordering the internment or placing in assigned residence of protected persons only if the security of the Detaining Power “makes it absolutely necessary.”[35]
These and other[36] weighty and impactful assessments may be extremely difficult to make in specific instances.[37] Two questions may be distinguished. A first concerns whether, as a legal matter, it should be natural persons who make those decisions and judgments. Concerning that question, it may be warranted for international actors to elaborate positions on whether legally mandated evaluative decisions and normative (or value) judgments may be reposed only in one or more natural persons. A second question concerns establishing and validating the information on which these legally mandated decisions and judgments are based. With respect to that question, it may be warranted for international actors to elaborate positions on whether or not — and, if so, under what circumstances and subject to what conditions — reliance may be placed on relevant systems to partly or wholly establish or validate that information.

A fourth set of issues concerns the potential use of proxies for legally relevant characteristics.[38] The basic issue may be framed in terms of whether a legally relevant feature or quality may be ascribed to a person, place, or thing through the operation of a socio-technical system without (further) assessment by one or more natural persons. For example, under instruments governing military operations in armed conflicts, civilians shall not be the object of an attack unless and for such time as those civilians take a direct part in hostilities.[39] A legal question relates to whether or not a party may seek to formulate and adopt proxies — such as age, assigned gender, location, and presence of arms — as part of the process to detect which civilians are and are not taking a direct part in hostilities. Numerous other potential examples of using proxies for legally relevant characteristics may arise as well. In turn, it is possible to envisage the formulation and configuration — and, perhaps, even the evaluation — of those proxies through the use of a relevant socio-technical system. In light of these possibilities and the stakes for adopting such proxies, it may be warranted for international actors to elaborate positions concerning whether the use of proxies for legally relevant characteristics is permissible under international law applicable in relation to armed conflict. If the answer is yes, international actors may further decide to form views on the circumstances under which and the conditions subject to which a socio-technical system may or may not be involved in the formulation, configuration, validation, or evaluation of those proxies.

A fifth set of issues concerns the preconditions necessary to assess and implement international legal responsibility. The core issue may be expressed in terms of whether international actors ought to commit, with respect to acts and omissions relating to armed conflict, to engaging only in conduct that is susceptible to legal assessment and the implementation of responsibility in case of violations. International actors are already bound to comply with applicable law, and the capability to make a legal assessment regarding relevant conduct is arguably a precondition for compliance. A further commitment may arguably be compelled as well: namely, a pledge to engage only in armed-conflict-related conduct that is at least facilitative of attribution, discernibility, and scrutiny of relevant conduct, including by actors not involved in the conduct.[40] A rationale in support of that approach is that such a commitment may help to ensure that it is possible for relevant natural persons to sufficiently trace, understand, and review, in a legal sense, a relevant system’s components, performance, and effects after the fact.

Finally, a sixth set of issues concerns what forms and manifestations of socio-technical systems may require additional legal review. Already, High Contracting Parties (HCPs) to Additional Protocol I of 1977 to the four Geneva Conventions of 1949 are obliged to conduct legal reviews with respect to weapons, means, and methods of warfare.[41] Certain socio-technical systems, once initiated, may engage in acts and omissions that the natural person responsible for the conduct of the system cannot reliably foresee, administer during the operation, or review after the fact (or some combination thereof). One concern is that a system may pass a legal review based on certain assumptions but, once those assumptions sufficiently change, then an additional legal review is warranted. It may therefore be useful for international actors to elaborate — with greater precision[42] — what forms and manifestations of socio-technical systems, adopted in relation to which circumstances of use and subject to what conditions, mandate additional legal review. Given the wide variety of possible application areas of socio-technical systems, it may be advisable to set out positions regarding triggers for additional legal review not only for weapons, means, and methods of warfare but for any relevant armed-conflict-related conduct involving such systems.

2. Taking measures relative to one’s own conduct

A second pathway to secure respect for the law is for international actors to take measures relative to their own armed-conflict-related conduct involving relevant systems. To help illustrate this pathway, in this section, I outline a handful of such existing measures.

HCPs to relevant instruments are obliged to take measures necessary to suppress all acts contrary to the provisions contained therein.[43] Where relevant socio-technical systems are involved in armed-conflict-related conduct, performing this obligation presupposes that the HCP will obtain sufficient means and know-how to prevent violations. In light of the wide variety of potential applications of relevant technologies, to perform this set of obligations, it seems that an HCP may need to form positions on relevant legal issues; institute and maintain sufficient education, training, monitoring, and review mechanisms; and forgo engaging in conduct the legality of which cannot be sufficiently assured.

HCPs to relevant instruments are obliged to enact — and, as warranted, adjust — any legislation necessary to provide effective penal sanctions for persons committing, or ordering to be committed, any grave breach.[44] Grave breaches may be committed through the use of relevant socio-technical systems. For example, suppose that a commander employs machine-learning techniques to help decide what to target, and the operation results in extensive destruction of property that is not justified by military necessity and that is carried out unlawfully and wantonly.[45] To perform the set of obligations concerning enactment of penal-sanctions legislation, an HCP arguably must (among other things) determine whether its existing legislation is or is not sufficiently facilitative of the search for, prosecution of, or extradition of alleged perpetrators of grave breaches where those alleged breaches were committed, at least in part, through the use of a relevant socio-technical system. Notably, it may be advisable for each HCP to assess whether or not its existing legislation sufficiently takes into account possible legal issues pertaining to (among other topics) the requisite mental elements[46] and modes of responsibility concerning grave breaches[47] allegedly committed through the use of relevant socio-technical systems.[48]

HCPs to relevant instruments are obliged to prosecute or extradite alleged perpetrators of grave breaches.[49] As noted above, grave breaches may be committed through the use of relevant socio-technical systems.[50] To perform the set of obligations related to prosecuting or extraditing alleged perpetrators, an HCP arguably must (among other measures) provide relevant personnel with the training, knowledge, and facilities necessary to undertake these tasks. In short, it may be warranted to ensure that natural persons tasked with instituting and adjudicating allegations of grave breaches have the technical training and means necessary to ascribe, discern, and scrutinize relevant conduct. Where relevant conduct was undertaken through a socio-technical system reliant upon war algorithms and data, it may be necessary to ensure that relevant natural persons have the knowledge and means required to piece together the inputs, functions, dependencies, and outputs of the computational components adopted, and by whom, in relation to each relevant circumstance of use. In other words, the people responsible for prosecuting or extraditing persons alleged to have committed grave breaches would arguably need to be in a position to undertake after-the-fact tracing and review of the relevant algorithmic and data-reliant elements adopted in relation to each relevant circumstance of use. Without such resources, an HCP may not be in a position to effectively prosecute or extradite alleged perpetrators of grave breaches committed through relevant socio-technical systems.

Notably, an HCP may need to obtain and maintain the necessary resources to prosecute or extradite alleged perpetrators even where the HCP has not itself adopted such socio-technical systems. That is because the obligation to prosecute or extradite allegations of grave breaches extends to all relevant allegations, irrespective of the nationality of the person alleged to have committed the breach. Seen in this light, the obligation to prosecute or extradite alleged perpetrators of grave breaches may compel two commitments. A first is for the HCP to exercise ongoing vigilance regarding the adoption of new socio-technical systems by others. And a second is a corresponding commitment by the HCP to obtain and maintain sufficient capabilities — from investigators to prosecutors to adjudicators — to prosecute or extradite alleged perpetrators whose conduct was undertaken through a relevant system’s employment.

Several instruments lay down provisions on measures relative to Protecting Powers or substitutes.[51] At least in theory, the use of relevant socio-technical systems in armed conflict may implicate several of the tasks entrusted to Protecting Powers or substitutes. For example, the use of machine-learning techniques adopted in relation to penal proceedings against a protected person in occupied territory may raise issues concerning whether the receiving State is in a position to provide the relevant notifications in a manner that is susceptible to sufficient scrutiny by the Protecting Power and the Power of Origin.[52] As another example, Protecting Powers are obliged to lend good offices in certain circumstances to settle a disagreement on the application or interpretation of applicable law.[53] The use of a relevant system may give rise to such a dispute. Satisfactorily performing these and similar tasks may turn partly on fulfilling three sets of commitments by relevant HCPs. A first is to ensure that any of the HCP’s own conduct involving the use of a relevant socio-technical system is facilitative of the Protecting Power’s or substitute’s tasks. A second commitment is to ensure, where the HCP is acting as a Protecting Power (perhaps especially where it performs those tasks relative to another HCP that possesses advanced technical capabilities), that the HCP itself possesses the knowledge, facilities, and resources necessary to perform its tasks as a Protecting Power. And a third is to ensure that potential substitutes for Protecting Powers, such as the International Committee of the Red Cross, have the knowledge, facilities, and resources necessary to perform the substitute’s tasks.

At least HCPs to Additional Protocol I of 1977 are obliged to make legal advisers available to the armed forces when necessary.[54] The potential or actual use of relevant socio-technical systems in armed conflicts may warrant new or adjusted forms of education, training, and supervision of legal advisers spanning diverse thematic areas. Those areas may potentially include (among others) the conduct of hostilities, restrictions on liberty, maritime systems, and humanitarian services.[55]

HCPs to Additional Protocol I of 1977 are obliged to include the study of applicable international law, if possible, in civil-instruction programs, so that the principles thereof may become known to the entire population.[56] The use of algorithmic and data-reliant socio-technical systems in armed conflicts may entail significant implications concerning the formulation, interpretation, and application of those principles. To perform this set of obligations, an HCP may decide to (among other measures) inventory existing civil-instruction programs and, as warranted, adjust them to reflect developments arising in respect of the potential or actual use of relevant systems in armed conflicts.

HCPs to relevant instruments are obliged, if their legislation is not already adequate, to take measures necessary for the prevention and repression, at all times, of abuses of the emblem.[57] Abuses of the emblem may involve the use of socio-technical systems reliant upon war algorithms and data. For example, a machine-learning system might recommend that a commander engage in a perfidious use of the emblem to gain a military advantage.[58] It may be warranted for HCPs to take steps to comprehensively understand whether — and, if so, how — potential or actual employments of socio-technical systems in armed conflict may result in abuses of the emblem and take corresponding legislative measures to prevent and repress such abuses.

As a final example of measures relative to one’s own conduct, HCPs to relevant instruments are obliged to ensure respect for and protection of fixed establishments and mobile medical units of the medical service, medical and religious personnel, medical transports, and hospital ships and their crews.[59] The use of socio-technical systems reliant upon war algorithms and data may entail implications relative to ensuring respect for and protection of these people and objects. For example, in an international armed conflict, it is prohibited for a party to employ an uninhabited maritime system that attacks military hospital ships or hospital ships utilized by National Red Cross Societies, officially recognized relief societies, or private persons of neutral countries.[60] It may be warranted for HCPs to take steps to comprehensively understand whether — and, if so, how — potential or actual employments of socio-technical systems in armed conflicts may result in a lack of respect for or protection of fixed establishments and mobile medical units of the medical service, medical and religious personnel, medical transports, and hospital ships and their crews. As a corollary, it may be advisable for HCPs to formulate and adopt new or extended measures to ensure such respect and protection.

3. Taking measures relative to the conduct of others

In this section, I outline some of the existing measures that international actors may take to help secure respect for the law by others engaging in conduct of concern that involves relevant systems.[61] The actions — be they of a positive or negative character — may differ depending on the implicated entity or person and the circumstances.[62] In general, to satisfactorily take the steps set out below, it may be warranted for international actors, perhaps especially States and international organizations, to commit to instituting ongoing cross-disciplinary education and training as well as the provision of relevant technical facilities and resources. In particular, it may be prudent to ensure the knowledge, facilities, and resources necessary to reliably attribute, discern, and scrutinize the conduct of alleged violations by others.

States and international organizations may use a diplomatic dialogue to address questions of compliance or exert diplomatic pressure through confidential protests or public denunciations.[63] To address matters of compliance with another international actor, a State or an international organization arguably needs to possess the knowledge, facilities, and resources necessary to reliably attribute, discern, and scrutinize the other actor’s conduct.

States and other international actors may offer legal assistance or support legal assistance provided by others, such as through instruction and training.[64] To provide such assistance, those offering the assistance presumably need to possess the knowledge, facilities, and resources necessary to discern the applicable law, anticipate relevant conduct, and set out the applicable law as it pertains to that conduct. For example, suppose a party to an armed conflict uses machine-learning techniques to nominate targets. Arguably, a precondition to effectively provide legal assistance to that party is a sufficient grasp of the relevant computational components, performance, and effects of the adopted socio-technical system.

As noted above, to repress violations, States may resort to and otherwise support penal measures, including with respect to grave breaches.[65] Arguably, a precondition to effectively resorting to or otherwise supporting criminal sanctions is that legislation is sufficiently facilitative of an investigation and prosecution of alleged perpetrators of violations, including where those alleged violations were committed, at least in part, through the use of a relevant socio-technical system.[66] Effective resort to penal measures is also arguably preconditioned on providing relevant organs with the training, knowledge, and facilities necessary to investigate and prosecute alleged perpetrators.[67] Notably, it may be prudent for a State to obtain and maintain the resources required to prosecute alleged perpetrators even where the State has not itself adopted such socio-technical systems. Seen in this light, the resort to penal measures against alleged perpetrators unconnected to the State may compel two commitments by the State seeking to resort to criminal sanctions. A first is for a State that may resort to penal measures to exercise ongoing vigilance regarding the adoption of new socio-technical systems by others, especially international actors that possess advanced technical capabilities. And a second is a corresponding commitment by the State to obtain and maintain sufficient capabilities — across all relevant organs and personnel — to prosecute alleged perpetrators whose conduct involves a relevant socio-technical system.

An HCP to a relevant instrument that is also party to the armed conflict may request that an inquiry be instituted concerning any alleged violation of the treaty.[68] To request and institute an inquiry, an HCP arguably needs to possess the knowledge, facilities, and resources necessary to (among other tasks) reliably attribute, discern, and scrutinize the conduct of the allegedly violative party.

At least HCPs to Additional Protocol I of 1977 may recognize the International Fact-Finding Committee’s competence to inquire into allegations by relevant other parties or to otherwise request such an inquiry.[69] Notably, the personnel of both the HCP requesting such an inquiry and the IFFC need access to sufficient education, training, facilities, and resources. For example, suppose that the IFFC is requested to inquire into an attack that involved targeting recommendations from a machine-learning system. To satisfactorily undertake that inquiry, the IFFC arguably must possess the technical knowledge, resources, and facilities necessary to (among other tasks) attribute the attack, discern relevant conduct, and scrutinize that conduct in light of the applicable law.

States and international organizations may condition, limit, or refuse arms transfers.[70] To do so with respect to arms that involve a socio-technical system reliant upon war algorithms and data, a State or an international organization arguably needs to possess the knowledge, facilities, and resources necessary to (among other tasks) discern whether the anticipated use of the arms by the other actor may constitute a violation.

States and international organizations may refer, where possible, a specific issue to a competent body, such as the International Court of Justice, for the settlement of disputes.[71] The use of a relevant socio-technical system in an armed conflict may give rise to such a dispute. A hypothetical example is a dispute among parties to relevant instruments as to whether or not the normative judgments mandated in those instruments may be reposed only in natural persons.[72] Notably, not only the State or international organization making the referral but also the (personnel of the) competent body may need access to relevant education, training, facilities, and resources.

Monitoring the fate of armed-conflict-related detainees transferred to another State might also be implicated by the use of a relevant socio-technical system.[73] For example, the receiving State might depend in part on the use of a machine-learning system to assess whether the person represents a sufficient security threat to warrant continued restrictions on liberty.[74] To monitor the fate of transferred detainees and, if necessary, to exercise its influence to help ensure observance of applicable law by the receiving State, an HCP arguably must possess the knowledge, facilities, and resources necessary to (among other tasks) reliably attribute, discern, and scrutinize the conduct of the receiving State pertaining to the detainee, including where such conduct implicates the use of a relevant socio-technical system.

As a final set of example measures, States, international organizations, and non-state parties may take steps concerning armed-conflict-related partnerships involving relevant systems. Consider some examples. International actors may condition joint operations involving a relevant system on a partner’s compliance with applicable law.[75] Military partners may plan operations involving a relevant system jointly to prevent violations.[76] Further, military partners may intervene directly with commanders in case of violations involving a relevant system, such as an imminent unlawful attack against civilians by a coalition partner.[77] A military partner may also opt out of a specific operation involving a relevant system if there is an expectation — based on the facts or knowledge of past patterns — that the operation would violate applicable law.[78] These measures highlight how partnerships may implicate particular interests and concerns relative to attributing, discerning, and scrutinizing the relevant conduct of others as well as the relationship between one’s own conduct and that of others.

Conclusion

The already broad and rapidly growing set of applications for socio-technical systems reliant upon war algorithms and data spans the conduct of hostilities, restrictions on liberty, humanitarian services, maritime systems, legal advice, and logistics. In this commentary, I have set out three courses that international actors may pursue to secure greater respect for international law in this area: form and publicly express positions on critical legal issues, take measures relative to their own armed-conflict-related conduct, and take steps relative to such conduct of others.

In light of the issues and examples set out above, respect for the law in this area arguably turns in no small part on whether natural persons can and will sufficiently foresee, understand, administer, and trace the components, behaviors, and effects of relevant systems. It may be advisable to institute ongoing cross-disciplinary education and training as well as the provision of sufficient technical facilities for all relevant actors — from commanders to legal advisers to prosecutors to judges. It may also be prudent to establish ongoing monitoring of others’ technical capabilities. Finally, it may be warranted for relevant international actors to pledge to engage, and to call upon others to engage, only in armed-conflict-related conduct that is sufficiently attributable, discernable, and scrutable.

Annex: Summary of Example Measures

Pathway 1

An international actor may form and publicly express positions on key legal issues arising in respect of relevant systems, including:
  1. Whether, as a legal matter, armed-conflict-related conduct ought to be reflective of human agency;
  2. Whether, as a legal matter, armed-conflict-related conduct may be considered reflective of human agency only if the conduct is subject to the exercise by one or more natural persons of intent (or, at least, foreseeability), knowledge, and causal control in respect of the conduct;
  3. Whether it is, or at least ought to be, presupposed that the (primary) exercise and implementation of international law — including by legal entities, such as States, international organizations, and courts — may be reposed only in natural persons or whether, alternatively, those responsibilities may be reposed partly or wholly in artificial agents;
  4. Whether legally mandated evaluative decisions and normative (or value) judgments may be reposed only in one or more natural persons;
  5. Whether — and, if so, under what circumstances and subject to what conditions — reliance may be placed on relevant systems to partly or wholly establish or validate the information upon which legally mandated evaluative decisions and normative (or value) judgments are made;
  6. Whether or not the use of proxies for legally relevant characteristics is permissible under international law applicable in relation to armed conflict;
  7. If the use of proxies for legally relevant characteristics is permissible, under what circumstances and subject to conditions may a relevant system be involved in the formulation, collection, validation, or evaluation of such proxies;
  8. Whether, as a legal matter, international actors ought to pledge to engage, and call upon others to commit to engage, only in armed-conflict-related conduct that is at least facilitative of attribution, discernibility, and scrutiny of conduct involving a relevant system, including by actors not involved in the conduct; and
  9. What forms and manifestations of relevant systems, adopted in relation to which circumstances of use and subject to what conditions, mandate additional legal review?

Pathway 2

An international actor may take measures relative to its own conduct involving a relevant system, including the measures necessary:
  1. To suppress all acts contrary to relevant binding instruments;
  2. To enact — and, as warranted, adjust — any legislation necessary to provide effective penal sanctions for persons committing, or ordering to be committed, any grave breach;
  3. To prosecute or extradite alleged perpetrators of grave breaches;
  4. To comply with and facilitate respect for applicable law relative to Protecting Powers or substitutes;
  5. To make legal advisers available to the armed forces;
  6. To include the study of applicable international law in civil-instruction programs, so that the principles thereof may become known to the entire population;
  7. For the prevention and repression of abuses of the emblem; and
  8. To ensure respect for and protection of fixed establishments and mobile medical units of the medical service, medical and religious personnel, medical transports, and hospital ships and their crews.

Pathway 3

An international actor may take measures relative to the conduct of others involving a relevant system, including the measures necessary:
  1. To use a diplomatic dialogue to address questions of compliance or exert diplomatic pressure through confidential protests or public denunciations;
  2. To offer legal assistance or support legal assistance provided by others;
  3. To resort to and otherwise support penal measures;
  4. To request that an inquiry be instituted concerning any alleged violation of a relevant instrument;
  5. For the International Fact-Finding Committee to inquire into allegations;
  6. To condition, limit, or refuse arms transfers;
  7. To refer, where possible, a specific issue to a competent body for the settlement of disputes; and
  8. To monitor the fate of armed-conflict-related detainees transferred to another State.
Concerning armed-conflict-related partnerships in particular, international actors may take the measures necessary:
  1. To condition joint operations involving a relevant system on a partner’s compliance with applicable law;
  2. To plan operations involving a relevant system jointly to prevent violations;
  3. To intervene directly with commanders in case of violations involving a relevant system; and
  4. To opt out of a specific operation involving a relevant system if there is an expectation that the operation would violate applicable law.

[1] See International Committee of the Red Cross, Autonomy, artificial intelligence and robotics: Technical aspects of human control (Aug. 2019), https://www.icrc.org/en/download/file/102852/autonomy_artificial_intelligence_and_robotics.pdf [https://perma.cc/M5UY-REHF].

[2] See, e.g., Nathan Strout, Inside the Army’s futuristic test of its battlefield artificial intelligence in the desert, C4ISRNET (Sept. 25, 2020), https://www.c4isrnet.com/artificial-intelligence/2020/09/25/the-army-just-conducted-a-massive-test-of-its-battlefield-artificial-intelligence-in-the-desert/ [https://perma.cc/8DDP-KANG]; Elsa Kania, “AI Weapons” in China’s Military Innovation, Brookings Inst. (Apr. 2020), https://www.brookings.edu/wp-content/uploads/2020/04/FP_20200427_ai_weapons_kania_v2.pdf [https://perma.cc/VT47-RBB5]; Merel Ekelhof and Giacomo Persi Paoli, Swarm Robotics: Technical and Operational Overview of the Next Generation of Autonomous System, U.N. Inst. Disarmament Res. (2020), https://unidir.org/sites/default/files/2020-04/UNIDIR_Swarms_SinglePages_web.pdf [https://perma.cc/MY9A-JLZ5]; Kelley M. Sayler, Artificial Intelligence and National Security, Congressional Res. Service, Report No. R45178 (Nov. 21, 2019), https://fas.org/sgp/crs/natsec/R45178.pdf [https://perma.cc/6KKC-KLSJ]; The Weaponization of Increasingly Autonomous Technologies: Artificial Intelligence – a primer for CCW delegates, U.N. Inst. for Disarmament Res., Paper No. 8 (2018), https://unidir.org/publication/weaponization-increasingly-autonomous-technologies-artificial-intelligence [https://perma.cc/85GZ-95D3]; Kelsey Reichmann, Can artificial intelligence improve aerial dogfighting?, C4ISRNET (June 7, 2019) https://www.c4isrnet.com/artificial-intelligence/2019/06/07/can-artificial-intelligence-improve-aerial-dogfighting/ [https://perma.cc/TDP8-PVB3]; Paul Sharre, Army of One 27–56 (2018); Vincent Boulanin and Maaike Verbruggen, Mapping the Development of Autonomy in Weapons Systems, Stockholm Int’l Peace Res. Inst. (2017), https://www.sipri.org/sites/default/files/2017-11/siprireport_mapping_the_development_of_autonomy_in_weapon_systems_1117_1.pdf [https://perma.cc/XUT7-27AT].

[3] See, e.g., Dan Gettinger and Arthur Holland Michel, Loitering Munitions, Cen. for the Study of the Drone (Feb. 2, 2017), https://dronecenter.bard.edu/files/2017/02/CSD-Loitering-Munitions.pdf [https://perma.cc/PUK9-9LN6].

[4] Possible antecedent technologies include algorithmic filtering of data and statistically based risk assessments initially created for domestic policing and criminal-law settings. Potential applications in armed conflict might include such things as prioritizing military patrols, assessing levels and kinds of threats purportedly posed by individuals or groups, and determining who should be held and when someone should be released. For example, authorities in Israel have reportedly used algorithms as part of attempts to obviate anticipated attacks by Palestinians through a process that involves the filtering of social-media data, resulting in over 200 arrests. Israel claims 200 attacks predicted, prevented with data tech, CBS News (June 12, 2018), https://www.cbsnews.com/news/israel-data-algorithms-predict-terrorism-palestinians-privacy-civil-liberties/ [https://perma.cc/54MJ-PMX2]. See generally Dustin A. Lewis, AI and Machine Learning Symposium: Why Detention, Humanitarian Services, Maritime Systems, and Legal Advice Merit Greater Attention, Opinio Juris (Apr. 28, 2020), http://opiniojuris.org/2020/04/28/ai-and-machine-learning-symposium-ai-in-armed-conflict-why-detention-humanitarian-services-maritime-systems-and-legal-advice-merit-greater-attention/ [https://perma.cc/9TNW-M4FQ]; Tess Bridgeman, The viability of data-reliant predictive systems in armed conflict detention, ICRC Humanitarian L. & Policy Blog (Apr. 8, 2019), https://blogs.icrc.org/law-and-policy/2019/04/08/viability-data-reliant-predictive-systems-armed-conflict-detention/ [https://perma.cc/BG3Z-326P]; Ashley Deeks, Detaining by algorithm, ICRC Humanitarian L. & Policy Blog (Mar. 25, 2019), https://blogs.icrc.org/law-and-policy/2019/03/25/detaining-by-algorithm/ [https://perma.cc/G7BG-4D7D]; Ashley S. Deeks, Predicting Enemies, 104 Virginia L. Rev. 1529 (2018).

[5] For example, provision of humanitarian services in armed conflict may rely in some contexts on relevant socio-technical systems. Applications might include, for example, predictive-mapping technologies used to inform populations of outbreaks of violence, track movements of armed actors, predict population movements, and prioritize response resources. See Lewis, AI and Machine Learning, above note 4; U.N. High Commissioner for Refugees Innovation Services, The Jetson Story (undated), https://jetson.unhcr.org/story.html [https://perma.cc/6V4P-T3ZZ]; Nat Manning, Keeping the Peace - The UN Department of Field Service’s and Peacekeeping Operations use of Ushahidi, Ushahidi Blog (Aug. 8, 2018), https://www.ushahidi.com/blog/2018/08/08/keeping-the-peace-the-un-department-of-field-services-and-peacekeeping-operations-use-of-ushahidi [https://perma.cc/EE4V-HS3D]. See also Allard Duursma & John Karlsrud, Predictive Peacekeeping: Strengthening Predictive Analysis in UN Peace Operations, 8 Stability: Int’l J. Sec. & Dev. 1 (2019). In addition, unmanned aerial vehicles (UAVs, a.k.a. drones) have been used, or at least contemplated for use, in relation to increasing situational awareness through mapping and monitoring, to delivering supplies, and to search-and-rescue operations. Aaron Boyd, The Pentagon Wants AI-Driven Drone Swarms for Search and Rescue Ops, Nextgov (Dec. 26, 2019), https://www.nextgov.com/emerging-tech/2019/12/pentagon-wants-ai-driven-drone-swarms-search-and-rescue-ops/162113/ [https://perma.cc/4QKN-H5LW]; Denise Soesilo, Patrick Meier, Audrey Lessard-Fontaine, Jessica Du Plessis & Christina Stuhlberger, Drones in Humanitarian Action: A guide to the use of airborne systems in humanitarian crises (2016), https://irevolution.files.wordpress.com/2011/07/drones-in-humanitarian-actionemail.pdf [https://perma.cc/BU3N-9M8Q]. For a recently developed framework concerning ethical use of certain data-science methods, see Kate Dodgson, Prithvi Hirani, Rob Trigwell, and Gretchen Bueermann, A Framework for the Ethical Use of Advanced Data Science Methods in the Humanitarian Sector, Data Sci. & Ethics Group (Apr. 2020), https://www.hum-dseg.org/sites/default/files/2020-10/Framework%20for%20the%20ethical%20use.pdf [https://perma.cc/RZK7-2WDQ].

[6] See, e.g., Hitoshi Nasu and David Letts, The Legal Characterization of Lethal Autonomous Maritime Systems: Warship, Torpedo, or Naval Mine?, 96 Int’l L. Stud. 79 (2020); Harry Lye, Royal Navy to Begin Unmanned Minehunting Operations, Naval Technology (Jan. 14, 2020), https://www.naval-technology.com/news/royal-navy-to-begin-unmanned-minehunting-operations/ [https://perma.cc/8HPB-H5VH]; Natalie Klein, Maritime Autonomous Vehicles within the International Law Framework to Enhance Maritime Security, 95 Int’l L. Stud. 244 (2019); Robert Veal, Michael Tsimplis, and Andrew Serdy, The Legal Status and Operation of Unmanned Maritime Vehicles, 50 Ocean Dev. & Int’l Law 23 (2019); Liu Xuanzun, China launches world-leading unmanned warship, Global Times (Aug. 22, 2019), https://www.globaltimes.cn/content/1162320.shtml [https://perma.cc/L6F7-XKPS]; Kyle Mizokami, The U.S. Navy Just Got the World’s Largest Uncrewed Ship, Popular Mechanics (Feb. 5, 2018), https://www.popularmechanics.com/military/navy-ships/a16573306/navy-accept-delivery-actuv-sea-hunter/ [https://perma.cc/G2JG-UV5X]; Jeffrey Lin and P.W. Singer, With the D3000, China enters the robotic warship arms race, Popular Science (Sept. 25, 2017), https://www.popsci.com/robotic-warship-arms-china-d3000/ [https://perma.cc/UUN4-DDYX]; Michael N. Schmitt and David S. Goddard, International law and the military use of unmanned maritime systems, 98 Int’l Rev. Red Cross 567 (2016).

[7] See Ashley Deeks, Coding the Law of Armed Conflict: First Steps, in The Law of Armed Conflict in 2040 (Matthew C. Waxman ed., forthcoming); Ashley Deeks, High-Tech International Law, 88 Geo. Wash. L. Rev. 574 (2020); Merel A.C. Ekelhof, The Distributed Conduct of War: Reframing Debates on Autonomous Weapons, Human Control and Legal Compliance in Targeting, Dissertation, PhD candidate, Vrije Universiteit, Amsterdam (Dec. 2019); Ashley Deeks, Noam Lubell, and Daragh Murray, Machine Learning, Artificial Intelligence, and the Use of Force by States, 10 J. Nat’l Security L. & Pol’y 1 (2019). More generally, see Kevin D. Ashley, Artificial Intelligence and Legal Analytics (2017).

[8] For example, in relation to the Gulf War of 1990–91, the United States employed a program aimed at increasing efficiencies in scheduling and making logistical arrangements for the transportation of supplies and personnel. See Marie Bienkowski, Demonstrating the operational feasibility of new technologies: the ARPI IFDs, 10(1) IEEE Expert 27, 28–29 (Feb. 1995).

[9] See generally Lucy Suchman, Configuration, in Inventive Methods (Celia Lury and Nina Wakeford eds., 2012). For an analysis in terms of the “technical layer,” the “socio-technical layer,” and the “governance layer” pertaining to autonomous weapons systems, see Ilse Verdiesen, Filippo Santoni de Sio, and Virginia Dignum, Accountability and Control Over Autonomous Weapon Systems: A Framework for Comprehensive Human Oversight, Minds and Machines (Aug. 2020). For an analysis of U.S. “drone operations” informed in part by methods relevant to socio-technical configurations, see M. C. Elish, Remote Split: A History of US Drone Operations and the Distributed Labor of War, 42 Science, Technology, & Human Values 1100 (2017).

[10] See Dustin A. Lewis, Gabriella Blum, and Naz K. Modirzadeh, War-Algorithm Accountability, Harv. L. Sch. Program on Int’l L. & Armed Conflict (Aug. 2016), vii, https://dash.harvard.edu/handle/1/28265262 [https://perma.cc/YJM2-X36H].

[11] See, e.g., Dustin A. Lewis, An enduring impasse on autonomous weapons, Just Security (Sept. 28, 2020), https://www.justsecurity.org/72610/an-enduring-impasse-on-autonomous-weapons/ [https://perma.cc/84LW-9JUW]. As explained in that post, at least a handful of States point to purported “humanitarian” benefits and potential for greater legal compliance from limited autonomy, or at least automation, in certain weapon systems. A basic idea underlying these positions is that socio-technical configurations underlying autonomy in weapon systems can, if prudently adopted and appropriately constrained, lead in concrete cases to better-informed decision-making and better outcomes from a legal-compliance perspective. States on the other side of the stalemate often frame their arguments by asserting that it is unethical and should be illegal to deploy combinations of sensors, data, algorithms, and machines that can kill humans without sufficient human oversight, intervention, control, or judgment; they contend that a consensus can be reached on minimal definitional elements to move to negotiations for a new treaty; and they question whether proponents’ arguments in favor of limited autonomy in weapons can be satisfactorily verified. Buttressed by calls for new law from a growing array of advocacy organizations, especially the Campaign to Stop Killer Robots, Austria, Brazil, and Chile have issued an appeal to negotiate a legally binding instrument that would codify “meaningful human control over critical functions in lethal autonomous weapon systems.” Proposal for a Mandate to Negotiate a Legally-binding Instrument that Addresses the Legal, Humanitarian and Ethical Concerns Posed by Emerging Technologies in the Area of Lethal Autonomous Weapons Systems (LAWS), Working Paper by Austria, Brazil and Chile to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS), U.N. Doc. CCW/GGE.2/2018/WP.7 (Aug. 30, 2018), https://undocs.org/CCW/GGE.2/2018/WP.7 [https://perma.cc/59VD-VNP9]. Meanwhile, in the middle sits a not-insignificant number of countries in favor of more detailed discussions but not pressing for new legal instruments — at least not yet. In practice, these States prioritize seeking incremental progress to close the normative chasm while maintaining the process in the GGE on LAWS.

[12] See Dustin A. Lewis, On “Responsible A.I.” in War: Exploring Preconditions for Respecting International Law in Armed Conflict, in Responsible A.I. (Silja Vöneky et al. eds., Freiburg Institute for Advanced Studies, forthcoming 2021); Dustin A. Lewis, Preconditions, in Autonomous Cyber Capabilities under International Law (Rain Liivoja and Ann Väljataga eds., NATO Cooperative Cyber Defence Centre of Excellence, forthcoming 2021).

[13] See International Legal and Policy Dimensions of War Algorithms: Enduring and Emerging Concerns, Harv. L. Sch. Program on Int’l L. & Armed Conflict (Nov. 2019), https://pilac.law.harvard.edu/international-legal-and-policy-dimensions-of-war-algorithms [https://perma.cc/KT75-KVWT].

[14] See Draft Report of the 2019 Session of the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Group of Governmental Experts of the High Contracting Parties to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, U.N. Doc. CCW/GGE.1/2019/CRP.1/Rev.2 (Aug. 21, 2019), https://undocs.org/CCW/GGE.1/2019/CRP.1/REV.2 [https://perma.cc/5S49-83A2]; Report of the 2018 session of the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Group of Governmental Experts of the High Contracting Parties to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, U.N. Doc. CCW/GGE.1/2018/3 (Oct. 23, 2018), https://undocs.org/en/CCW/GGE.1/2018/3 [https://perma.cc/UU2L-3LB6]; Report of the 2017 Session of the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems (LAWS), Group of Governmental Experts of the High Contracting Parties to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, U.N. Doc. CCW/GGE.1/2017/3 (Dec. 22, 2017), https://undocs.org/CCW/GGE.1/2017/3, [https://perma.cc/D6JL-FUE7].

[15] See War-Algorithm Accountability, above note 10, at iii.

[16] See id. at 17. See also, e.g., Matthijs M. Maas, Innovation-Proof Global Governance for Military Artificial Intelligence?: How I Learned to Stop Worrying, and Love the Bot, 10 J. Int’l Humanitarian Legal Stud. 129 (2019).

[17] With respect to definitions or characteristics of relevant systems, see, e.g., LAWS and Human Control: Brazilian Proposals for Working Definitions, Working Paper by Brazil to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS), U.N. Doc. CCW/GGE.1/2020/WP.4 (Aug. 19, 2020), https://undocs.org/CCW/GGE.1/2020/WP.4 [https://perma.cc/Y577-7YF8]; Working Definition, Statement by Ireland to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Aug. 29, 2018), http://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2018/gge/documents/29August_Proposal_Definition_Ireland.pdf [https://perma.cc/F7GU-RA6M]; Charles Trumbull, Characteristics of the Systems Under Consideration, Statement by the United States to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Aug. 28, 2018), https://geneva.usmission.gov/2018/08/29/statement-of-the-u-s-delegation-on-characteristics-of-the-systems-under-consideration/ [https://perma.cc/K8SU-T5NN]; Elements - Agenda item 6a) Characterization of the Systems Under Consideration in Order to Promote a Common Understanding on Concepts and Characteristics Relevant to the Objectives and Purposes of the Convention, Statement by Switzerland at the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Aug. 27–31, 2018), https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2018/gge/statements/27August_Switzerland.pdf [https://perma.cc/HR8E-LMYD]; Working Definition of LAWS/”Definition of Systems Under Consideration,” Statement by Germany to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Apr. 9–13, 2018), https://www.unog.ch/80256EDD006B8954/(httpAssets)/2440CD1922B86091C12582720057898F/$file/2018_LAWS6a_Germany.pdf [https://perma.cc/46WZ-SGHH]; Russia’s Approaches to the Elaboration of a Working Definition and Basic Functions of Lethal Autonomous Weapons Systems in the Context of the Purposes and Objectives of the Convention, Working Paper by the Russian Federation to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems, U.N. Doc. CCW/GGE.1/2018/WP.6 (Apr. 4, 2018), https://www.unog.ch/80256EDD006B8954/(httpAssets)/FC3CD73A32598111C1258266002F6172/$file/CCW_GGE.1_2018_WP.6_E.pdf [https://perma.cc/LE8H-V6Q2]; Towards a Definition of Lethal Autonomous Weapons Systems, Working Paper by Belgium to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS), U.N. Doc. CCW/GGE.1/2017/WP.3 (Nov. 7, 2017), https://undocs.org/ccw/gge.1/2017/WP.3 [https://perma.cc/FHY9-EHNH]; Towards a working definition – or “common characterisation” – of LAWS, Statement by New Zealand at the Third Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS) (Apr. 11-15, 2016), https://www.reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2016/meeting-experts-laws/shttps://www.reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2016/meeting-experts-laws/statements/12April_NewZealand.pdftatements/12April_NewZealand.pdf [https://perma.cc/7SXX-FR7C]; Towards a Working Definition of LAWS, Statement by Italy at the Third Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS) (Apr. 11–15, 2016), https://www.unog.ch/80256EDD006B8954/(httpAssets)/06A06080E6633257C1257F9B002BA3B9/$file/2016_LAWS_MX_towardsaworkingdefinition_statements_Italy.pdf [https://perma.cc/UU6U-7AYC]; Towards an Operational Definition of LAWS, Statement by France at the Third Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS) (Apr. 11–15, 2016) (French), https://www.unog.ch/80256EDD006B8954/(httpAssets)/BEC4CD0DFE278031C1257F9300580565/$file/2016_LAWS+MX_Towardaworkingdefinition_Statements_France.pdf [https://perma.cc/PJ3L-AYWG]; Working Towards a Definition of LAWS, Statement by the United Kingdom at the Third Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS) (Apr. 11–15, 2016), https://www.unog.ch/80256EDD006B8954/(httpAssets)/44E4700A0A8CED0EC1257F940053FE3B/$file/2016_LAWS+MX_Towardaworkingdefinition_Statements_United+Kindgom.pdf [https://perma.cc/A4YL-AYBG]; Towards a Working Definition of LAWS, Statement by Poland at the Third Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS) (Apr. 11–15, 2016), https://www.unog.ch/80256EDD006B8954/(httpAssets)/400223F5850705E2C1257F9B002C008E/$file/2016_LAWS_MX_towardsaworkingdefinition_statements_Poland.pdf [https://perma.cc/YKQ9-2LZK]; Michael Siegrist, Legal officer of International Humanitarian Law and International Criminal Justice, Directorate of International Law DIL, A Purpose-Oriented Working Definition for Autonomous Weapons Systems, Statement by Switzerland at the Third Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS) (Apr. 13, 2016), https://www.unog.ch/80256EDD006B8954/(httpAssets)/558F0762F97E8064C1257F9B0051970A/$file/2016.04.13+LAWS+Legal+Session+(as+read).pdf [https://perma.cc/ZJ9V-R4FH]; Usman Jadoon, Counsellor, Towards a Working Definition of LAWS, Statement by Pakistan at the Third Informal Meeting of Experts on Lethal Autonomous Weapon Systems (LAWS) (Apr. 12, 2016), https://www.reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2016/meeting-experts-laws/statements/12April_Pakistan.pdf [https://perma.cc/8P43-Y5WK].

[18] See above note 11.

[19] See, e.g., Questionnaire on the Legal Review Mechanisms of New Weapons, Means and Methods of Warfare, Working Paper by Argentina to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS), U.N. Doc. CCW/GGE.1/2019/WP.6 (Mar. 29, 2019), https://www.unog.ch/80256EDD006B8954/(httpAssets)/52C72D09DCA60B8BC125841E003579D8/$file/CCW_GGE.1_2019_WP.6.pdf [https://perma.cc/7UVP-9YFV]; The Australian Article 36 Review Process, Working Paper by Australia to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS), U.N. Doc. CCW/GGE.2/2018/WP.6 (Aug. 30, 2018), https://www.unog.ch/80256EDD006B8954/(httpAssets)/46CA9DABE945FDF9C12582FE00380420/$file/2018_GGE+LAWS_August_Working+paper_Australia.pdf [https://perma.cc/3PGS-UFWG]; Strengthening of the Review Mechanisms of a New Weapon, Means or Methods of Warfare, Working Paper by Argentina to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS), U.N Doc. CCW/GGE.1/2018/WP.2 (Apr. 4, 2018), http://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2018/gge/documents/GGE.1-WP2-English.pdf [https://perma.cc/XYL2-V9H9]; Weapons Review Mechanisms, Working Paper by the Netherlands and Switzerland to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS), U.N. Doc. CCW/GGE.1/2017/WP.5 (Nov. 7, 2017), https://undocs.org/ccw/gge.1/2017/WP.5 [https://perma.cc/8VVM-XS52]; U.S. Department of Defense Response to Stockholm International Peace Research Institute (SIPRI) “questionnaire on Article 36 review process” (Sep. 2017), https://ogc.osd.mil/LoW/practice/DoDDocuments/sipri_questionnaire_on_article_36_review_process_usa_response_final.pdf [permalink https://perma.cc/5CHL-S9CQ]; Implementation of Weapons Reviews Under Article 36 Additional Protocol I, Statement by Germany at the Third Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS) (Apr. 11–15, 2016), https://www.unog.ch/80256EDD006B8954/(httpAssets)/56540402E64EC6BEC1257F9A00437856/$file/2016_LAWS+MX_ChallengestoIHL_Statements_Germany.pdf [https://perma.cc/4EFG-LCEM]; The Belgian Commission for the Legal Review on New Weapons, Statement at the Third Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS) (Apr. 13, 2016), https://www.unog.ch/80256EDD006B8954/(httpAssets)/E561B679C0CD4C0DC1257F940052EFBF/$file/2016_LAWS+MX_ChallengestoIHL_Presentations_Belgian+Commission.pdf [https://perma.cc/CD6A-TZA7]; Michael Meier, Weapon Reviews, Statement by the United States at the Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS) (Apr. 13, 2016), https://www.reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2016/meeting-experts-laws/statements/13April_US.pdf [https://perma.cc/HJT9-4JUM].

[20] In the GGE on LAWS, debates on the law most frequently fall under three general categories: international humanitarian law/law of armed conflict (IHL/LOAC) rules on the conduct of hostilities, especially on distinction, proportionality, and precautions in attacks; reviews of weapons, means, and methods of warfare; and individual and State responsibility. Perhaps the most pivotal axis of the current debate, which touches on all three categories, concerns the desirability, or not, of developing and instantiating a concept of “meaningful human control” (or a similar formulation) over the use of force, including autonomy in configuring, nominating, and prioritizing targets and applying force to them. Alongside States (see below note 27), several organizations have strived to provide more information and greater specificity around what the law does or should require with respect to the “human element” debate. These include detailed considerations put forward by the International Committee of the Red Cross and the Stockholm International Peace Research Institute as well as the International Panel on the Regulation of Autonomous Weapons. See Vincent Boulanin, Neil Davison, Netta Goussac and Moa Peldán Carlsson, Limits on Autonomy in Weapon Systems, Stockholm Int’l Peace Research Inst. and Int’l Comm. of the Red Cross (June 2020), https://www.sipri.org/sites/default/files/2020-06/2006_limits_of_autonomy_0.pdf [https://perma.cc/WYN3-F5NF]; International Panel on the Regulation of Autonomous Weapons, Focus on Human Control, “Focus on” Report No. 5 (Aug. 2019), https://www.ipraw.org/wp-content/uploads/2019/08/2019-08-09_iPRAW_HumanControl.pdf [https://perma.cc/D45R-YPM8]. For its part, the U.N. Institute for Disarmament Research stands out as a trusted partner for several States in producing digestible insights on intersections between law, technologies, and operations. Recent examples include UNIDIR publications on targeting practices and robotic swarms, as well as on predictability and understandability in military applications of artificial intelligence. See Arthur Holland Michel, The Black Box, Unlocked: Predictability and Understandability in Military AI, U.N. Inst. for Disarmament Research (2020), https://unidir.org/publication/black-box-unlocked [https://perma.cc/ECE7-LVAL]; Merel Ekelhof and Giacomo Persi Paoli, The Human Element In Decisions About The Use Of Force, U.N. Inst. Disarmament Res. (2020), https://unidir.org/publication/human-element-decisions-about-use-force [https://perma.cc/W86T-4RB7]; Ekelhof and Persi Paoli, Swarm Robotics, above note 2.

[21] See, e.g., Human Machine Touchpoints: The United Kingdom’s Perspective On Human Control Over Weapon Development And Targeting Cycles, Working Paper by the United Kingdom to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS), U.N. Doc. CCW/GGE.2/2018/WP.1 (Aug. 8, 2018), http://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2018/gge/documents/GGE.2-WP1.pdf [https://perma.cc/23YQ-VZL2]. See generally Ekelhof, The Distributed Conduct of War, above note 7; Merel A.C. Ekelhof, Lifting the Fog of Targeting: “Autonomous Weapons” and human control the lens of military targeting, 73 Nav. War Coll. Rev. 61 (2018).

[22] Given the relatively limited substantive focus of the GGE on LAWS, which takes place in the context of the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects (CCW), it may be warranted to either expand the focus of the GGE or to undertake debate on these additional areas in other forums.

[23] See, e.g., Sheila Jasanoff, Virtual, visible, and actionable: Data assemblages and the sightlines of justice, Big Data & Society 1 (July–Dec. 2017). Literature on human-machine interactions and on science, technology, and society more broadly calls for much greater attention to understand and address how such things as political ideologies, ethical commitments, cognitive biases, cultural norms, and social prejudices may implicate the design, use, and governance of algorithmic and data-reliant socio-technical systems. See, e.g., Ruha Benjamin, Race After Technology (2019); Safiya Umoja Noble, Algorithms of Oppression (2018); Brent Daniel Mittelstadt, Patrick Allo, Mariarosaria Taddeo, Sandra Wachter, & Luciano Floridi, The ethics of algorithms: Mapping the debate, Big Data & Society 1 (July–Dec. 2016); Cathy O’Neil, Weapons of Math Destruction (2016).

[24] More broadly, see Hin-Yan Liu, Matthijs Maas, John Danaher, Luisa Scarcella, Michaela Lexer, and Leonard Van Rompaey, Artificial intelligence and legal disruption: a new model for analysis, 12 L., Innov. & Tech. 205 (2020).

[25] Research underpinning a companion resource — A Compilation of Materials Reflective of States’ Views on International Legal Issues pertaining to the Use of Algorithmic and Data-reliant Socio-technical Systems in Armed Conflict produced by the Harvard Law School Program on International Law and Armed Conflict (HLS PILAC) in December of 2020, which compiles materials (at least apparently) reflecting the views of States — was conducted in several additional languages, including Chinese, French, German, Russian, Spanish, and Portuguese.

[26] This commentary uses the term international law applicable in relation to armed conflict to encompass all potentially relevant fields of international law, including but not limited, as relevant, to IHL/LOAC, the field of international law applicable in relation to the threat or use of force in international relations (the jus ad bellum or the jus contra bellum), international human rights law, international criminal law, international refugee law, and other potentially relevant fields.

[27] Some potentially relevant considerations regarding this issue may be detected in statements submitted by States in discussing aspects of the “human element” relative to the use of force. See, e.g., Sandra de Jongh, Policy Officer-Ministry of Foreign Affairs, Agenda item 5(b): Further Consideration of the Human Element in the Use of Lethal Force; Aspects of Human-Machine Interaction in the Development, Deployment and Use of Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Statement by the Netherlands to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Apr. 26, 2019), https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2019/gge/statements/26March_NL5b.pdf [https://perma.cc/2QG9-QH5T]; Agenda Item 5(e): “Human element”, Statement by Brazil to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Mar. 26, 2019), https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2019/gge/statements/26March_Brazil5c.pdf [https://perma.cc/X2XX-U6Z7]; Agenda Item 5(b): Further Consideration of the Human Element in the Use of Lethal Force; Aspects of Human-Machine Interaction in the Development, Deployment and Use of Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Statement by Germany to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Mar. 26, 2019), https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2019/gge/statements/26March_Germany5c.pdf [https://perma.cc/H7WT-AQTX]; Further Consideration of the Human element in the Use of lethal force; Aspects of Human Machine Interaction in the Development, Deployment and Use of Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Statement by India to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapon Systems (LAWS) (Mar. 26, 2019), https://www.unog.ch/80256EDD006B8954/(httpAssets)/36D216B42ED8D7A7C12583D2003C8598/$file/5+b+26+Mar+2019+afternoon.pdf [https://perma.cc/6XXN-N9CC]; Further Consideration of the Human Element in the Use of Lethal Force; Aspects of Human-Machine Interaction in the Development, Deployment and Use of Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Statement by Canada to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (March 25–29, 2019), https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2019/gge/statements/26March_Canada5b.pdf [https://perma.cc/JZ46-AL6R]; Further Consideration of the Human Element in the Use of Lethal Force; Aspects of Human-Machine Interaction in the Development, Deployment and Use of Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Statement by the European Union to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Mar. 25–29, 2019), https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2019/gge/statements/26March_EU5c.pdf [https://perma.cc/ML9S-SD97]; Further Consideration of the Human Element in the Use of Lethal Force; Aspects of Human-Machine Interaction in the Development, Deployment and Use of Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Statement by Norway at the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Aug. 28, 2018), https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2018/gge/statements/28August_Norway.pdf [https://perma.cc/2HL5-WUZ3]; Agenda Item: 6(b) Human Element Segment, Statement by Belgium to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Aug. 27–31, 2018), https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2018/gge/statements/28August_Belgium.pdf [https://perma.cc/584Y-B5E6]; Further consideration of the human element in the use of lethal force, aspects of human machine interaction in the development, deployment and use of emerging technologies in the area of LAWS, Statement by Bulgaria to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Aug. 27–31, 2018), https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2018/gge/statements/29August_Bulgaria.pdf [https://perma.cc/5PCZ-WW5V]; Agenda Item 6(b): Further Consideration of the Human Element in the Use of Lethal Force; Aspects of Human-Machine Interaction in the Development, Deployment and Use of Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Statement by Ireland to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Aug. 27–31, 2018), https://www.unog.ch/80256EDD006B8954/(httpAssets)/9A87BBDA75C3416BC12582F8005C624D/$file/2018_GGE+LAWS+2_6b_Ireland.pdf [https://perma.cc/D969-2ME9]; Agenda Item 6(b): Further Consideration of the Human Element in the Use of Lethal Force; Aspects of Human-Machine Interaction in the Development, Deployment and Use of Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Statement by Costa Rica to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Aug. 27–31, 2018) (Spanish), https://www.unog.ch/80256EDD006B8954/(httpAssets)/E293437E0A608B17C12583040030B1A6/$file/2018_GGE+LAWS+2_6b_Costa+Rica.pdf [https://perma.cc/X4U5-6LPU]; Agenda Item 6(b): Further Consideration of the Human Element in the Use of Lethal Force, Statement by Estonia to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Aug. 27–31, 2018), https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2018/gge/statements/28August_Estonia.pdf [https://perma.cc/2DM4-CJMD]; Agenda Item 6b: Further Consideration of the Human Element in the Use of Lethal Force; Aspects of Human-Machine Interaction in the Development, Deployment and Use of Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Statement by Switzerland to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Aug. 27–31, 2018), https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2018/gge/statements/28August_Switzerland.pdf [https://perma.cc/HQ2G-UV8L]; Further Consideration of the Human Element in the Use of Lethal Force; Aspects of Human-Machine Interaction in the Development, Deployment and Use of Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Statement by the European Union to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Aug. 27–31, 2018), https://www.unog.ch/80256EDD006B8954/(httpAssets)/CCACABC4BCDB60B6C12582F8005C41D5/$file/2018_GGE+LAWS+2_6b_European+Union.pdf [https://perma.cc/C6AY-3K7D]; Usman Jadoon, Counsellor of Pakistan, Agenda item 6(b): Further Consideration of the Human Element in the Use of Lethal Force; Aspects of Human-Machine Interaction in the Development, Deployment and Use of Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Statement by Pakistan at the First Session of the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Apr. 11, 2018), https://www.unog.ch/80256EDD006B8954/(httpAssets)/8D3985AC41854006C1258272005837D9/$file/2018_LAWS6b_Pakistan.pdf [https://perma.cc/L5RK-RZPY]; Further Considerations of the Human Element in the Use of Lethal Force; Aspects of Human-machine Interaction in the Development, Deployment and Use of Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Statement by Austria at the Meeting of the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Apr. 9–13, 2018), https://www.unog.ch/80256EDD006B8954/(httpAssets)/9870C7B7FE556FFEC12582720057E67D/$file/2018_LAWS6b_Austria.pdf [https://perma.cc/9SSR-N5VZ]; Discussion on the Human Element in the Use of Lethal Force; Aspects of Human-machine Interaction in the Development, Deployment and Use of Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Statement by the United Kingdom to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Apr. 11, 2018), https://www.unog.ch/80256EDD006B8954/(httpAssets)/9CE24D9A02CCBAC5C125827A0034B4A5/$file/2018_LAWS6b_UK.pdf [https://perma.cc/7WWK-7JTV]; Katherine Baker, Further Consideration of the Human Element in the Use of Lethal Force; Aspects of Human-Machine Interaction in the Development, Deployment and Use of Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Statement by the United States to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) (Apr. 9–13, 2018), https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2018/gge/statements/11April_US.pdf [https://perma.cc/9YD5-X6UW].

[28] Several lines of debate have emerged regarding whether existing legal norms, principles, rules, and standards suffice to cover responsibility- and accountability-related concerns that may arise in relation to the potential or actual use of certain socio-technical systems in armed conflicts. For example, significant scholarly attention has been devoted to discerning whether or not existing concepts pertaining to individual criminal responsibility for war crimes under the international criminal law of war crimes — not least as set out in the ICC Statute (1998) — provide a sufficient framework to address individual responsibility in relation to the use of such systems. For a recent post examining several relevant aspects, see Marta Bo, Meaningful Human Control over Autonomous Weapon Systems: An (International) Criminal Law Account, Opinio Juris (Dec. 18, 2020), http://opiniojuris.org/2020/12/18/meaningful-human-control-over-autonomous-weapon-systems-an-international-criminal-law-account/ [https://perma.cc/L98Q-T4UX].

[29] For example, for an argument that algorithmic forms of warfare cannot be subject to law, see Gregor Noll, War by Algorithm: The End of Law?, in War and Algorithm 75–103 (Max Liljefors, Gregor Noll & Daniel Steuer eds., 2019). On legal aspects of automatic target recognition systems involving “deep learning” methods, see Joshua G. Hughes, The Law of Armed Conflict Issues Created by Programming Automatic Target Recognition Systems Using Deep Learning Methods, 21 Y.B. Int’l Human. L. 99 (2018). On machine-learning systems more broadly, see, e.g., Jenna Burrell, How the machine “thinks”: Understanding opacity in machine learning algorithms, Big Data & Society 1 (Jan.–June 2016). On the role of explanation with respect to accountability of artificial intelligence under the law, see Finale Doshi-Velez Mason Kortz, Accountability of AI Under the Law: The Role of Explanation, Working Group on Explanation and the Law Working Paper, Harv. Univ. Berkman Klein Ctr. for Internet & Soc. (2017), https://dash.harvard.edu/bitstream/handle/1/34372584/2017-11_aiexplainability-1.pdf?sequence=3&isAllowed=y [https://perma.cc/F44N-85RT].

[30] See, e.g., A “Compliance-Based” Approach to Autonomous Weapon Systems, Working Paper by Switzerland to the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS), U.N. Doc. CCW/GGE.1/2017/WP.9 (Nov. 10, 2017), p. 3, https://www.unog.ch/80256EDD006B8954/(httpAssets)/6B80F9385F6B505FC12581D4006633F8/$file/2017_GGEonLAWS_WP9_Switzerland.pdf [https://perma.cc/XY56-63QT] (expressing the position that “[t]he Geneva Conventions of 1949 and the Additional Protocols of 1977 were undoubtedly conceived with States and individual humans as agents for the exercise and implementation of the resulting rights and obligations in mind.”); see also Office of the General Counsel of the Department of Defense, Department of Defense Law of War Manual (June 2015, updated Dec. 2016), s. 6.5.9.3, p. 354 (expressing the position that law-of-war obligations apply to persons rather than to weapons, including that “it is persons who must comply with the law of war”).

[31] See Dustin A. Lewis, A Key Set of IHL Questions concerning A.I.-supported decision-making, in 50 Proceedings of the Bruges Colloquium (forthcoming, 2021); Switzerland, A “Compliance-Based” Approach, above note 30, at 3.

[32] Regulations Respecting the Laws and Customs of War on Land, Annex to Convention (IV) Respecting the Laws and Customs of War on Land art. 23(g) (signed Oct. 18, 1907, entered into force Jan. 26, 1910) 36 Stat. 2295 (Hague Regulations IV); see also Geneva Convention relative to the Protection of Civilian Persons in Time of War art. 53 (signed Aug. 12, 1949, entry into force Oct. 21, 1950), 75 U.N.T.S. 287 (GC IV).

[33] Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (Protocol I) arts. 51(5)(b), 57(2)(a)(iii), and 57(2)(b) (signed June 8, 1977) 1125 U.N.T.S. 3 (AP I).

[34] Id. art. 8(a).

[35] GC IV (1949) art. 42, first para. See Lewis, AI and Machine Learning Symposium, above note 4 (“[T]o raise two doctrinal examples, IHL envisages assessments as to whether a protected person may be interned or placed in assigned residence either in an international armed conflict because “the security of the Detaining Power makes it absolutely necessary” or in a situation of occupation because the Occupying Power deemed it that it was “for imperative reasons of security.” Transposing those evaluative decisions and normative judgments partially or fully into algorithmically generated assessments by way of data-shaped probabilities in concrete cases presents a far-from-frictionless exercise, to say the least. Moreover, the rules for detention relating to non-international armed conflict are, in many respects, even less clear than their international-armed-conflict counterparts. That would appear to leave larger space for those employing AI technologies to rely upon potentially problematic domestic criminal-law examples without enough international law to guide them.”).

[36] Additional examples (among many others) include provisions relative to evaluative decisions and normative (or value) judgments concerning: the presumption of civilian status in case of “doubt” (AP I (1977) arts. 50(3) and 52(3)); the betrayal of “confidence” in relation to the prohibition of perfidy (Hague Regulations IV (1907) art. 23(b) and AP I (1977) art. 37(1); see also Rome Statute of the International Criminal Court art. 8(2)(b)(xi) (signed July 17, 1998) 2187 U.N.T.S. 3 (ICC Statute)); whether an object — by its nature, location, purpose or use — makes an “effective” contribution to military action and whose total or partial destruction, capture, or neutralization, in the circumstances ruling at the time, offers a “definite” military advantage (AP I (1977) art. 52(2), Protocol II to the CCW (1996 amend.) art. 2(6), and Second Protocol to the Hague Convention of 1954 for the Protection of Cultural Property in the Event of Armed Conflict 1999 art. 1(f) (signed Mar. 26, 1999) 2253 U.N.T.S. 172); and whether or not a civilian takes a “direct” part in hostilities (AP I (1977) art. 51(3) and Protocol Additional to the Geneva Conventions of 12 August 1949 and relating to the Protection of Victims of Non-international Armed Conflicts art. 13(3) (signed June 8, 1977) 1125 U.N.T.S. 609 (AP II)).

[37] See, e.g., International Committee of the Red Cross, Commentary on the First Geneva Convention: Convention (I) for the Amelioration of the Condition of the Wounded and Sick in Armed Forces in the Field para. 1348 (2d ed., 2016), https://ihl-databases.icrc.org/ihl/full/GCI-commentary [https://perma.cc/HXV2-YAEN] (“Under combat conditions, in the very moment that a person is injured, it may be extremely difficult to determine with any degree of certainty whether that person is wounded in the legal sense, and in particular whether he or she is refraining from any hostile act.”) (emphasis added).

[38] See, e.g., Dustin A. Lewis, Legal reviews of weapons, means and methods of warfare involving artificial intelligence: 16 elements to consider, ICRC Humanitarian L. & Policy Blog (Mar. 21, 2019), https://blogs.icrc.org/law-and-policy/2019/03/21/legal-reviews-weapons-means-methods-warfare-artificial-intelligence-16-elements-consider/ [https://perma.cc/8W5B-Y3BG].

[39] See, e.g., Dustin A. Lewis, AI and Machine Learning Symposium, above note 4.

[40] See Dustin A. Lewis, International Legal Regulation of the Employment of Artificial-Intelligence-related Technologies in Armed Conflict, Moscow J. Int’l L. No. 2, 53, 61–63 (2020). On certain issues related to predicting and understanding military applications of artificial intelligence, see Holland Michel, The Black Box, Unlocked, above note 20. One of many potential concerns in this area is that a “double black box” may emerge where human agents encase technical opacity in military secrecy.

[41] AP I (1977) art. 36.

[42] Some States have elaborated positions on what may require an additional review of a weapon, means, or method of warfare involving emerging technologies in the area of lethal autonomous weapons systems. See above note 19.

[43] Geneva Convention for the Amelioration of the Condition of the Wounded and Sick in Armed Forces in the Field art. 49, third para. (signed Aug. 12, 1949, entry into force Oct. 21, 1950) 75 U.N.T.S. 31 (GC I); Geneva Convention for the amelioration of the condition of the wounded, sick and shipwrecked members of the armed forces at sea art. 50, third para. (signed Aug. 12, 1949, entry into force Oct. 21, 1950), 75 U.N.T.S. 85 (GC II); Geneva Convention relative to the Treatment of Prisoners of War art. 129 (signed Aug. 12, 1949, entry into force Oct. 21, 1950) 75 U.N.T.S. 135 (GC III); GC IV (1949) art. 146, third para.; see also AP I (1977) arts. 86(1) and 87(1)–(2).

[44] GC I (1949) art. 49, first para.; GC II (1949) art. 50, first para.; GC III (1949) art. 129, first para.; GC IV (1949) art. 146, first para.; see also AP I (1977) art. 86(1).

[45] See ICC Statute (1998) art. 8(2)(a)(iv).

[46] Namely, what it means to exercise sufficient intent and knowledge in relation to conduct involving a relevant socio-technical system.

[47] Other violations (that is, in addition to grave breaches) that give rise to individual criminal responsibility under international law applicable in relation to armed conflict ought to be assessed as well.

[48] See above note 28.

[49] GC I (1949) art. 49, second para.; GC II (1949) art. 50, second para.; GC III (1949) art. 129, second para.; GC IV (1949) art. 146, second para.; see also AP I (1977) art. 88(1).

[50] See, e.g., above note 45 and the accompanying text.

[51] GC I (1949) arts. 8 and 10; GC II (1949) arts. 8 and 10; GC III (1949) arts. 8 and 10; GC IV (1949) arts. 9 and 11; see also AP I (1949) art. 5. But see ICRC, Commentary on GC I, above note 37, at para. 1014 (stating that “[s]eemingly, practice since 1949 has evolved to the point of considering the appointment of Protecting Powers as optional in nature.”).

[52] See ICRC, Commentary on GC I, above note 37, at para. 1080 (“If prisoners of war are transferred from one Detaining Power to another, the Protecting Power must, if the Power to which they have been transferred fails to meet its obligations in any important respect, notify the Power which transferred the prisoners of war. This notification triggers that Power’s obligation to take effective measures to correct the situation or request the return of the prisoners of war. Such requests must be complied with (Article 12 [of GC III]).”) (emphasis added).

[53] GC I (1949) art. 11, first para.; GC II (1949) art. 11, first para.; GC III (1949) art. 11, first para.; GC IV (1949) art. 12, first para.

[54] AP I (1977) art. 82.

[55] It may be warranted to ensure that sufficiently knowledgeable and trained legal advisers are available at all relevant stages, including, as relevant, development, acquisition, review, testing and validation, fielding, employment, and post-conduct assessment. See, e.g., Annemarie Vazquez, Laws and Lawyers: Lethal Autonomous Weapons Bring LOAC Issues to the Design Table, and Judge Advocates Need to be There, 228 Mil. L. Rev. 89 (2020).

[56] GC I (1949) art. 47; GC II (1949) art. 48; GC III (1949) art. 127; GC IV (1949) art. 144, first para.; see also AP I (1977) art. 83(1).

[57] GC I (1949) art. 54; GC II (1949) art. 45; see also AP I (1977) art. 18 and Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Adoption of an Additional Distinctive Emblem art. 6 (signed Dec. 8, 2005, entry into force Jan. 14, 2007) 2404 U.N.T.S. 261.

[58] Under Article 85(3) of AP I, the perfidious use of the emblems constitutes a grave breach of that instrument. AP I (1977) art. 85(3).

[59] GC I (1949) arts. 19, first para., 24, 25, 26, and 35, first para.; GC II (1949) arts. 22, first para., 24, 25, 27, 36, and 37, first para.; see also AP I (1977) arts. 12 and 21–31 and AP II (1977) arts. 9(1) and 11(1).

[60] See GC II (1949) arts. 22, first para., 24, first para., and 25, first para.

[61] There is a contemporary debate in international legal circles around the nature of the obligation to “ensure respect” for the Geneva Conventions. Compare, e.g., Michael N. Schmitt and Sean Watts, Common Article 1 and the Duty to “Ensure Respect,” 96 Int’l L. Stud. 674 (2020) with ICRC, Commentary on GC III, above note 70, at paras. 186–206. I do not take a position here on that debate.

[62] The list in this section, which is far from comprehensive, is drawn in part from ICRC, Commentary on GC I, above note 37, at para. 214.

[63] ICRC, Commentary on GC I, above note 37, at para. 214.

[64] Id.

[65] See above notes 44–50 and the accompanying text.

[66] Id.

[67] Id.

[68] GC I (1949) art. 52; GC II (1949) art. 53; GC III (1949) art. 132; GC IV (1949) art. 149.

[69] AP I (1977) art. 90(2).

[70] See, e.g., International Committee of the Red Cross, Commentary on the Third Geneva Convention: Convention (III) relative to the Treatment of Prisoners of War para. 214, n. 122 (2d ed., 2020), https://ihl-databases.icrc.org/applic/ihl/ihl.nsf/Treaty.xsp?documentId=77CB9983BE01D004C12563CD002D6B3E&action=openDocument [https://perma.cc/773X-NTXY]; see also Arms Trade Treaty, adopted April 2, 2013, A/RES/ 67/234B, arts. 6–7.

[71] See, e.g., ICRC, Commentary on GC III, above note 70, at para. 214, n. 124 (citation omitted).

[72] See above notes 31–37 and the accompanying text.

[73] See ICRC, Commentary on GC III, above note 70, at para. 201 (citation omitted).

[74] See above note 4.

[75] ICRC, Commentary on GC III, above note 70, at para. 214 (citations omitted).

[76] Id.

[77] Id.

[78] Id. at para. 194.