International Legal and Policy Dimensions of War Algorithms: Enduring and Emerging Concerns
From September 2018 through December 2020, the Harvard Law School Program on International Law and Armed Conflict undertook a research and policy project titled “International Legal and Policy Dimensions of War Algorithms: Enduring and Emerging Concerns.” Through research, events, briefings, and convenings, the project team pursued two objectives. The first was to strengthen international debate and inform policy-making on the ways that artificial intelligence and complex computer algorithms are transforming war. The second was to help illuminate how international legal and policy frameworks already govern — and might further regulate — the design, development, and use of those technologies. The Ethics and Governance of Artificial Intelligence Fund provided financial support for the project.
In December of 2020, HLS PILAC published two capstone documents for the project. Several additional writings were produced in connection with the project as well. HLS PILAC faculty and staff have briefed numerous governments, militaries, United Nations system actors, and humanitarian bodies on the project’s research and analysis.
Legal Commentary
In a legal commentary titled “Three Pathways to Secure Greater Respect for International Law concerning War Algorithms,” Dustin A. Lewis (HLS PILAC Research Director) consolidates the main findings of the project. The commentary argues that international actors ought to systematically identify and address the diverse array of all armed-conflict-related activities and decisions implicated by socio-technical systems reliant upon war algorithms and data, including but extending beyond weapons. Further, in doing so, international actors ought to assess whether existing measures aimed at securing respect for the law concerning this range of activities and decisions are up to the task. To help inform those considerations, the commentary outlines three pathways that international actors — including States, international organizations, non-state parties to armed conflicts, and others — may take to secure greater respect for international law in this area.
Pathway 1
An international actor may form and publicly express positions on key legal issues arising in respect of relevant systems, including:
Whether, as a legal matter, armed-conflict-related conduct ought to be reflective of human agency;
Whether, as a legal matter, armed-conflict-related conduct may be considered reflective of human agency only if the conduct is subject to the exercise by one or more natural persons of intent (or, at least, foreseeability), knowledge, and causal control in respect of the conduct;
Whether it is, or at least ought to be, presupposed that the (primary) exercise and implementation of international law — including by legal entities, such as States, international organizations, and courts — may be reposed only in natural persons or whether, alternatively, those responsibilities may be reposed partly or wholly in artificial agents;
Whether legally mandated evaluative decisions and normative (or value) judgments may be reposed only in one or more natural persons;
Whether — and, if so, under what circumstances and subject to what conditions — reliance may be placed on relevant systems to partly or wholly establish or validate the information upon which legally mandated evaluative decisions and normative (or value) judgments are made;
Whether or not the use of proxies for legally relevant characteristics is permissible under international law applicable in relation to armed conflict;
If the use of proxies for legally relevant characteristics is permissible, under what circumstances and subject to conditions may a relevant system be involved in the formulation, collection, validation, or evaluation of such proxies;
Whether, as a legal matter, international actors ought to pledge to engage, and call upon others to commit to engage, only in armed-conflict-related conduct that is at least facilitative of attribution, discernibility, and scrutiny of conduct involving a relevant system, including by actors not involved in the conduct; and
What forms and manifestations of relevant systems, adopted in relation to which circumstances of use and subject to what conditions, mandate additional legal review?
Pathway 2
An international actor may take measures relative to its own conduct involving a relevant system, including the measures necessary:
To suppress all acts contrary to relevant binding instruments;
To enact — and, as warranted, adjust — any legislation necessary to provide effective penal sanctions for persons committing, or ordering to be committed, any grave breach;
To prosecute or extradite alleged perpetrators of grave breaches;
To comply with and facilitate respect for applicable law relative to Protecting Powers or substitutes;
To make legal advisers available to the armed forces;
To include the study of applicable international law in civil-instruction programs, so that the principles thereof may become known to the entire population;
For the prevention and repression of abuses of the emblem; and
To ensure respect for and protection of fixed establishments and mobile medical units of the medical service, medical and religious personnel, medical transports, and hospital ships and their crews.
Pathway 3
An international actor may take measures relative to the conduct of others involving a relevant system, including the measures necessary:
To use a diplomatic dialogue to address questions of compliance or exert diplomatic pressure through confidential protests or public denunciations;
To offer legal assistance or support legal assistance provided by others;
To resort to and otherwise support penal measures;
To request that an inquiry be instituted concerning any alleged violation of a relevant instrument;
For the International Fact-Finding Committee to inquire into allegations;
To condition, limit, or refuse arms transfers;
To refer, where possible, a specific issue to a competent body for the settlement of disputes; and
To monitor the fate of armed-conflict-related detainees transferred to another State.
Concerning armed-conflict-related partnerships in particular, international actors may take the measures necessary:
To condition joint operations involving a relevant system on a partner’s compliance with applicable law;
To plan operations involving a relevant system jointly to prevent violations;
To intervene directly with commanders in case of violations involving a relevant system; and
To opt out of a specific operation involving a relevant system if there is an expectation that the operation would violate applicable law.
Cross-cutting Themes and Commitments
Some themes and commitments cut across these three pathways. Arguably, respect for the law turns in no small part on whether natural persons can and will foresee, understand, administer, and trace the components, behaviors, and effects of relevant systems. It may be advisable, moreover, to institute ongoing cross-disciplinary education and training as well as the provision of sufficient technical facilities for all relevant actors, from commanders to legal advisers to prosecutors to judges. Further, it may be prudent to establish ongoing monitoring of others’ technical capabilities. Finally, it may be warranted for relevant international actors to pledge to engage, and to call upon others to engage, only in armed-conflict-related conduct that is sufficiently attributable, discernable, and scrutable.
Compilation
In “A Compilation of Materials Apparently Reflective of States’ Views on International Legal Issues pertaining to the Use of Algorithmic and Data-reliant Socio-technical Systems in Armed Conflicts,” HLS PILAC seeks to provide a resource through which the positions of States with divergent positions on certain matters potentially of international public concern can be identified. Legal aspects of war technologies are more complex than some governments, scholars, and advocates allow. In our view, knowledge of the legal issues requires awareness of the multiple standpoints from which these arguments are fashioned. An assumption underlying how we approach these inquiries is that an assessment concerning international law in this area ought to take into account the perspectives of as many States (in addition to other relevant actors) as possible.
Research Support
The following Research Assistants contributed to the project: Lindsay Anne Bailey, Sonia Chakrabarty, Minqian Lucy Chen, Edi Ebiefung, Zach Grouev, Rachael Hanna, Simon Gregory Jerome, Daniel Levine-Spound, Charles Orta, Will Ossoff, Binendri Perera, Vera Piovesan, Sam Rebo, Lilianna (Anna) Rembar, Ricardo Cruzat Reyes, Juan Rivera Rugeles, Delphine Rodrik, Carolina Silva-Portero, Katherine Shen, Delany Sisiruca, April Xiaoyi Xu, and Eun Sung Yang. In addition, Jennifer Allison and Caroline Walters of the HLS Library, as well as other members of the HLS Library staff, provided research support.
Contact
Professor of Practice Naz K. Modirzadeh (Founding Director, HLS PILAC): nmodirzadeh@law.harvard.edu.
Dustin A. Lewis (Research Director, HLS PILAC): dlewis@law.harvard.edu.
Research and Analysis
Publications
Dustin A. Lewis, On “Responsible A.I.” in War: Exploring Preconditions for Respecting International Law in Armed Conflict, in The Cambridge Handbook of Responsible Artificial Intelligence: Interdisciplinary Perspectives (Silja Voeneky, Philipp Kellmeyer, Oliver Mueller, and Wolfram Burgard eds., Cambridge University Press, 2022)
Dustin A. Lewis, A Key Set of IHL Questions concerning AI-supported Decision-making, 51 Collegium (Proceedings of the Bruges Colloquium) 80 (Autumn 2021)
Dustin A. Lewis, Preconditions for Applying International Law to Autonomous Cyber Capabilities, in Autonomous Cyber Capabilities under International Law (Rain Liivoja and Ann Väljataga eds., NATO Cooperative Cyber Defence Centre of Excellence, 2021)
Dustin A. Lewis, Three Pathways to Secure Greater Respect for International Law concerning War Algorithms, Legal Commentary, HLS PILAC (2020)
A Compilation of Materials Apparently Reflective of States’ Views on International Legal Issues pertaining to the Use of Algorithmic and Data-reliant Socio-technical Systems in Armed Conflict (Editor: Dustin A. Lewis; Contributors: Sonia Chakrabarty, Minqian Lucy Chen, Edi Ebiefung,
Zach Grouev, Rachael Hanna, Simon Gregory Jerome, Charles Orta, Will Ossoff, Vera Piovesan, Sam Rebo, Lilianna (Anna) Rembar, Ricardo Cruzat Reyes, Juan Rivera Rugeles, Carolina Silva-Portero, and Delany Sisiruca), Harvard Law School Program on International Law and Armed Conflict (December 2020)Dustin A. Lewis, International legal regulation of the employment of artificial-intelligence-related technologies in armed conflict, Moscow Journal of International Law No. 2, pp. 53–64 (2020)
Dustin A. Lewis, An Enduring Impasse on Autonomous Weapons, Just Security, Sept. 28, 2020
Dustin A. Lewis, AI and Machine Learning Symposium: Why Detention, Humanitarian Services, Maritime Systems, and Legal Advice Merit Greater Attention, Opinio Juris Blog, April 28, 2020.
Dustin A. Lewis, Legal reviews of weapons, means and methods of warfare involving artificial intelligence: 16 elements to consider, ICRC Humanitarian Law and Policy Blog, March 21, 2019.
Naz K. Modirzadeh and Dustin A. Lewis, Expert views on the frontiers of artificial intelligence and conflict, Humanitarian Law and Policy Blog, March 19, 2019.
Recordings
Law and the Future of War, Episode 20: War Algorithms, University of Queensland’s School of Law (June 16, 2022), https://www.buzzsprout.com/1403818/10705281-war-algorithms-dustin-lewis (with Dustin Lewis).
2021 Stockholm Security Conference “Battlefields of the Future,” The AI Conundrum: Algorithms and Human Responsibility in Armed Conflicts, Stockholm International Peace Research Institute (Dec. 15, 2021), https://www.youtube.com/watch?v=LusTdJ6blWA (with Dustin Lewis).
Lethal Autonomous Weapons, Episode 6: Who Is Responsible for Lethal Autonomous Weapons?, Lethal Autonomous Weapons: 10 Things We Want to Know (Oct. 13, 2021), https://laws10.simplecast.com/episodes/episode-6-who-is-responsible-for-lethal-autonomous-weapons-b96wrAPW (with Dustin Lewis).
Annual Cyber Conference: Trends and Challenges in Regulation of Quasi-Sovereign Powers in the Face of New Technological and Cyber Developments, Panel on Challenges in the Regulation of Artificial Intelligence and Cyber Capabilities in the Battlefield, The Federmann Cyber Security Center Law Program (May 13, 2021), https://www.youtube.com/watch?v=I1Kl2tVdbmE (with Dustin Lewis).
CyFy 2018, Algorithms, AI and Armed Conflict, Observer Research Foundation (Oct. 15, 2018), https://www.youtube.com/watch?v=_8gLoIRvAfI (with Dustin Lewis).