Note: More information about this PILAC Project as well as the full version of the Briefing Report are available here [link].


Section 3: International Law pertaining to Armed Conflict

In this section, we outline key fields, concepts, and rules relating to international law pertaining to armed conflict. We do so to identify some of the fundamental substantive norms that may be relevant to war algorithms in general and to our three-part accountability approach in particular.[281] State responsibility entails, among other things, identifying the content of the underlying obligation. Individual responsibility entails, among other things, identifying the elements of the crime and the mode of responsibility under international law. Finally, scrutiny governance entails detecting—and potentially surpassing—a baseline of relevant normative regimes, and international law may provide a foundational normative framework concerning regulation of war algorithms.

This section is divided into two parts. We first set the stage with an introduction of state responsibility. Then, in the bulk of the section, we highlight relevant considerations in the substantive law of obligations. Part of the focus is on AWS, since that has been the main framing states have addressed to date. We examine whether a customary international law norm pertaining to AWS in particular has crystallized. We find that one has not, at least not yet. So we then outline some of the main international law rules of a more general nature. We focus here primarily on rules that may relate to AWS, but we also note a number of rules that may (otherwise or also) implicate war algorithms.

With respect to AWS, most commentators and states focus primarily on international humanitarian law and international criminal law. In this section, we raise concerns not only in those fields but also in some of the other regimes of international law that might apply with respect to war algorithms. The section, however, is not meant to be exhaustive.[282] We note that some states—including Switzerland, the United States, and the United Kingdom—have articulated much more detailed analyses of how AWS might relate to a particular rule or field of international law; in light of our interest in discerning state practice, we focus, in part, on those states’ positions and practices.

State Responsibility

State responsibility underpins international law. To grasp the broader accountability architecture governing the design, development, or use (or a combination thereof) of war algorithms, therefore, it is necessary to have at least a basic understanding of the conceptual framework of state responsibility.

Underlying Concepts

The underlying concepts of state responsibility, which are general in character, are attribution, breach, excuses, and consequences.[283] Attribution concerns the circumstances under which an act may be attributed to a state.[284] Breach concerns the conditions under which an act (or omission) may qualify as an internationally wrongful act.[285] Excuses concern the general defenses that may be available to a state in relation to an internationally wrongful act.[286] And consequences concern the forms of liability that may arise in relation to an internationally wrongful act. As James Crawford explains, “[i]ndividual treaties or rules may vary these underlying concepts in some respect; otherwise they are assumed and apply unless excluded.”[287]

Conduct may be attributed to a state under a variety of circumstances. These circumstances include the conduct of any state organ, such as the armed forces.[288] They also include the conduct of a person or entity empowered by the law of the state to exercise elements of governmental authority (so long as the person or entity is acting in that capacity in a particular instance),[289] and the conduct of an organ placed at the disposal of a state by another state so long as that “organ is acting in the exercise of elements of the governmental authority of the State at whose disposal it is placed.”[290] The conduct of these organs, persons, and entities where acting in those capacities shall be considered an act of the state under international law even if that conduct exceeds its authority or contravenes instructions.[291] Furthermore, “[t]he conduct of a person or group of persons shall be considered an act of a State under international law if the person or group of persons is in fact acting on the instructions of, or under the direction or control of, that State in carrying out the conduct.”[292] And “[t]he conduct of a person or group of persons shall be considered an act of a State under international law if the person or group of persons is in fact exercising elements of the governmental authority in the absence or default of the official authorities and in circumstances such as to call for the exercise of those elements of authority.”[293] Also, “[t]he conduct of an insurrectional movement which becomes the new Government of a State shall be considered an act of that State under international law.”[294] And, finally, “[c]onduct which is not attributable to a State under the preceding [circumstances] shall nevertheless be considered an act of that State under international law if and to the extent that the State acknowledges and adopts the conduct in question as its own.”[295]

In general, a consequence of state responsibility is the liability to make reparation.[296] As noted by Pietro Sullo and Julian Wyatt, “[t]he principle that States have to provide reparations to other States to redress wrongful acts they have committed is undisputed under international law and is confirmed by other instruments of international law.”[297] Those authors explain that “[t]he primary function of reparations in international law is the re-establishment of the situation that would have existed if an internationally wrongful act had not been committed and the forms that such reparation may take are various.”[298]

Substantive Law of Obligations

While state responsibility provides the basic framework, the substantive law of obligations fleshes out the relevant rules and procedures. The substantive law of obligations may be found in a relevant branch or branches of public international law. The operation of a specific branch may have implications for particular forms of attribution, breach, excuses, and consequences. IHL, for instance, contains specific provisions on what may constitute a “serious violation” and what consequences may arise with respect to certain rule breaches.

The two sources of the substantive law of obligations most relevant to war algorithms are treaties and customary international law. Treaties are often defined as international agreements between two or more states.[299] And customary international law is often defined as being made up of the “rules of international law that derive from and reflect a general practice accepted as law.”[300] Below, we first explore whether there is a specific customary rule pertaining to AWS in particular. (We focus on AWS here and not on war algorithms more broadly because, to date, the bulk of the state practice pertains to AWS.) Answering in the negative, we then highlight treaty provisions (and corresponding customary rules) of a more general character that may relate to AWS and war algorithms. These provisions stretch across an array of fields of international law—not only IHL and international criminal law, but also space law, telecommunications law, and others.

Customary International Law concerning AWS[301]

Customary international law has two constituent elements: state practice and opinio juris sive necessitates (shorthand: opinio juris).[302] State practice has recently been formulated as the “conduct of the State, whether in the exercise of executive, legislative, judicial or any other functions of the State.”[303] And opinio juris has recently been formulated as “the belief that [a practice] is obligatory under a rule of law.”[304] In other words, a state following a particular practice merely as a matter of policy or out of habit, not out of a sense of legal obligation, does not qualify as opinio juris.[305]

It seems fair to say that statements made by official state representatives at the 2015 and 2016 Convention on Certain Conventional Weapons (CCW) Informal Meetings of Experts on Lethal Autonomous Weapons Systems could qualify as state practice or opinio juris. (Though those statements probably should not be counted as both.) Such gatherings are “informal implementation mechanism[s],”[306] not formal gatherings of state parties. But these meetings nevertheless involved the sort of public pronouncements that, when conducted by state agents, are capable of comprising evidence of the elements of customary international law. In at least some cases, states’ presentations at meetings of experts have been considered as state practice for the purposes of assessing customary international law.[307] Whether a particular statement is evidence depends in part on its content. For example, a state merely implying or expressing a desire that something become illegal would not be evidence of state practice.[308]

So far, it appears that there is not enough consensus among these statements for any clear customary international law to have emerged due to state practice or opinio juris. Be that as it may, the 2016 meeting revealed relatively wide agreement on some important points. First, nearly all states that explicitly addressed the issue concurred that “fully” autonomous weapon systems do not yet exist (although some maintained that such systems will never exist, whereas others seemed to assume that they inevitably will). Second, there was wide agreement on the need for further discussion or monitoring (or both). Nearly every state mentioned the importance of continuing the dialogue. Third, most states indicated their belief that the current definitions of “autonomous weapon systems” are inadequate, impeding the progress that international society can make in assessing legal concerns.

In terms of taking a concrete position concerning the legality of “lethal autonomous weapons systems,” at the 2016 Meeting the greatest agreement was on the importance or relevance of the review process under Article 36 of the first Additional Protocol to the Geneva Conventions (described in more detail below) and on the need for “meaningful human control” over AWS. In statements at the 2016 Meeting, thirteen states referenced the importance or relevance of Article 36—more than twice as many as at the 2015 Meeting. Also at the 2016 Meeting, thirteen states expressly referenced the need for “meaningful human control.” However, as in 2015, this agreement was undercut by the lack of clarity as to what “meaningful human control” means. (Some states seemed to think that something akin to a human override capability would be sufficient, while others disagreed.[309]) Given the disparities in how different states interpret the concept, some states expressed skepticism about the usefulness of the notion of “meaningful human control.”[310]

When comparing the 2015 and 2016 CCW Informal Meetings of Experts, it is important to bear in mind that the participating states are not identical. The differences between the meetings may simply reflect the altered composition of participating states, not necessarily a coherent shift in position among the same group of states. Nonetheless, the growing number of states that referenced Article 36 reviews might reflect a growing recognition that the category “autonomous weapon systems” involves a broad spectrum of weapons and may require review on a case-by-case basis.

Another consideration in the evaluation of customary international law that may be relevant to AWS concerns “specially affected” states. The basic idea is that the practice of “specially affected” states[311]—that is, states that are “affected or interested to a higher degree than other states with regard to the rule in question”—“should weigh heavily (to the extent that, in appropriate circumstances, it may prevent a rule from emerging).”[312] For example, with respect to the rights associated with a state’s territorial sea, the practices of states with a coastline have been considered as more significant than those of landlocked states.[313] There is some dispute over the determination and role of “specially affected” states in customary international humanitarian law.[314] Yet the position of the majority of commentators seems to be that “[i]f an emerging rule in respect to the use of sophisticated weaponry is considered then the practice of only a few states technically capable of production may suffice.”[315]

If this view is accurate, then the practice of states that are more technologically advanced in the weapons arena—such as the United States, Israel, and South Korea, which are reportedly some of the states furthest along in the development of relevant technologies[316]—would be particularly important for any customary rules about AWS. So far, these and other similar states have largely favored continuing to monitor or discuss the development of such weapons. Indeed, these states mostly refrain from deciding on their per se legality while offering hints that they have apprehensions about bans that they view as potentially premature or restricting civilian technological development.[317]

Yet another line of reasoning suggests that states in whose territory where autonomous weapons might be deployed (regardless of whether the territorial state grants consent) may also be considered “specially affected.” Along these lines, Pakistan’s statements about the illegality of lethal autonomous weapons systems would also receive a privileged status.[318] This claim might have some value as lex ferenda (the law as it should be). But, as mentioned above, existing scholarly commentary tends to focus on the weapons-possessors, not on the places where the weapons may be used, as the “specially affected” states.

Summary of States’ Positions as Reflected by Their Statements at the 2015 and 2016 CCW Meetings of Experts

Charts containing the relevant quotations, caveats, and explanations are in Appendices I and II.

Position:[319] Currently unacceptable, unallowable, or unlawful

States reflecting this position: Austria,[320] Chile,[321] Costa Rica, Ecuador, Germany,[322] Mexico, Pakistan, Poland,[323] and Zambia

Position: Need to monitor or continue to discuss

States reflecting this position: Algeria, Austria, Australia, Canada, Chile, Colombia, Croatia, Costa Rica, Czech Republic, Ecuador, Finland, France, Germany, India, Ireland, Israel, Italy, Japan, Korea, Mexico, Morocco, Netherlands, New Zealand, Pakistan, Poland, Sierra Leone, South Africa, Spain, Sri Lanka, Sweden, Switzerland, Turkey, United Kingdom, United States of America, and Zambia

Position: Need to regulate[324]

States reflecting this position: Austria, Chile, Colombia, Czech Republic, Netherlands, Poland, Sri Lanka, Sweden, and Zambia

Position: Need to ban (or favorably disposed towards the idea)[325]

States reflecting this position: Algeria, Bolivia,[326] Chile, Costa Rica, Croatia,[327] Cuba, Ecuador, Egypt,[328] Ghana, Mexico,[329] Nicaragua,[330] Pakistan, Sierra Leone,[331] Palestine,[332] Zambia,[333] and Zimbabwe[334]

Position: Need for meaningful human control

States reflecting this position: Argentina, Austria, Australia, Canada, Chile, Colombia, Croatia, Czech Republic, Denmark, Ecuador, Germany, Greece, Ireland, Korea, Morocco, Netherlands, Pakistan, Poland, South Africa, Sweden, Switzerland, Turkey, United Kingdom, Zambia, and Zimbabwe

Position: AP I Article 36 weapons review (defined below) necessary[335]

States reflecting this position: Australia, Austria, Canada, Cuba, Finland, France, Germany, Netherlands, Sierra Leone, South Africa,[336] Sri Lanka, Sweden, Switzerland, United Kingdom, and Zambia

Position: Refers to legal principles while remaining undecided on per se legality of AWS

States reflecting this position: Algeria, Argentina, Australia, Austria, Canada, Chile, Czech Republic, Denmark, Ecuador, Finland, France, Germany, Greece, India, Ireland, Israel, Italy, Japan, New Zealand, Poland, Sierra Leone, South Africa, Spain, Sri Lanka, Sweden, Switzerland, Turkey, United Kingdom, United States of America, and Zambia

Treaty Provisions and Customary Rules Not Specific to AWS

Having established that a rule of customary international law specific to AWS has not crystallized (at least not yet),[337] we turn to treaty provisions and customary rules that might nonetheless govern the design, development, or use (or a combination thereof) of an AWS or, more generally, a war algorithm. The following section is not meant to be exhaustive but rather to highlight some of the main rules that might be implicated by AWS or war algorithms.

Jus ad Bellum

The jus ad bellum (also known as the jus contra bellum) is the field of public international law governing the threat of force or the use of force by a state in its international relations. Current international law establishes a general prohibition on such threats of force and such uses of force unless undertaken pursuant to a lawful exception to that prohibition. Recognized exceptions include an enforcement action pursuant to a mandate of the U.N. Security Council, an exercise of lawful self-defense conforming to the principles of necessity and proportionality, and lawful consent.[338]

At least two concerns arise with respect to war algorithms as a matter of the jus ad bellum. The first is whether the determination of a breach of a rule of the jus ad bellum is independent of the type of weapon used.[339] For instance, some commentators have debated the use of so-called “predecessors of AWS,” such as UAVs, in the context of obviating threats of terrorism as a matter of the jus ad bellum.[340] Others find those contributions “misguided,”[341] arguing instead that “[t]he use of AWS does not render an operation illegal under rules of ius ad bellum.”[342]

The second concern is whether a particular use of a war algorithm in relation to the use of force in international relations falls under the category of prohibited “force.” The most pertinent analogue might be a computer network attack. Oliver Dörr notes that, so far, such attacks against the information systems of another state have not been treated in practice under the principle of the non-use of force.[343] However, Dörr argues, “current and future State practice may, in this respect, lead to a different interpretation, given the weapon-like destructive potential which some attacks by means of information technology may develop: computer network attacks intended to directly cause physical damage to property or injury to human beings in another State may reasonably be considered armed force.”[344]

International Humanitarian Law

IHL is the primary field of international law governing armed conflict. It applies only in relation to armed conflict. Under international law, armed conflicts may be either international or non-international in character. IHL binds all of the parties to the armed conflict (whether states or non-state organized armed groups), as well as individuals.[345] And, where applicable, the law of neutrality also binds neutral states or other states not party to the armed conflict.[346]

The discussion on AWS and war algorithms enters into a number of preexisting debates in IHL. Those concern such issues as the contours of civilian “direct participation in hostilities,”[347] the geographic and temporal scope of armed conflict, and the relationship of IHL to international human rights law. The AWS discourse to date has largely revolved around IHL provisions concerning the conduct of hostilities, given the focus on autonomous weapon systems. Here we highlight the major considerations concerning AWS as weapons, though we note some other areas of IHL that might be relevant for war algorithms more broadly.

Suppression of Acts Contrary to the Geneva Conventions

As a framework matter, states parties to the Geneva Conventions of 1949 have a general obligation to “undertake to respect and to ensure respect for the ... Convention[s] in all circumstances.”[348] More broadly, each state party “shall take measures necessary for the suppression of all acts contrary to the provisions of the” Geneva Conventions of 1949 other than grave breaches.[349] (States are required to take certain other, more exacting measures with respect to grave breaches, as noted below.)

Classification: Weapons (or Weapon Systems) or Combatants?

An initial issue is whether under IHL the relevant AWS (however defined) is considered a weapon (or a weapon system) or should be classified as something else, such as a combatant. The bulk of states and commentators focus on AWS in the sense of weapons.[350] But others, such as Hin-Yan Liu, raise the prospect that an AWS may be considered a combatant where, for instance, the focus is on the system’s decision-making capability. Liu adopts the U.S. DoD Law of War Working Group’s approach to differentiating between the terms “weapon” and “weapon systems.”[351] The former refers to “all arms, munitions, materiel, instruments, mechanisms, or devices that have an intended effect of injuring, damaging, destroying or disabling personnel or property,” while the latter is more broadly conceived to include “the weapon itself and those components required for its operation, including new, advanced or emerging technologies.”[352]

For Liu, “the capacity for autonomous decision-making pushes these technologically advanced systems to the boundary of the notion of ‘combatant’.”[353] As an indicator of the “potential for the confusion between means and methods of warfare and combatants,” Liu points to the German military manual, which provides that “combatants are persons who may take a direct part in hostilities, i.e., participate in the use of a weapon or a weapon-system in an indispensable function.”[354] Liu notes that “this characterization was used in the context of differentiating categories of non-combatants who are members of the armed forces,” yet his broader point is that “the circularity of this definition illustrates precisely the difficulties associated with defining ‘weapon’ and ‘weapons system’.”[355]

Weapons: Reviews

As noted relatively frequently at the 2016 CCW Informal Expert Meeting on Lethal Autonomous Weapons Systems, Article 36 of Additional Protocol I imposes an obligation on states parties concerning “the study, development, acquisition or adoption of a new weapon, means or method of warfare.” In particular, states parties are obliged to determine “whether [the] employment [of a new weapon, means or method of warfare] would, in some or all circumstances, be prohibited by” AP I or by any other rule of international law applicable to the state party.

With respect to AWS, Christopher Ford argues that “[t]he complexity of the weapons review will be a function of the sophistication of the technology, the geographic and temporal scope of use, and the nature of the environment in which the system is expected to be used.”[356] He puts forward four “best practices” to consider in all such reviews. First, “[t]he weapons review should either be a multi-disciplinary process or include attorneys who have the technical expertise to understand the nature and results of the testing process.” Second, “[r]eviews should delineate the planned and normal circumstances of use for which the weapon was reviewed.” Third, “[t]he review should provide a clear delineation between expected human and system roles.” And fourth, “optimally, the review should occur at three points in time.” Those points are: “when the proposal is made to transition a weapon from research to development”; before the weapon is fielded; and, after fielding, “based upon feedback on how the weapon is functioning.” The latter “would necessitate the establishment of a clear feedback loop which provides information from the developer to the reviewer to the user, and back again.”

Weapons: Grounds for Unlawfulness

Under IHL, a weapon or its use may be considered unlawful under two sets of circumstances.[357] First, the weapon may be considered unlawful per se (in and of itself), either because the weapon has been expressly prohibited in applicable international law or because the weapon is not capable of being used in a manner that comports with IHL. Second, the weapon may be considered unlawful based on a particular use. In relation to this factor, only that unlawful use of the weapon, not the weapon itself, would be illegal.

Weapons: Unlawful Per Se Due to Applicable Prohibition

A number of IHL treaties prohibit or restrict the use of certain weapons. The prohibitions in IHL treaties concerning specific weapons that might be relevant to war algorithms or AWS (or both) include:

  • Pursuant to the Hague Convention on the Laying of Automatic Submarine Contact Mines (1907 Hague Convention VIII),[358] it is prohibited to lay unanchored automatic contact mines, except when they are so constructed as to become harmless one hour at most after the person who laid them ceases to control them;[359] it is also prohibited to lay anchored automatic contact mines that do not become harmless as soon as they have broken loose from their moorings and to use torpedoes that do not become harmless when they have missed their mark;[360] finally, it is also forbidden to lay automatic contact mines off the coast and ports of the enemy with the sole object of intercepting commercial shipping.[361]
  • The Convention on the Prohibition of Military or any other Hostile Use of Environmental Modification Techniques (1977)[362] prohibits, among other things, military or other hostile use of environmental modification techniques if these would have widespread, long-lasting, or severe effects as the means of destruction, damage, or injury to another state party.[363]
  • The Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons which may be Deemed to be Excessively Injurious or to have Indiscriminate Effects (1980)[364] “facilitates the negotiation of protocols which can address particular weapons or types of weapon technology.”[365] Under the aegis of the CCW, the following weapons prohibitions, among others, have been adopted:
    • Pursuant to the Protocol on Non-Detectable Fragments (Protocol I, 1980),[366] it is prohibited to use any weapon “the primary effect of which is to injure by fragments which in the human body escape detection by x-rays”;[367]
    • Pursuant to the Protocol on Prohibitions or Restrictions on the Use of Mines, Booby-Traps and other Devices (Protocol II, as amended, 1996),[368] it is prohibited to use booby-traps in the form of apparently harmless portable objects specifically designed and constructed to contain explosive material and to detonate when they are disturbed or approached[369] (note that the U.S. DoD Law of War Manual states that “to the extent a weapon system with autonomous functions falls within the definition of a ‘mine’ in the CCW Amended Mines Protocol, it would be regulated as such.”[370]);
    • Pursuant to the Protocol on Prohibitions or Restrictions on the Use of Incendiary Weapons (Protocol III, 1980),[371] it is prohibited to make any military objective located within a concentration of civilians the object of attack by air-delivered incendiary weapons;[372]
    • Pursuant to the Protocol on Blinding Laser Weapons (Protocol IV, 1995),[373] it is prohibited to employ laser-weapons specifically designed, as their sole combat function or as one of their combat functions, to cause permanent blindness to unenhanced vision, that is, to the naked eye or to the eye with corrective eyesight devices.[374]
  • The Convention on the Prohibition of the Use, Stockpiling, Production and Transfer of Anti-Personnel Mines and on Their Destruction (1997)[375] prohibits the use, development, production, acquisition, stockpiling, retention, or transfer of anti-personnel landmines and provides for their destruction.[376]
  • The Biological Weapons Convention (1972)[377] prohibits the development, production, stockpiling, acquisition, or retention of microbial or other biological agents or toxins where the types or quantities are such that there is no justification for prophylactic, protective, or other peaceful purposes.
  • The Chemical Weapons Convention (1993)[378] prohibits the development, production, acquisition, stockpiling, retention, direct or indirect transfer, or use of chemical weapons, preparing for their use or assisting, encouraging, or inducing any person to do any of these things.
  • The Convention on Cluster Munitions (2008)[379] prohibits the use, development, production, acquisition, stockpiling, retention, and direct or indirect transfer of cluster munitions and forbids assistance, encouragement, or inducement of any of these activities.[380]

As noted above, whether AWS (however defined) should be the subject of a preemptive prohibition remains an area of discussion and debate. As of August 2016, 16 states have stated that there is a need for a ban on fully autonomous weapons or have made statements indicating that they are favorably disposed toward the idea.[381]

Some advocates of a preemptive ban have pointed to the development of the Protocol on Blinding Lasers (CCW Protocol IV) as a relevant precedent. However, commentators have noted a number of distinguishing factors between permanently-blinding lasers and AWS. The combined analyses of two scholars suggest that, in general, a weapons ban is more likely to be successful where:

  • The weapon is ineffective;
  • Other means exist for accomplishing a similar military objective;
  • The weapon is not novel: it is easily analogized to other weapons, and its usages and effects are well understood;
  • The weapon or similar weapons have been previously regulated;
  • The weapon is unlikely to cause social or military disruption;
  • The weapon has not already been integrated into a state’s armed forces;
  • The weapon causes superfluous injury or suffering in relation to prevailing standards of medical care;
  • The weapon is inherently indiscriminate;
  • The weapon is or is perceived to be sufficiently notorious to galvanize public concern and spur civil society activism;
  • There is sufficient state commitment in enacting regulations;
  • The scope of the ban is clear and narrowly tailored; or
  • Violations can be identified.[382]

According to one of those scholars, “[o]f these, only a single factor – civil society engagement – supports the likelihood of a successful ban on autonomous weapon systems; the others are irrelevant, inconclusive, or imply that autonomous weapon systems will resist regulation.”[383] The extent to which states agree or disagree with these arguments seems likely to shape whether states will take more concrete steps towards a preemptive ban concerning AWS.

Weapons: Unlawful Per Se — Of a Nature to Cause Superfluous Injury or Unnecessary Suffering

Pursuant to Article 35(2) of AP I, “[i]t is prohibited to employ weapons, projectiles and material and methods of warfare of a nature to cause superfluous injury or unnecessary suffering.”[384] According to Bill Boothby, “[t]his is now a customary rule of law that binds all States in all types of armed conflict.”[385] Accordingly, to not be unlawful, a war algorithm must not be of a nature to cause superfluous injury or unnecessary suffering.

Weapons: Unlawful Per Se — Indiscriminate by Nature

In addition to the customary superfluous-injury principle, “[t]he second, equally important, customary weapons law principle holds that weapons that are indiscriminate by nature are prohibited.”[386] The principle is derived in part from Article 51(4) of AP I. That provision prohibits indiscriminate attacks that are defined as including attacks “which employ a method or means of combat which cannot be directed at a specific military objective; or … which employ a method or means of combat the effects of which cannot be limited” as required by AP I and which consequently are of a nature to strike military objectives and civilians or civilian objects without distinction.[387] Thus, according to Switzerland, “in order for an AWS to be lawful under this rule [prohibiting indiscriminate-by-nature weapons], it must be possible to ensure that its operation will not result in unlawful outcomes with respect to the principle of distinction.”[388]

Weapons: Unlawful by Use — Failure to Conform to Principles Governing Conduct of Hostilities

As noted above, where a weapon is not unlawful per se it may nonetheless be considered unlawful based on a particular use. In relation to this factor, only that unlawful use of the weapon, not the weapon itself, would be illegal. To avoid contravening IHL, in an armed conflict a direct attack using a weapon that is not unlawful per se must comport with IHL principles governing the conduct of hostilities.

The three such principles most frequently cited in discussions of AWS are distinction, proportionality, and precautionary measures. Each of these principles has IHL treaty roots and customary cognates. According to Switzerland, the basic guidelines in relation to AWS are as follows:

Most notably, in order to lawfully use an AWS for the purpose of attack, belligerents must: (1 - Distinction) distinguish between military objectives and civilians or civilian objects and, in case of doubt, presume civilian status; (2 - Proportionality) evaluate whether the incidental harm likely to be inflicted on the civilian population or civilian objects would be excessive in relation to the concrete and direct military advantage anticipated from that particular attack; (3 - Precaution) take all feasible precautions to avoid, and in any event minimize, incidental harm to civilians and damage to civilian objects; and cancel or suspend the attack if it becomes apparent that the target is not a military objective, or that the attack may be expected to result in excessive incidental harm.[389]

With respect to the principle of proportionality and AWS, the U.S. DoD Law of War Manual states that “in the situation in which a person is using a weapon that selects and engages targets autonomously, that person must refrain from using that weapon where it is expected to result in incidental harm that is excessive in relation to the concrete and direct military advantage expected to be gained.”[390]

Regarding precautions in attack, the wording of Article 57(2) of AP I raises the question of whether some of the precautionary-measures obligations laid down therein may be carried out, as a matter of treaty law, only by humans (compared with other obligations therein, which are reposed in the party to the armed conflict). Consider how Article 57(2)(a) of AP I lays down obligations of “those who plan or decide upon an attack.”[391] But Article 57(2)(b)–(c) of AP I frames the obligations, respectively, as “an attack shall be cancelled or suspended”[392] and “effective advance warning shall be given.”[393]

For their part, the authors of the U.S. DoD Law of War Manual emphasize their view that “[t]he law of war rules on conducting attacks (such as the rules relating to discrimination and proportionality) impose obligations on persons. These rules do not impose obligations on the weapons themselves; of course, an inanimate object could not assume an ‘obligation’ in any event.”[394] According to this view, “the obligation on the person using the weapon to take feasible precautions in order to reduce the risk of civilian casualties may be more significant when the person uses weapon systems with more sophisticated autonomous functions.”[395] As an example, the Manual authors state that “such feasible precautions a person is obligated to take may include monitoring the operation of the weapon system or programming or building mechanisms for the weapon to deactivate automatically after a certain period of time.”[396]

The UK MoD Joint Doctrine Note on unmanned aircraft systems discusses the obligations laid down in Additional Protocol I on the constant care that must be “taken in the conduct of military operations to spare civilians and civilian objects. This means that any system, before an attack is made, must verify that targets are military entities, take all feasible precautions to minimise civilian losses and ensure that attacks do not cause disproportionate incidental losses.”[397] The Joint Doctrine Note authors state that “[f]or automated systems, operating in anything other than the simplest of scenarios, this process will provide a severe technological challenge for some years to come.”[398]

While not focusing on AWS in particular, the UK MoD Joint Doctrine Note also addresses a situation where “a mission may require an unmanned aircraft to carry out surveillance or monitoring of a given area, looking for a particular target type, before reporting contacts to a supervisor when found.”[399] According to the Joint Doctrine Note authors, “[a] human-authorised subsequent attack would be no different to that by a manned aircraft and would be fully compliant with the LOAC [law of armed conflict], provided the human believed that, based on the information available, the attack met LOAC requirements and extant ROE [rules of engagement].”[400] The Joint Doctrine Note authors elaborate this line of reasoning, noting that, “[f]rom this position, it would be only a small technical step to enable an unmanned aircraft to fire a weapon based solely on its own sensors, or shared information, and without recourse to higher, human authority.”[401] This would be entirely legal, the Joint Doctrine Note concludes, “[p]rovided it could be shown that the controlling system appropriately assessed the LOAC principles (military necessity; humanity; distinction and proportionality) and that ROE were satisfied….”[402] Yet the authors highlight a number of additional factors to consider:

In practice, such operations would present a considerable technological challenge and the software testing and certification for such a system would be extremely expensive as well as time consuming. Meeting the requirement for proportionality and distinction would be particularly problematic, as both of these areas are likely to contain elements of ambiguity requiring sophisticated judgement. Such problems are particularly difficult for a machine to solve and would likely require some form of artificial intelligence to be successful.[403]

Finally in this connection, the Joint Doctrine Note notes that “the MOD currently has no intention to develop systems that operate without human intervention in the weapon command and control chain, but it is looking to increase levels of automation where this will make systems more effective.”[404]

According to the U.S. DoD Law of War Manual, “in many cases, the use of autonomy could enhance the way law of war principles are implemented in military operations. For example, some munitions have homing functions that enable the user to strike military objectives with greater discrimination and less risk of incidental harm.”[405] The Manual authors also note that “some munitions have mechanisms to self-deactivate or to self-destruct, which helps reduce the risk they may pose generally to the civilian population or after the munitions have served their military purpose.”[406]

In a similar connection, the UK MoD Joint Doctrine Note on unmanned aircraft systems states that “[s]ome fully automated weapon systems have already entered service, following legal review, and contributing factors – such as required timeliness of response – can make compliance with LOAC easier to demonstrate.”[407] The authors give an example of “the Phalanx and Counter-Rocket, Artillery and Mortar (C-RAM) systems that are already employed in Afghanistan,” arguing that “it can be clearly shown that there is insufficient time for a human initiated response to counter incoming fire.”[408] According to this view, “[t]he potential damage caused by not using C-RAM in its automatic mode justifies the level of any anticipated collateral damage.”[409]

Other potentially relevant conduct-of-hostilities considerations raised in relation to AWS include principles concerning prohibitions on the denial of quarter and on the protection of persons hors de combat (such as the wounded and sick hors de combat). For instance, in relation to denial of quarter, in the view of Switzerland, “[a]ny reliance on AWS would need to preserve a reasonable possibility for adversaries to surrender. A general denial of this possibility would violate the prohibition of ordering that there shall be no survivors or of conducting hostilities on this basis (denial of quarter).”[410]

Stepping back, we see that, where a war algorithm is capable of being used in relation to the conduct of hostilities in connection with an armed conflict, that possible use is already regulated by a number of IHL rules and principles. Few states, however, have offered detailed views on what implications may arise for such uses of war algorithms.

Other Functions in relation to Armed Conflict

IHL governs far more than just weapons and the conduct of hostilities. As the primary normative framework regulating armed conflict, IHL also lays down rules concerning such activities as capture, detention, and transfer of enemies; medical care to the wounded and sick hors de combat; and humanitarian access and assistance to civilian populations in need. Switzerland has noted, for instance, that it is conceivable that AWS “could be used to perform other tasks governed by IHL, such as the guarding and transport of persons deprived of their liberty or tasks related to crowd control and public security in occupied territories.”[411]

Martens Clause

With respect to AWS, the IHL “Martens clause” would, according to Switzerland, afford “an important fallback protection in as much as the ‘laws of humanity and the requirements of the public conscience’ need to be referred to if IHL is not sufficiently precise or rigorous.”[412] Pursuant to this line of reasoning, “not everything that is not explicitly prohibited can be said to be legal if it would run counter [to] the principles put forward in the Martens clause. Indeed, the Martens clause may be said to imply positive obligations where contemplated military action would result in untenable humanitarian consequences.”[413]

Seizure of Private Property Susceptible of Direct Military Use

In a situation of belligerent occupation (a type of international armed conflict), the Occupying Power may seize, among other things, “all kinds of munitions of war … even if they belong to private persons.”[414] Items so seized “must be restored and compensation fixed when peace is made.”[415] With respect to AWS, this provision may implicate, for example, the private property—including the software and hardware components involved in developing AWS—of individuals or commercial entities subject to a belligerent occupation.[416]

International Criminal Law

International criminal law (ICL) is a framework through which individual responsibility arises for international crimes. Under certain circumstances, the design, development, or use (or a combination thereof) of a war algorithm may form part of the conduct underlying an international crime. Recognized categories of international crimes include war crimes, genocide, and crimes against humanity. Each international crime is made up of a prohibited act or acts (the actus reus or actus reī) and the prohibited mental state (the mens rea). War crimes may arise only in relation to armed conflict. Genocide and crimes against humanity may arise outside of situations of armed conflict (though they often do in fact arise in relation to armed conflict). Here, we focus on the Statute of the International Criminal Court (ICC),[417] though we note that other ICL rules—those derived from applicable treaties or customary international law—also may be relevant.

Various states and commentators disagree on whether ICL, especially in relation to war crimes, sufficiently addresses the design, development, and use of AWS. The discussion is hampered by lack of agreement on the definition of AWS, on the technological capabilities of AWS, and on the nature of the relationship between the various actors involved in the development and operation of AWS. These disagreements implicate underlying legal concepts of attribution, control, foreseeability, and reconstructability.

Much of the debate on AWS in relation to ICL revolves around modes of responsibility for international crimes and the mental element of international crimes.[418] Those arguing that ICL is sufficient to address AWS concerns typically emphasize that, ultimately, a single person—often, the commander or superior—may and should be held responsible where, in connection with an armed conflict, the design, development, or use of an AWS gives rise to an international crime.[419] Those arguing that ICL may not be sufficient typically emphasize that the ICL modes of command and superior responsibility are predicated on relationships between humans, not on relationships between humans and machines or constructed systems. (The ICC Statute establishes jurisdiction for individual responsibility only over natural persons, thereby excluding legal entities such as corporations.) They also note that it might not be possible, due to a lack of a temporal nexus to an armed conflict, to prosecute a developer who, before the war began, coded an AWS to function in a way that later gives rise to a war crime.[420] Critics also argue that due to the distributed nature of technical and physical control over an operation involving an AWS, it may not be possible to establish the relevant intent and knowledge of a particular perpetrator. Or, they assert, even if it is possible to establish the mental element, a perpetrator may argue to exclude criminal responsibility due to a mistake of fact, given how complex the operation of an AWS may be.

Arms-Transfer Law

The Arms Trade Treaty of 2013 (ATT)[421] may implicate war algorithms that form part of the conventional arms and certain other items covered by that instrument. It may do so not only with respect to exporting and importing states parties but also in connection with trans-shipment states parties.

The ATT regulates certain activities of the international trade in arms—in particular, “export, import, transit, trans-shipment and brokering,” all of which fall under the umbrella term of “transfer.”[422] Many of the arms and related items covered by the treaty already use war algorithms. In relation to states parties, the treaty applies in respect of all conventional arms within eight categories: battle tanks, armored combat vehicles, large-caliber artillery systems, combat aircraft, attack helicopters, warships, missiles and missile launchers, and small arms and light weapons.[423] The ATT also regulates the export of “ammunition/munitions fired, launched or delivered by”[424] such conventional weapons, as well as of “parts and components where the export is in a form that provides the capability to assemble the [relevant] conventional arms.”[425] (The ATT expressly does “not apply to the international movement of conventional arms by, or on behalf of, a State Party for its use provided that the conventional arms remain under that State Party’s ownership.”[426])

As part of the regulatory system established by the ATT, a state party is prohibited from authorizing any transfer of conventional arms or other covered items in three situations. First, the state party may not authorize such a transfer if it “would violate its obligations under measures adopted by the United Nations Security Council acting under Chapter VII of the Charter of the United Nations, in particular arms embargoes.”[427] Second, an authorization is prohibited if the transfer “would violate its relevant international obligations under international agreements to which it is a Party, in particular those relating to the transfer of, or illicit trafficking in, conventional arms.”[428] And third, an authorization is prohibited if the state party “has knowledge at the time of authorization that the arms or items would be used in the commission of … grave breaches of the Geneva Conventions of 1949, attacks directed against civilian objects or civilians protected as such, or other war crimes as defined by international agreements to which it is a Party.”[429]

Even if the export is not prohibited under one of those stipulations, the ATT imposes an obligation not to authorize the export where the state party determines “that there is an overriding risk of any of the negative consequences” identified in a provision of the treaty.[430] Those consequences include the potential that the conventional arms or other covered items:

(a) would contribute to or undermine peace and security;

(b) could be used to:

(i) commit or facilitate a serious violation of international humanitarian law;

(ii) commit or facilitate a serious violation of international human rights law;

(iii) commit or facilitate an act constituting an offence under international conventions or protocols relating to terrorism to which the exporting State is a Party; or

(iv) commit or facilitate an act constituting an offence under international conventions or protocols relating to transnational organized crime to which the exporting State is a Party.[431]

Also, pursuant to the ATT, each export state party “shall make available appropriate information about the authorization in question, upon request, to the importing State Party and to the transit or trans-shipment States Parties, subject to its national laws, practices or policies.”[432] Finally, each state party “involved in the transfer of conventional arms covered under Article 2 (1) [of the ATT] shall take measures to prevent their diversion.”[433]

The upshot is that, under the ATT, a detailed and somewhat expansive regime exists to regulate the transfer of war algorithms where those algorithms form part of certain conventional weapons and related items.

International Human Rights Law

While IHL traces its roots to the regulation of interstate wars, international human rights law (IHRL) arose out of an attempt to regulate, as a matter of international law and policy, the relationship between the state—through its governmental authority—and its population. Unlike the relatively narrow war-related field of IHL, IHRL spans a seemingly ever-growing range of dealings an individual, community, or nation may have with the state.

In recent decades, the connection between IHL and IHRL has been the subject of increased jurisprudential treatment and interpretation by states. The precise links between the two branches of public international law have also merited extensive academic commentary. The debate on this relationship is largely over three issues. First, whether IHRL applies extraterritorially such that states bring all, some, or none of their obligations with them when they fight wars under IHL outside of their territories. Second, whether organized armed groups have IHRL obligations (or, at least, responsibilities). And third, what is the apposite interpretive procedure or principle to use when discerning the content of a particular right under the relevant framework(s).

With these considerations in mind, IHRL may impose substantive obligations on a state party to an armed conflict concerning the design, development, or use of a war algorithm. These obligations may range, for instance, from violations of the right to privacy to the right not to be arbitrarily deprived of life. That is, of course, not an exhaustive list, but it demonstrates the wide array of rights under IHRL that a war algorithm might implicate. IHRL might also implicate state obligations in relation to the design, development, and use of war algorithms during times of peace.

Law of the Sea

As illustrated in section 2, many of the existing weapon systems with autonomous functions operate in the sea.[434] A number of provisions of the 1982 U.N. Convention on the Law of the Sea (UNCLOS),[435] “many of which are recognised as stating customary international law, … apply to ships with mounted autonomous weapon systems and possibly to independent seafaring autonomous weapon systems.”[436] Among these are the UNCLOS articles outlining “state obligations to protect and preserve both the marine environment generally and specific areas, such as the seabed and ocean floor,”[437] as well as the general prohibition on the threat of force or the use of force.[438] Furthermore, “[i]n addition to providing that the high seas ‘shall be reserved for peaceful purposes’, UNCLOS sets forth a number of prohibitions applicable to ships equipped with autonomous weapon systems that wish to exercise rights to innocent and transit passage.”[439] Finally, “[w]hile automated and autonomous weapon systems have long been used on warships, future autonomous weapon systems may themselves be warships.” Accordingly, “[s]hould they be granted warship status, such systems would gain certain rights and associated obligations.”[440]

Space Law

Guidance concerning the design, use, and liability of war algorithms in outer space in relation to armed conflict may be found in the 1967 Outer Space Treaty,[441] other space-law treaties, and various U.N. General Assembly declarations.[442] Yet “aside from a few plain prohibitions,” “the ‘ceiling’ of space law regulation is sky high … it allows for a wide range of potential extraterrestrial autonomous weapon systems”[443] and of war algorithms more broadly.

One such prohibition—laid down in the Outer Space Treaty, which may be binding as a codification of international law[444]—is on the use of space for destructive purposes. In particular, states parties to the Outer Space Treaty “undertake not to place in orbit around the Earth any objects carrying nuclear weapons or any other kinds of weapons of mass destruction, install such weapons on celestial bodies, or station such weapons in outer space in any other manner.”[445] Among the other issues raised in this context include jurisdiction, control over objects launched into space, international responsibility for activities in space, and international liability for damage caused by space-based objects.[446]

International Telecommunications Law

Constructed systems that use the electromagnetic spectrum or international telecommunications networks in effectuating war algorithms may be governed in part by telecommunications law. That law is regulated primarily by the International Telecommunications Union (ITU).[447] Scholars have already raised AWS in relation to telecommunications law,[448] including with respect to obligations to legislate against certain “harmful interference,” preserving the secrecy of international correspondence and military radio installations, as well as exceptions concerning certain uses of military installations.[449]

[281].  See infra Section 4.

[282]. One field of international law that we do not address but that might merit attention is international trade law, perhaps especially to the extent that it is used as a framework for developing technology-related standards and procedures at the national and international levels.

[283].  See James R. Crawford, State Responsibility, in Max Planck Encyclopedia of Public International Law ¶ 3 (2006).

[284].  See Draft Articles on Responsibility of States for Internationally Wrongful Acts with Commentaries arts. 4–11, Report of the International Law Commission, 53d Sess., Apr. 23-June 1, July 2-Aug. 10, 2001, U.N. Doc. A/56/10, U.N. GAOR 56th Sess., Supp. No. 10 (2001), http://legal.un.org/ilc/texts/instruments/english/commentaries/9_6_2001.pdf [hereinafter Draft Articles].

[285].  Id. at arts. 12–15.

[286].  Id. at arts. 20–25.

[287].  Crawford, supra note 283, at ¶ 3.

[288].  Draft Articles, supra note 284, at art. 4(1).

[289].  Id. at art. 5.

[290].  Id. at art. 6.

[291].  Id. at art. 7.

[292].  Id. at art. 8.

[293].  Id. at art. 9.

[294].  Id. at art. 10(1); see also id. at art. 10(2)–(3).

[295].  Id. at art. 11.

[296].  See Rosalyn Higgins, Problems and Process: International Law and How We Use It 162 (1995).

[297].  Pietro Sullo & Julian Wyatt, War Reparations, in Max Planck Encyclopedia of Public International Law ¶ 5 (2015) (citing to the 2001 International Law Commission Draft Articles on Responsibility of States for Internationally Wrongful Acts (art. 31 and arts. 34–37)).

[298].  Sullo & Wyatt, supra note 297, at ¶ 5.

[299].  See, e.g., Vienna Convention on the Law of Treaties art. 2(1)(a), May 23, 1969, 1155 U.N.T.S. 133; Restatement (Third) of the Foreign Relations Law of the United States § 301(1) (1987).

[300].  Michael Wood (Special Rapporteur), Int’l Law Comm’n, Second Report on Identification of Customary International Law, at 20, U.N. Doc. A/CN.4/672 (2014), http://daccess-ods.un.org/access.nsf/Get?Open&DS=A/CN.4/672&Lang=E [hereinafter Wood, Second Report]. Though the International Law Commission (ILC) Drafting Committee ultimately did not include this definition in its subsequent report, this exclusion was related to concerns about redundancy, not objections to its content. See Gilberto Saboia (Chairman of the Drafting Committee), Int’l Law Comm’n, Identification of Customary International Law, at 4 (2014), http://legal.un.org/ilc/sessions/66/pdfs/english/dc_chairman_statement_identification_of_custom.pdf.

[301].  Katie King and Joshua Kestin provided extensive research assistance for this section.

[302].  See, e.g., Int’l Law Comm’n, Identification of Customary International Law: Text of the Draft Conclusions Provisionally Adopted by the Drafting Committee, draft conclusion 2, U.N. Doc. A/CN.4/L.869 (2015), https://documents-dds-ny.un.org/doc/UNDOC/LTD/G15/156/93/PDF/G1515693.pdf?OpenElement; Wood, Second Report, supra note 300, at 9, 21–27.

[303].  Int’l Law Comm’n, supra note 302, at draft conclusion 5.

[304].  Wood, Second Report, supra note 302, at 24 (quoting the explanation of various states). See also Michael Wood (Special Rapporteur), Int’l Law Comm’n, Third Report on Identification of Customary International Law, at 13, U.N. Doc. A/CN.4/682 (2015), https://documents-dds-ny.un.org/doc/UNDOC/GEN/N15/088/91/PDF/N1508891.pdf?OpenElement [hereinafter Wood, Third Report]; Int’l Law Comm’n, supra note 302, at draft conclusion 9 (“The requirement, as a constituent element of customary international law, that the general practice be accepted as law (opinio juris) means that the practice in question must be undertaken with a sense of legal right or obligation”).

[305].  See, e.g., id.

[306].  U.N. Office at Geneva, 2010 Meeting of Experts, Disarmament, http://www.unog.ch/80256EE600585943/(httpPages)/701141247B6C85E7C12576F200587847?OpenDocument (last visited March 12, 2016).

[307].  See, e.g., Customary International Humanitarian Law 1338, 3164 (Jean-Marie Henckaerts and Louise Doswald-Beck eds., 2005), https://www.icrc.org/eng/assets/files/other/customary-international-humanitarian-law-ii-icrc-eng.pdf (citing remarks at a meeting of experts as evidence related to state practice on deception and a Colombian Ministry of Foreign Affairs working paper presented at a meeting of experts as evidence of state practice). The same International Committee of the Red Cross (ICRC) study also took statements at CCW conferences as evidence of state practice, both when at official States Parties conferences, see, e.g., id. at 1965 (citing China’s remarks about blinding lasers; however, since these remarks were made a year after China adopted the protocol banning blinding lasers and are generally an endorsement of that protocol, it is not clear what added value they have), and in preparatory or implementation gatherings, see, e.g., id. at 1966 (noting India’s statement at the Third Preparatory Committee for the Second Review Conference of States Parties to the CCW that it “fully supported the idea of expanding the scope of the CCW to cover armed internal conflicts”). Even if one is not willing to accept the ICRC’s assessment of what qualifies as state practice, see, e.g., John Bellinger & William Haynes, A U.S. Government Response to the International Committee of the Red Cross Study Customary International Humanitarian Law, 89 Int’l. Rev. Red Cross 443, 444–46 (2007), https://www.icrc.org/eng/assets/files/other/irrc_866_bellinger.pdf, international tribunals like the International Tribunal for the Former Yugoslavia have accepted states’ remarks before the United Nations General Assembly as state practice, see Prosecutor v. Tadic, Case No. IT-94-1-I, Decision on Defence Motion for Interlocutory Appeal on Jurisdiction, para. 120 (Int’l Cri. Trib. For the Former Yugoslavia Oct. 2, 1995), as well as statements before national legislatures, see id. at para. 100. Statements at a meetings of experts are similarly public, recorded, and made by state representatives in an official capacity. Further, at least one International Court of Justice judge has also declared that “the positions taken up by the delegates of States in international organizations and conferences…naturally form part of State practice.” Barcelona Traction, Light and Power Company Limited (Belgium v. Spain), Judgment, 3 I.C.J. Rep 286, para. 302 (Feb. 5, 1970) (Ammoun, J., separate opinion), http://www.icj-cij.org/docket/index.php?p1=3&p2=3&case=50&p3=4. Statements at the Meeting of the Experts would fulfill that description.

[308].  See Henrik Meijers, On International Customary Law in the Netherlands, in On the Foundations and Sources of International Law 77, 85 (Ige F. Dekker & Harry H.G. Post eds., 2003) (A “declaration by a state which implies no more than that it is in favor of a proposed rule becoming law, does not contribute to the formation of…custom” because “[i]f one declares to be in favour of something happening in [the] future, that ‘something’ has not yet taken place in the present, and no present practice (relating to that something) can have been formed yet”).

[309].  See, e.g., Statement of Israel, Characteristics of LAWS (Part II), http://www.unog.ch/80256EDD006B8954/(httpAssets)/AB30BF0E02AA39EAC1257E29004769F3/$file/2015_LAWS_MX_Israel_characteristics.pdf (“During the discussions, delegations have made use of various phrases referring to the appropriate degree of human involvement in respect to LAWS. Several States mentioned the phrase ‘meaningful human control’. Several other States did not express support for this phrase. Some of them thought that it was too vague, and the alternative phrasing ‘appropriate levels of human judgment’ was suggested. We have also noted, that even those who did choose to use the phrase ‘meaningful human control’, had different understandings of its meaning. Some of its proponents had in mind human control or oversight of each targeting action in real-time, while others thought that, at least from a perspective of ensuring compliance with IHL, the preset by a human of certain limitations on the way a lethal autonomous system would operate, may also amount to meaningful human control. In our view, it is safe to assume that human judgment will be an integral part of any process to introduce LAWS, and will be applied throughout the various phases of research, development, programming, testing, review, approval, and decision to employ them. LAWS will not actually be making decisions or exercising judgment by themselves, but will operate as designed and programmed by humans”).

[310].  See Appendices I and II.

[311].  North Sea Continental Shelf Cases (Germany v. Denmark; Germany v. Netherlands), Judgment, 1969 I.C.J. Rep. 3, para. 73 (Feb. 1969) (“State practice, including that of States whose interests are specially affected, should have been both extensive and virtually uniform in the sense of the provision invoked;—and should moreover have occurred in such a way as to show a general recognition that a rule of law or legal obligation is involved”).

[312].  Wood, Second Report, supra note 300, at 38–39 (internal citations omitted).

[313].  See, e.g., Yoram Dinstein, The Interaction between Customary International Law and Treaties 288–89 (2007).

[314].  See, e.g., Ward Ferdinandusse, Book Review, 53 Netherlands Int’l L. Rev. 502, 504 (2006) (“it may be asked whether there are specially affected states in IHL at all. It is easy to see how the concept of specially affected states is useful when discussing delimitation of the continental shelf: some states have a continental shelf to delimit while other states do not and, one may assume, never will. There is an aspect of permanency there which is lacking in IHL. Belligerent states, one may hope, are the peace makers of tomorrow. Occupied states may be the occupiers of tomorrow. Customary rules develop slowly and should be stable enough to withstand such changing of positions. Moreover, one would think that it is irreconcilable with the very character of IHL to grant specially affected status to the manufacturers of certain dubious weapons, just as it would have been problematic at least to grant South-Africa specially affected status with regard to the question of apartheid”). See also Richard Price, Emerging Customary Norms and Anti-Personnel Landmines, in The Politics of International Law 106, 120–21 (Christian Reus-Smit ed., 2004); Jean-Marie Henckaerts, Customary International Humanitarian Law: Taking Stock of the ICRC Study, 78 Nordic J. Int’l L. 435, 446 (2010).

[315].  Harry H.G. Post, The Role of State Practice in the Formation of Customary International Humanitarian Law, in On the Foundations and Sources of International Law 129, 142 (Ige F. Dekker & Harry H.G. Post eds., 2003). See also Dinstein, supra note 313, at 293; Customary International Humanitarian Law, supra note 307, at xliv–xlv (“Concerning the question of the legality of the use of blinding laser weapons, for example, ‘specially affected States’ include those identified as having been in the process of developing such weapons”). Cf. H.W.A. Thirlway, International Customary Law and Codification: An Examination of the Continuing Role of Custom in the Present Period of Codification of International Law 71–72 (stating that, in relation to laws for outer space, specially affected states would be those “actually or potentially in control of the economic and scientific assets necessary for the exploration of space,” and that it might even be unnecessary to look beyond those states to determine the relevant state practice).

[316].  See Kenneth Anderson & Matthew Waxman, Law and Ethics for Autonomous Weapon Systems: Why a Ban Won’t Work and How the Laws of War Can, American University Washington College of Law Research Paper No. 2013-11, at 1 (2013), http://ssrn.com/abstract=2250126.

[317].  At least in 2015, Germany did somewhat differentiate itself, drawing a “red line” about the need for meaningful human control and calling for states to “take care to closely monitor the development and introduction of any new weapon system to guarantee that there will be no transgression.”

[318].  This sort of argument would not be too far removed from some states’ claims before the International Court of Justice (ICJ) that the potentially world-affecting damage nuclear weapons could create should mean that all states qualify as specially affected, see Hugh Thirlway, The Sources of International Law, in International Law 91, 99 (Malcolm D. Evans ed., 2014). The ICJ did not weigh in on the validity of this claim. Still, if anything, the sort of argument outlined above would be less extreme than the nuclear-weapons claim, since, it seems, AWS might be capable of being more geographically limited than nuclear weapons. That argument would nevertheless rely on states believing that they could accurately predict where AWS would be used, if the customary law was to precede their development.

[319].  When states advocate the need to regulate AWS, the need for meaningful human control, or the need for an Article 36 review, they are not necessarily suggesting that any of these steps, on their own, would adequately address the issues presented by autonomous weapons. Rather, states often presented these actions as necessary but not sufficient steps to effectively dealing with AWS. Additionally, this table is not intended to and does not necessarily represent a comprehensive, accurate list of all states’ current positions on AWS. One reason for this fact is that it represents states’ positions as assessed through both the 2015 and 2016 meetings; a state’s position could have changed between 2015 and 2016, but both the 2015 and 2016 positions would be listed here. Also, the table generally excludes states’ remarks outside of the written statements they offered at these two meetings. There are several exceptions, which are noted through footnotes.

[320].  In this context, Austria concludes only that the technology as it currently stands is unlawful; though concerned about future versions also being unlawful, Austria does not categorically state that lawfulness would be impossible.

[321].  Chile’s position on this issue is slightly ambiguous. Some of its statements clearly indicate that it believes that fully autonomous weapons are unlawful, but some of its other statements seem to suggest that those weapons should simply be regulated. (This raises the question whether Chile believes that AWS would become lawful if we simply regulated their use.)

[322].  In this context, Germany never explicitly uses the word “unlawful.” Nevertheless, Germany has given strong indications that it considers the use of lethal force by fully autonomous weapon systems to be illegitimate. Not only does Germany explicitly state that it is “not acceptable” for a weapon system to have control over life and death, but Germany portrays its current stance as a repetition of the stance that it took in last year’s meeting. (In last year’s meeting, Germany stated that it considered AWS to be unlawful.)

[323].  In this context, Poland indicated only that a fully autonomous weapon system would not be allowed, but it was very careful to indicate that it believes that such weapon systems do not yet exist. Therefore, Poland does not believe that any autonomous weapon systems, as they currently exist, are unlawful. But its Human Rights and Ethical Issues Statement does suggest that if a fully autonomous weapon system were to be developed in the future, it would “not be allowed.” (As with Germany, however, Poland does not explicitly use the word “unlawful,” though Poland’s statement that fully autonomous weapon systems would “not be allowed” seems to suggest that such systems would indeed be illegal.)

[324].  Scholarly debates about AWS are often framed as a choice between regulation and a ban. However, when states at the 2015 and 2016 CCW Informal Meeting of Experts have discussed regulation, it is not clear that they were implying regulation was to be preferred over a ban; often, those endorsing regulation seemed to be conceiving of the act as distinguished from doing nothing, not in contrast to a ban.

[325].  The Holy See has also spoken in favor of a ban (for example, in a written statement for the 2015 CCW Meeting of Experts). However, as it is not a state, see Gerd Westdickenberg, Holy See, in Max Planck Encyclopedia of Public International Law (James R. Crawford, ed., 2006) (“The Holy See is neither a State nor only an abstract entity like an international organization….The international personality the Holy See enjoys as a unique entity and the sovereignty it exercises are different from those of other subjects of international law, be it States, international organizations like the International Committee of the Red Cross (ICRC), or [other] subject[s] of international law…[Its] international legal personality can best be defined as being ‘sui generis’”), the Holy See has not been included in this table or any of the ones that follow in Appendices I and II.

[326].  Bolivia did not express its desire for a ban via a written statement at the 2015 or 2016 CCW Meeting of Experts, but it did reportedly offer an oral statement favoring a ban at the 2015 CCW Meeting of Experts. See Campaign to Stop Killer Robots, Report on Activities: Convention on Conventional Weapons Second Informal Meeting of Experts on Lethal Autonomous Weapons Systems 25 (2015), http://www.stopkillerrobots.org/wp-content/uploads/2013/03/KRC_CCWx2015_Report_4June2015_uploaded.pdf (“Bolivia made a late statement—its first on the matter—that called for a ban on fully autonomous weapons systems, citing concerns that the right to life should not be delegated and doubts that international humanitarian and human rights law is sufficient to deal with the challenges posed”). Bolivia’s position has been included here to more fully represent states’ attitudes on an important issue.

[327].  In 2015, Croatia did not necessarily endorse a ban on all AWS but seemed to at least indicate it would be favorably inclined toward efforts to ban any AWS that did not involve “meaningful human control;” Croatia also repeatedly indicated that the option of a ban or moratorium should still be on the table. See Appendices I and II for more.

[328].  At the 2015 or 2016 CCW Meeting of Experts, Egypt did not express its desire for a ban via a written statement. It has, however, orally indicated a preference for a moratorium on the development of AWS until more debate has occurred. See Appendices I and II for more. Egypt’s position has been included here to more fully represent states’ attitudes on an important issue.

[329].  At the 2015 or 2016 CCW Meeting of Experts, Mexico did not express its desire for a ban via a written statement. It did, however, orally indicate a preference for a ban during the 2016 meeting. See Appendices I and II for more. Mexico’s position has been included here to more fully represent states’ attitudes on an important issue.

[330].  At the 2015 or 2016 CCW Meeting of Experts, Nicaragua did not express its desire for a ban via a written statement. It did, however, orally indicate a preference for a ban during the 2016 meeting. See Appendices I and II for more. Nicaragua’s position has been included here to more fully represent states’ attitudes on an important issue.

[331].  Sierra Leone did not explicitly call for a ban but is seemingly against any AWS not under human control. See Appendices I and II for more.

[332].  At the 2015 or 2016 CCW Meeting of Experts, Palestine did not express its desire for a ban via a written statement (it did offer a written statement for the 2015 meeting, but it is not available online, and no press reports cite that 2015 statement as announcing Palestine favored a ban). Palestine did, however, orally indicate a preference for a ban during the 2015 CCW meeting (not the Meeting of Experts). See Appendices I and II for more. Palestine’s position has been included here to more fully represent states’ attitudes on an important issue.

[333].  Zambia believes a prohibition on the use of AWS should be “on the CCW agenda.” See Appendices I and II for more.

[334].  At the 2015 or 2016 CCW Meeting of Experts, Zimbabwe did not express its desire for a ban via a written statement. It did, however, orally indicate a preference for a ban during the 2016 CCW meeting (not the Meeting of Experts). See Appendices I and II for more. Zimbabwe’s position has been included here to more fully represent states’ attitudes on an important issue.

[335].  Other states spoke about the importance of proper national review but did not necessarily frame it in terms of an international legal obligation or, more specifically, an obligation derived from Article 36 of AP I.

[336].  South Africa’s position on Article 36 is somewhat ambiguous. South Africa does not explicitly state that an Article 36 review is necessary, nor does South Africa discuss how it would plan to implement it. But South Africa’s General Statement directly quotes the language of Article 36 when discussing compliance with international law, strongly implying that an Article 36 review is important or relevant to assessing the legality of AWS.

[337].  This conclusion aligns with the statement in the U.S. DoD Law of War Manual that “[t]he law of war does not prohibit the use of autonomy in weapon systems.” Law of War Manual, supra note 110, at § 6.5.9; see also id. at § 6.9.5.2 (“The law of war does not specifically prohibit or restrict the use of autonomy to aid in the operation of weapons”).

[338].  On Security Council authorizations and self-defense, see, e.g., Oliver Dörr, Use of Force, Prohibition of, in Max Planck Encyclopedia of Public International Law ¶¶ 38, 40–42 (2015).

[339].  See Markus Wagner, Autonomous Weapon Systems, in Max Planck Encyclopedia of Public International Law ¶ 11 (2016) (arguing that “[w]hether a breach of a rule of ius ad bellum has occurred is a determination that is independent from the type of weapon that has been used….”).

[340].  Id.

[341].  Id.

[342].  Id.

[343].  See, e.g., Dörr, supra note 338, at ¶ 12.

[344].  Id. (citations omitted).

[345].  See, e.g., Jann Kleffner, supra note 17.

[346].  See, e.g., Michael Bothe, Law of Neutrality, in The Handbook of International Humanitarian Law (Dieter Fleck ed., 3rd ed. 2013).

[347].  See generally the Forum in 42 N.Y.U. J. Int’l Law & Pol. 3, 637 et seq. (2010).

[348].  See Geneva Convention for the Amelioration of the Condition of the Wounded and Sick in Armed Forces in the Field art. 1, Aug. 12, 1949, T.I.A.S. 3362 [hereinafter GC I]; Geneva Convention for the Amelioration of the Condition of Wounded, Sick and Shipwrecked Members of Armed Forces at Sea art. 1, Aug. 12, 1949, T.I.A.S. 3363 [hereinafter GC II]; Geneva Convention Relative to the Treatment of Prisoners of War art. 1, Aug. 12, 1949, T.I.A.S. 3364 [hereinafter GC III]; Geneva Convention Relative to the Protection of Civilian Persons in Time of War art. 1, Aug. 12, 1949, T.I.A.S. 3365 [hereinafter GC IV].

[349].  GC I, supra note 348, at art. 59; GC II, supra note 348, at art. 50; GC III, supra note 348, at art. 129; GC IV, supra note 348, at art. 146.

[350].  On the conflation between weapons and “means and methods of warfare,” at least in the context of Article 36 AP I weapons reviews, see generally Hin-Yan Liu, Categorization and Legality of Autonomous and Remote Weapons Systems, 94 Int’l Rev. Red Cross 627, 636 (2012).

[351].  Id. at 635.

[352].  Id. (citations omitted).

[353].  Id. at 636 (italics added).

[354].  Id. at 637.

[355].  Id.

[356].  Lt. Col. Christopher M. Ford, Stockton Center for the Study of International Law, Remarks at the 2016 Informal Meeting of Experts, at 4, UN Office in Geneva (April 2016), http://www.unog.ch/80256EDD006B8954/(httpAssets)/D4FCD1D20DB21431C1257F9B0050B318/$file/2016_LAWS+MX_presentations_challengestoIHL_fordnotes.pdf; see also U.K. Ministry of Def., supra note 113, at 5-3 (discussing factors concerning legal review and situation awareness of manned vs. unmanned aircraft systems).

[357].  This sub-section on weapons and IHL draws extensively on William H. Boothby, Prohibited Weapons, in Max Planck Encyclopedia of Public International Law (2015).

[358].  Convention No. VIII Relative to the Laying of Automatic Submarine Contact Mines, Oct. 18, 1907, 36 Stat. 2332.

[359].  Id. at art. 1.

[360].  Id.

[361].  Id. at art. 2.

[362].  Convention on the Prohibition of Military or Any Other Hostile Use of Environmental Modification Techniques, May 18, 1977, 31 U.S.T. 333, 1108 U.N.T.S. 15.

[363].  See id. at art. 1. See also AP I, supra note 12, at arts. 35(3) and 55.

[364].  Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, Oct. 10, 1980, 1342 U.N.T.S. 137, 19 I.L.M. 1523 [hereinafter CCW].

[365].  Boothby, supra note 357, at ¶ 16.

[366].  Protocol [I to the Convention on Prohibitions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects] on Non-Detectable Fragments, Oct. 10, 1980, 1342 U.N.T.S. 168.

[367].  Id.

[368].  Protocol [II to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects] on Prohibitions or Restrictions on the Use of Mines, Booby-Traps and Other Devices, Oct. 10, 1980, 1342 U.N.T.S. 168.

[369].  Id. at art. 2-3.

[370].  Law of War Manual, supra note 110, at § 6.5.9.2 (internal reference omitted).

[371].  Protocol [III to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects] on Prohibitions or Restrictions on the Use of Incendiary Weapons art. 2(2), Oct. 10, 1980, 1342 U.N.T.S. 171.

[372].  Id. at art. 2.

[373].  Protocol [IV to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects] on Blinding Laser Weapons art. 1, Oct. 13, 1995, 1380 U.N.T.S. 370.

[374].  Id. at art. 1.

[375].  Convention on the Prohibition of the Use, Stockpiling, Production and Transfer of Anti-Personnel Mines and on Their Destruction, Sept. 18, 1997, 2056 U.N.T.S. 211, 242.

[376].  Id. at art. 1.

[377].  Convention on the Prohibition of the Development, Production, and Stockpiling of Bacteriological (Biological) and Toxin Weapons and on Their Destruction, art. 1, Apr. 10, 1972, 1015 U.N.T.S. 163.

[378].  Convention on the Prohibition of the Development, Production, Stockpiling and Use of Chemical Weapons and on their Destruction art. 1, Jan. 13, 1993, 1974 U.N.T.S. 317.

[379].  Convention on Cluster Munitions art. 1, May 30, 2008, 48 I.L.M. 357.

[380].  Id. at art. 1.

[381].  See supra Section 3: International Law pertaining to Armed Conflict — Customary International Law concerning AWS.

[382].  Rebecca Crootof, The Killer Robots Are Here: Legal and Policy Implications, 36 Cardozo L. Rev. 1837 (2014); Sean Watts, Regulation-Tolerant Weapons, Regulation-Resistant Weapons and the Law of War, 91 Int’l L. Stud. 541 (2015).

[383].  Rebecca Crootof, Why the Prohibition on Permanently Blinding Lasers is Poor Precedent for a Ban on Autonomous Weapon Systems, Lawfare (Nov. 24, 2015), https://www.lawfareblog.com/why-prohibition-permanently-blinding-lasers-poor-precedent-ban-autonomous-weapon-systems.

[384].  AP I, supra note 12, at art. 35(2) (emphasis added). See also Regulations Concerning the Laws and Customs of War on Land art. 23(e), annexed to Hague Convention (IV) Respecting the Laws and Customs of War on Land, Oct. 18, 1907, T.S. 539 [hereinafter 1907 Hague Regulations].

[385].  Boothby, supra note 357, at ¶ 10; see also, e.g., Law of War Manual, supra note 110, at § 6.5.9.2 (stating that “[i]n addition, the general rules applicable to all weapons would apply to weapons with autonomous functions. For example, autonomous weapon systems must not be calculated to cause superfluous injury ….”) (internal reference omitted).

[386].  Boothby, supra note 357, at ¶ 11; see also, e.g., Law of War Manual, supra note 110, at § 6.5.9.2 (stating that “[i]n addition, the general rules applicable to all weapons would apply to weapons with autonomous functions. For example, autonomous weapon systems must not … be inherently indiscriminate.”) (internal reference omitted).

[387].  AP I, supra note 12, at art. 51 (emphasis added).

[388].  Swiss, “Compliance-Based” Approach, supra note 74, at 3.

[389].  Id.

[390].  Law of War Manual, supra note 110, at § 6.5.9.3 (internal reference omitted).

[391].  AP I, supra note 12, art. 57(2)(a).

[392].  Id. at art. 57(2)(b).

[393].  Id. at art. 57(2)(c).

[394].  Law of War Manual, supra note 110, at § 6.5.9.3 (italics added).

[395].  Id.

[396].  Id.

[397].  U.K. Ministry of Def., supra note 113, at 5-2.

[398].  Id.

[399].  Id. at 5-4.

[400].  Id.

[401].  Id.

[402].  Id.

[403].  Id.

[404].  Id.

[405].  Law of War Manual, supra note 110, at § 6.5.9.2.

[406].  Id. (internal reference omitted).

[407].  U.K. Ministry of Def., supra note 113, at 5-2.

[408].  Id.

[409].  Id.

[410].  Swiss, “Compliance-Based” Approach, supra note 74, at 3 (citation omitted).

[411].  Id. (citation omitted) (noting that “[a]dditional specific rules need to be taken into consideration if AWS were to be relied for such activities”).

[412].  See, e.g., id. at 4 (citing to CCW, supra note 364, at preamble and AP I, supra note 12, at art. 1(2), and noting that “[i]n its 1996 Advisory Opinion on the legality of the threat or use of nuclear weapons, the International Court of Justice held that the clause ‘proved to be an effective means of addressing the rapid evolution of military technology’ (§78)”).

[413].  Id. at 3 (citing respectively, to AP I, supra note 12, at art. 57(2)(a) and to GCs I–IV, supra note 348, at arts. 49, 50, 129, 146 (respectively); AP I, supra note 12, at Section III.

[414].  1907 Hague Regulations, supra note 384, at art. 53(2).

[415].  Id.

[416].  According to the U.S. DoD Law of War Manual, “[p]rivate property susceptible of direct military use includes cables, telephone and telegraph facilities, radio, television, telecommunications and computer networks and equipment, motor vehicles, railways, railway plants, port facilities, ships in port, barges and other watercraft, airfields, aircraft, depots of arms (whether military or sporting), documents connected with the conflict, all varieties of military equipment (including that in the hands of manufacturers), component parts of, or material suitable only for use in, the foregoing, and, in general, all kinds of war material.” Law of War Manual, supra note 110, at § 11.18.6.2, citing to U.S. Dep’t of the Army, The Law of Land Warfare, 1956 FM 27-10 ¶410a (Change No. 1 1976).

[417].  Rome Statute of the International Criminal Court, July 17, 1998, 2187 U.N.T.S. 90 [hereinafter ICC Statute].

[418].  Id. at arts. 25(3) and 28.

[419].  See, e.g., Dutch Government, Response to AIV/CAVV Report, supra note 22.

[420].  See Tim McFarland & Tim McCormack, Mind the Gap: Can Developers of Autonomous Weapons Systems Be Liable for War Crimes?, 90 Int’l L. Stud. 361 (2014).

[421].  Arms Trade Treaty, Apr. 2, 2013, U.N. Doc. A/RES/67/234B [hereinafter ATT].

[422].  Id., at art. 2(2).

[423].  Id. at art. 1.

[424].  Id. at art. 3.

[425].  Id. at art. 4.

[426].  Id at art. 2(3).

[427].  Id. at art. 6(a).

[428].  Id. at art. 6(b).

[429].  Id. at art. 6(c).

[430].  Id. at art. 7(3).

[431].  Id. at art. 7(1).

[432].  Id. at art. 7(6).

[433].  Id. at art. 11(1).

[434].  The vast majority of scholars and states addressing AWS in relation to international law focus only on IHL and ICL; Rebecca Crootof has provided one of the most expansive analyses of various fields of public international law that might be implicated by AWS. Rebecca Crootof, The Varied Law of Autonomous Weapon Systems, in NATO Allied Command Transformation, Autonomous Systems: Issues for Defence Policy Makers 98, 109 (Andrew P. Williams & Paul D. Scharre eds., 2015) [hereinafter Crootof, Varied]. With respect to the law of the sea, space law, and international telecommunications law, we draw in part on her analysis.

[435].  United Nations Convention on the Law of the Sea, Dec. 10, 1982, 1833 U.N.T.S. 397 [hereinafter UNCLOS].

[436].  Crootof, Varied, supra note 434, at 109 (citation omitted).

[437].  Id. (citing to UNCLOS, supra note 435, at art. 192–196).

[438].  Id. (citing to UNCLOS, supra note 435, at art. 301)

[439].  Id. at 110 (citing to UNCLOS, supra note 435, at art. 88).

[440].  Id. (referring to the definition of “warship” in UNCLOS, supra note 435, at art. 29). Id. at 110 n.41.

[441].  Treaty on Principles Governing the Activities of States in the Exploration and Use of Outer Space, Including the Moon and Other Celestial Bodies, Jan. 27, 1967, 18 U.S.T. 2410, 610 U.N.T.S. 205 [hereinafter OST].

[442].  Crootof, Varied, supra note 434, at 111.

[443].  Id.

[444].  Id. (citation omitted).

[445].  OST, supra note 441, at art. IV.

[446].  Crootof, Varied, supra note 434, at 112 (citations omitted).

[447].  See Dietrich Westphal, International Telecommunication Union, in Max Planck Encyclopedia of Public International Law (2014).

[448].  Crootof, Varied, supra note 434, at 113–114.

[449].  Id. at 114 (citation omitted).