• Swisstech Convention Center (map)
  • Lausanne
  • Switzerland

Applied Machine Learning Days at EPFL

Track: AI & Humanitarian Action

Session on “Weaponised AI: what are the implications?”

Panelists:

  • Dustin Lewis, Research Director, Harvard Law School Program on International Law and Armed Conflict

  • Helen Toner, Director of Strategy, Center for Security and Emerging Technology, Georgetown University

  • Nadia Marsan, Senior Assistant Legal Adviser, NATO Office of Legal Affairs

  • Neil Davison, Scientific & Policy Adviser, ICRC

Date and location

January 28, 2020 at SwissTech Conference Center, Lausanne

AI & Humanitarian Action Track Overview

Digitalization and new technologies play an increasingly important role in today’s humanitarian activities. Conflicts are more and more fragmented and complex, making it difficult for humanitarian organizations to access conflict areas and the vulnerable people affected. It is against this background that humanitarian organizations look with interest at the possibilities offered by AI and machine learning.

Meanwhile AI and machine learning are set to change the way in which wars are fought. Parties to armed conflict are looking to these technologies to enable novel weapons and methods of warfare, such as: increasingly autonomous weapons; new forms of cyber and information warfare; and ‘decision-support’ systems for targeting. It is critical to understand the foreseeable consequences of these developments, and to tackle accompanying legal question and ethical questions concerns.

To address these distinct dimensions, given the ICRC’s mandate both to protect and assist victims of armed conflict and to promote and strengthen international humanitarian law (the law of war), this full day track is divided in two parts:

  • In the morning session, we will explore a few common challenges through the lens of humanitarian action. The first panel will discuss how privacy challenges are different (or similar) in war-torn contexts. We will then address the difficulties for a humanitarian organization in deciding when a commercially available solution is adequate to be used in very unique settings. Finally, the third panel will explore how fairness can be addressed when AI-generated predictions are made about the lives of the most vulnerable people.

  • In the afternoon session, we will switch our attention to the use of machine learning in the conduct of warfare itself. The first panel will explore emerging military applications and consider potential implications for civilian protection, compliance with international humanitarian law, and ethical acceptability. Building on these discussions, the second panel will explore what human-centred AI means in practice for armed conflict. How can we preserve meaningful human control and judgment for tasks, and in decisions, that have serious consequences for people’s lives and are governed by specific rules of international humanitarian law?