[Research publication] Retaining human responsibility in the development and use of autonomous weapon systems

Published 24 November 2022

Las Vegas - Nellis AFB (LSV / KLSV) Aviation Nation 2014 Air Show USA - Nevada, November 8, 2014 by T. Del Coro @ Wikimedia Commons

In a report for the Stockholm International Peace Research Institute (SIPRI), Marta Bo with Laura Bruun and Vincent Boulanin tackle how humans can be held responsible for violations of international humanitarian law involving autonomous weapons systems.

As the risk of autonomous weapons systems become increasingly pervasive, ensuring that humans retain responsibility for the development and use of autonomous weapons systems (AWS) becomes more urgent. Despite this urgency, the debate on human responsibility for international humanitarian law (IHL) violations involving AWS is lacking in the overall policy debate on autonomous weapons systems.

Holding states and individuals accountable
International law offers two main frameworks for attributing responsibility for autonomous weapons systems: state responsibility and individual criminal responsibility. The report explores both frameworks to help policymakers understand when to hold states and individuals accountable for IHL violations involving autonomous weapon systems.

It further highlights the specific issues that make it difficult to attribute responsibility when autonomous weapons systems are involved. Providing a comprehensive analysis of these issues helps inform policy measures to uphold respect for IHL and address some of the challenges connected to holding actors legally responsible.  

Key issues for autonomous weapons systems in IHL
The authors identify a few key issues where the policy debate in this area will need to provide answers:

  • IHL rules on conditions where AWS would give rise to state responsibility for internationally wrongful acts and individual criminal responsibility for war crimes
  • Schemes of responsibility among multiple human agents in the chain of development and use of AWS
  • Standards of intent, knowledge, behaviour, and care required from AWS users
  • Ability to trace back IHL violations to potential wrongdoers

As the report points out: ‘fundamental questions about how humans and machines could—and, importantly, should—interact in decisions to use force’ have gained significant attention within the Group of Governmental Experts (GGE) on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems.

While there is an agreement that responsibility must remain with humans, as machines cannot be held accountable for violations of IHL, in order to fully understand how ‘human responsibility’ should be retained in practice the key issues identified in this report will need to be addressed.

Read the full report

[Spring academy] Artificial intelligence and international law | 27-31 March 2023
The Spring academy artificial Intelligence and international law , is an annual interdisciplinary programme offering in-depth perspectives on AI and international law. It addresses fundamental issues at the intersection of theory and practice. The programme will cover the technical aspects of AI, the philosophy and ethics of AI, human rights in relation to AI, AI in international humanitarian law, AI and international responsibility and international governance. The spring academy provides an ideal venue to help you understand these aspects of AI through a short interactive course with plenty of room for discussion with your fellow multidisciplinary participants. Be the first to know when registrations open. Register your interest now.

Read more
Autonomous Weapons and The Responsibility Gap in light of the Mens Rea of the War Crime of Attacking Civilians in the ICC Statute Marta Bo in the Journal of International Criminal Justice, 2021.

In or out of control? Criminal responsibility of programmers of autonomous vehicles and autonomous weapon systems In a new paper, Asser Institute researcher Marta Bo examines when programmers may be held criminally responsible for harms caused by self-driving cars and autonomous weapons.

Lethal Autonomous Weapons: 10 things we want to know

Listen to Marta on the Lethal Autonomous Weapons Podcast by the Graduate Institute Geneva.

Also, listen to Marta’s interview on AWS and accountability for the LAW and the Future of War Podcast


About Marta Bo
Dr. Marta Bo is a researcher at the Asser Institute, research fellow the Graduate Institute for International and Development studies (Geneva) and associate senior researcher at SIPRI. She is currently researching on criminal responsibility for war crimes committed with autonomous weapon systems (LAWS and War Crimes Project); AI and criminal responsibility; automation biases and mens rea for crimes committed with autonomous or automated systems; disarmament and criminalisation. Marta is also responsible for the Asser-Cassese Initiative long-term capacity-building training project for judiciaries in international and transnational criminal law (ICL and TCL), international humanitarian law (IHL) and human rights law (HRL).

Marta Bo is part of the Asser research strand In the public interest: accountability of the state and the prosecution of crimes, which examines i) the accountability of states - individually and collectively (for instance at the level of the United Nations or the European Union) - in light of public interest standards in the context of counter-terrorism; and ii) the prosecution of individuals for international and transnational crimes in the public interest. Moreover, to ensure both the accountability of the state and the prosecution of individuals, this strand will also investigate iii) the role of journalists, the (new) media, human rights NGOs and academics in protecting and promoting public interest standards.

Dr Marta Bo LL.M.