[Research paper] In or out of control? Criminal responsibility of programmers of autonomous vehicles and autonomous weapon systemsPublished 23 September 2022
Partially automated driver-assist systems were involved in hundreds of crashes in the past year according to U.S. safety regulators. On top of that, a UN report on the Libyan civil war suggests that autonomous weapons are already in use. In this report, a panel of experts found that the STM Kargu drone system may have been involved in an attack where 32 students were killed.
As incidents with autonomous systems become more frequent, it becomes increasingly urgent to examine who may be held criminally responsible for such incidents.
The problem of many hands is a challenge for attributing criminal responsibility
Developing autonomous technology is a complex process, which requires many individuals with specialised skills. For instance, when Google was working on autonomous driving technology, approximately 170 people worked as a team with expertise ranging from exterior design to manufacturing. It complicates attributing criminal responsibility since incidents can be caused by many different issues and failures along the chain of development. These are often interlinked and can include technological malfunctions, communication errors, or even end-user mistakes. In fact, some manufacturers tend to shift the onus of responsibility to the end-user who is ultimately operating the system.
However, is the end-user really the one solely in control? End-users may not be able to actually fully understand, foresee, and anticipate the risks and possible failures involved in the use of autonomous vehicles and autonomous weapons. They do not necessarily have meaningful human control (MCH).
As Marta Bo points out in her paper, the notion of ‘meaningful human control’ could be useful in these complex incidents to determine when individual criminal responsibility may be attributed to the programmers.
Are programmers in control?
Marta argues that certain crucial decisions on the behaviour and effects of autonomous vehicles and autonomous weapons are made at the programming phase and remain relevant throughout the use of these systems. After all, it is the programmers that partly decide how traffic laws are embedded in the systems for autonomous vehicles. Furthermore, it is the programmers that partly decide how international humanitarian law (IHL) is implemented in autonomous weapons and how these systems might interact with their environment. The programmers, therefore, might exert a level of meaningful control over the system that continues even when they are operated by an end-user.
This form of control could make the programmers criminally responsible in cases where they could understand and foresee the risk of a crime committed with autonomous systems technology.
A last resort
Marta Bo cautions that ‘criminal responsibility is the last resort measure, and is triggered by serious harm and a culpable mental state’. While there is a deterrent value in criminal responsibility, an over-emphasis on criminalisation could form an obstacle to a technology that has the potential to reduce traffic incidents on the road and (civilian) casualties on the battlefield. According to Bo, often another approach might be more relevant to tackle the issue of responsibility in autonomous systems, such as civil liability, product liability, state responsibility for violations of international humanitarian law, and (in some legal systems) corporate criminal responsibility.
Read Marta Bo’s full paper ‘Are programmers in or 'out of' control? The individual criminal responsibility of programmers of autonomous weapons and self-driving cars,’ Gless, S. & Whalen-Bridge, H. (eds.), Human-Robot Interaction in Law and its Narratives: Legal Blame, Criminal Law, and Procedure, Cambridge University Press, 2022.
‘Autonomous Weapons and The Responsibility Gap in light of the Mens Rea of the War Crime of Attacking Civilians in the ICC Statute’, Journal of International Criminal Justice, 2021
‘Meaningful Human Control over Autonomous Weapons: An (International) Criminal Law Account,’ Opinio Juris, 18 December 2020
Lethal Autonomous Weapons: 10 things we want to know
Listen to Marta on the Lethal Autonomous Weapons Podcast by the Graduate Institute Geneva.
Also, listen to Marta’s interview on AWS and accountability for the LAW and the Future of War Podcast