[New publication] The influence of AI decision-support systems on legal assessments in military targeting

Published 14 December 2023

@ U.S. Army photo by Dan Lafontaine, PEO C3T- Public domain, via Wikimedia Commons - Software-based applications of AI also have a major impact on military operations.

Taylor Woodcock explores how algorithmic decision-support systems (DSS) in warfare impact legal reasoning in military targeting in a new article for Global Society’s special issue on algorithmic warfare. She concludes that these systems currently lack the capacity to make the necessary contextual, qualitative and value- legal judgments required by the proportionality standard under international humanitarian law (IHL). 

Reliance on decision-support systems 
Autonomous weapons systems are often the first thing people think of when it comes to military AI. However, software-based applications of AI related to surveillance, and decision-making support also have a major impact on military operations. The release of Open AI’s ChatGPT, generative artificial intelligence (AI) has captured the headlines and the imagination of the public and the military. The United States Department of Defense is already experimenting with large language models (LLMs) to use AI in decision-making, sensors, and weapons in military operations 

 

Decision-support systems would not autonomously engage in the use of force, but could influence critical decision-making in warfare, such as target selection. In her article, Taylor points out that software-based applications likely face less scrutiny and far fewer obstacles to deployment than autonomous weapons systems. Nonetheless, target selection is a crucial aspect of warfare that can result in collateral damage to civilians and is regulated by international humanitarian law (IHL).   

AI and the principle of proportionality 
IHL requires assessment of the proportionality of attacks, weighing the expected collateral damage to civilians against the anticipated military advantage. Such an assessment is contextual, qualitative, and value-laden, and Taylor highlights that there are serious questions around how data-driven AI could perform this kind of assessment.  

Generating recommendations on the proportionality of a proposed attack through an algorithm even risks compromising human decision-making. The autonomy of the human decision-maker can be influenced by the recommendations of a software which is prone to data biases, a lack ofexplainability, and unreliability in complex environments like the battlefield. For example, AI conclusions are based on correlation and not causation, but sometimes humans can use this to reach conclusions about causation. It is these features, in particular the need to make value-judgments that weigh human lives against perceived military gain, that render AI unfit to make proportionality assessments in line with the rules of international humanitarian law concludes Taylor. 

Read the full article available with open access on Global Society. 

Read more 
[Spring academy] Artificial intelligence and international law 
The Asser Institute’s Spring academy artificial Intelligence and international law (22 - 26 April 2024), is an annual interdisciplinary programme offering in-depth perspectives on AI and international law. It addresses fundamental issues at the intersection of theory and practice. Read more.  


Could individuals be held responsible if they fail to prevent autonomous weapon systems from carrying out illegal attacks in a conflict? Marta Bo explores the issue of holding people accountable for the actions of autonomous weapon systems during wartime. Read more. 

Berenice Boutin and Taylor Woodcock propose ways to operationalise ‘meaningful human control’ through a legal ‘compliance by design’ approach in ‘Aspects of Realizing (Meaningful) Human Control: A Legal Perspective’ published in R. Geiß, and H. Lahmann's Research Handbook on Warfare and Artificial Intelligence.  Read more. 

 

About Taylor Woodcock 
Taylor Woodcock is a researcher in public international law at the Asser Institute, while pursuing her PhD at the University of Amsterdam. She works on the research strand ‘Regulation in the public interest: Disruptive technologies in peace and security’, which addresses regulation to safeguard and promote public interests. It focuses on the development of the international regulatory framework for the military applications of disruptive technologies and the arms race in conventional and non-conventional weapons.  


Taylor Woodcock LL.M.