Sök

Sök

Designing compliance

This PhD project examines the role of the private (technology) sector in shaping legal obligations in military targeting through the provision of military AI capabilities to states, particularly decision-support systems. In turn, it considers what the outsourcing of these capabilities means for states’ compliance with their legal obligations under international humanitarian law.

AI, the private sector and state compliance

Increasingly, open-source intelligence and big data are providing actionable military insights for the targeting operations by Western armed forces. To this end, the private (technology) sector is developing AI-powered software that collects, collates, and analyses data for military intelligence (AI-decision support systems). Though there are different arguments as to whether such AI tools will improve or be a risk to international humanitarian law (IHL) compliance, it is also invariably said that IHL rules can be embedded into the code of this software, such as to have compliance “by design”. At the same time, there are also (presently) non-binding “responsible AI” requirements at the level of attaining a particular standard by the technological system, which in turn may impact what IHL rules mean and entail in practice.

This PhD project aims to analyse the private (technology) sector’s role in relation to achieving compliance against the above complex and oftentimes vague regulatory framework, and consider this from the perspective of the blurring of the public and private domains in relation to interpreting and implementing the state’s legal obligations in military targeting.

Ansvarig institution

Institutionen för operativ juridik och folkrätt

Pågår

2024-2029

Dela:
Publicerad 2025-04-28 Uppdaterad 2025-04-28