Weapon systems that integrate an increasing number of automated or autonomous features raise the question of whether humans will remain in direct, meaningful control of the use of force. While diverse, these systems are captured by the catch-all category of autonomous weapons systems (AWS), because they weaponise Artificial Intelligence (AI).
Many states consider applying force without any human control as unacceptable. But there is less consensus about various complex forms of human-machine interaction along a spectrum of autonomy and the precise point(s) at which human control stops being meaningful. Faced with these questions in the transnational debate at the UN Convention on Certain Conventional Weapons (CCW), states reach different conclusions: some, supported by civil society organizations, advocate introducing new legal norms to prohibit so-called fully autonomous weapons, while others leave the field of open in order to increase their room of manoeuvre.
As discussions drag on with little substantial progress, the operational trend towards including automated and autonomous features in weapon systems continues. A majority of the top 10 arms exporters such as the USA, China, and Russia, are developing or planning to develop some form of AWS. The AutoNorms project (08/2020-07/2025) addresses these uncertainties in providing answers to its main research question: to what extent will AWS shape and transform international norms governing the use of violent force? Answering this research question is crucial because norms, defined broadly as understandings of appropriateness that evolve in practices, sustain and shape the international security order. While the rules-based order has been remarkably resilient, it currently finds itself increasingly subject to internal and external challenges. Monitoring changing practices and norms on the use of force will allow us to understand their repercussions for the fundamental character of international order.
Existing International Relations (IR) research on norms does not enable us to understand the dynamics of this vital process because it does not capture how norms emerge and develop procedurally. The state of the art conceptually connects norms predominantly to international law and limits attention to how norms emerge in deliberative international forums. The AutoNorms project will develop a new theoretical approach allowing us to study the bottom-up process of how norms manifest and develop in practices. Practices are patterned ways of doing things in different social contexts. This inter-disciplinary concept combines sociology, constructivist IR, and critical legal scholarship in accentuating the constitutive quality of practices as sites of norm emergence and change.
The AutoNorms project pursues three research objectives:
1. To analyse how and under what conditions norms emerge and change in practices.
2. To analyse how understandings of perceived appropriateness about autonomising the critical functions of weapons systems emerge and evolve across military, transnational political, dual-use, and popular imagination contexts in four countries (China, Japan, Russia, USA).
3. To investigate how emerging norms on AWS will affect the make-up of the current international security order.