Article 36 of Additional Protocol 1 (1977) of the Geneva Convention (1949) requires:
“In the study, development, acquisition or adoption of a new weapon, means or
method of warfare, a High Contracting Party is under an obligation to determine
whether its employment would, in some or all circumstances, be prohibited by this
Protocol or by any other rule of international law applicable to the High
The rise of robotics Robotic, Autonomous Systems and Artificial Intelligence (RAS-AI) requires new methods to ensure compliance with the requirements of an Article 36 review of all new weapons, means or methods of warfare.
On 17 May 2021, Trusted Autonomous Systems (TAS) hosted International Weapons Review (IWR) ‘Law for AI Basics’ course for TAS participants and associated research personnel. IWR’s legal experts introduced international and domestic legal issues relevant to the design and acquisition of AI systems for use by Defence in Australia and identified legal inputs to ethical AI design in Defence.
The workshop covered Australian legal and ethical compliance requirements for Trusted Autonomous Systems. The Article 36 Review processes and issues relevant to autonomous systems, five facets of Ethical AI in Defence (responsibility, governance, trust, law and traceability) and requirements of the Legal and Ethical Assurance Program Plan (LEAPP). Workshops are available to stakeholders of Australian Defence including Defence Industries, Government, Universities, ADF and Defence.
Human machine teaming with RAS-AI will be a key ADF capability in the future. RAS-AI may increase safety for personnel, removing them from high-threat environments; increase the fidelity and speed of human awareness and decision-making; and reduce the cost and risk to manned platforms.
The development and RAS-AI investment must be informed by ethical and legal considerations and constraints. To achieve this, in February 2021, TAS commenced the Ethics Uplift Program (EUP) to provide immediate and ongoing assistance to TAS participants through consultation, advice and policy development, supported by case analysis, education and enculturation.
The training is designed to enable participants to understand, analyse and evaluate legal issues and risks that are relevant to the design and development of trusted autonomous systems, using case studies. This introductory course is aimed at technical staff responsible for design and development of AI systems and managers responsible for oversight of technical staff. IWR, led by Dr Lauren Sander and Mr Damian Copeland, offers unique expertise in international law relevant to the development of new weapons, means and methods of warfare, including Article 36 weapon review requirements.
Dr Lauren Sanders is a legal practitioner whose doctoral studies were in international criminal law accountability measures, and whose expertise is in the practice of international humanitarian law including advising on the accreditation and use of new and novel weapons technology. She has over twenty years of military experience and has advised the ADF on the laws applicable to military operations in Iraq and Afghanistan and domestic terrorism operations.
Damian Copeland is a legal practitioner whose expertise and doctoral studies are in the Article 36 legal review of weapons, specifically focused on weapons and systems enhanced by Artificial Intelligence. He is a weapons law expert with over twenty-five years military service, including multiple operational deployments where he has extensive experience in the application of operational law in support of ADF operations.
Learn more about the range of IWR services at https://internationalweaponsreview.com/