International Weapons Review delivering Law for AI Basics workshops, part of the TAS Ethics Uplift Program

Article 36 of Additional Protocol 1 (1977) of the Geneva Convention (1949) requires:

“In the study, development, acquisition or adoption of a new weapon, means or

method of warfare, a High Contracting Party is under an obligation to determine

whether its employment would, in some or all circumstances, be prohibited by this

Protocol or by any other rule of international law applicable to the High

Contracting Party”.

The rise of robotics Robotic, Autonomous Systems and Artificial Intelligence (RAS-AI) requires new methods to ensure compliance with the requirements of an Article 36 review of all new weapons, means or methods of warfare.

On 17 May 2021, Trusted Autonomous Systems (TAS) hosted International Weapons Review (IWR) ‘Law for AI Basics’ course for TAS participants and associated research personnel. IWR’s legal experts introduced international and domestic legal issues relevant to the design and acquisition of AI systems for use by Defence in Australia and identified legal inputs to ethical AI design in Defence.

The workshop covered Australian legal and ethical compliance requirements for Trusted Autonomous Systems. The Article 36 Review processes and issues relevant to autonomous systems, five facets of Ethical AI in Defence (responsibility, governance, trust, law and traceability) and requirements of the Legal and Ethical Assurance Program Plan (LEAPP). Workshops are available to stakeholders of Australian Defence including Defence Industries, Government, Universities, ADF and Defence.

Human machine teaming with RAS-AI will be a key ADF capability in the future. RAS-AI may increase safety for personnel, removing them from high-threat environments; increase the fidelity and speed of human awareness and decision-making; and reduce the cost and risk to manned platforms.

The development and RAS-AI investment must be informed by ethical and legal considerations and constraints. To achieve this, in February 2021, TAS commenced the Ethics Uplift Program (EUP) to provide immediate and ongoing assistance to TAS participants through consultation, advice and policy development, supported by case analysis, education and enculturation.

The training is designed to enable participants to understand, analyse and evaluate legal issues and risks that are relevant to the design and development of trusted autonomous systems, using case studies. This introductory course is aimed at technical staff responsible for design and development of AI systems and managers responsible for oversight of technical staff.  IWR, led by Dr Lauren Sander and Mr Damian Copeland, offers unique expertise in international law relevant to the development of new weapons, means and methods of warfare, including Article 36 weapon review requirements.

Dr Lauren Sanders is a legal practitioner whose doctoral studies were in international criminal law accountability measures, and whose expertise is in the practice of international humanitarian law including advising on the accreditation and use of new and novel weapons technology. She has over twenty years of military experience and has advised the ADF on the laws applicable to military operations in Iraq and Afghanistan and domestic terrorism operations.

Damian Copeland is a legal practitioner whose expertise and doctoral studies are in the Article 36 legal review of weapons, specifically focused on weapons and systems enhanced by Artificial Intelligence.  He is a weapons law expert with over twenty-five years military service, including multiple operational deployments where he has extensive experience in the application of operational law in support of ADF operations.

Learn more about the range of IWR services at https://internationalweaponsreview.com/

New TAS project to develop an Australian Code of Practice for the Design, Construction, Survey and Operation of Autonomous and Remotely Operated Vessels in 2021

By Rachel Horne – Assurance of Autonomy Lead/Director of Autonomy Accreditation – Maritime

Autonomous systems technology offers the ability to increase safety and efficiency, while lowering economic and environmental cost. While some level of autonomy has been seen in commercial products for a number of years, for example the basic thermostat or the Roomba, in the last five years there has been a rapid acceleration in the capacity and availability of unmanned aerial vehicles known as drones, and in uncrewed surface and sub-surface vessels, also called autonomous vessels.

For this rapid acceleration to continue, and to ensure this technology can integrate into commercial and defence operations, autonomous systems need to be trusted by the government, regulators, operators, and the broader community. An integral part of gaining trust is having a clear, well-tailored regulatory framework, consistent assurance requirements and agreed assurance methodology, and support from the regulator. These same factors also facilitate innovation and promote growth in industry by providing certainty.

Coral AUV. Image by AIMS

New project: Development of an Australian Code of Practice

The NASF-P (National Accreditation Support Facility Pathfinder) team have commenced a number of new projects to address the challenges outlined above. One of these projects is aimed at addressing the lack of tailored standards for autonomous and remotely operated vessels by developing an Australian Code of Practice for the Design, Construction, Survey and Operation of Autonomous and Remotely Operated Vessels. This Code will represent best practice, and is intended to provide certainty for industry by providing a set of regulator-acknowledged standards that they can use to design, construct, survey and operate autonomous and remotely operated vessels. The Code of Practice will be voluntary, and will be updated periodically.

This project, led by Maaike Vanderkooi on behalf of TAS, will begin with a review of available Codes of Practice and Standards, for example the UK Maritime Autonomous Surface Ships (MASS) UK Industry Conduct Principles and Code of Practice, and Lloyd’s Register Unmanned Marine Systems Code. The project will then develop a draft Australian Code of Practice, using input from key stakeholders, which will then be released for broader public consultation. The intent is to release a draft Code of Practice by October 2021, which will be available for use by industry and the regulator.

Maaike Vanderkooi has been chosen to lead the project as a result of her extensive experience in developing regulatory frameworks in the maritime, heavy vehicle and ports arenas, and her experience in developing, reviewing and impact assessing commercial vessel standards.

Maaike Vanderkooi

TAS will engage closely with key stakeholders, including the Australian Maritime Safety Authority (AMSA), the Australian Association for Unmanned Systems (AAUS) Maritime Working Group, the Marine Surveyors Association Inc, and the Australasian Institute of Marine Surveyors, throughout this project to ensure the Code of Practice is practical and appropriate for use by Australian industry and the regulator. There will also be opportunities for input by interested parties throughout the project.

Engagement opportunities

  • We are looking for people with direct experience applying current Codes of Practice or Standards to autonomous and remotely operated vessels, to discuss their experience and provide feedback to us in May 2021;
  • We will hold a series of workshops with key stakeholders between May and August 2021; and
  • We will release the draft Code of Practice for public consultation in August 2021, and welcome all thoughts and feedback.

If you would like to contact us in relation to this project, to offer feedback, suggestions, or your assistance, please email us at NASFP@tasdcrc.com.au.

QUT WAM-V in operation at AIMS. Image by AIMS

Other NASF-P projects underway

The NASF-P team have a number of projects underway, including:

  • Preparation of a Body of Knowledge on the assurance and accreditation of autonomous systems;
  • Air domain: development of an end-to-end acceptable process for the design, build, test and evaluation of autonomous detect and avoid (DAA) systems for certain types of airspace;
  • Maritime domain: development of a repeatable, regulator-accepted methodology to demonstrate compliance with COLREGS for autonomous and remotely operated vessels; and
  • Preparation of a business case for a new, independent, National Accreditation Support Facility, based in Queensland, that will better connect operators and regulators to facilitate more efficient assurance and accreditation.

The NASF-P team recently worked with Queensland AI Hub, Australian Institute of Marine Science, and AMC Search, supported by Advance Queensland, to deliver a world-first pilot course ‘Autonomous Marine Systems Fundamentals for Marine Surveyors’. This course, which was created to address the gap in experience with autonomous marine systems amongst the accredited marine surveyor community, had nine participants from around Queensland.

Participants of the pilot course at AIMS, March 2021. Image by TAS

If you would like to find out more about our work, or provide feedback on where you see the key risks and opportunities for the autonomous systems industry in Australia, please contact us as NASFP@tasdcrc.com.au

TAS Research Fellows (three of the four) featured in a University of Queensland Blog

Read about the work of three of the Trusted Autonomous Systems Research Fellows here.

Autonomous Marine Systems Fundamentals for Marine Surveyors

Trusted Autonomous Systems and the Queensland AI Hub supported the delivery of a world first pilot course ‘Autonomous Marine Systems Fundamentals for Marine Surveyors’ by the Australian Maritime College Search (AMC Search). The course ran at the Australian Institute of Marine Science (AIMS) in Townsville on 29 and 30 March.

The course introduces participants to autonomous and unmanned vessel technology, key terminology, operating concepts and system capabilities, limitations and risks. The participants were existing qualified accredited Marine Surveyors and other professionals, including representatives of Australian Maritime Safety Authority (AMSA), Port of Townsville and Maritime Safety Queensland (MSQ), who will benefit from increasing their knowledge and understanding of Autonomous Marine Systems.

The course enabled participants to more effectively undertake survey and other assurance and accreditation activities relating to autonomous and unmanned vessel technology. By upskilling ten accredited marine surveyors and professionals, this pilot course will address a gap in the current assurance and accreditation framework for autonomous marine systems.

This pilot course will make it more efficient and effective for Queensland businesses to build, test and certify their autonomous marine systems. It is a key opportunity to utilise the AIMS maritime facility and to borrow on the AI expertise of TAS and AMC Search. These learnings may then lead to additional opportunities to accelerate autonomous maritime systems development in Queensland as part of our commitment to the Smart Drone State.

This world first Autonomous Systems Training for Marine Surveyors was developed in partnership between Australian Maritime College Search (AMC Search), Trusted Autonomous Systems (TAS), Queensland AI Hub and Australian
Institute of Marine Science (AIMS).

To view or download our media release, click here

A Method for Ethical AI in Defence

Today the Australian Department of Defence released ‘A Method for Ethical AI in Defence’ an outcome of a workshop in 2019 attended by over 100 representatives from Defence, other Australian government agencies, industry, academia, international organisations and media. The workshop was facilitated by Defence Science & Technology Group, RAAF Plan Jericho and Trusted Autonomous Systems Defence Cooperative Research Centre. Defence note that the report outlines a pragmatic ethical methodology for communication between software engineers, integrators and operators during the development and operation of Artificial Intelligence (AI) projects in Defence.

Trusted Autonomous Systems CEO Professor Jason Scholz said ”Trusted Autonomous Systems are very pleased to partner with Defence on this critical issue of ethics in AI. Ethics is a fundamental consideration across the game-changing Projects that TAS are bringing together with Defence, Industry and Research Institutions.”

AI and human machine teaming will be a key capability in the future of Australian Defence systems. Chief Defence Scientist Tanya Monro notes “…AI technologies offer many benefits such as saving lives by removing humans from high-threat environments and improving Australian advantage by providing more in-depth and faster situational awareness”.

Air Vice-Marshal Cath Roberts, Head of Air Force Capability said “artificial intelligence and human-machine teaming will play a pivotal role for air and space power into the future… We need to ensure that ethical, moral and legal issues are resolved at the same pace as the technology is developed. This paper is useful in suggesting consideration of ethical issues that may arise to ensure responsibility for AI systems within traceable systems of control”. These comments are equally important to the other service arms.

In 2019, the Trusted Autonomous Systems Defence CRC (TASDCRC) commenced a six-year Programme on the Ethics and Law of Trusted Autonomous Systems valued at $9M. Over the past two years the activity has conducted workshops, engagements and consultation with participants and stakeholders of the Centre, contributing to ADF strategy, producing diverse publications and influencing the design of trusted autonomous systems such as the game-changing Athena AI ethical and legal decision support system.

From 2021 the Ethics Uplift Program (EUP) of the TASDCRC will offer ongoing assistance to Centre participants through consultation, advice and policy development, supported by case analysis, education and enculturation

The Trusted Autonomous Systems affiliate researchers and employees participate in a wide range of events in consideration of the ethics and law of RAS-AI such as  ICRC, UNIDIR SIPRI, and NATO.

TASDCRC is a non-government participant in the United Nations (UN) Group of Governmental Experts (GGE) on Lethal Autonomous Systems (LAWS) to ensure the development of autonomous systems accord with ethical principles, the laws of armed conflict (LOAC) and in abidance with Article 36 weapons reviews.

The Defence Media Release reinforced that “The ethics of AI and autonomous systems is an ongoing priority and Defence is committed to developing, communicating, applying and evolving ethical AI frameworks”. Trusted Autonomous Systems are a partner to Defence on that journey. More details at https://www.dst.defence.gov.au/publication/method-ethical-ai-defence

Book release – Lethal Autonomous Weapons: Re-Examining the Law and Ethics of Robotic Warfare

New Oxford University Press volume released: Lethal Autonomous Weapons: Re-Examining the Law and Ethics of Robotic Warfare 

The question of whether new rules or regulations are required to govern, restrict, or even prohibit the use of autonomous weapon systems has been the subject of debate for the better part of a decade. Despite the claims of advocacy groups, the way ahead remains unclear since the international community has yet to agree on a specific definition of Lethal Autonomous Weapon Systems and the great powers have largely refused to support an effective ban.  Lethal Autonomous Weapons focuses on exploring the moral and legal issues associated with the design, development and deployment of lethal autonomous weapons.  

The book features chapters by current and former TAS collaborators including CEO Prof. Jason Scholz, Chief Scientist Kate Devitt, Prof Rain Liivoja, Dr Tim McFarland, Dr Jai Galliott and Dr Bianca Baggiarini. 

Available hard and soft copies, more details at publisher site: 

https://global.oup.com/academic/product/lethal-autonomous-weapons-9780197546048?cc=au&lang=en& 

Also available in soft copy on a number of platforms.  

TASDCRC Activity on Ethics and Law of Trusted Autonomous Systems

Human machine teaming with Robotic, Autonomous Systems and Artificial Intelligence (RAS-AI) will be a key capability in the future of Australian Defence systems. RAS-AI may increase safety for personnel, removing them from high-threat environments; increase the fidelity and speed of human awareness and decision-making; and reduce the cost and risk to manned platforms. This RAS-AI investment must be informed by ethical and legal considerations and constraints.

Figure 1. Engineers Australia has awarded Athena AI as an engineering breakthrough that has the ability to identify protected objects, people and symbols, such as hospitals, in near real time for military operations using computer vision at very high probabilities. A funded TASDCRC project led by Cyborg Dynamics Engineering and Skyborne Technologies

In 2019, the Trusted Autonomous Systems Defence CRC (TASDCRC) commenced a six-year Programme on the Ethics and Law of Trusted Autonomous Systems valued at $9M. Over the past two years the activity has conducted workshops, engagements and consultation with participants and stakeholders of the Centre, contributing to ADF strategy, producing diverse publications and influencing the design of trusted autonomous systems such as the game-changing Athena AI ethical and legal decision support system—see Figure 1.

The Trusted Autonomous Systems affiliate researchers and employees participate in a wide range of events in consideration of the ethics and law of RAS-AI such as  ICRC, UNIDIR SIPRI, and NATO.

TASDCRC is a non-government participant in the United Nations (UN) Group of Governmental Experts (GGE) on Lethal Autonomous Systems (LAWS) to ensure the development of autonomous systems accord with ethical principles, the laws of armed conflict (LOAC) and in abidance with Article 36 weapons reviews.

Law and the Future of War, University of Queensland

The University of Queensland Law and the Future of War Research Group continues to lead research to develop and promote a better understanding of international law that governs the use of trusted autonomous systems (TAS) by the Australian Defence Organisation. It further aims to contribute to the development of law, policy and doctrine to ensure that Australia’s reliance on trusted autonomous systems satisfies both humanitarian imperatives and national security interests and is consistent with Australia’s commitment to upholding international law.

Ethics Uplift Program

From February 2021, the TASDCRC Ethics Uplift Program (EUP) will offer immediate and ongoing assistance to Centre participants through consultation, advice and policy development, supported by case analysis, education and enculturation.

The objectives of the program are to:

  • Raise the level of knowledge, skills and application of ethics.
  • Build enduring ethical capacity in Australian industry and universities to service Australian RAS-AI.
  • Educate in how to build ethical and legal autonomous systems.
  • Achieve ethical RAS-AI for TASDCRC Projects and
  • Support and contribute to the development of national policy.

The program will provide Australian industry access to the best of Australian theoretical and pragmatic expertise via universities and consultancies grounded in Defence-suitable methodologies and frameworks. The continued investment by TASDCRC with Defence and other participants is intended to accelerate and foster a sustainable capability for ethical and legal sovereign RAS-AI in Australia.

To express interest in provision of services to the EUP, contact Chief Scientist Dr Kate Devitt via info@tasdcrc.com.au

The Trusted Autonomous Systems submission to the Australian AI Action Plan

Trusted Autonomous Systems Directors of Autonomy Accreditation, (L-R) Mark Brady (Land), Rachel Horne (Maritime) and Tom Putland (Air) 

 

The Australian Government recognises that artificial intelligence (AI) will have enormous social and economic benefits for all Australians. The Department of Industry, Science, Energy and Resources (DISER) is consulting on the development of an Artificial Intelligence Action Plan to help maximise the benefits of AI for all Australians and manage the potential challenges. To that end DISER has developed a discussion paper and invited submissions to the AI Action Plan.

AI is the key underpinning of autonomous systems. The Trusted Autonomous Systems Defence CRC (TAS-DCRC) is uniquely placed to provide a submission, noting our depth of experience and leadership in this field, and our multi-disciplinary team including scientists, engineers, ethicists and philosophers, lawyers, and academics. We have a broad focus on many facets of Artificial Intelligence (AI), and our work covers ethical, assurance, technical, and practical perspectives.

Through our common good activities, specifically Activity 2: Assurance of Autonomy, we are actively working to provide a better understanding for Australians about the assurance and accreditation pathways for autonomous systems, in the maritime, land and air domains, and are working with the regulators and key stakeholders to improve those paths.

Our goal is to improve the innovation pipeline, making it faster to design, build, test, assure and certify autonomous systems, while maintaining warranted trust and safety. This work will unlock the many safety, environmental and efficiency benefits autonomous systems can bring, boost Australian jobs, and cement Queensland’s status as the ‘Smart Drone State’ of Australia.

We aim to attract more international participants to Australia to use our country’s world-class large-scale test ranges, and to generate business for the numerous AI entrepreneurs in Queensland and Australia more broadly.

Download the TAS-DCRC submission to the AI Action Plan here

CEO Jason Scholz awarded 2020 McNeil Prize

Trusted Autonomous Systems CEO, Professor Jason Scholz was awarded the 2020 McNeil Prize today by Chief of Navy, Vice Admiral Michael Noonan, AO in a virtual ceremony.

In 2016, the Australian Naval Institute (ANI) created an award to honour an individual or individuals from Australian defence industry who have made an outstanding contribution to the capabilities and sustainment of the Royal Australian Navy (RAN). This award was named the McNeil Prize in honour of Rear Admiral Percival McNeil CB RAN (1883-1951).

The contributions of Prof. Scholz to RAN capability are articulated in the ANI Media Release and the ceremony underscored the importance of this field of research and contribution to future capability. Read the ANI Media Release here.

Congratulations Jason!

Introducing Tom Putland, Director of Autonomy Accreditation – Air

With the surging use of highly automated remotely piloted aircraft systems (RPAS) and the prospect of ubiquitous drone-based delivery from the likes of Wing, Matternet, Flirtey and others, the question of how to perform air traffic management for drones, to prevent both unmanned-on-unmanned and unmanned-on-manned conflicts is a complicated one.

It’s clear that there are different societal expectations for the safety of two large wide body aircraft with hundreds of fare-paying passengers onboard colliding with one another compared to two small unmanned aircraft colliding with one another. Society may be willing to invest significant cost to ensure two commercial public transport aircraft do not collide, however society would not be willing to expend the same resources to prevent two drones from colliding.

To complicate this further, there are likely to be orders of magnitude more drones than manned aircraft, operating in close proximity, undertaking a range of different operations that may require approval at a moment’s notice. Without the ability to rely upon the human eye onboard to undertake see and avoid functions, this problem lends itself towards an autonomous, system of systems solution.

As the demand for such an Unmanned Aircraft System Traffic Management (UTM) system increases, the highly intertwined technical, legal and societal issues associated with a UTM need to be solved. The regulation and governance related to design, manufacture, certification and the continued operational safety of these autonomous systems requires a collaborative approach from society, regulators, academia and the aviation industry to ensure that trusted, safe, equitable and efficient UTM systems are developed for all parties.

It is with great pleasure that the Centre can announce the appointment of Tom Putland as Director of Autonomy Accreditation – Air, effective Monday 2 November.

Tom has worked at the Civil Aviation Safety Authority (CASA) for the past seven years, five of which were spent in the realm of RPAS focusing on RPAS airworthiness and overarching safety and risk management policy for CASA. Tom has also played a crucial role in the assessment and approval of complex RPAS operations.

Tom has been an Australian representative at the Joint Authorities for Rulemaking on Unmanned Systems (JARUS) for the last three years and has actively contributed to the development of the JARUS Specific Operations Risk Assessment(SORA), a globally recognised risk assessment tool for RPAS operations.

In these times of rapid technology development with respect to RPAS, UTM and automation, Tom is ideally placed to bridge the gap between regulators, the industry, society and academia to create a harmonised body of knowledge to facilitate faster, more efficient and safer certification of autonomous aircraft in Australia and around the world.

Tom becomes our second Director of Autonomy Accreditation, joining  Rachel Horne (Maritime), to develop a national body of knowledge including methods, policies, and practices to support accreditation. Directors address issues experienced by regulators, insurers, and autonomous technology developers by producing consistent (yet flexible) parameters for safe and trusted operations and improved agility to meet fast-changing technical and social licence needs.

Autonomy Accreditation forms a significant part of the Centre’s Assurance of Autonomy Activity that aims to create a trusted environment for test, risk analysis and regulatory certification support of autonomous systems and establish an independent world- class assurance service to global industry based in Queensland.