Collaborating to build a better domestic and international regulatory environment for innovative autonomous vessels

By Rachel Horne, Assurance of Autonomy Activity Lead, Trusted Autonomous Systems (TAS)

Domestic and international engagement is key for understanding different regulatory, environmental, and operational contexts; providing lessons in how we might approach some of our own challenges; facilitating future collaboration; and accelerating the operationalisation of autonomous vessels nationally and internationally.

In June 2022 TAS Assurance of Autonomy Lead Rachel Horne presented at the Autonomous Ship Technology Symposium 2022 in Amsterdam and visited several UK organisations pioneering autonomous vessel development, assurance, and use. The Symposium brought together industry, defence and government stakeholders to discuss topics that included: Progressing towards Maritime Autonomous Surface Ships (MASS) adoption; legal, liability, and regulatory frameworks; vision technologies and collision avoidance; novel concepts and case studies; inland and port operations; and solving the challenges of unmanned vessels. The symposium demonstrated that industry is driving rapid technological development, but the regulatory environment and ability to integrate new and evolving technology into commercial use is not advancing at the same pace.

 

Rachel Horne presenting at Autonomous Ship Expo, Amsterdam June 2022 provided by Rachel Horne and Eva Szewczyk

___________________________________

Industry is driving rapid technological development, but the regulatory environment and the ability to integrate new and evolving technology into commercial use is not advancing at the same pace.
___________________________________

 

A panel discussion on “Legal, liability, and regulatory frameworks” highlighted this point with fundamental questions raised such as how COLREGs applies to autonomous vessels and whether the convention needs to be amended, how liability will work, and how cybersecurity should be integrated into regulatory frameworks. Combining the experience and expertise of a broad range of parties will help to answer these questions in time, highlighting the need for ongoing domestic and international collaboration. By continuing to talk to each other, organisations lower the risk of reinventing the wheel on some issues while leaving others in the “too hard” basket. How this collaboration should happen and how to make it as productive as possible is something to puzzle out together.

TAS visited leading UK institutions that are pioneering autonomous vessel development, assurance, and use. These included the Assuring Autonomy International Programme (AAIP) and the new Institute for Safe Autonomy at the University of York, the National Physical Laboratory and Smart Sound Plymouth, University of Southampton, and CEbotiX at the National Oceanographic Centre. These initiatives show an advanced state of research is well underway with frameworks and resources being developed, investment in infrastructure and facilities that foster collaborative research and development, and different types of autonomous vessels and supporting technology in use. Australia does not yet have the same level of facilities and infrastructure or the depth of experts and vessels located in close proximity to foster collaboration. Continued discussion and collaboration is vital for ongoing development when it comes to the design, assurance, regulation, and use of autonomous vessels in the future.

There are similarities with the ambitions of Australian organisations, of which many are expanding their autonomous vessel-related offerings on the side of their traditional focus areas. For example, the Australian Institute of Marine Science is pioneering the use of bespoke autonomous vessels for reef monitoring, while also building a tropical marine test range offering. AMC Search is another good example, with increasing service offerings focussed on autonomous vessel operator and technical training, while building a cold-water test range offering. The Defence and Maritime Innovation and Design Precinct being established at the University of Tasmania also shares the ambition of fostering increased science and technology capabilities by facilitating collaboration between Defence Science & Technology (DST) and academics from across the country.

TAS will release in the coming months a series of case studies that highlight the use and regulation of autonomous vessels in Australia and compare Australia’s experience with other countries such as the UK. Preparing these comparative studies, and engaging with a range of UK-based stakeholders, highlighted the difficult regulatory environment and lack of pathways for autonomous vessels, together with the barriers they create. While frustration also exists in Australia, we do have plausible pathways and regulatory initiatives underway to smooth those pathways and enable operators to get their vessels in the water. For example, TAS publication of the Australian Code of Practice and COLREGS Operator Guidance Framework stamps Australia as a world leader in progressing the regulation of autonomous vessels.

 

Cover pages of the Australian Code of Practice and COLREGS Operator Guidance Framework

 

Continued collaboration is the best way to ensure continued progress in both domestic and international regulatory frameworks. By pooling the experience, expertise and ideas of multi-disciplinary groups of stakeholders we give ourselves the best chance to mould our current regulatory frameworks to better facilitate autonomy and build new, improved frameworks. Continuing to build the literature exploring the specific and more general legal issues associated with autonomy is also crucial, and will support needed policy, legislative and convention change. Organisations such as the University of Queensland Law and the Future of War team, part of TAS’s Ethics and Law Activity, are leading research in this area, and making it accessible via initiatives such as the Law and the Future of War podcast.

There are opportunities for the Australian autonomous systems ecosystem to advocate for the change they need through both domestic and international processes. For example, domestically, the Independent Review of Domestic Commercial Vessel Safety Legislation and Costs and Charging Arrangements welcomed submissions. Both TAS and the Australian Association for Uncrewed Systems (AAUS) provided submissions to that Review focussed on changes needed to better support autonomy. The Independent Review Panel are due to provide their report to government in November 2022. At an international level, the April meeting of the International Maritime Organization Maritime Safety Committee (MSC 105) highlighted that the MSC is working to develop a goal-based instrument regulating the operation of Maritime Autonomous Surface Ships (MASS). This instrument is intended to be adopted as a non-mandatory MASS Code by 2024, with a mandatory version to be in force in 2028. AMSA represents Australia in the MSC and will need to continue to engage with the Australian autonomous systems ecosystem to ensure relevant interests are represented. You can contact AMSA through AMSAConnect (AMSAConnect@amsa.gov.au) for more information.

TAS recently facilitated a Regulatory Workshop as part of a broader Defence exercise, which brought together a broad range of Defence, Government, industry and academic participants. Reflecting on my key takeaways from the broad-ranging and insightful discussion during the event, I put forward the following key points which I think will resonate with most people working in this space:

  • We must collaborate and communicate.
  • For regulation to be an enabler, not an inhibitor, we need change – at a legislative, regulatory, policy, and cultural level.
  • We do not need perfect; we need to iterate at we go.
  • Trust is critical. We need transparency, explainability and certainty to build that trust.
  • The future is now. We need to accelerate, together, to succeed.

TAS will continue to work to improve the regulatory approach to autonomous vessels and welcomes ongoing collaboration with our domestic and international stakeholders.

For those looking to connect with industry conferences and events on autonomous vessels, there are a few remaining in 2022. They include:

These events offer a unique opportunity to find out what domestic and international counterparts are focussing on, to build relationships, and to set up future collaborations to ensure a thriving and trusted autonomous systems industry in the maritime domain.

 

If you would like to contact us to offer feedback, suggestions, or request more information on our projects, please email us at info@tasdcrc.com.au.

 

High altitude technologies taking off in Queensland!

On Friday 22 July in a large exhibition hall in Brisbane, an innovative Queensland company, DanField Stratoship undertook a demonstration inflation of a stratospheric air ship.

The demonstration was organised by Trusted Autonomous Systems, a Defence Cooperative Research Centre in conjunction with the Royal Australian Air Force (RAAF) Air Warfare Centre (AWC), and as part of the groups involvement in the High-Altitude Pseudo Satellite (HAPS) Challenge.

The HAPS Challenge is exploring high altitude technologies including balloons that provide a range of lower-cost mechanisms to deploy payloads to areas of interest. Functions can include pseudo-satellite and persistent surveillance where reliable station-keeping and path-prediction functions are established. These technologies can provide lower cost, rapidly deployable capabilities for communications and surveillance tasks including bushfire early warning. HAPS Challenge management incorporates (Sir Lawrence Wackett Defence & Aerospace Centre – RMIT, SmartSat CRC, Trusted Autonomous Systems and RAAF Air Warfare Centre) supported by the Bureau of Meteorology.

The inflation demonstrates the technical status of the Stream 1 of the Challenge (Automated of Autonomous HAPS Platform Station Keeping & Constellation Maintenance) under the Power and Control in the Stratosphere (PACITS) project. This is currently in Phase 3, Prototype Development & Demonstration. PACITS is led by Danfield Stratoship and supported by a team The Stratoship Group including its sibling company Skysite, SuperSky Engineering, SmartSat Services and the Australian National University.

Ultimately, the conclusion on the Challenge will see a practical demonstration of the technology, deployed to the Stratosphere with regulatory approval processes underway.

Stratoship Group members at the remote pilot station.

TAS awards Defence contributors to autonomy projects at ADSTAR 2022

Outstanding contributions to industry-led innovation and trust frameworks by Defence personnel have been recognised by Trusted Autonomous Systems (TAS) with six recipients receiving a 2022 Autonomy Accelerator Award.

The six recipients were individuals and teams who were nominated by TAS industry participants and TAS Project and Activity leads.

The Autonomy Accelerator Award recognises the significant contribution made by the recipient to the advancement of autonomous systems in Australia. Recipients have gone above and beyond to ensure delivery, displayed integrity, and been strongly committed to achieving shared goals.

Award recipients delivered on the TAS commitment to game-changing capability impact through industry projects and policy progress, which will ease pathways to deployment of trusted autonomous systems.

The recipients of the 2022 Autonomy Accelerator Awards are list below.

Robert Bolia

Program Leader LVC, Air & Space Program, DSTG

Robert serves as Program Leader LVC at DSTG, the latest in a series of leadership positions he has held there. Before migrating to Australia, he spent his career working for the US DoD, culminating in eight years at the Office of Naval Research Global, based in Japan and Chile.  Before that he led the Asia-Pacific section at AFRL HQ, where he was charged with building and executing AFRL’s international engagement strategy for the Asia-Pacific region, and for developing collaborative research programs between AFRL and defence laboratories in the region.  Prior to that he was a human factors scientist in the AFRL Human Effectiveness Directorate.

Robert was nominated for his science leadership and vision for building industry-led trusted autonomous systems as well as his commitment to building ethical frameworks suitable for Defence contexts of use. Robert has been a champion of TAS since inception in 2018, working collaboratively across organisations and stakeholder groups to increase sovereign capability for Australia. Robert’s commitment is evidenced in his co-authorship of ‘A Method for Ethical AI in Defence’ report by DSTG, RAAF and TAS leading to national and international policy impact including NATO and the TTCP AI Strategic Challenge.

Nominated for his science leadership and vision for building industry-led trusted autonomous systems, Robert Bolia co-authored the ‘A Method for Ethical AI in Defence’ report by DSTG, RAAF and TAS.

Jared Freundt, George Katselis, Kris Allpress, Samuel Weckert, and Ian Lochert

Advanced Warhead Technologies Group, Weapons and Combat Systems Division, DSTG

Jared Freundt has over 20 years’ Defence Science and Technology experience focussed on characterising the performance of weapon system in support of Australian Defence Force requirements. Jared is a subject matter expert in high-speed photo instrumentation capability and has developed novel imaging techniques and analysis methods. In addition, he has developed a wide range of skills including; finite element analysis, explosive ordnance handling and safety and trial management. Due to his significant experience, he is currently working with the DSTG Trials Authority to ensure DSTG is well positioned for delivery of more complex trials into the future and to ensure they are managed consistently, safely and efficiently.

Jared Freundt and his team (George Katselis, Benjamin Hall, George Katselis, Kris Allpress, Samuel Wechert, Ian Lochert and others) in WSD were nominated for their exemplary collaboration with Skyborne Technologies on the Gannet Glide Drone program.

They produced a number of experimental warheads, which were tested successfully both in a standalone and integrated capacity. The collaboration resulted in an impressive demonstration at Port Wakefield in South Australia. Their efforts enabled a very successful outcome on all fronts.

Skyborne logo

Jared Freundt and his team were nominated for their exemplary collaboration with Skyborne Technologies.

Michael Gan

Royal Australian Airforce; Artificial Intelligence Lead, Jericho Disruptive Innovation

Wing Commander Michael Gan commenced in his role as Deputy-Director Artificial Intelligence at Air Force’s Jericho Disruptive Innovation in September 2018. His previous experience in air mobility operations (particularly in humanitarian aid and disaster relief) as well as in critical thinking and ethics in professional military education has strongly influenced his approach to the development or Artificial Intelligence in the Royal Australian Air Force. His main focus has been on developing and exploring the AI foundations of education, ethics and assurance, while exploring Defence applications for computer vision and imagery analysis, natural language processing and AI/Data analytic decision support tools.

Michael Gan was nominated as a champion of industry-led innovation and his sustained commitment to building trusted sovereign capability in robotics, autonomous systems, and artificial intelligence. Michael ensures alignment of industry-led projects to Defence values in his ethical, legal and assurance of autonomy frameworks and tools exemplified in his co-authorship of ‘A Method for Ethical AI in Defence’ DSTG Technical Report by DSTG, RAAF and TAS leading to national and international policy impact including NATO and the TTCP AI Strategic Challenge.

A champion of industry-led innovation and building trusted sovereign capability in robotics, autonomous systems, and artificial intelligence, Michael Gan was a co-author of the ‘A Method for Ethical AI in Defence’ DSTG Technical Report by DSTG, RAAF and TAS

Robert Morris

Aerospace Division, DSTG

SQNLDR Morris has extensive RAAF Aircrew experience, primarily as an Airborne Electronics Officer and sensor employment specialist on P3-Orions, amassing around 7,500hrs. He also has UAS expertise from his involvement in the 2006, DSTG NW Shelf UAS trial. In 2010, he was selected as the Commanding Officer for the Heron UAS deployment to Afghanistan, (logging 400hrs). He has a wealth of experience in LOAC, Joint Fires, and Targeting from his time the Air and Space Operations Centre and HQJOC.  SQNLDR Morris has been an Air Liaison Officer at DSTG since 2019 and with Human Factors group, in Aerospace Division since 2021.

The nomination recognised Robert (Bob) Morris for his excellent work in leading the HF assessments of Athena AI. From the time Bob came onboard, he worked directly with the engineering team to identify improvements to the capability, the training program and how evaluations would be conducted. Bob bought operational skills from decades of UAS experience in both theatre and T&E. He also supported liaison with RAAF and initial datasets for evaluation and integration. He worked directly with our engineering team to address roadblocks as they came up.

Upon conclusion of the project Bob Morris had conducted training and evaluation of Australia’s first AI enabled sensor to effector capability with 2 military units being 2SECFOR and 20STA, both of which were able to proficiently use the software within one day of training.

Robert Morris contributed operational skills from decades of UAS experience in both theatre and T&E to the Athena AI project.

Karl Sammut

Professor, College of Science and Engineering, Flinders University

Professor Karl Sammut is a Co-Director of the Centre for Defence Engineering at Flinders University. Karl received his Ph.D from Nottingham University in 1992 before taking up positions at the Politecnico di Milano, followed by Loughborough University and then at Flinders University. Between 2019 and 2022, he held a part-time position as a Senior Principal Scientist with DSTG. Karl has over twenty years experience in maritime autonomy research for the development of uncrewed surface and underwater vehicles and has worked in collaboration with Trusted Autonomous Systems, DSTG, Thales, Boeing Australia, Lockheed Martin Australia, Naval Group and Fincantieri.

Karl was a key founding member of the team, and in concert with Thales and USYD helped put together the original proposal with Thales for the MCM in a day project. Karl and the Flinders team have been instrumental in driving the project forward from systems, software and hardware development perspective.

Karl is a driven and highly dependable member of the ‘MCM in a Day’ project. He has consistently supported and motivated the development of key software and hardware resources, despite a number of challenges. In doing so, he has enabled the progression of many key integration activities. He has regularly contributed solutions to numerous technical and administrative challenges throughout the duration of the project and has offered important insights and ideas that have notably furthered the project’s development. Karl’s efforts have demonstrably progressed project development, both within his immediate team at Flinders University and the project team as a whole.

Karl and Flinders team attendance at PAC was a great example of the commitment to project success that Karl fosters in the Flinders team, driving the Crawler and associated kit across from SA for the event in Sydney to ensure the teams would maximise their time to work on the equipment. Karl’s attitude and commitment influences the whole TAS MCM in a day project team in a hugely positive way.

Karl Sammut is a driven and highly dependable member of the ‘MCM in a Day’ project. He has influenced the whole team in a hugely positive way.

Rafał Sienicki, Research Specialist

Information Sciences, Defence Science and Technology Group

Rafał (Ralph) joined DSTG in 2005. He has provided research, systems engineering and project management contributions to Defence projects in the areas of electronic warfare, radar and communication systems, distributed systems, and modelling, simulation, and experimentation. He is currently pursuing a PhD with the University of Sydney on deep learning approaches for spatial-temporal characterisation and prediction of the electromagnetic environment.

Ralph Sienicki has provided invaluable insights on the electromagnetic operating environment its characterization and effects, identified relevant research problems and applied creative solutions.  Ralph has corralled DSTG effort toward Distributed aUtonomous Spectrum Management (DUST) milestones and goals, written high quality technical investigation reports, contributed constructively to DUST project reports and whilst the DSTG partner point of contact provided an exemplar of PM partner reports.

Ralph has facilitated productive exchanges and negotiations between DUST partners to bring project outcomes. Most recently Ralph has commenced a PhD aligned to the DUST project under the supervision of the DUST UoS partner.

Ralph Sienicki has provided invaluable insights on the electromagnetic operating environment its characterization and effects, identified relevant research problems and applied creative solutions.

 

About Trusted Autonomous Systems

Trusted Autonomous Systems (TAS) is Australia’s first Defence Cooperative Research Centre uniquely equipped to deliver research into world-leading autonomous and robotic technologies to enable trusted and effective cooperation between humans and machines. Funded by the Commonwealth via the Next Generation Technology Fund (NGTF) and the Queensland State Government, TAS aims to improve the competitiveness, productivity, and sustainability of Australian industry through industry-led projects with real translation opportunities to move technology rapidly from universities into industry and ultimately into leading edge capability for the Australian Defence Force. Projects are supported by ‘common-good’ activities in ethics, law and assurance of autonomy accelerating the operationalisation of capabilities. TAS is developing the capacity of Australia’s defence industry to acquire, deploy and sustain the most advanced autonomous and robotic technologies.

Who is liable when an autonomous military drone causes unintended harms?

By Dr Brendan Walker-Munro, Law and Future of War Research Group, The University of Queensland

Suppose that the ADF procures an autonomous airborne drone (‘drone’) for delivery of goods whilst deployed during humanitarian operations overseas. The drone is unarmed but carries sensors capable of determining if it is under attack and can take evasive action. During one such deployment, the drone is delivering medical supplies on a humanitarian assistance mission in the region after a cyclone when a megabat collides with the drone, damaging its GPS. Its location confused, the drone accidentally crashes into a home harming several people.

Who is responsible for the harms to property and people in this instance? What should the ADF know about domestic and international law ahead of procuring drones like these from manufacturers? What questions should they ask and what kinds of precautions should they take before deploying assets operationally?

In the Australian context, there are two sources of laws which impose obligations on manufacturers of military equipment including this drone flown in international territory. The first is the common law, made up of judgments and decisions by Australian and international courts. The second is statutory law, the Acts passed by Australian Parliament and enforced by the courts.

For manufacturers of AMS, the provisions of statutory law (including the Australian Consumer Law or ACL) will have the most impact on their operations. Whilst the common law will still apply, it has a much narrower application because the common law:

  • Creates a duty of care generally limited to the end users of the equipment (usually the ADF and/or its soldiers, sailors, airmen and officers);
  • Is limited in its application to third parties outside of the ADF, such as civilians; and
  • Is subject to wider defences based on a test of reasonableness, such as showing the manufacturer took “reasonable” steps to protect persons from harm, and/or that the harm that resulted was not “reasonably foreseeable”.

Otherwise, it is important to recognise that under either law:

  • Operations against the enemy by the ADF attracts combatant immunity or combatant privilege, and so are not actionable
  • Australian courts may still decline jurisdiction in cases where it would be ‘clearly inappropriate’ to do so.

The ACL was originally passed in 2010 to create a binding framework to protect consumer safety and punish unconscionable seller behaviours. To sue under the ACL, a plaintiff only need prove the existence of a “safety defect”, which only requires proof that ‘their safety is not such as persons generally are entitled to expect’, and where that defect causes harm, loss or injuries.

The ACL will apply to military contractors, as:

  • Use of the word “goods” in the ACL explicitly includes references to ships, aircraft, vehicles, components, software and subassemblies; and
  • The ACL applies to the conduct of any corporation (whether incorporated domestically or internationally)

Companies that fail to prevent safety defects in their AMS could face the issue of damages, compensation orders, or orders to remediate or redress any harm or loss suffered by the plaintiff.

For the crashed drone landing overseas, Australia is an appropriate forum to hear any claims, and the action was not being taken in the course of “actual operations against the enemy”.

The drone was provided in trade or commerce and requisite proof of harm — either consisting of death or personal injury, or destruction/damage of a residence — is also uncontroversial.

A manufacturer will be able to resist liability if they can demonstrate that the defect was not present at the time of delivery to the military, or alternately that the defect was only detectable by some process or technology that was not available at the time of manufacture.

In this case the megabat collided with the drone. It might be argued that the GPS unit was not sufficiently robust to withstand a direct physical altercation (see arguments regarding the fragility of the angle-of-attack (AOA) sensor feeding data to the MCAS system on the 737 Max). The manufacturer should predict collisions with objects including flying animals and ensure safety critical systems such as GPS were resilient under such events.

The ‘safety defect’ will have different ramifications if the safety defect was for example the result of a carelessly wired sensor (a manufacturing defect) versus a poorly chosen sensor (a design defect).

The manufacturers of drones will continue to attract liability where it is produced negligently or contains safety defects, irrespective of whether the manufacturer is ordinarily registered in Australia, or that the products are purely military in nature.

ADF personnel in charge of procuring or deploying drones need to be aware of the legal frameworks that govern their use and potential liabilities when incidents occur.

 

This Blog condenses the content in – Walker-Munro, B (2022), ‘Exploring manufacturer strict liability as regulation for autonomous military systems. Who is liable when an autonomous military drone causes unintended harms?’ in Torts Law Journal, Vol 27, Part 3 (available on subscription Lexis Nexis).

New videos released in ‘A Method for Ethical AI in Defence’ series

Interviewees from the TAS CDLE video series

In a newly released series of videos on ethical AI, Trusted Autonomous Systems explores responsibility, governance, trust, law, and traceability for robotics, autonomous systems, and artificial intelligence.

These videos were produced by Trusted Autonomous Systems for the Centre for Defence Leadership & Ethics (CDLE) at the Australian Defence College.

The videos feature Chief Defence Science Prof Tanya Monro, ADF personnel, and representatives from Defence industries.

They explore topics of responsibility, governance, trust, law and traceability using a hypothetical science fiction scenario Striking Blind written by Australian Defence College Perry Group students in 2021. In the Striking Blind story, an Australian autonomy platform is deployed in a future operation with a fictional AI called ‘Mandela’.

The videos explore ethical and legal factors associated with this scenario. They highlight the need for maintaining robust oversight so that the ADF can benefit from AI and autonomous systems while addressing their complex challenges.

Designed for the full Defence learning continuum, the videos are based on the framework and pragmatic tools described in the Defence Science technical report A Method in Ethical AI in Defence (2021).

Screenshots from the animations produced in the video series

Using the framework the videos cover:

While the method does not represent the views of the Australian Government, it provides an evidence-based collaborative framework relevant to Australian Defence contexts of use as well as ethical and legal considerations aligned with international best practice.

These videos provide both an overview and an in-depth exploration of A Method for Ethical AI in Defence and can be used within professional military education and by external stakeholders of Defence, including academia.

How to use the videos
  1. Use animations as thought prompts within presentations and workshops on robotics, autonomous systems, and artificial intelligence amongst Defence stakeholders.
  2. Use longer videos in a ‘flipped classroom’ model of learning for professional education and training
  3. Use videos in multi-stakeholder meetings to establish a shared framework within which to identify ethical and legal risks for robotics, autonomous systems and artificial intelligence projects for Defence
  4. Use pragmatic tools videos to establish processes for the identification and management of ethical and legal risks on RAS-AI projects
How to cite the videos

Producer Tara Roberson (Trusted Autonomous Systems)

Creative Director Kate Devitt (Trusted Autonomous Systems)

Publisher Trusted Autonomous Systems

Production Company Explanimate

Sponsor Centre for Defence Leadership & Ethics, Australian Defence College

With thanks to all interviewees who appeared in the videos: Stephen Bornstein, Damian Copeland, Kate Devitt, Michael Gan, Chris Hall, Sean Hamilton, Lachlan Jones, Rebecca Marlow, Tanya Monro AC, Mick Ryan AM, Lauren Sanders, Jason Scholz & Dominic Tracey.

Cite as: Roberson, T. & Devitt, S.K. (2022). Ethics of Robotics, Autonomous Systems and Artificial Intelligence Videos for Defence. [14 Videos] Trusted Autonomous Systems. https://tasdcrc.com.au/ethical-ai-defence-videos/

The transcribed video series is available on the TAS website.