Introducing Mark Brady, Director of Autonomy Accreditation – Land

Society is increasingly reliant on autonomous systems and artificial intelligence (RAS-AI). Along with changes to communication, mobility, and technology, RAS-AI in the land domain will bring many changes to the physical landscape and architecture of cities. Despite the physical limitations of current transport infrastructure, carriageway width and lane marking, there will be the capacity to vastly increase vehicular throughput of many longer, narrower and faster vehicles over a given area of roadway without changing the size of the road. Vehicles may not even need windows except for viewing scenery and traffic jams may be a historical memory in the fully integrated smart cities of the future. Private ownership of vehicles themselves may eventually give way to mobility as a service with houses no longer needing space wasted on a ‘bedroom for the car’ and autonomous vehicle service fleet companies becoming the new industrial giants of the 21st Century. Trust will be a crucial factor in the development and sustainability of these systems.

With human-machine interaction there is always a point where the decision-making process of the human can be called to account for their action or inaction. This point is not always as clear-cut for RAS-AI; and the ability of society to examine a decision after the fact is particularly relevant in situations where harm has occurred. Autonomous vehicles have the potential to make life and death decisions in place of human beings. Unlike traditional robotic systems that were basically static, fixed to the ground with a given operational design domain, autonomous land vehicles may be highly mobile, heavy, and capable of inflicting harm throughout their operation or deployment. As the potential for harm rises, so does the need to assure that their operation, and failure, occur in ways that are predictable so that humans may make allowances for such behaviour during their operation.

Accordingly, the ability to accredit the operational domain or domains of autonomous land vehicles is necessary to foster and maintain trust in these systems. Establishing trust in RAS-AI requires these systems to be predictable, explainable, and ultimately, accreditable. Predictability is therefore the first step towards building trust, where an ability to understand the ultimate outcome of an open-ended RAS-AI decision-making process becomes vital. However, prediction may not always be possible, and in such cases, explainability allows society to understand why the RAS-AI followed a particular behavioural pathway. It will also be necessary to accredit the operational capabilities of RAS-AI to foster and maintain trustworthiness. RAS-AI might be accredited within a specific operational domain, as a level of safe operation, or as a combination of other factors. As the body of knowledge surrounding trust in autonomous systems is just beginning to be understood, there is now a significant need to clarify parameters of trust in RAS-AI.

It is with this in mind that the TASDCRC now introduces our third Director of Autonomous Systems Accreditation, Mark Brady. Mark is an expert in the area of regulation for autonomous land vehicles with a focus on establishing a roadmap for assurance and accreditation of autonomous land-based technology. Mark’s research into regulation for disruptive technology focused on autonomous land vehicles as a case study examining the regulatory impact these technologies have on law. Mark brings a wealth of experience as a researcher and academic at the University of Adelaide and as a solicitor working in Queensland. These skills will help Mark to foster cooperation between researchers, regulators and stakeholders to encourage confidence and investment in the development of automated land vehicle technology in Queensland and throughout Australia as it looks to become a world leader in many areas of autonomous technology.

Mark joins Rachel Horne (Maritime), and Tom Putland (Air) to develop a national body of knowledge including methods, policies, and practices to support accreditation. Directors address issues experienced by regulators, insurers, and autonomous technology developers by producing consistent (yet flexible) parameters for safe and trusted operations and improved agility to meet fast-changing technical and social licence needs. Autonomy Accreditation forms a significant part of the Centre’s Assurance of Autonomy Activity that aims to create a trusted environment for test, risk analysis and regulatory certification support of autonomous systems and establish an independent world-class assurance service to global industry based in Queensland.

L-R, Mark, Rachel & Tom