James Allsop: AI ‘must not be left to the private sphere’, should be embraced by courts
While the implementation of legal tech is important for law and the administration of justice, former Federal Court chief justice James Allsop has warned against society becoming “mechanised and sterile” in the face of artificial intelligence – and emphasised the importance of the spirit of the law remaining “human”.
Last week (21 November), the Honourable James Allsop AC delivered the Sir Francis Burt Oration 2023: The Legal System and the Administration of Justice in a Time of Technological Change: Machines Becoming Humans, or Humans Becoming Machines?
“The COVID pandemic put great strain upon the administration of justice. The physical places of courts and courtrooms were closed and inaccessible. This was a deep impediment to the hearing of cases which required physical presence of the actors in one place – in particular, criminal jury cases. But technology (albeit well-known technology available for decades), properly organised and utilised, allowed courts to continue to hear almost all other cases, with the participants dispersed and located sometimes in their homes and offices,” he said.
“The ability to put in place and maintain this dispersed system of justice gave many insights: that a court could, systemically, continue to operate using digital technology for communication; that access to justice could be given to people in remote locations; that disadvantage of those without resources could be overcome by the court itself providing communication facilities in its premises; and many others.
“The insights were always born of the lived experience of the use of the technology and of the thoughtful contemplation of that use and experience: the central importance of the state taking all litigants seriously in the way the technology was made available; the necessity for acceptance by the users – the litigants – that this was the proper and respectful manifestation of just state power; that the judge and counsel were who they and the circumstances represented them to be – the essential actors in a real performance of the execution of protective state power by a human, but detached (abstracted in that sense), delegate of the state.”
As a result of these lived experiences, physical presence is not “necessarily universally demanded” for the administration of justice post-pandemic, according to Mr Allsop, who added that digital technology and artificial intelligence (AI) have also driven “transformative changes”.
“Digital and AI technologies and capacities are, of course, not matters that are limited to the law; they will change the whole of society, human and social relationships, and perhaps even our consciousness and sense of humanity. None of these things is definable or predictable. We need to be careful of definition and prescription, and of (over) confident prediction in this field,” he continued.
“It is helpful first to consider the vastness of digital information in its electronic form that is publicly available and that can be reached by computers. Even in a restricted field such as the legal system, the available digitally recorded information is vast: written in different languages, but nevertheless written.
“Take our domestic legal environment. The now vast availability of legal precedent, the size of statutes and regulations, and the digital multiplication of communication and records, almost demand assistance with optical character recognition and analysis by AI. We have, through digital technology, already created a mass of information, which is sometimes incapable of human analysis (even reading) in a short period of time.”
The multitude of possibilities come with “excitement and hope” for the legal system, Mr Allsop mused, as well as a “potential radical change” in tech-driven legal products and solutions; tools which Mr Allsop said “must not be left to the private sphere”.
“If public justice is to remain relevant, it must incorporate and give confidence and validity to these tools to improve access to justice. These kinds of technologies permit or facilitate the growth of alternative legal service providers, helping parties create documents and examine and answer questions about large volumes of data and information,” he added.
“The speed, growing accuracy and lowering cost of large data analysis will have an effect on all aspects of legal practice: a redefinition of what is skilled human legal work, what legal work can be commoditised by automation, how legal work can be re-organised leading to different roles. The first and most obvious of such new roles is the coming together of computer engineering and legal skills into the same people in their education and in their practice.”
These new developments “undoubtedly” pose challenges for the courts, which must engage with legal tech and AI tools in order to help facilitate justice for litigants.
“In a world of vast bodies of data, sometimes expressed in language of impenetrable prolixity, ascertaining the barest outline of the framework of one’s rights, let alone an assessment of their likely character, can be, and often is, time-consuming and prohibitively expensive. If the courts, in partnership with respected AI providers, could provide information and analytical apps to litigants, this would open up to large numbers of people a low-cost way of ascertaining, at least to some measure of satisfaction, what their rights are and how they might deal with them,” Mr Allsop continued.
“This may and perhaps will involve a broadening of the role of the courts from decision-maker to facilitator of access to justice. It may involve more resources. It may involve a reconfiguration or reconsideration of how to engage with the public and the profession. However, if it is not done, the role will be played by external actors and the courts will lose a measure of control over, and relevance for, a growing sector of the administration of justice.”
Lastly, Mr Allsop emphasised the importance that law and justice remain “human” in nature as the legal profession continually embraces emerging tech.
“In many ways, we have allowed our society to become mechanised and sterile with a drained sense of spirituality. There is no reason why science should necessarily have done this. To mistake the machine for the master will lead to the end of the spirit of the law as human and free. The danger is not the machine becoming human; it is the human becoming the machine,” he concluded.
“How we embrace and usefully use machine-learned capacities to reduce the cost of justice and increase the penetration of distilled appreciation into almost impenetrable thickets of available data that we have created should be guided by an appreciation of what the law is and what the law is not.”
Lauren Croft
Lauren is a journalist at Lawyers Weekly and graduated with a Bachelor of Journalism from Macleay College. Prior to joining Lawyers Weekly, she worked as a trade journalist for media and travel industry publications and Travel Weekly. Originally born in England, Lauren enjoys trying new bars and restaurants, attending music festivals and travelling. She is also a keen snowboarder and pre-pandemic, spent a season living in a French ski resort.