Goodbye job applications, hello dream career
Seize control of your career and design the future you deserve with LW career

Will AI be the ‘great equaliser’ for access to justice, or will it inhibit it?

Here, several firm leaders discuss how artificial intelligence may impact access to justice, outlining how AI could bring significant benefits and the potential dangers and downfalls. 

user iconJess Feyder 05 April 2023 Big Law
expand image

Intellectual property and technology partner at Dentons, Robyn Chatwood, spoke to Lawyers Weekly about the potential of AI programs like ChatGPT aiding the access to justice issue.

“AI might allow more people to access legal advice if it is reliable and accessible,” Ms Chatwood outlined. 

“There is potential for AI programs like ChatGPT to facilitate access to justice by undertaking legal analysis — or even to have AI programs conduct and manage evidence cost-effectively or to draft legal documents.”

Ms Chatwood highlighted that this could aid access to justice since the cost of lawyers is sometimes the factor that prohibits people from accessing legal services. 

For such people, it would be a welcome development for bots to replace lawyers in areas where the complexity of legal issues or the need for bespoke lawyering is low, Ms Chatwood noted.

“AI could help, for instance, to draft simple wills — democratising a process that may only be afforded by some,” she outlined. 

Ms Chatwood added: “It is easy to foreshadow a world where judges could be aided by legal robots that provide the precedents for a decision.”

“This is, however, a different thing from using an AI judge to entirely resolve a dispute.”

Ms Chatwood outlined an area where the reality of AI aiding access to justice is already a reality: “AI use cases have included using programs to process speech into text — so enabling faster access to court transcripts — which is good for justice.”

Chris McLean, chief operating officer at Piper Alderman, also spoke to Lawyers Weekly on the topic. 

“AI has incredible potential to assist lawyers in providing legal advice and providing greater access to justice for everyone,” he said.  

“AI should give people access to better, more affordable legal advice, at least for simple matters.”

“For access to justice, AI will be a boon for pro bono legal centres in a number of ways.” 

“It will allow legal centres to take instructions from more clients, as large language models (like ChatGPT) allow people to ask questions in normal sentences and can serve as pretty effective front-end interfaces to take initial instructions.”

“It will allow clients to ask questions and give instructions in their native language.”

“It will provide a far more effective interface to already established legal help resources.”

David Fischl, partner at Hicksons Lawyers, commented: “There is significant opportunity for AI programs like ChatGPT to aid in the access to justice — but not in the way most people think.”

“The real opportunity is for law firms, government, and community service providers to provide client-facing applications powered by the technology underlying ChatGPT.”

“The client-facing applications will provide advice and guidance directly to the people who need it, in a place and at a pace they’re used to consuming it, like with social media platforms.” 

“The large language model technology behind ChatGPT is the technology we have been waiting for.”

“It will be the ‘great equaliser’,” Mr Fischl asserted.

“Lawyers now have the technology to, at scale, understand people’s issues and then provide customised, practical and easy to follow advice and guidance.”

“There has never been a better time to be a lawyer.”

Dangers and downfalls?

Ms Chatwood mused: “There is a potential downfall where AI could exacerbate social injustice or access to justice issues.”

“AI is a creature of instructions,” she highlighted. “It relies on data input — and so, if there is structural inequality in the data that is input, then the interplay of that data with the technology will not address that.”

“Predictive results reflect the biases of the input,” she noted.

Ms Chatwood continued: “These realities limit the potential for AI to solve the access to justice problem.”

“Machines making decisions may, in fact, crowd out the role of human expertise, which could be crucial to justice.”

Ms Chatwood gave an example: “Algorithms, such as sentencing algorithms, have been known to be biased against minority groups.”

“Inappropriate over-reliance on AI in law could jeopardise the part of justice that relies on discretion and human judgement.” 

Ms Chatwood noted that increased reliance on AI might lead to gaps in justice in unintended ways.

“Socioeconomic groups with greater wealth have greater potential to harness AI for legal work,” she explained. 

“But less advantaged groups may fare badly in the AI arms race and so be unable to truly deliver on the benefits of AI to provide access legal information for those who can least afford it,” she said.

Ms Chatwood noted that it is important to ask questions about the nature of the AI tools being deployed, in that while they might improve efficiency and access to justice, the complex and confidential nature of many AI systems can mask details on how the AI decisions are made. 

Mr McLean commented: “Right now, you can ask ChatGPT how to challenge a parking fine in NSW or how to draft a valid Australian will and you will receive a pretty good response, although you do need to consider the big dangers in this advice.”

“The big danger with AI is that it lies, and lies convincingly,” he said. AI has the tendency to produce content that is nonsensical or untruthful in relation to certain sources.

“This tendency can be particularly harmful as models become increasingly convincing and believable, leading to over-reliance on them by users,” he outlined. 

“The recent jumps in the ability of AI to understand requests and response in natural language are a potentially huge disruptor for the legal industry.” 

“AI has to potential to become a powerful tool to enable lawyers to provide advice more efficiently and more effectively provided some of the challenges can be overcome.” 

Mr Fischl reflected on the risks: “The biggest risk is that the technology is made available to those in need without proper design.”

“People will accidentally go down a path where they interpret material incorrectly and run into greater trouble.”

“Contrary to what you read, this technology will not, out of the box, assist those in our society access justice.”

“It will only achieve our ambition to provide equal access to justice when it is designed by legal experts.”

“When the technology is implemented with a proper design, created by lawyers, we will overcome the barriers, access to justice for all will be greatly increased, and we will move into a new era of legal advice and representation, underpinned by the expanded use of AI.” 

You need to be a member to post comments. Become a member for free today!