Why shouldn’t lawyers necessarily trust ChatGPT or similar technologies?
As with any new technology, there can be a certain amount of scepticism or hesitation about relying on it for critical tasks, such as providing legal advice to clients, writes Nick Mann.
As an artificial intelligence (AI) language model, ChatGPT and its rivals can provide useful information and suggestions on a wide range of legal topics based on its training data. However, there are several reasons why lawyers should not necessarily trust ChatGPT or similar technologies without exercising critical thinking and professional judgement:
- Lack of context: ChatGPT lacks the ability to understand the context of a legal issue or case fully. It operates solely on the basis of the words it receives as input and the patterns it recognises in its training data. As a result, it may provide incomplete or inaccurate information that could lead lawyers to draw incorrect conclusions or make faulty decisions.
- Limited scope: While ChatGPT has been trained on vast amounts of data, it may not have been trained on specific cases or areas of law that are relevant to a particular lawyer’s practice. Consequently, its responses may not be comprehensive or tailored to a particular legal issue or question.
- Ethical considerations: Lawyers have ethical obligations to provide competent representation to their clients, and relying solely on an AI system could potentially violate those obligations. A lawyer must exercise independent professional judgement when providing legal advice or representation and cannot merely rely on a machine’s output.
- Liability: If a lawyer relies on ChatGPT or similar technologies for legal advice and the advice turns out to be incorrect or incomplete, the lawyer could face liability for professional negligence or misconduct. It is essential for lawyers to exercise their professional judgement and carefully consider the advice given by any AI technology.
- Cost: While ChatGPT is available for free to the public, other AI technologies or applications may come at a cost. Lawyers must consider the cost of such tools and whether they represent a good value for their clients.
- Reliance on technology: Over-reliance on ChatGPT or similar technologies could lead to a loss of critical thinking skills and could result in the erosion of traditional legal skills, potentially leading to a reduction in the quality of legal advice provided.
Is it surprising that so few lawyers have expressed confidence in using ChatGPT?
It is not necessarily surprising that only a small percentage of lawyers trust ChatGPT or similar technologies. As with any new technology, there can be a certain amount of scepticism or hesitation about relying on it for critical tasks, such as providing legal advice to clients.
Further, the legal profession has traditionally been slow to adopt new technologies, in part due to concerns about accuracy, confidentiality, and ethical considerations. Lawyers are trained to exercise independent professional judgement and to carefully consider the specific circumstances of each case or issue, and there may be concerns that ChatGPT or other AI technologies may not be able to fully replicate that level of judgement.
In addition, some lawyers may not fully understand how ChatGPT or other AI technologies work or may be concerned about the potential risks associated with their use, such as inaccuracies, confidentiality breaches, or legal liability.
That being said, there are also lawyers who are enthusiastic about the potential of AI technologies to assist them in their work, such as in conducting legal research, drafting documents, or managing caseloads. As the technology continues to develop and become more widely accepted, it is likely that more lawyers will begin to trust and rely on AI technologies like ChatGPT, provided that they are used appropriately and with appropriate safeguards in place to ensure accuracy, confidentiality, and ethical considerations.
Do you use chatbots in your practice?
Polaris has used chatbots in the provision of legal services for the last five years and was the first law firm to sign on to and build a bot with Josef. Josef and ChatGPT are two different types of tools with different purposes and functions, so it’s not necessarily accurate to say that one is better than the other.
ChatGPT has a wide scope and is trained on vast amounts of data (of varying quality). Detecting and correcting errors requires significant expertise. And, at present, ChatGPT draws on information up to 2021, meaning that if a recent court decision has altered the legal landscape, advice or information from ChatGPT could be badly out of date.
The use of chatbots has augmented and enhanced the work of our lawyers at Polaris, and anyone advising that the end is coming for the legal profession ignores the irreplaceable effect of empathy and humanness on a client in need of legal services.
Nick Mann is the founder and principal at Polaris Lawyers.