What impact will the University of Sydney’s new AI policy have on law students?
Following the University of Sydney’s announcement permitting students to use AI in their academic assignments, its interim deputy head of school and dean and two former law students explore the potential implications of this shift on their law students’ education and professional development.
Under its new “sector-leading” assignment policy, the University of Sydney will allow students to use generative AI tools throughout their coursework, marking a significant shift from its previous stance on academic integrity, which banned technologies like ChatGPT.
Beginning in Semester 1 of 2025, students will be permitted to use AI for assignments, with exceptions for exams, in-semester tests, and instances where teaching staff opt to exclude its use.
In Semester 2, the university will implement a “two-lane approach”, where take-home and open assignments will permit the use of AI and other available tools, while in-person assessments will prohibit using the technology.
Professor Joanne Wright, deputy vice-chancellor of education at the University of Sydney, emphasised that integrating modern AI technologies into education is crucial for equipping graduates with the skills necessary to thrive in today’s evolving workforce.
“This is a substantial change to our teaching, assessment and program design, and it is absolutely necessary to ensure our graduates are equipped with the tools they need for the modern workforce.
“Generative AI has already had a profound impact on workplaces, and our graduates are expected to demonstrate skilled use of the relevant tools in job interviews. The changes we’re making to teaching and assessments ensure we are preparing students for their careers without compromising their learning or the integrity of our world-class education,” Wright said.
Following this decision, Lawyers Weekly spoke with Professor Jason Harris, interim deputy dean and interim deputy head of school; Piadora Rahme, a law graduate from the University of Sydney; and Jeremy Short, a current law student and paralegal at the university. Together, they discussed the potential impact of the new policy on law students, offering a balanced perspective on its possible advantages and challenges.
The advantages
AI has already significantly impacted legal studies, with law students increasingly experimenting with and leveraging generative AI for various tasks, including processing vast amounts of case law and statutory data and efficiently summarising complex legal documents.
Harris highlighted the potential benefits of enabling the use of AI technology to the university’s law students, noting that it can serve as a “powerful” learning tool that could enhance learning across a range of areas.
“As in other disciplines, generative AI can be a powerful assistant for learning in law, including where it is used to explore new concepts, schedule and plan work, and refine search strategies,” Harris said.
Rahme explained that allowing these law students to leverage AI goes beyond enhancing their efficiency and saving time; it can also promote a more inclusive and accessible learning environment, providing more significant opportunities to address diverse learning needs and offer personalised support.
“AI has the potential to allow legal studies to be more inclusive. For example, international students who learn the law in a second language that is not their native tongue may struggle to follow class discussions.
“AI can help translate difficult concepts and offer an interactive learning experience that is more specific and refined than a simple translation service,” Rahme said.
She also noted that the University of Sydney’s decision to allow students to use AI will be advantageous, as it will provide opportunities for them to hone critical thinking and problem-solving skills.
“AI also has the potential to refine the skill of giving instructions. In order to get a desired output that is specific to the user’s needs, the user will often need to refine their prompt through various iterations.
“This is advantageous to students who learn through practice to discern what inputs and outputs are useful for instruction, which indirectly can develop the student’s ability to discern what is salient or irrelevant,” Rahme said.
Short emphasised that integrating AI into legal education would significantly enhance students’ efficiency in conducting legal research – a task that is often time-consuming and detracts from focusing on other critical aspects of their studies.
“AI has already transformed the way that law students approach their study. This is evident through the increased efficiency when conducting legal research, which has enabled students to spend more time focusing on the application of legal concepts and principles,” Short said.
“It has also been beneficial through its role in simplifying legal concepts and ideas, which is reflective of what has begun to emerge in legal practice – where firms have become more receptive to its role in increasing efficiency and allowing lawyers to focus on more complex tasks.”
Potential pitfalls
Despite its promising potential, there are considerable concerns surrounding the decision to allow law students to use AI in their assignments and throughout their degrees.
Harris highlighted that one of the key drawbacks of allowing AI is that “inappropriate use of automated writing tools or generative AI” could “impede skill development”.
Rahme raised essential concerns about using AI in legal education, explaining that AI may struggle to capture the subtleties and nuances inherent in legal practice due to its reliance on existing data and patterns.
“The ‘answer’ presented by AI may also fail to accommodate the subtleties and nuances of a question that are specifically designed to test a student’s knowledge of the course curriculum.
“As many students would know, practice papers in law often contain several red herrings or facts analogous to cases studied throughout the course that are intentionally designed to be interrogated by students in their answers,” Rahme said.
“AI, because of its limitations in its training data, would likely fail to recognise these subtleties, risking crucial marks allocated in a set marking criteria.”
Will law students’ development be impacted?
Harris emphasised the importance of their law students recognising AI as a tool to enhance specific work processes rather than as a substitute for fundamental skills and critical thinking.
“As we see in guidance from courts and professional bodies, academic publishers, government agencies and so forth, it is essential to approach AI as a tool that can assist some work processes but cannot be relied on to substitute for the exercise of personalised skills and disciplinary understanding,” Harris said.
“It is important that we continue to operate on the basis that our students want to learn and continue iterating our assessment design and teaching to support this.”
Rahme expressed concerns about the potential impact of AI on students’ critical thinking, highlighting that excessive reliance on AI could hinder their ability to engage critically with legal material and process information in depth.
“In the context of law students’ development, AI can seriously impact a student’s critical thinking process, which requires them to sit with the facts of a particular case or scenario, and systematically apply the law from first principles,” Rahme said.
“With time, overreliance on AI robs the student of independently and organically developing such essential skills which can only be developed with practice. This has the effect of jeopardising a student’s professional competence.”
On the other hand, Short expressed optimism about AI’s potential in legal education at the University of Sydney, saying: “I am hopeful that AI will become a beneficial tool that law students use to assist (and not replace) the way they study and complete assignments.”