Goodbye job applications, hello dream career
Seize control of your career and design the future you deserve with LW career

What role should lawyers play in advising on the creation of new tech?

With novel technology set to be created for the 2032 Olympic Games, one special counsel specialising in ethics and a technology partner discussed the role lawyers must play in advising on the creation of new technology. 

user iconJess Feyder 09 March 2023 Big Law
expand image

The Brisbane 2032 Olympic Games are taking shape, with new technology at the fore of how the Games will unfold. 

Dentons partner and head of the Brisbane office Craig Chapman attended one of the first entrepreneurial conferences preparing for the Games last week. He commented: “It was clear from the presentations and panel discussion that these Games will be different.” 

“New technology and its development and implementation were a significant part of the discussion.”

“The 2032 Olympics are expected to feature futuristic technologies ranging from personal drones and crowd emotion monitoring, to flying cars, even hologram broadcasts.”

“Many of the businesses that will deliver this technology are currently start-ups and will need advice across finance, liability, intellectual property, and many areas such as AI that are yet to be settled,” explained Mr Chapman. 

“These Games will also be more global than ever before with international live venues where live hologram events are broadcast being seriously considered.”

Shane Budden, special counsel for ethics at the Queensland Law Society Ethics and Practice Centre, spoke with Lawyers Weekly about why lawyers are integral to the formulation of such technologies, due to the deeply considered approach needed in the creation of new technologies. 

“The constant problem with this sort of technology is that historically, developers have been far more concerned with what they could do than what they should do,” Mr Budden commented, “for example, with the launching of social media”.

“Tech developers were so busy making it possible for people to upload every aspect of their lives in real time that they never stopped to think of the downside. 

“Social media now facilitates bullying and creates mental health issues for teenagers, and developers are struggling to retrofit safety devices and protocols that should have been there in the first place,” said Mr Budden. 

“If we want to avoid the mistakes of the past, we need to explore all the ways the technology could be abused and build in fail-safes, and that doesn’t really appear to be happening.”

“Technologies such as facial recognition and emotion-reading AI are certain to be rolled out at the Olympics, with the understandable goal of preventing terrorist attacks and crowd violence, or predicting crowd surges that could injure people,” Mr Budden noted. “The question is, what else is being done with this technology?”

“Can debt recovery firms purchase access to it in an attempt to serve documents on people? Will legislation be amended to allow police to remove people who are read by the AI as being angry or volatile, but who have not done anything wrong?” he speculated. 

Mr Budden spoke about the role lawyers should play in forecasting the potential downfalls and ethical dilemmas that might arise in the creation of new technologies. 

“A lawyer’s first duty is to the courts and the administration of justice, and that means we need to stand up for the rule of law and fundamental issues such as the presumption of innocence and the right to be afforded natural justice.

“That means it is our duty to identify potential pitfalls in the roll-out of new technology, and to alert the government to them,” he explained. 

“We also need to look at reform across the board on these issues,” Mr Budden continued. “Most of the government legislation for technology — like the Electronic Transactions (Queensland) Act 2001 — is 20 or more years old; that predates iPhones and Facebook, let alone cutting-edge technology like ChatGPT.

“Lawyers are best placed to provide advice on how new technologies might be properly regulated.”

Mr Budden discussed a key difficulty with new technology — that the harm it poses to us is unknown until it appears. 

“We really don’t know what technologies we will be dealing with in 2032,” he highlighted.

“Two years ago, few people outside the world of tech had ever heard about transformer algorithms or ChatGPT.

“Kids are now using that technology to cheat on assignments, and some law firms are no doubt using it to create draft advice and documents; what will we be doing with it in five years’ time, let alone 10?” he asked. 

“Unfortunately, once Pandora’s box is opened, it is too late, and we live with the consequences.

“One thing is certain — there is no stopping the march of this technology. 

“Google, Microsoft and Meta — just to name the big ones — are pouring money into open AI, and speed appears to be winning out over safety,” he stated. 

“It will be up to lawyers to highlight the flaws in regulation and the ways in which this technology can be abused, to inform the government how best to respond to these challenges. 

“It promises to be a busy time.”

For lawyers, the capacity to forecast potential downfalls of new technologies is embedded in their skill set, said Dentons intellectual property and technology partner Robyn Chatwood.

“It is something that we do every day as part of the legal skill set — we are anticipating risks and developing mitigation strategies for our clients,” she explained.

“Lawyers bridge legal, technology and commercial issues, and so lawyers are in a unique position to participate in or lead the debate about the interaction between legal and human rights issues arising from new forms of technologies,” explained Ms Chatwood. 

“Lawyers often act as talisman in these sorts of discussions about the challenges and principles of new tech — whether it be relating to human rights impacts, cyber security vulnerabilities or algorithmic transparency, unfairness, discrimination, bias or other issues.”

“Lawyers will continue to play a critical role in bringing to light the areas of concern and guiding decisions about the risks and ethical considerations of all forms of tech.”

You need to be a member to post comments. Become a member for free today!