Goodbye job applications, hello dream career
Seize control of your career and design the future you deserve with LW career

New deepfake porn laws to ‘address a major legal gap’ in Australia

The Albanese government has revealed plans to bring new legislation to Parliament this week, aimed at prohibiting the distribution and creation of deepfake pornography.

user iconGrace Robbie 05 June 2024 Big Law
expand image

Possession or distribution of digitally altered deepfake pornographic materials could result in individuals facing severe criminal penalties under upcoming national legislation, set to be introduced in the Federal Parliament this week.

Under the proposed legislation, individuals distributing illicit material will be subject to a six-year prison sentence, while those involved in creating and distributing such content will face a more severe penalty of seven years imprisonment.

The federal government has been prompted to implement legislation prohibiting the sharing of sexually explicit, non-consensual material due to the notable surge in the production of AI-generated pornographic deepfake images.

 
 

Attorney-General Mark Dreyfus is scheduled to introduce this new legislation on Wednesday (5 June) in Parliament.

Dreyfus emphasised the importance of making the distribution and sharing of such illegal materials criminalised, stating that “digitally created and altered sexually explicit material is a deeply distressing form of abuse against women and girls and can cause long-lasting harm”.

“These reforms will make clear that those who seek to abuse or degrade women through doxxing, deep fakes, or by abusing their privacy online, will be subject to serious criminal penalties,” Dreyfus said.

Prime Minister Anthony Albanese also expressed how harmful content glorifying violence against Australian women will not be tolerated.

“There should be zero tolerance for harmful content that glorifies violence against Australian women. Young adults should not be coached in disrespect or misogyny by online influencers,” he said.

As reported by the ABC, new offences will apply exclusively to sexual material involving adults, while child abuse material will continue to be addressed under distinct, dedicated offences.

Dr Asher Flynn, the chief investigator and deputy lead at the Australian Research Council Centre and associate professor of criminology at Monash University, expressed support for the Albanese government introducing this legislation, which will “address a major legal gap” present in Australia.

“The Australian government’s decision to introduce laws criminalising the non-consensual production or creation of a sexualised deepfake of an adult will address a major legal gap in Australia, in which the creation of sexualised deepfakes was not illegal anywhere in Australia, except in the state of Victoria. The new laws will see Australia once again leading the way in legal responses to online sexual harm,” Flynn said.

“The laws may also go some way towards curbing the accessibility of sexualised deepfake technologies. If it is illegal to create or produce non-consensual deepfake imagery, then it would likely reduce the capacity for the technologies, like free nudity apps, to be advertised.”

She also stressed the significance of the government implementing additional measures to complement this new legislation to ensure that the legislation is effectively enforced and adhered to.

“It is important that these proposed laws are introduced alongside other responses which incorporate regulatory and corporate responsibility, education and prevention campaigns, training for those tasked with investigating and responding to sexualised deepfake abuse, and technical solutions that seek to disrupt and prevent the abuse,” Flynn said.

Creating legislation to criminalise the production of explicit non-consensual material is of significant importance in Australia, and Flynn cited a study she recently conducted.

“Through a survey conducted in 2019, we found that, across Australia, the United Kingdom and New Zealand, 14.1 per cent of respondents aged between 16 and 84 had experienced someone creating, distributing or threatening to distribute a digitally altered image representing them in a sexualised way,” she said.

“People with disabilities, Indigenous Australians and LGBTIQA+ respondents, as well as younger people between 16 and 29, were among the most victimised.”

Flynn also articulated the pressing need to attribute accountability to the developers and originators of the technology that facilitates the creation of deepfake content.

“Responsibility should also be placed onto technology developers, digital platforms and websites [that] host and/or create the tools to develop deepfake content to ensure safety by design and to put people above profit,” Flynn said.