Goodbye job applications, hello dream career
Seize control of your career and design the future you deserve with LW career

How should Australia regulate tech platforms?

One legal body discusses how Australia should regulate to combat the harms posed by platform media sites, and the special counsel of an ethics centre discusses the need to be targeted in our approach.

user iconJess Feyder 20 March 2023 Big Law
expand image

Australia is lagging behind in legislating on the harms created by big tech, as disinformation spreads like wildfire in our democracy, the Human Rights Law Centre (HRLC) said in a statement.

Millions of Australians rely on social media networks like Facebook, Twitter, Instagram and TikTok for news, nationally and from the wider world, yet, these platforms have come under scrutiny for being implicated in the spread of disinformation and hate speech. 

“While Australia has been an early mover on reform for online safety and digital media, it lags on key aspects of regulating digital platforms,” the HRLC said in a statement. 

The HRLC recently made a submission to the parliamentary inquiry into the influence of international digital platforms and has called for three key recommendations:

  1. For the federal government to move away from self-regulatory and co-regulatory models for digital platforms, and ensure new regulations are written by legislators or regulators;
  2. For the federal government to introduce comprehensive digital regulatory frameworks for Australia, focused on transparency and risks arising from platforms’ systems and processes. Including:
    • for major platforms to undertake risk assessments that identify human rights risks and other forms of harm, with obligations to develop and implement mitigation measures; 
    • a data access regime to support civil society research and enhance transparency; 
    • measures to give users greater control over the collection and use of their personal data, including by making recommender systems opt-in; 
    • broad information-gathering and enforcement powers for the establishment of an independent and well-resourced regulator.

3. For Parliament to consider mechanisms for enhancing parliamentary scrutiny of digital regulation, such as the establishment of a committee for digital affairs, and for improving the coordination of tech policy across government.

Scott Cosgriff, senior lawyer at the HRLC, commented: Technology should serve communities, not put people at risk.

“Australia’s regulatory framework for digital platforms should be centred around the protection of human rights and community interests.

“Big tech digital platforms are allowing disinformation to spread like wildfire in our democracy.

“Instead of the lax, voluntary and ineffective self-regulation measures currently in place, we need laws to make digital platforms more transparent and accountable.”

“As long as business as usual continues for big tech, Australian people and our human rights will be under threat,” stated Mr Cosgriff. “But disinformation online is a democratic problem with a democratic solution.

“We should have greater control over our own data, and we need greater transparency around why certain content floods our feeds.

“We need comprehensive laws and checks in place to limit the amplification of disinformation that causes harm and undermines trust in institutions.” 

Shane Budden, special counsel with the Queensland Law Society Ethics Centre, spoke to Lawyers Weekly about key considerations needed when creating legislation to regulate platform intermediaries. 

Mr Budden said that while there was a need to review the legislation that regulates social media and the digital space in general, caution is warranted.

“While it is important to take action to prevent the use of social media and digital platforms for spreading hate speech and disinformation, it is vital that basic rights — such as freedom of speech — are also protected,” he stated. 

Mr Budden noted that these platforms, like any media, can be used for goodwill and regulating them is a complex issue.

“Various platforms have been used to vilify certain groups and spread dangerous conspiracy theories, but they have also been used by people fighting against oppressive regimes — for example, the protests in Iran that are often organised via social media,” he explained. 

“In addition, these platforms are often the only way people suffering abuse can get the word out to the rest of the world.”

It is essential, he said, for governments to consult broadly and meaningfully with stakeholders and the community and to ensure that any regulations are proportionate and calibrated to what is needed to address the targeted behaviour rather than use a broad brush.

“We need to be careful in this area, as what offends one person might seem like common sense to another, and censorship is rarely a good option,” Mr Budden highlighted. “This is particularly the case when dealing in the world of conspiracy and online extremism — those who subscribe to these views often think the government is trying to silence them, and heavy-handed regulation might well reinforce their views.”

He noted: “At the end of the day, the most effective way to combat hate speech, disinformation and similar things is to prevent it from happening. That means engaging with the perpetrators and helping them to see reason.

“The best answer to a bad argument is a good argument.”

You need to be a member to post comments. Become a member for free today!