What tech companies must do to keep children safe
Australia is at a critical moment in deciding what level of protection and accountability we expect from the technology industry — and the world is watching, writes Steve Baird.
That’s because whatever action we do or don’t take here won’t just affect Australians but will also impact children around the world.
The Philippines is a global hotspot for OSEC, and while it is also home to a ground-breaking global law enforcement collaboration that combats this heinous transnational crime, it is imperative that Australia and other countries adopt laws to enable offenders to be detected, prosecuted, and appropriately sentenced in their home countries.
More than 60 per cent of the OSEC cases that Philippine law enforcement worked on arose from referrals from authorities in other countries. Of these foreign referrals, since February 2019, thirty-five per cent have been from Australia.
The live sex shows are often arranged in forums and on social media platforms and paid for using online payment platforms and intermediary banks — all of whom must bear some responsibility for the children who are being harmed.
As criminals harness technological innovations for malicious ends, regulators are left struggling to keep up. That’s why regulators and companies overseas are watching closely as Australia now debates how much further it is prepared to go in this debate around tech regulation and responsibility.
In many ways, Australia has set the standard for online safety. We have a world-leading anti-child exploitation unit, Task Force Argos, which was profiled in the 2021 documentary The Children in the Pictures. We appointed the world’s first eSafety commissioner, who recently exercised her powers to require five major tech companies to report on the measures they are taking to tackle the proliferation of child sexual exploitation material on their platforms and services.
Tech firms have taken some positive steps in mitigating some of the risks of crimes they inadvertently facilitate. But moves towards increased accountability are not guaranteed.
The Commonwealth Online Safety Act 2021 requires industry associations to develop codes of practice to protect Australians from harmful content and behaviour, including OSEC. The draft codes developed by the industry can and should be stronger and more proactive. For example:
- The draft codes would have tech companies detect existing child sex abuse material (CSAM) on social media, as they have been doing for several years — but this would not apply to direct-messaging platforms;
- The draft codes would not require tech companies to implement measures to detect live-streamed child sexual abuse;
- There is little in the draft codes that would require platforms with end-to-end encryption to detect or block CSAM and live-streamed child sexual abuse; and
- The measures in the draft codes focus on online safety for Australian end users, not extending protection to the many children globally who are harmed by Australians on Australian platforms.
- That providers of online platforms and services should be required to use technological tools to detect not only known CSAM but also first-generation CSAM and live-streamed CSAM;
- That providers of encrypted electronic services should be required to use technological tools and behavioural indicators to detect CSAM before it enters the encrypted space; and
- That the digital industry should use their policies, tools, and rules to tangibly support the privacy and security of victims and survivors to create a safer online environment for all.
But the scale of the crime is, without doubt, much greater, and it is time for tech companies to play a proactive part in protecting vulnerable children from offenders using their platforms. There is an opportunity here for Australia to take the lead.
Steve Baird is the chief executive of International Justice Mission Australia.