Privacy Act reforms must uphold human rights, say advocates
Several prominent legal organisations have released their submissions in response to the Privacy Act Review Report. They discuss the proposals relating to geolocation tracking data, facial recognition technology and other biometric data and discuss how the proposals can be strengthened to uphold human rights.
The Attorney-General’s Department’s Privacy Act Review Report (Report) contains 116 proposals for reforming the Privacy Act 1988 (Cth).
Key aspects of the Report covered the collection and use of personal information, consent around the collection of information, definitions of personal information, time frames around data breaches, regulation of targeted advertising and increased protections for children and vulnerable people, among proposals.
Feedback was sought on the proposals, with several key legal bodies submitting detailed submissions.
Three human rights bodies — the Australian Human Rights Commission (AHRC), the Human Rights Law Centre (HRLC), and the Law Council of Australia (LCA) — focused on key propositions in the Report in addressing the protection of personal information.
“The legal framework for the protection of personal information is critical to the broader right to privacy under these instruments,” the HRLC said in their submission, “the significance of this area of human rights law is growing in the digital age”.
The HRLC broadly supported amendments to the Privacy Act that gives people greater choice and control over the collection and use of their personal information.
The HRLC noted several ways the proposals could be enhanced to ensure such protections.
“There is currently no right under Australian law to opt out of the collection, use or disclosure of personal information for the purpose of direct marketing or targeted advertising,” the HRLC submitted.
“Because many forms of collected information do not meet the existing definition of personal information, many instances of targeting are currently outside the scope of the Privacy Act.
“The proposed expansion of the definition of personal information would be an important limit on excessive data collection and profiling.”
This could be further strengthened by placing limitations on the collection of personal information (instead of permitting its collection and then providing options around its use), the HRLC highlighted.
They recommended that default settings should require an “opt-in” rather than an “opt-out” for direct marketing and personalised advertising.
The AHRC commented in length on two key features of the Report’s proposals on geolocation tracking data and facial recognition technology (FRT).
The AHRC noted that it was essential for the Privacy Act to consider the sensitivity of location-tracking data.
“Location tracking data has the potential to be misused in a way that reveals other sensitive information about individuals to third parties without their knowledge or consent,” it maintained.
The AHRC recommended that geolocation tracking data should be redefined as sensitive data, rather than personal data, to make it subject to more stringent requirements for its use and disclosure.
The AHRC also considered the proposals relating to the consent of allowing the collection of geolocation data by consumers.
It recommended that the definition of consent should be amended to state that it must be voluntary, informed, current, specific and unambiguous.
The Report highlighted concerns around FRT and three key risks associated with it:
- The contribution of FRT to the growth in surveillance;
- The use of data derived from FRT to engage in profiling; and
- The risk that errors connected to facial recognition disproportionately affect certain groups.
“In respect of the growing use of FRT-enabled surveillance, the commission found that this would lead to an inevitable reduction of personal privacy, and that the threat of closer security by police and government agencies can impede participation in lawful democratic processes.”
It also noted the raised risk of profiling people and protecting the rights to freedom of association and assembly, freedom of expression and opinion, and freedom from unlawful and arbitrary arrest.
The AHRC recommended that enhanced risk assessment requirements be adopted regarding the use of FRT and other biometric information.
It also recommended a parliamentary inquiry should be conducted into the risks of facial recognition technologies, specifically in consideration of human rights risks.
The AHRC recommended that governments across Australia introduce legislation that specifically regulates the use of facial recognition and other biometric technologies.
It also suggested a provision that FRT developers and deployers must complete a facial recognition impact assessment of the potential harms, including the potential human rights risk.
“Until the legislation recommended … comes into effect, Australia’s federal, state and territory governments should introduce a moratorium on the use of facial recognition and other biometric technology in decision making that has a legal, or similarly significant, effect for individuals, or where there is a high risk to human rights, such as in policing and law enforcement,” stated the AHRC.
The LCA echoed the recommendations made by the AHRC and the HLRC: “In addition to enhanced consent requirements, greater emphasis could be placed on organisational accountability in the collection and handling of sensitive information, including geolocation and other data.”
“The Law Council recognises that the collection of biometric information, including facial recognition, without prior notice and without consent represents a serious risk to the right to privacy.
“Consideration should therefore be given to tight regulation of the collection and protection of this information, including strict regulation around the use of this data by law enforcement agencies.”
The LCA continued: “In relation to the capture and use of facial recognition technology and biometric data more broadly, it is essential to adopt a technology-neutral, risk-based approach to regulation.
“It is crucial to be mindful of the potential downstream privacy risks caused by the amalgamation of data and interrelatedness of various technologies, particularly in the context of AI.”