1. Age assurance technology
In its announcements on men’s violence on Wednesday, the government said it would provide budget funding for an age assurance technology pilot “to protect children from harmful content, like pornography and other age-restricted online services”.
“The pilot will identify available age assurance products to protect children from online harm, and test their efficacy, including in relation to privacy and security,” it said.
The idea has been mooted for some time, but previously rebuffed by the government. In a March 2023 report, the eSafety commissioner suggested a “roadmap on age verification”, including measures to prevent harm from pornography to children. In an August 2023 response, the government did not take up the call, saying age verification technology was “immature, but developing” at the time.
The response said any such program must work reliably without circumvention, be able to be comprehensively implemented including where such content is hosted outside Australia and balance privacy and security risks while not introducing further privacy risks.
“Age assurance technologies cannot yet meet all these requirements,” the government response read, noting that a decision to mandate age assurance “is not ready to be taken”.
Wednesday’s pilot study announcement will “identify available products” and whether it now is able to be implemented.
A key sticking point would be how much personal data, such as identity documents, a person would have to supply – either to online platforms, third-party platforms or a centralised government database – to verify their age.
2. Misinformation code
The communications minister, Michelle Rowland, on Tuesday said the government would introduce its contentious bill on online misinformation “later this year”. The bill – released as a draft last year but withdrawn after a backlash over free speech – was the first remedy raised by the Albanese government after harmful misinformation spread online following the Bondi Junction and Wakeley stabbings.
Rowland said the government was having “constructive consultations with a number of parties” ahead of releasing the updated draft bill.
“I think more than ever, the events that have gone on in terms of the stabbings in Bondi but also in western Sydney have highlighted how important it is to hold the platforms to account for their systems and processes that should address the spread of harmful misinformation,” she said.
Soon after the Wakeley and Bondi Junction stabbings, Rowland told the Nine newspapers “doing nothing is not an option for any responsible government”.
3. News media bargaining code
The government is under pressure to designate Meta’s Facebook and Instagram, TikTok and X under the news media bargaining code, which would compel those social media companies to negotiate deals with mainstream media outlets for the benefit they derive from news content on their platforms.
The Greens’ communications spokesperson, Sarah Hanson-Young, says the government should designate the platforms. The assistant treasurer, Stephen Jones, is awaiting advice from Treasury and the Australian Competition and Consumer Commission (ACCC) about the effect of changes on news outlets and the social platforms.
Jones said social media companies had a “social responsibility”, including to carrying news on their platforms. He added it would be “anti-democratic” if Facebook, for instance, repeated its 2021 behaviour in entirely removing news content during negotiations.
4. Regulate algorithms and recommender engines
Several government processes looking at online reform have raised concerns about how social media algorithms serve harmful or violent content to users. A meeting of the online harm ministers criticised “algorithmic recommender systems that push content from ‘influencers’ who perpetuate harmful gender stereotypes and condone violence against women” – while the terms of reference for the Online Safety Act review lists recommendation engines among the “harms raised by a range of emerging technologies”, alongside artificial intelligence and end-to-end encryption. Recommendation engines are central to the personalisation of online content.
Guardian Australia understands the government is considering whether reforms to the Online Safety Act and the eSafety commissioner’s powers could compel social media platforms to block young people from seeing harmful or violent content.
5. Further limits on online abuse
The eSafety review also raised the prospect of amending regulations around cyberbullying of children, non-consensual sharing of intimate images, cyber abuse of adults, or changing rules around material that depicts abhorrent violent conduct.
After the Wakeley church stabbing, and the eSafety commissioner’s federal court action against X for hosting footage of the attack, the government is expected to closely consider whether those rules are fit for purpose.
6. A digital ‘coordination body’
Stepping into more novel approaches, a 2023 report of the Senate economics committee looking at the influence of global digital platforms received submissions suggesting a kind of overarching regulator for the tech space.
Chaired by the Liberal senator Andrew Bragg, the report recommended the federal government “establish a digital platforms coordination body”, raising concerns about “fragmentation” and overlap between current regulatory schemes, some with “competing priorities”.
The report noted Greens senator David Shoebridge’s complaint that there was “no lead agency” in many areas. Some submissions called for greater resourcing of existing regulators such as the ACCC or the Office of the Australian Information Commissioner. Others called for a new regulator with specific digital expertise, or a new parliamentary committee dedicated to online issues.
The government response to the committee view noted ongoing work to strengthen existing processes.
7. A ‘tech tax’
To further regulate the online platforms, Hanson-Young has also called on the government to “tax them properly”, as well as setting up new rules for owners.
“Comprehensive media reform is needed to ensure we have media regulation that is fit for purpose and covers both the tech giants and modern media corporations. A ‘fit and proper person test’ should be enforced for large media proprietors and social media giants,” she said recently.
There have been suggestions higher taxes could be used to directly fund public interest journalism.
The Asia-Pacific Development, Diplomacy & Defence Dialogue (AP4D) thinktank recently put forward a new digital platform tax, to fund news media to confront the “rising tide of misinformation and disinformation”.