Social Media Platforms Under Pressure to Curb Misinformation

img

Amid growing public concern, parliamentary scrutiny, and the rise of AI-generated disinformation, social media platforms operating in Australia are facing unprecedented pressure to tackle the spread of false and harmful content. From new legislation to high-profile platform boycotts, the era of digital free-for-all may be ending.


A Worsening Problem with Real-World Impacts

Misinformation — whether about elections, vaccines, Indigenous Voice referendums, or climate disasters — has been flooding Australian timelines. A 2024 report from the eSafety Commissioner found that 1 in 3 Australians encountered “potentially harmful misinformation” on a weekly basis.

“We’ve seen conspiracies about bushfire causes, vaccine lies, and even deepfake political speeches circulating unchecked,” says Julie Inman Grant, the eSafety Commissioner. “This isn’t abstract — it affects trust, safety, and public health.”

"We are in an information war, and algorithms are the frontline generals." — Prof. Declan Hughes, Digital Ethics Researcher

The Role of Algorithms and Profit Models

Critics argue that misinformation thrives because it fuels engagement — and engagement drives ad revenue. The business model of major platforms like Facebook, X (formerly Twitter), YouTube, and TikTok is increasingly under fire.

“The algorithm doesn’t care what’s true,” says Dr. Ayesha Doyle from the University of Sydney. “It cares what keeps people scrolling, clicking, and reacting.”

A 2024 ACCC investigation into Meta and Google’s ad practices found that viral misinformation was 6 times more likely to be promoted than factual content due to engagement metrics.

Legislative Response: The ACMA Bill

In response, the Australian government introduced a draft bill empowering the Australian Communications and Media Authority (ACMA) to compel tech platforms to disclose how they moderate misinformation.

Key elements include:

  • Mandatory transparency reports every quarter
  • Penalties up to $7.8 million for repeated failures to address harmful content
  • Independent audits of algorithmic content delivery systems

While some civil liberties groups caution against overreach, public support for the bill remains high. A 2025 Essential poll showed 72% of Australians favour stronger regulation of social media misinformation.

Tech Platforms Push Back

Platforms are resisting what they call government overreach. In early 2025, Meta threatened to limit news sharing on Facebook and Instagram again if the legislation passed — echoing its 2021 standoff with the Morrison Government.

X (formerly Twitter) also warned it may reduce service availability in Australia if content moderation rules become too strict.

“Heavy-handed laws risk free speech and global access,” said a Meta Australia spokesperson. “We believe in a collaborative, not punitive, approach.”

Election Misinformation and Deepfakes

One of the sharpest flashpoints has been election-related misinformation. The lead-up to the 2025 federal election has already seen deepfake videos falsely attributed to key politicians and coordinated foreign influence campaigns.

The Australian Electoral Commission (AEC) has been working with tech firms to rapidly take down false content — but response times vary widely.

“Some platforms act within hours. Others drag it out for days — long enough to cause damage,” says AEC Digital Integrity Lead Samantha Leung.

Local and Grassroots Solutions

Beyond government action, local solutions are emerging:

  • RMIT FactLab and AAP FactCheck are partnering with community media for rapid myth-busting
  • Digital literacy programs in schools and TAFEs are growing
  • Aboriginal media organisations are tackling culturally targeted misinformation in remote communities

“Our mob needs tools to sort truth from garbage — especially when it’s disguised in slick videos,” says Marlee Bundjalung, a journalist with NITV.

Global Context: Australia Not Alone

Australia’s move follows trends seen worldwide. The European Union’s Digital Services Act (DSA) enforces strict rules for digital platforms, while the US is facing similar debates about regulation vs. censorship.

Experts say Australia could play a leading role in balancing digital rights and responsibilities in the Asia-Pacific.

Knowledge is Power.

Help us build Australia's most trusted news source.

Subscribe Now

“We have an opportunity to create a world-first model — if we get the balance right,” says Prof. Linda Rouse from the ANU Tech Policy Centre.

Conclusion: Accountability in the Age of Virality

The stakes couldn’t be higher. As generative AI tools lower the barrier for creating hyper-realistic fake content, and trust in institutions wanes, Australians are demanding clearer boundaries — and consequences — for online misinformation.

Whether through smarter regulation, community resilience, or platform accountability, one thing is clear: the era of digital hands-off is ending, and the time for responsibility has arrived.