24 March 2026

Fossil fuel dependence and climate disinformation are now Australia’s biggest threats. Power must be wrestled back from big tech, say former defence leaders

Download the report

A former Chief of the Australian Defence Force, Admiral Chris Barrie Retd, has called for legislation to regulate, and if necessary break up, the digital-tech monopolies that are now enabling an information war threatening Australia’s future and undermining its ability to respond to escalating climate disruption.

Monopolies and duopolies can exercise dangerous market power and political power, including in technology, in the media and social media, and in communications. Anti-trust laws were first enacted in the USA in 1890, and resulted in companies like AT&T and Standard oil being broken up. 

Adm Barrie, a member of the Australian Security Leaders Climate Group, was launching a new report, “The climate disinformation war: How to fight back for Australia’s democracy and security”.

This comes as the Federal Parliament’s Select Committee on Information Integrity on Climate Change and Energy is due to release its final report this week, reflecting increasing attention on the role of disinformation in shaping Australia’s climate and energy policy.

“There has been a failure to understand how energy dependence on fossil fuels will cause both economic disruption and more perilous physical conditions for Australians”, says Admiral Barrie. “Now the two issues are colliding”. 

“We are facing an unprecedented energy crisis much worse by the world’s failure to face its fossil fuel addiction. Layered on top is a climate disinformation war globally and in Australia that is actively undermining the capacity to build a renewable, clean-energy future and curb coal and gas exports.”

 There is great concern amongst Australians about the issue. A new report by the National Security College on how Australians think about security issues says that: 

“Of 15 issues we nominated in July 2025, people rated a range of non-military issues as the most serious threats for Australia over the next ten years. These were AI-enabled attacks (77% ‘major’/’moderate’ threat), severe economic crisis (75%), critical supply disruption (74%), disinformation (73%), and foreign interference in Australia’s politics, government, economy or society (72%)” (emphasis added). 

The disinformation report’s author, intelligence analyst Anastasia Kapetas, says the disinformation war poses a direct threat to democracy: 

“In the digital age, power comes from dominance in the information space, and our world  is increasingly shaped by propaganda and disinformation rather than facts. We are losing this battle and AI will make it much worse.”

Big tech contributes to the spread of disinformation by facilitating the spread of  false information on their platforms, aided by inadequate content moderation and the prioritization of engagement over accuracy.  Their algorithms often prioritise misleading content, making it more visible to users, which exacerbates the problem.  Automated moderation tools can misinterpret context, leading to the removal of critical information while failing to adequately address harmful content.

Kapetas says that: 

“This is no longer just a communications issue. It is a national security threat with consequences for Australia’s sovereignty, economic resilience, disaster preparedness, institutional trust and strategic autonomy. We are already seeing a drift toward authoritarian politics linked to climate denial.”

“Fighting this disinformation war will require political courage and decisive policy action,” says Adm Barrie. “If these threats are not checked, accelerating climate change will crash society as we know it. This is not speculation—it reflects the warnings of the world’s leading climate scientists.”

The report outlines a set of actions to upend the disinformation war, including: 
 

  • Comprehensive anti-trust architecture such as the current European Union Digital Markets Act, to prevent tech platforms that amplify disinformation from becoming too powerful to regulate.
  • Digital platform, social media and AI regulation, making companies take responsibility for online disinformation and other harms, organised around principles of liability, user control, transparency and systemic risk.
  • Urgent, enforceable regulation of generative AI, as chatbots and image generators rapidly scale disinformation.

Ms Kapetas says many of these measures already have strong public support, as concern grows about the unchecked power of big tech, artificial intelligence and disinformation.