PauseAI in Australia
A message from PauseAI volunteers in Australia:
By 2030, artificial intelligence could be fully automated, self-improving and smarter than humans at almost everything. This isn’t science fiction—it’s the assessment of leading AI companies and researchers. When this happens, every aspect of life will change forever.
Join our community | Email us | Connect on Facebook | YouTube channel | LinkedIn | Events
What risks are we facing?
Artificial Intelligence is advancing at an astonishing rate. Experts like Sam Altman, Dario Amodei, and Geoffrey Hinton warn that AI could surpass human intelligence within the next five years. Without international cooperation, this could result in economic chaos, war, and even human extinction.
“As general-purpose AI becomes more capable, evidence of additional risks is gradually emerging. These include risks such as large-scale labour market impacts, AI-enabled hacking or biological attacks, and society losing control over general-purpose AI.”
– International AI Safety Report (2025), co-authored by 96 experts from 30 countries, including Australia.
Don’t we want AI’s benefits?
Sure. Artificial Intelligence already has the potential to be a powerful tool. If AI remains under control, it could be used to cure diseases, drive scientific breakthroughs, and spread opportunity and wellbeing. But it would be tragic to achieve these advances only to then lose control and suffer catastrophic losses.
“We seem to be assuming AI will neatly fit into a benign pattern. That assumption only holds to the extent AI is analogous with most of what has come before. And in the circumstances, we’d be wise to examine it far more rigorously before settling on it because there are good reasons to suppose it is a different species altogether, for which history is a poor guide.”
– Waleed Aly
New technologies have always brought change, but humans need time to adjust, safeguard, and plan for the future. For any other technology—whether aeroplanes, skyscrapers, or new medications—we insist on expertly designed safety measures before exposing the public to risks. This is not happening with AI.
AI companies are in a race, fueled by billions of dollars of investment, to build superhuman AI first. When one company succeededs, your life and that of your loved ones will become radically different, and you won’t have any say in what this future holds. This isn’t just a tech issue— it will affect everyone.
What can be done?
PauseAI proposes an international treaty to pause the development of smarter-than-human general AI until there is a credible plan to ensure it is safe. It is in Australia’s interest to advocate for this.
“Who will show leadership on negotiating an AI non-proliferation treaty? It is a collective responsibility and certainly one to which Australia could contribute.”
– Alan Finkel, Australia’s Chief Scientist (2016–2020)
History shows that smaller countries can make a big difference in solving global problems. Take the 1982 ban on whale hunting and the 1987 agreement to protect the ozone layer. Australia, which used to hunt whales itself, became a leader in protecting ocean life by supporting the ban and even taking Japan to court over its whaling. Australia also helped protect the environment by quickly joining the agreement to stop using chemicals that were damaging the ozone layer. These stories show that countries like Australia can make real change happen worldwide by taking action and working with other nations.
Aren’t there more important issues?
We agree that there are many important issues facing Australia, but we won’t be able to solve them in a world with uncontrolled AI. Australia should be advocating for an international treaty at the same time as it works on other issues.
Why isn’t anything being done already?
Australian politicians have looked at some of the smaller risks of AI, but not the big ones. As of the last election, the major parties do not have a clear plan.
We acknowledge that not everyone agrees about the risk of an AI catastrophe. We address some of the common objections here. We don’t claim to be 100% certain, but we think the probability of very bad outcomes is more than high enough to justify a pause.
It is psychologically difficult to think about potential catastrophes. Many people assume that the risks are out of their control and therefore not worth worrying about. Yet, anyone can take action right now by speaking up. We think it’s better to act than to simply worry.
How can I help in Australia?
You can make a difference. Volunteers in Australia raise awareness, protest, lobby, and support the global PauseAI movement.
- Join our community
- Attend our next Australian online or in-person event
- Contact Australian politicians (using this easy tool)
- Talk to your friends and family about AI risk
Campaigns
Petition to the House of Representatives
In September 2025, e-petition EN7777 to the Australian House of Representatives was open for 30 days and collected 168 signatures. The petition asked the House to legislate that all future frontier artificial intelligence systems must pass rigorous independent safety evaluations, and further asked the House to advocate proactively for an international treaty to pause frontier AI development until global safety mechanisms are in place. We await an official response from a minister of the government.
Productivity commission submission
In September 2025, PauseAI Australia responded to the interim report on Harnessing Data and Digital Technology with this submission. Volunteers also made individual submissions (David, Peter, Michael).
Investigate OpenAI
In July 2025, volunteer Mark Brown brought OpenAI to the attention of the Australian Federal Police and the Attorney-General of Australia, alleging potential breaches of the Crimes (Biological Weapons) Act 1976. It was discussed in a news story and on a video podcast. We are still waiting for a response from the AFP and the Attorney-General.
Melbourne protest
In February 2025, volunteers in Melbourne protested they missed opportunity of the Paris AI Action Summit. The protest received coverage in the nine newspapers.