(Top)

PauseAI protest @ Melbourne - June 16th

Join #PauseAI for an upcoming peaceful protest at the Melbourne Convention and Exhibition Centre (MCEC) where Sam Altman will be having a talk.

  • Date & Time: Friday, June 16, 2 pm AEST
  • Venue: Main entrance of MCEC, 1 Convention Centre Place, South Wharf, VIC 3006, Australia
  • Protest Times: 1.30 pm to 3 pm (arrival time) & 4:30 pm onwards (departure time)
  • Logistics: Bring signs and flyers, no fee is required to participate, Startup Victoria membership ticket is currently free

Join us to raise your voice for AI safety and make a difference. Please join #PauseAI’s Discord server

, the #australia channel and AGI Moratorium’s Slack, #λ-australia
 for more discussions.

Press Release

On Friday, June 16th, volunteers from the new PauseAI

movement will gather at the Melbourne Convention and Exhibition Centre to urge the Australian government to take the lead on pausing the development of more powerful and dangerous AI systems.

A rapidly increasing number of AI experts signed a statement

last week that reads:

“Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.”

This has been signed by virtually all AI labs (OpenAI, Google DeepMind, Anthropic) and hundreds of AI scientists including Geoffrey Hinton, the “Godfather of AI”.

AI safety researchers have not reached on consensus on how large the risk of human extinction will be. Results from the “Existential risk from AI survey”

show that estimates range from 2% to 98%, with an average of 30%.

The protesters are urging the Australian government to take the lead on global AI safety and pause the development of more dangerous AI systems. They are also asking them to prioritize the Pause on the AI Safety Summit , which is being organised by the UK and will be held later in 2023.

Pausing AI development is a radically different approach to safety from what the AI lab CEOs like Sam Altman are proposing. OpenAI believes that “it would be unintuitively risky and difficult to stop the creation of superintelligence”

, so they are pursuing further development toward superintelligence.

“We have a choice: do we risk everything to build a superintelligence that the public was never consulted on, or do we stop while we still can?” - PauseAI protesters

“AI companies are putting everything at risk; we’re already seeing the damage, and it will get far worse. Technology development is not inevitable, and pausing should be considered a feasible option. We can’t cede the future to a few CEOs who acknowledge they are willing to risk humanity for their dreams. We all deserve a say on our future, and a global pause gives us that chance.”

“Despite acknowledging the dangers of continued AI development, these companies are merely using it as an excuse to carry on, and seem to refuse to voluntarily give up this dangerous power. In such situations, global collaboration in reigning in this dangerous development is key so that we make sure technology development works for all.”

“We may not have the luxury of time. AI developments are happening at a frantic pace, and we need to act now to prevent the worst-case scenarios. The summit in autumn could be even too late to prevent the worst. We need governments to pause AI development right now”

The PauseAI protesters have concrete agenda suggestions and policy proposals for the summit.

For more information, please visit PauseAI.info

.

Contact