(Top)

PauseAI candlelit vigil @ UN HQ NYC, 3rd of June

  • Candlelit vigil to raise awareness about the existential risk of AI.
  • 3rd of June, 7:30PM to 9PM. Sun sets at 8:15 here.
  • United Nations Headquarters in New York City.
  • Sign up

Press Release

On Saturday, June 3rd, at sunset, a candlelit vigil will take place in front of the United Nations. The vigil is one of hope, so humans can come together to act in the face of a growing existential threat. Volunteers from the new PauseAI

movement will gather there to urge governments to organize a summit to halt the development of this dangerous technology.

Half of AI researchers believe

that there is a 10% or greater chance that the invention of superhuman AI will mean the end of humanity. Would you board an airplane if half of the aircraft engineers thought there was a 10% chance of it crashing?

Prominent examples of people warning about the dangers of AI include Prof. Geoffrey Hinton

and Prof. Yoshua Bengio
, both Turing Award winners and pioneers of the most successful AI methods today. Not only scientists but also leaders of AI companies themselves are concerned about this danger:

The advancements in the AI landscape have exceeded expectations. In 2020, it was estimated that an AI system would pass university entrance exams by 2050. This goal was achieved in March 2023 by OpenAI’s GPT-4. This AI has a verbal IQ of 155

, speaks 23 languages, can program, and can deceive people
. Fortunately, GPT-4 still has limitations. For example, it cannot effectively hack or write computer viruses , but it’s possible that these skills are only a few innovations away. Given the current pace of AI investment, this point is rapidly approaching .

These massive and unexpected leaps in capabilities have prompted many experts to request a pause in the development of AI through an open letter

addressed to major AI companies. The letter has been signed over 27,000 times, mostly by AI researchers and tech luminaries. A pause is needed to work on AI legislation, work on the AI alignment problem and adjust as a society to this new technology. A recent survey
in the United States shows significant support for a pause with more than 60% of the public in favor. Unfortunately, it appears that companies are not willing to voluntarily jeopardize their competitive positions by stopping. These AI companies are locked in a race to the bottom, where safety increasingly takes a back seat to improving capabilities. Therefore, the pause must be imposed by governments. Implementing a national pause is also challenging as countries have reasons not to be the first to pause. Therefore, an international solution is needed: a summit. PauseAI is calling on our governments to organize that summit.

For more information, please visit PauseAI.info

.