(Top)

Take action

AI won’t get safer unless we act decisively to push for safety. Choose an activity below from your interest / skill areas and help save the future!

For everyone

1. Encourage governments to act

2. Inform others

  • Share about AI risk on your social media. Tag @pauseai (Twitter), or @pause_ai (Instagram) in your posts.
  • Talk to people in your life about AI safety. Answer their questions, and encourage them to act too. Use our counterarguments to help you be more persuasive.

3. Join / support PauseAI

  • Join the Discord
    , where most of the collaboration happens.
  • Find or create a local PauseAI community .
  • Participate in a protest, either on your own or with your local group. Find these on event .
  • Discuss how you can best contribute: fill out this form
    .
  • Donate to PauseAI or buy some merchandise in our store
    . The revenue goes directly toward PauseAI’s initiatives to reduce AI risks.
  • Follow our social media channels
    and stay updated. Your local PauseAI region may also have social media pages.

For specific people

If you work in AI

  • DON’T WORK TOWARD STRONGER OR FASTER AI! If you have some cool idea on how we can make AI systems 10x faster, please don’t build it / spread it / talk about it. We need to slow down AI development, not speed it up.
  • Talk to your management and colleagues about the risks. Get them to take an institutional position toward mitigating risk over profit. Encourage implementation of standard risk mitigation procedures and anonymous reporting.
  • Hold a seminar on AI safety at your workplace. Check out these slides
    and talks and videos
    for inspiration.
  • Sign the Statement on AI Risk
    .
  • Reach out by email or on Signal
    if you have anything critical to share - username: pauseainyc.11

If you are a politician or work in government

  • Prepare for the next AI safety summit . Form coalitions with other countries. Work towards a global treaty.
  • Invite (or subpoena) AI lab leaders to parliamentary/congressional hearings to give their predictions and timelines of AI disasters.
  • Establish a committee to investigate the risks of AI .
  • Work with opposition politicians to demonstrate that AI safety affects us all, regardless of political beliefs.

If you have experience with (international) law

If you work as a journalist or have a social media following

  • Create content about the risks of AI to society, or cover one of our events. For more information, reach out to us on the Discord
    , by Signal, or any of our other communication channels.

If you have skills in managing / posting to social media, videography, photography, or journalism

  • Want to assist PauseAI with social media strategy? Reach out to terry@pauseai.info
    for more information.