Take action
AI won’t get safer unless we act decisively to push for safety. Choose an activity below depending on your interests or skills.
For everyone
Demand government action
- Write to your politicians: We’ve found emails are surprisingly effective and take relatively little effort. If you don’t feel confident about what to write, start with our email builder . When you get a meeting, you should check out our lobby tips .
- Call your politicians: Try calling legislators’ offices while having a set of talking points in view so you stay on topic.
- Protest: Join one of the protests or organize one yourself .
- Sign petitions: International AI Treaty , Ban Superintelligence , Demand responsible AI , or one of the national petitions: UK , AUS , NL .
Inform people around you
- Share about AI risk on your social media. One of these videos or this website can be a good start. And don’t forget to tag us in your posts.
- Talk to people in your life about AI safety. Answer their questions, and encourage them to act too. Use our counterarguments to help you be more persuasive.
- Tabling and flyering are great ways to reach many people in a short amount of time.
- Attend local events: Many cities have (free / low-cost) events about AI & technology policy. Attending these events is a great way to network and share your concerns. If you want AI safety marketing materials, reach out to us on Discord so we can send you some.
Support PauseAI
- Join or create a local PauseAI community .
- Join the Discord , where most of the collaboration happens.
- Protest or participate in events . If no protest is near you, consider starting one .
- Look over our vacancies to see if any of your skills match our organizational needs. We’re often looking for people with experience in social media, communications, organizing, outreach, and software. Some positions are compensated.
- Sign up as a volunteer so we can find projects in your interest areas).
- Donate to PauseAI or buy some merchandise in our store .
- Follow our social media channels and stay updated. Your local PauseAI chapter may also have dedicated social media pages.
For specific people
If you work in AI
- Don’t work towards better AI: Do not work for AI companies or capabilities research. And do not spread ideas on how we can make AI systems faster or smarter.
- Talk to your management and colleagues about the risks. Get them to take an institutional position toward mitigating risk over profit. Encourage implementation of standard risk mitigation procedures and anonymous reporting.
- Hold a seminar on AI safety at your workplace. Check out these slides and talks and videos for inspiration.
- Sign the Statement on AI Risk .
If you are a politician or work in government
- Prepare for the next AI safety summit . Form coalitions with other countries to share safety information and act quickly when harms arise. Work towards a global treaty.
- Invite (or subpoena) AI lab leaders to parliamentary/congressional hearings to give their predictions and timelines of AI disasters.
- Establish a committee to investigate the risks of AI . Publish the findings, if feasible.
- Make AI safety a priority in your party’s platform, your government’s policy, or just make sure it’s on the agenda.
- Work with opposition politicians to demonstrate that AI safety affects us all, regardless of political beliefs.
If you have experience with (international) law
- Help draft policy. Draft examples . (some frameworks )
- Make submissions to government requests for comment on AI policy (example ).
If you work as a journalist or have a social media following
- Create content about AI dangers or PauseAI. For more information, reach out to us through any of our communication channels .