(Top)

DON'T LET AI COMPANIES GAMBLE WITH OUR FUTURE

  • If you take the existential risk seriously, as I now do, it might be quite sensible to just stop developing these things any further.
    Geoffrey Hinton

    Geoffrey Hinton

    Nobel Prize winner & "Godfather of AI"

  • The development of full artificial intelligence could spell the end of the human race.
    Stephen Hawking

    Stephen Hawking

    Theoretical physicist and cosmologist

  • ... we should have to expect the machines to take control.
    Alan Turing

    Alan Turing

    Inventor of the modern computer

  • If we pursue [our current approach], then we will eventually lose control over the machines
    Stuart Russell

    Stuart Russell

    Writer of the AI textbook

  • Rogue AI may be dangerous for the whole of humanity. Banning powerful AI systems (say beyond the abilities of GPT-4) that are given autonomy and agency would be a good start.
    Yoshua Bengio

    Yoshua Bengio

    AI Turing Award winner

of AI scientists believe the alignment problem is real & important

of citizens want AI to be slowed down by our governments

chance we'll reach AGI in 2025

We risk losing control

AI can have amazing benefits, but it could also erode our democracy, destabilize our economy and could be used to create powerful cyber weapons.

Read about the risks >

We risk human extinction

Many AI labs and experts agree: AI could end humanity.

How and why AI could kill us >

We need a pause

Stop the development of AI systems more powerful than GPT-4 until we know how to make them safe. This needs to happen on an international level, and it needs to happen soon.

Read the proposal >

WE NEED TO ACT RIGHT NOW

In 2020, experts thought we had more than 35 years until AGI. Recent breakthroughs show we might be almost there. Superintelligence could be one innovation away, so we should tread carefully.

How long do we have? >

YOU CAN HELP

Too few people are well-informed about the potential risks of AI. Inform others, and help stop this race to the bottom.

Take action >