(Top)

DON'T LET AI COMPANIES GAMBLE WITH OUR FUTURE

  • Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.
    Statement on AI Risk

    Statement on AI Risk

    Signed by hundreds of experts, including the top AI labs and scientists

  • If you take the existential risk seriously, as I now do, it might be quite sensible to just stop developing these things any further.
    Geoffrey Hinton

    Geoffrey Hinton

    Nobel Prize winner & "Godfather of AI"

  • The development of full artificial intelligence could spell the end of the human race.
    Stephen Hawking

    Stephen Hawking

    Theoretical physicist and cosmologist

  • ... we should have to expect the machines to take control.
    Alan Turing

    Alan Turing

    Inventor of the modern computer

  • If we pursue [our current approach], then we will eventually lose control over the machines
    Stuart Russell

    Stuart Russell

    Writer of the AI textbook

  • Rogue AI may be dangerous for the whole of humanity. Banning powerful AI systems (say beyond the abilities of GPT-4) that are given autonomy and agency would be a good start.
    Yoshua Bengio

    Yoshua Bengio

    AI Turing Award winner

of AI scientists believe the alignment problem is real & important

of citizens want AI to be slowed down by our governments

chance we'll reach AGI in 2025