(Top)

Implementing a Pause internationally - addressing the hard questions

If we allow the creation of a superintelligent AI, we are risking every single life on earth . When we’re talking about a Pause, we’re talking about implementing an international ban on the creation of a superintelligent AI. However, we need more than just making it illegal, we need to actually prevent it from happening. To keep us safe, we need to make building a superintelligence even more difficult than it already is. And that means we need to address some hard questions.

How do we regulate hardware used for AI?

To train a model like GPT-4, you need a lot of highly specialized and costly hardware (25,000 Nvidia A100 GPUs, cost of $10.000 each). At this point only one company is able to create these graphics cards: Nvidia. Only one company is able to manufacture the chips: TSMC. Only one company creates the lithography machines: ASML.

This all means that the supply chain for creating these AI models is very centralized, which means it is relatively easy to control. Implementing GPU export controls, and keeping track of sales would be a good first step.

But as Moore’s Law continues, the bar will be lowered for creating these models. If it turns out to be hard to make AI provably safe, the pause may need to take many years. Not only will the hardware become cheaper and more performant, we can also expect more efficient algorithms to be developed (we’ll talk about this later). This means that we’ll need to tighten regulation as time goes on.

At some point, we should no longer allow hardware to become more capable. The risk of creating a superintelligent AI could be too high.

How do we regulate software used for training AI?

The Transformer architecture revolutionized the field of AI. This new parallelized design allowed AI models to be scaled up way higher, at a much lower cost, while giving better results. The software side of things is a lot more difficult to control. Software is just information - it can be copied and distributed very easily.

Fir