One of the lesser-discussed impacts of the AI push is the sheer amount of energy required to power the masses of systematic infrastructure required to power these expansive systems.
According to reports, the training process for OpenAI’s GPT-4, which is powered by around 25,000 NVIDIA A100 GPUs, required up to 62,000 megawatt-hours. That’s equivalent to the energy needs of 1,000 U.S. households for over 5 years.
And that’s just one project. Meta’s new AI supercluster will include 350,000 NVIDIA H100 GPUs, while X and Google, among various others, are also building massive hardware projects to power their own models.
It’s a huge resource burden, which will require significant investment to facilitate.
And it’ll also have an environmental impact.
To provide some perspective on this, the team at Visual Capitalist have put together an overview of Microsoft’s rising electricity needs as it continues to work with OpenAI on its projects.