NVIDIA consensus suggests a lot of imminent AI energy is needed: Barclays
2024.07.14 05:01
In a recent thematic investing report, Barclays analysts discussed the energy demands poised to accompany the rise of artificial intelligence (AI) technologies, with a particular focus on NVIDIA’s (NASDAQ:) role in this landscape.
According to analysts, the projected energy needs tied to AI advancements underscore a crucial aspect of NVIDIA’s market outlook.
Barclays’s analysis indicates that data centers could consume more than 9% of the current U.S. electricity demand by 2030, driven largely by AI power requirements. The “AI power baked into NVIDIA consensus” is one of the key factors behind this substantial energy forecast, analysts noted.
The report also points out that while AI efficiency continues to improve with each new generation of GPUs, the size and complexity of AI models are growing at a rapid pace. For instance, the size of major large language models (LLMs) has been increasing approximately 3.5 times per year.
Despite these improvements, the overall energy demand is set to rise due to the expanding scope of AI applications. Each new generation of GPUs, such as NVIDIA’s Hopper and Blackwell series, is more energy-efficient. Still, the larger and more complex AI models require substantial computational power.
“Large language models (LLMs) require immense computational power for real-time performance,” the report writes. “The computational demands of LLMs also translate into higher energy consumption as more and more memory, accelerators, and servers are required to fit, train, and infer from these models.”
“Organizations aiming to deploy LLMs for real-time inference must grapple with these challenges,” Barclays added.
To illustrate the scale of this energy demand, Barclays projects that powering approximately 8 million GPUs will require around 14.5 gigawatts of power, translating to roughly 110 terawatt-hours (TWh) of energy. This forecast assumes an 85% average load factor.
With about 70% of these GPUs expected to be deployed in the U.S. by the end of 2027, this equates to over 10 gigawatts and 75 TWh of AI power and energy demand in the U.S. alone within the next three years.
“NVIDIA’s market cap suggests this is just the start of AI power demand deployment,” analysts said. The chipmaker’s ongoing development and deployment of GPUs are poised to drive significant increases in energy consumption across data centers.
Moreover, the reliance on grid electricity for data centers stresses the importance of addressing peak power demands. Data centers operate continuously, necessitating a balanced power supply.
The report cites a notable statement from Sam Altman, CEO of OpenAI, at the Davos World Economic Forum, “We do need way more energy in the world than I think we thought we needed before…I think we still don’t appreciate the energy needs of this technology.”