Can AI escape the “energy crisis”? GPT-6 may be stuck in power

avatar
36kr
03-27
This article is machine translated
Show original

A Microsoft GPT-6 training cluster project engineer said that if more than 100,000 H100 GPUs are deployed in a state, the power grid will collapse. Bill Gates said that electricity is the key to determining whether data centers can be profitable. In the next few years, AI development may be restricted by chip design and power supply. Due to delays in power supply, the construction time of some data centers in the United States has been extended by 2 to 6 years.

GPT-5 has not yet come out, and OpenAI seems to have begun training GPT-6, but electricity may have become a "stuck" problem.

Kyle Corbitt, co-founder and CEO of AI startup OpenPipe, revealed that he recently talked with a Microsoft engineer responsible for the GPT-6 training cluster project, who complained that deploying infiniband-level links between cross-region GPUs is really a hassle. A painful thing.

Corbitt asked why the training clusters were not centralized in the same area, and the Microsoft engineer responded, "Well we've tried that, but if you put more than 100,000 H100 GPUs in a state, the grid would collapse ."

What is the concept of 100,000 pieces of H100? For reference, a report from market research agency Factorial Funds shows that OpenAI’s Vincent video model Sora requires 720,000 pieces of H100 during peak periods—an amount that, according to the engineer above, is enough to collapse the power grid in seven states .

Can the data center be profitable and what is the construction time? Look at the "eyes" of electricity

At the S&P Global 2024 Cambridge Energy Week (CERAWEEK), which just concluded, energy industry executives from around the world talked about the advancement of AI technology in the industry and also talked about AI’s huge demand for energy.

“By 2030, AI will consume more electricity than households.” Toby Rice, CEO of EQT, the largest natural gas producer in the United States, cited such a prediction in his speech.

Bill Vass, vice president of engineering at Amazon Web Services, pointed out that the world adds a new data center every three days.

Bill Gates said that electricity is the key to determining the profitability of data centers, and the amount of electricity consumed by AI is staggering. The use of AI will drive up energy demand. In the next few years, the development of AI may be restricted by chip design and power supply.

This is not unfounded - as new data centers are being built at a faster rate than new power plants are being built, a gap between supply and demand is already starting to emerge. CBRE Group, Inc., an American commercial real estate services company, revealed that the construction time of data centers has been extended by 2 to 6 years due to delays in power supply .

"Energy Behemoth"

AI’s title of “energy behemoth” is not deserved.

OpenAI's Sam Altman once "complained" about AI's energy, especially power needs. At the Davos Forum at the beginning of the year, he said that the development of AI requires breakthroughs in energy, and AI will bring about electricity demand that far exceeds expectations .

Data shows that ChatGPT consumes more than 500,000 kilowatt-hours of electricity every day to handle about 200 million user requests, which is equivalent to more than 17,000 times the daily electricity consumption of American households; as for the search giant Google, if it AIGC is called in user searches, and annual electricity consumption will increase to about 29 billion kilowatt hours, which is even higher than the annual electricity consumption of countries such as Kenya and Guatemala.

Looking back at 2022, when AI has not yet set off such a large-scale craze, data centers in China and the United States accounted for 3% and 4% of the total electricity consumption in their respective societies respectively.

As the scale of global computing power gradually increases, Huatai Securities’ research report on March 24 predicts that by 2030, the total electricity consumption of data centers in China and the United States will reach 0.95/0.65 trillion kilowatt hours and 1.7/1.2 trillion kilowatt hours respectively. , which is 3.5 times and more than 6 times that of 2022. Under an optimistic scenario, the electricity consumption of AI in China/US in 2030 will reach 20%/31% of the electricity consumption of the whole society in 2022 .

Analysts further pointed out that because data centers are not evenly distributed, regional power shortage risks will appear first (such as Virginia in the United States) . Considering that there has been almost no growth in electricity in the United States in history, AI will become an important driving factor in the return of positive growth in electricity in overseas developed regions.

Where does the increase in power come from?

A shortage of electricity naturally requires "new electricity", but where does "new electricity" come from? Under the global trend of carbon neutrality, clean energy represented by photovoltaics and wind power seems to be the first choice, but this is only an "ideal choice."

"It's impossible for us to build 100 gigawatts of new renewable energy (power plants) in a few years. It's a bit difficult." Former US Secretary of Energy Ernest Moniz admitted.

EQT CEO Toby Rice added that technology companies need enough reliable power, which cannot be achieved by renewable energy sources such as wind and solar power. As for large nuclear facilities (only one is currently under construction in the United States), they have historically been expensive and time-consuming to build. . "Technology companies are not going to wait 7-10 years for this infrastructure, and then you have to use natural gas."

The executive from a U.S. natural gas giant said that technology companies building data centers have inquired about purchasing natural gas from EQT , and Rice was asked, "How fast can you deliver it?" "How much natural gas can we get?"

U.S. stocks are “no longer a hidden corner”

First there was the "GPU shortage", and then there was the "electricity shortage". It is hard to say that the development of AI will be smooth sailing.

It is worth noting that U.S. stock investors who want to seize the AI wave have set their sights on this corner.

Vistra Energy, one of the largest electricity producers and retail energy suppliers in the United States, Constellation Energy, the largest energy company in the United States, and NRG Energy, the largest green power company in the United States, have all more than doubled their share prices in the past year, and all set share price history this week. new highs .

Judging from the range of increases and decreases in the past year and this year, although the performance of these three companies is not as good as Nvidia, the "strongest stock on the surface", they have also left behind Microsoft, the "company behind OpenAI".

This article comes from the WeChat public account "Science and Technology Innovation Board Daily" (ID: chinastarmarket) , author: Zheng Yuanfang, 36 Krypton is published with authorization.

Source
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments