Энергетический вызов: как высокие расходы на вычисления ставят ИИ-компании перед трудным выбором Headline: The Energy Challenge: How High Computational Costs Put AI Companies at a Tough Crossroads

Current revenues of AI companies may not justify the immense costs of computing, according to HSBC CEO Georges Elhedery, who spoke at the Global Investment Summit for financial leaders in Hong Kong, as reported by CNBC.

In July, analysts at Morgan Stanley indicated that the capacity of global data centers is expected to grow sixfold over the next five years, with the valuation of data centers and their equipment projected to reach $3 trillion by the end of 2028.

A McKinsey report from April provides an even larger figure, predicting that by 2030, meeting the demand for AI infrastructure will require capital expenditures of $5.2 trillion. The costs associated with data centers for traditional IT applications are estimated to be around $1.5 trillion.

Elhedery stated that consumers are not ready to pay for this, and companies will be cautious as performance benefits may not materialize within a year or two.

«It resembles five-year trends, so growth suggests we will start seeing tangible benefits in terms of revenue and willingness to pay for it probably later than investors anticipate,» he noted.

General Atlantic’s Chairman and CEO William Ford concurred: «In the long run, you will create a whole range of new industries and applications, and that will lead to productivity gains, but it will take 10 to 20 years.»

Major tech firms such as Alphabet, Meta, Microsoft, and Amazon have raised their capital expenditure forecasts to $380 billion by 2025. OpenAI has entered into various infrastructure agreements worth around $1 trillion.

Ford commented that the vast expenditures in the AI sector indicate a recognition of the long-term impact of the technology. However, it requires «upfront payment for future opportunities.» He acknowledged that in the initial stages, there may be «misallocation of capital, overvaluation, and irrational exuberance.»

«You are essentially betting on the idea that this will be a large-scale technology similar to railroads or electricity, which had profound impacts over time and transformed the economy. But in the first few years, it’s very challenging to accurately predict how,» concluded the CEO of General Atlantic.

Recently, investors have been actively discussing whether the markets might be overestimating artificial intelligence.

Last week, investor Ray Dalio stated that his personal «bubble indicator» is at a relatively high level. Concurrently, Federal Reserve Chairman Jerome Powell characterized the AI boom as «distinct» from the dot-com era.

Magnus Grimeland, founder of the Singapore-based venture firm Antler, believes the industry is «definitely» not in a bubble. He noted that the adoption rate of neural networks in business is faster compared to other technological shifts, like the transition from physical servers to cloud computing.

Furthermore, artificial intelligence has become a «priority» for thought leaders, whether they are heads of medical institutions in India or executives at Fortune 500 companies in America.

«What sets this situation apart from a bubble and makes it fundamentally different from the dot-com era is that much of the growth is driven by real revenues,» Grimeland stated.

Another distinction between the popularity of AI and the dot-com bubble is the consumer adoption rate.

«Think about how quickly our online behavior has changed, right? A year ago, 100% of my search queries were on Google, and now it’s probably 20%,» Grimeland remarked.

AI projects are increasingly being integrated with familiar online systems. In October, OpenAI unveiled its Atlas browser with an integrated chatbot and AI assistants.

OpenAI’s annual revenue exceeds $13 billion, as the company’s CEO Sam Altman mentioned in a podcast. Although this figure is substantial, it pales in comparison to the $1 trillion the startup intends to invest in computational infrastructure over the next decade.

Host Brad Gerstner posed a question to Altman on this topic, to which Altman responded: «Firstly, we are generating much more. Secondly, Brad, if you want to sell your shares, I can find a buyer for you.»

He added that there are critics who «enthusiastically discuss computing systems or anything else and would be eager to buy shares.»

Altman acknowledged that there are circumstances that might lead to issues, such as a lack of access to sufficient computational resources. However, he added, «revenues are growing rapidly.»

«We’re betting on continued growth, which applies not just to ChatGPT. We hope to become one of the key AI services, our consumer device business will be significant, and AI capable of automating science will create tremendous value,» the entrepreneur said.

Microsoft CEO Satya Nadella highlighted that OpenAI has «exceeded» all business plans presented to his company as an investor.

Large corporations continue to invest tens of billions of dollars in AI initiatives despite discussions about a potential bubble in the sector.

In April, OpenAI secured a deal for $40 billion with a valuation of $300 billion. In October, the company allowed current and former employees to sell shares worth $6.6 billion. The transaction valued the startup at $500 billion, a record for private companies.

On November 3, the cloud computing startup Lambda announced a multibillion-dollar agreement with Microsoft, which involves creating AI infrastructure based on tens of thousands of Nvidia chips.

«We are in the midst of what is arguably the largest technology build-out we have ever seen. The industry is performing very well right now, and a lot of people are using ChatGPT, Claude, and other accessible AI services,» commented Lambda’s CEO, Stephen Balaban.

On October 3, Microsoft announced a $15.2 billion investment in the UAE over four years, which includes supplying advanced Nvidia graphic chips.

As part of the agreement, the U.S. provided the corporation with a license to export chips.

The company began utilizing investment funds in the region starting in 2023. The new agreement anticipates investments of $7.9 billion from the beginning of 2026 to the end of 2029, including $5.5 billion for capital expenditures and expanding AI infrastructure.

Microsoft also signed a $9.7 billion deal with the Australian company IREN to provide cloud computing capabilities for artificial intelligence. This agreement will give the corporation access to infrastructure built on Nvidia GB300 GPUs.

Grimeland emphasized that «vast» amounts of money flow into AI-related companies due to «mispricing,» yet the potentials in the area are far greater.

Energy is a major source of the tremendous costs associated with artificial intelligence. The operation of hundreds of thousands of graphics cards requires a constant power supply. This creates burdens on the grid and leads to rising utility costs, which consumers find frustrating.

Even Altman or Nadella do not know how much energy will be sufficient for AI.

«In this specific case, demand and supply cycles are indeed unpredictable. The biggest challenge we face right now is not the excess of computing power but the ability to rapidly build [data centers] near power sources,» the Microsoft CEO noted in the BG2 podcast.

Otherwise, the company could have too many chips in storage with no place to connect them, he added.

The increasing demand for electricity has outpaced utility companies’ plans to create new generating capacity. Consequently, data center developers have begun acquiring energy directly, bypassing the grid through special agreements.

«If a very cheap type of energy becomes available on a large scale soon, many people will be extremely disappointed with the existing contracts they signed,» Altman observed.

Remember, in July 2024, Bernstein Research suggested a potential electricity deficit in the U.S. if the demand growth from AI data centers continues at its current pace.