Nvidia и OpenAI: трансформация ИИ с инвестициями в $100 млрд Translation: Nvidia and OpenAI: Transforming AI with $100 Billion Investment

Nvidia Corporation and OpenAI have established a memorandum of strategic partnership, with a total investment potential reaching up to $100 billion.

According to Bloomberg, the funding will be allocated in phases, with an initial $10 billion being made available upon signing the agreement. As part of this arrangement, Nvidia will acquire an equity stake in OpenAI.

The partners plan to deploy data centers with a capacity of 10 GW. Their objective is to address a fundamental limitation of the industry: the scarcity of computational power needed to train complex models.

As OpenAI expands its infrastructure, it aims to enhance advanced capabilities, such as sophisticated logical reasoning, multimodal data processing, and systems for in-depth document analysis. The partners believe this will not only reduce the costs of AI solutions but also expedite their transition from labs to real-world applications.

In an interview with CNBC, Nvidia CEO Jensen Huang described the deal as a monumental breakthrough in the field of artificial intelligence.

“We are talking about the beginning of an industrial revolution in AI,” he stated.

Sam Altman, co-founder and CEO of OpenAI, emphasized that the new computing infrastructure will form the foundation of the economy of the future.

“It all begins with computation. Our collaboration with Nvidia is aimed at achieving new breakthroughs in artificial intelligence and their widespread use for people and businesses,” he added.

Greg Brockman, the company’s president, confirmed plans to scale the benefits of the technology for a broader audience.

The first phase of the project is expected to become operational in the second half of 2026 on Nvidia’s Vera Rubin platform.

Rolling out 10 GW of computing capacity is a complex and far from eco-friendly challenge.

According to energy consultancy 174 Power Global, cooling systems for such facilities could account for up to 40% of their total energy consumption.

Deloitte experts warned that by the end of 2025, data centers are projected to constitute around 2% of global electricity consumption (536 TWh). Demand from energy-intensive AI could escalate this figure to over 1000 TWh by 2030.

According to estimates by the United Nations, queries to ChatGPT consume ten times more energy than a Google search. Simultaneously, cooling data centers requires an amount of water comparable to six times the total consumption of Denmark.

Research from the Institute for Energy and Environmental Research indicates that in 2018, the United States had 1,000 data centers with a combined consumption of 11 GW (1.9% of the total U.S. electricity consumption and 31.5 million tons of greenhouse gas emissions). By 2025, this number exceeded 5,000.

“With the proliferation of data centers, their contribution to carbon emissions is steadily increasing. According to a 2024 study, the carbon footprint of these facilities reached 105 million metric tons—approximately 2% of total emissions in the U.S., compared to 31.5 million tons in 2018,” experts noted.

It is worth mentioning that in July, the company Meta announced plans to create a 5 GW data center.