Nvidia и OpenAI объединяют силы: инвестиции на $100 млрд для революции в искусственном интеллекте Headline: Nvidia and OpenAI Join Forces: $100 Billion Investment for a Revolution in Artificial Intelligence

Nvidia Corporation and OpenAI have entered into a memorandum of strategic partnership, with total investments potentially reaching $100 billion.

According to Bloomberg, the funding will be allocated in phases, with the initial $10 billion being disbursed upon the agreement’s signing. As part of this arrangement, Nvidia will acquire an equity stake in OpenAI.

The two parties plan to establish data centers with a capacity of 10 GW, aiming to address the critical industry challenge of insufficient computational resources for training complex models.

In expanding its infrastructure, OpenAI intends to enhance advanced functionalities—ranging from sophisticated logical reasoning and multimodal data processing to systems designed for in-depth document analysis. The partners believe this initiative will not only lower the cost of AI-driven solutions but also expedite their transition from labs to real-world applications.

In an interview with CNBC, Nvidia CEO Jensen Huang described the deal as a landmark moment in artificial intelligence.

«We are witnessing the dawn of an industrial revolution in AI,» he stated.

Sam Altman, co-founder and CEO of OpenAI, emphasized that the new computational infrastructure will serve as the foundation for the economy of the future.

«Everything begins with computation. Our collaboration with Nvidia is focused on achieving new breakthroughs in artificial intelligence and ensuring their widespread application for individuals and businesses,» he added.

Greg Brockman, the company’s president, affirmed plans to scale up the technology’s advantages for a broader audience.

The first phase of the project is expected to become operational in the second half of 2026 on Nvidia’s Vera Rubin platform.

Establishing 10 GW of computational power is a complex task and certainly not the most environmentally friendly endeavor.

According to energy consultancy 174 Power Global, cooling systems for such facilities could account for up to 40% of their total energy consumption.

Deloitte experts warned that by the end of 2025, data centers will contribute approximately 2% of the global electricity consumption (536 TWh). Demand from energy-intensive AI may increase this figure to over 1000 TWh by 2030.

The United Nations estimates that a single request to ChatGPT consumes ten times more energy than a Google search. Simultaneously, the cooling of data centers requires a volume of water comparable to six times the total consumption of Denmark.

Research from the Institute for Energy and Environmental Research indicates that in 2018, there were 1,000 data centers in the United States consuming a total of 11 GW (1.9% of the country’s total electricity consumption, resulting in 31.5 million tons of greenhouse gas emissions). By 2025, the number has grown to over 5,000.

«As data centers proliferate, their contribution to carbon emissions is steadily increasing. According to a 2024 study, the carbon footprint of these facilities has reached 105 million metric tons—approximately 2% of total emissions in the U.S., up from 31.5 million tons in 2018,» experts noted.

It is worth mentioning that in July, Meta announced plans to establish a 5 GW data center.