Sunday, February 15, 2026

Nvidia’s ‘customer lock-in’ strategy: $2 billion investment in CoreWeave

Input
2026-01-27 00:28:00
Updated
2026-01-27 00:28:00
(Source: Yonhap News)
The Financial News, New York City – Reporter Lee Byung-chulNvidia announced on the 26th (local time) that it will invest an additional $2 billion (about 2.9 trillion won) in data center operator CoreWeave. Having already taken control of the AI chip market, Nvidia is now moving beyond simply selling chips to directly backing the expansion of its customers’ data centers.
Nvidia said it purchased $2 billion worth of CoreWeave shares at $87.20 per share. Jensen Huang, Nvidia’s chief executive officer, stated, "CoreWeave’s expertise in AI factories and its execution speed are recognized across the industry," adding, "Together, we are racing to meet the growing demand for Nvidia AI factories."
Nvidia refers to its specialized data centers as "AI factories." CoreWeave plans to accelerate the construction of these AI factories through 2030.
Nvidia is not just making a simple equity investment in CoreWeave. It has also agreed to purchase up to $6.3 billion worth of CoreWeave’s cloud services under a contract that runs through 2032. Nvidia will effectively act as a backstop, absorbing CoreWeave’s remaining unsold capacity.
For CoreWeave, the deal secures both capital and demand at the same time.
This transaction extends Nvidia’s recent trend of "alliance investments" with key customers. Nvidia has been injecting capital directly into major clients that purchase its AI chips in large volumes. The strategy is to further expand demand while blocking rivals from entering these accounts.
Nvidia has also pledged to invest up to $100 billion in OpenAI over several years. On the same day, Nvidia’s venture investment arm joined a $200 million funding round for AI video startup Synthesia.
There is a clear reason Nvidia is accelerating these moves. Big Tech companies are increasingly trying to reduce their dependence on Nvidia by developing their own chips.
A prime example is Google’s Tensor Processing Unit (TPU), which is gaining traction as companies such as Anthropic adopt it. OpenAI has also begun developing its own AI accelerators with Broadcom. At the same time, it has signed a GPU purchase agreement with Nvidia’s biggest rival, Advanced Micro Devices (AMD).
AI developers are clearly diversifying their supply of Graphics Processing Units (GPUs), which are essential for training AI models.


pride@fnnews.com Reporter Lee Byung-chul Reporter