Price Prediction

Why Is Nvidia Is Worth $5 Trillion: Inside a $35 Billion AI Datacenter

If you want to know why Nvidia is valued at $5 trillion, take a look at the data and chart below. They show how this tech giant is scooping up a huge portion of the AI spending boom.

As AI enters its industrial phase, the world’s most advanced data centers now measure their scale not in square footage or servers, but in gigawatts of computing capacity. Wall Street has begun to measure the cost of these gigawatts, and predict which companies might benefit from the spending spree.

TD Cowen analysts put this in context, writing in a research note this week that 1 gigawatt is roughly the output of a nuclear reactor. That’s the new baseline for next-generation AI data centers, such as xAI’s Colossus 2 in Memphis, Meta’s Prometheus in Ohio and Hyperion in Louisiana, OpenAI’s Stargate, and Amazon’s Mount Rainier project in Indiana.

These sprawling structures require huge amounts of electricity, and combine that with capital and silicon to churn out intelligence. It’s an expensive process.

According to new analysis from Bernstein Research, 1 gigawatt of AI data center capacity costs about $35 billion. That may sound extreme, but it represents the new economic foundation of AI. Each gigawatt of data center capacity is not just a measure of power, but a proxy for an emerging industrial ecosystem spanning semiconductors, networking gear, power systems, construction, and energy generation.

Here’s what makes up the $35 billion gigawatt (GW), and which companies stand to gain, according to Bernstein and TD Cowen estimates this week.

GPUs

The single biggest cost driver in an AI data center is the compute itself. Bernstein estimates that roughly 39% of total spending is devoted to GPUs, dominated by GB200 and other upcoming AI chips from the company, such as the Rubin series.

With Nvidia’s 70% gross profit margins, Bernstein calculates that the company captures nearly 30% of total AI data center spending as profit. No wonder this company is worth almost $5 trillion.

TD Cowen’s data shows that each gigawatt translates to more than 1 million GPU dies, the core brain of these AI chips. Nvidia’s foundry partner, TSMC, earns $1.3 billion per GW from manufacturing many of these components, these analysts estimated.

Other chipmakers such as AMD and Intel are trying to catch up, while hyperscalers including Google, Amazon, and Microsoft are investing in AI ASICs, custom accelerators that could reduce total system costs. Even so, GPUs remain the economic center of gravity, according to Bernstein and TD Cowen analysts.

Networking

Next in line are the arteries connecting those GPUs together. Bernstein estimates 13% of data center costs go to networking equipment such as high-speed switches and optical interconnects.

Arista Networks, Broadcom, and Marvell are positioned to benefit as switch vendors and chip designers. Arista’s high margins mean its profit share is proportionally greater than its revenue share.

Meanwhile, component makers including Amphenol and Luxshare gain from cabling and connectors, while optical transceiver makers such as InnoLight, Eoptolink, and Coherent stand to profit, too, according to Bernstein analysts.

Power and Cooling Infrastructure

The physical infrastructure around compute racks, generators, transformers, and uninterruptible power supplies, accounts for another big part of the costs of a 1 GW AI datacenter. Power distribution alone takes up nearly 10% of spending, according to Bernstein.

Eaton, Schneider Electric, ABB, and Vertiv are major players here. Vertiv also has an opportunity in thermal management, which makes up about 4% of total spend, split between air and liquid cooling systems, Bernstein estimates.

Real Estate, Electricity, and Labor

Land and buildings make up about 10% of upfront costs. But once the lights go on, operational costs are surprisingly small. It costs about $1.3 billion in electricity to run a 1 GW AI data center for a year. Personnel costs are also negligible, with huge data centers reportedly operating with 8 to 10 people who get paid $30,000 to $80,000 per year each, according to Bernstein.

The bottleneck, however, is shifting toward power availability. Siemens Energy, GE Vernova, and Mitsubishi Heavy now report surging orders for turbines and grid infrastructure as hyperscalers fight to secure reliable electricity at scale.

Sign up for BI’s Tech Memo newsletter here. Reach out to me via email at abarr@businessinsider.com.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button