Is the AI Bubble Bursting?

Travis Hoium
01-31

On Monday, $NVIDIA(NVDA)$ 's shares fell 17%, and the company lost $593 billion in value, the most in a single day in history.

The cause was the DeepSeek AI model that was open-sourced from China, competing on a level playing field with OpenAI’s o3 model and similar models from Anthropic and Google.

The debates around whether DeepSeek cost more than they said or what chips they used aren’t relevant to the industry long-term. What’s relevant is model training getting cheaper and models getting smaller.

Both trends are a big deal for the AI bubble inflating the market.

What is going on? Everyone is reevaluating how much spending is going to go into AI this year and beyond. And the answer may not be as bullish as it was a few weeks ago.

Is Capex Growing?

We are still early in the earnings cycle, but two notes from yesterday’s calls with $Microsoft(MSFT)$ and $Meta Platforms, Inc.(META)$ interested me.

While we expect to be AI capacity constrained in Q3, by the end of FY '25, we should be roughly in line with near-term demand given our significant capital investments.

Amy Hood, Microsoft CFO

What Hood is saying is that Microsoft has been over-spending to buy land, build buildings, and buy GPUs. But it will soon “catch up” with demand. As a result, spending will slow.

  • 2024 Capex: $55.7 billion

  • Q1 2025 Capex: $20.0 billion or $80 billion run rate

  • Q2 2025 Capex: $22.6 billion or $90.4 billion run rate

  • 2025 Capex Guidance: $80 billion or a $74.8 billion run rate in calendar Q1 and Q2 of 2025

This implies a reduction in the capex run rate from the first half of fiscal 2025 to the second half of fiscal 2025.

Over at Meta Platforms, the story was more bullish, but the numbers are similar.

  • 2023 Capex: $28.1 billion for the full year

  • Q1 2024 Capex: $6.72 billion or $26.9 billion run rate

  • Q2 2024 Capex: $8.47 billion or $33.9 billion run rate

  • Q3 2024 Capex: $9.2 billion or $36.8 billion run rate

  • Q4 2024 Capex: $14.84 billion or $59.4 billion run rate

  • 2025 Capex Guidance: $60 billion to $65 billion

At the high end, capex growth will be 9.4%. At the low end, it will be just 1.0%.

This is an about-face to the massive jump in capex that started when ChatGPT was released in Q4 2022. Note: This is capex from the cash flow statement and doesn’t include finance leases, which Microsoft includes in the capex numbers mentioned above.

For perspective, Microsoft and Meta are NVIDIA’s two biggest customers and they’re essentially flattening spending. However, analysts are expecting 52% revenue growth for NVIDIA in calendar 2025.

Something seems amiss.

Models Are Commodities

Why the shift in how big tech is thinking about capex?

The biggest change is the thinking about how much capacity is needed to build models. DeepSeek is the one that’s gotten the attention for their $5.6 million model — which I’m not here to debate, but it’s clear the model was MUCH cheaper than OpenAI’s similarly performant model.

Now, companies are shifting compute build for model-making to inference, and incremental compute will be for inference. This will cause a couple of changes.

  1. Capex will scale with demand and may decline as chips and systems get more efficient.

  2. Spending may shift from NVIDIA chips, which are the only game in town outside of Google’s TPUs for training, to ASICs or custom silicon like $Amazon.com(AMZN)$ ’s Inferentia.

Microsoft has already backed away from OpenAI, declining to participate in its last funding round. Changes are happening, whether we want to see them or not.

Inference Will Move On-Device

What would give me the most heartburn as the CFO of a hyperscaler is the likelihood that inference will simply move on-device.

One of the implications of more efficient models like o1-mini, DeepSeek’s R1, or Gemini Nano is they fit on devices like mobile phones and computers. No need for expensive GPUs in data centers.

This was always the next real step change in AI, but the market seemed to have seen data center growth as inevitable long-term. On-device inference doesn’t mean data centers are dead, but is the data center growth phase of this story over?

What Does This Mean For Investors?

Here’s the fundamental question I’ve been grappling with for months: Where is there durable value generation in AI?

I don’t think we’ve found the answer and that has me wondering why many “AI stocks” are trading for nosebleed valuations.

Every stock below has a price-to-sales multiple over 10x, indicating high expectations for future growth. And growth expectations are rising with NVIDIA at a P/S multiple 10x higher than it had a decade ago.

We are seeing signs the “buy all the GPUs at any price” phase of AI investment is over. Models are being commoditized, inference is getting cheaper and moving on device, and even hyperscalers are slowing their spending growth.

And despite signs the market is overvalued, the most AI popular stocks on the market continue to trade for crazy multiples.

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Comments

We need your insight to fill this gap
Leave a comment
13