Skip to main content
Odd Lots

Ray Wang on How AI Is Causing DRAM Prices to Surge

45 min episode · 2 min read
·

Episode

45 min

Read time

2 min

Topics

Artificial Intelligence

AI-Generated Summary

Key Takeaways

  • HBM Production Trade-offs: High Bandwidth Memory requires three times more wafer capacity than commodity DRAM to produce the same number of bits. When manufacturers shift production to HBM for AI accelerators, they crowd out supply for PCs, smartphones, and gaming consoles. This ratio worsens with HBM4 and HBM5 generations, intensifying the shortage for consumer electronics despite HBM representing over half the DRAM market.
  • Memory Demand Across AI Pipeline: AI applications consume massive memory at every stage—training requires HBM and CPU DRAM, inference needs HBM for memory-bound decode operations, and agentic AI demands extensive CPU-based servers with commodity DRAM. Long context windows amplify this demand as models process increasingly complex prompts and generate lengthy outputs, with ChatGPT usage growing from novelty queries to 800 million users requesting detailed multi-page reports.
  • Supply Constraints Through 2026: Memory manufacturers face clean room space limitations preventing rapid capacity expansion. The primary solution involves node migration to 1b and 1c processes using EUV machines, producing more bits per wafer. However, this migration takes time and competes with HBM production needs. Major producers increased capital expenditure significantly in 2025, but meaningful new fab capacity won't come online until 2028.
  • Demand Destruction Patterns: PC makers including Dell, Lenovo, and Asus implemented price hikes across product lines. MediaTek cut 2026 mobile chip outlook by ten to fifteen percent, representing 10 million fewer units. Companies face choices between downgrading product specifications, delaying launches of higher-margin new products, or accepting margin compression. Apple reports managing impacts better than competitors but expects meaningful effects in second half 2026.
  • Pricing Dynamics Reversal: Commodity DRAM spot prices remained stable during early AI hype from 2023-2024, then surged dramatically starting late 2024. Current commodity DRAM margins actually exceed HBM margins despite HBM's reputation for higher profitability. Memory suppliers still prioritize HBM production for strategic positioning in the growing AI market, viewing it as a critical growth driver for the next several years despite lower current margins.

What It Covers

Ray Wang from Semia Analysis explains how AI demand is creating a severe DRAM shortage, causing spot prices to surge dramatically since late 2024. Memory chip makers face clean room constraints while balancing production between high-margin HBM chips for AI accelerators and commodity DRAM for consumer electronics, creating supply shortages expected to persist through 2027.

Key Questions Answered

  • HBM Production Trade-offs: High Bandwidth Memory requires three times more wafer capacity than commodity DRAM to produce the same number of bits. When manufacturers shift production to HBM for AI accelerators, they crowd out supply for PCs, smartphones, and gaming consoles. This ratio worsens with HBM4 and HBM5 generations, intensifying the shortage for consumer electronics despite HBM representing over half the DRAM market.
  • Memory Demand Across AI Pipeline: AI applications consume massive memory at every stage—training requires HBM and CPU DRAM, inference needs HBM for memory-bound decode operations, and agentic AI demands extensive CPU-based servers with commodity DRAM. Long context windows amplify this demand as models process increasingly complex prompts and generate lengthy outputs, with ChatGPT usage growing from novelty queries to 800 million users requesting detailed multi-page reports.
  • Supply Constraints Through 2026: Memory manufacturers face clean room space limitations preventing rapid capacity expansion. The primary solution involves node migration to 1b and 1c processes using EUV machines, producing more bits per wafer. However, this migration takes time and competes with HBM production needs. Major producers increased capital expenditure significantly in 2025, but meaningful new fab capacity won't come online until 2028.
  • Demand Destruction Patterns: PC makers including Dell, Lenovo, and Asus implemented price hikes across product lines. MediaTek cut 2026 mobile chip outlook by ten to fifteen percent, representing 10 million fewer units. Companies face choices between downgrading product specifications, delaying launches of higher-margin new products, or accepting margin compression. Apple reports managing impacts better than competitors but expects meaningful effects in second half 2026.
  • Pricing Dynamics Reversal: Commodity DRAM spot prices remained stable during early AI hype from 2023-2024, then surged dramatically starting late 2024. Current commodity DRAM margins actually exceed HBM margins despite HBM's reputation for higher profitability. Memory suppliers still prioritize HBM production for strategic positioning in the growing AI market, viewing it as a critical growth driver for the next several years despite lower current margins.

Notable Moment

The analyst revealed that commodity DRAM margins currently exceed HBM margins despite conventional wisdom, creating a strategic dilemma for manufacturers. Memory suppliers continue prioritizing HBM production anyway because it represents a new growth driver in a market historically dependent on flat-growth sectors like PCs and mobile phones, making technological leadership in HBM critical for long-term competitive positioning.

Know someone who'd find this useful?

You just read a 3-minute summary of a 42-minute episode.

Get Odd Lots summarized like this every Monday — plus up to 2 more podcasts, free.

Pick Your Podcasts — Free

Keep Reading

More from Odd Lots

We summarize every new episode. Want them in your inbox?

Similar Episodes

Related episodes from other podcasts

Explore Related Topics

This podcast is featured in Best Finance Podcasts (2026) — ranked and reviewed with AI summaries.

Read this week's AI & Machine Learning Podcast Insights — cross-podcast analysis updated weekly.

You're clearly into Odd Lots.

Every Monday, we deliver AI summaries of the latest episodes from Odd Lots and 192+ other podcasts. Free for up to 3 shows.

Start My Monday Digest

No credit card · Unsubscribe anytime