
HBM at $2,900 represents nearly half of the B200’s production cost. This explains why SK Hynix and Samsung – the HBM suppliers – capture significant value from the AI boom. Nvidia’s margins depend heavily on memory pricing dynamics it doesn’t control.
Why HBM Costs So Much
The B200 integrates 192 GB of HBM3E memory, priced at approximately $14-$17 per gigabyte. But HBM isn’t ordinary memory. It requires through-silicon vias (TSVs) that enable unprecedented bandwidth – 8 TB/s for the B200, double the previous generation.
This architectural complexity means HBM consumes approximately three times the wafer capacity of standard DRAM to produce equivalent bits. It’s memory, but it’s extraordinarily demanding memory.
The Memory Supercycle
The AI boom has triggered a structural transformation in memory markets:
– DRAM contract prices surged 420% in 2024, from $3.75 in January to $19.50 in November
– SK Hynix announced all its chips are sold out through 2026
– Samsung and SK Hynix raised prices 30% for Q4 2025
– Inventory levels collapsed from 17 weeks in 2024 to 2 weeks in late 2025
The Triopoly
The HBM market is controlled by three players: SK Hynix (~50% share), Samsung (~40%), and Micron (~10%). All three have their 2025-2026 HBM production sold out.
New capacity takes years to build. Memory makers are cautious about overbuilding, fearing a future glut if AI demand cools. The shortage may persist 3-4 years.
Key Takeaway
As memory chokepoint analysis shows, SK Hynix and Samsung are becoming critical AI infrastructure providers, not just commodity suppliers. Their production decisions determine the pace of AI buildout.
Source: The Economics of the GPU on The Business Engineer









