MU
Micron Technology
Summary
What they do:
Manufacture DRAM, NAND flash, and High Bandwidth Memory (HBM) — the memory chips that sit inside every AI GPU package, on every server motherboard, and in every storage device — as the only Western HBM producer and #3 globally behind SK Hynix and Samsung, sitting at Layer 08 as a critical-path supplier for AI compute.
Why they matter:
Every NVIDIA Blackwell GPU physically requires 8 HBM3E stacks bonded inside the package — no HBM, no GPU. Micron's HBM market share has grown to ~24%, its Q2 FY2026 revenue nearly tripled to $23.9B, and FY2026 is tracking to ~$109B. The company is the sole American HBM manufacturer — a strategic asset in an era of semiconductor supply chain sovereignty.
Recent performance:
Q2 FY2026 revenue $23.9B (up ~196% YoY), non-GAAP EPS $12.20 (beat $9.31 consensus). Q3 FY2026 guidance: revenue $33.5B, gross margin ~81%, EPS ~$19.15. Stock at ~$460, market cap ~$474B. HBM4 volume production underway for NVIDIA Vera Rubin platform.
Our Verdict
The definitive AI memory play — revenue nearly tripling as HBM demand overwhelms supply, with the only Western HBM manufacturing base and 81% guided gross margins — but at ~12x forward P/E the stock has already re-rated 7x from $66, and memory cyclicality means the peak is always closer than it looks.
Structural trends
Structural
82
/ 100
Moat
7/10
HBM oligopoly with NVIDIA validation lock-in and sole Western producer geopolitical moat
AI Exp.AI Exposure
Pure Play~65% AI
Play Type
EstablishedAI Growth
~196%
Rel. Value
73
COMPELLINGPriceLIVE
$465.66
+9.17%
Live via Yahoo Finance · refreshes every 5 min
Market Cap
$525.1B
P/E Ratio
20.1
P/S Ratio
9.0x
52W High
$471.34
52W Low
$65.65
52W Chg
609.3%
Beta
1.61
Micron Technology manufactures memory — the silicon chips that store data for processors to work with. The company operates massive fabrication plants in Boise, Idaho; Manassas, Virginia; Hiroshima, Japan; and Singapore. These are Class 10 cleanrooms the size of multiple football fields, where billions of transistors are etched onto silicon wafers.
The product line spans three categories. HBM (High Bandwidth Memory) is the crown jewel — vertically stacked DRAM dies bonded with through-silicon vias, delivering 10-20x the bandwidth of standard DRAM. Each HBM3E stack is about 1cm tall and contains 24-36GB of memory. An NVIDIA Blackwell B200 GPU requires 8 HBM3E stacks; the next-generation Vera Rubin platform will require 12 HBM4 stacks. System DRAM (DDR5) sits on server motherboards, providing working memory for CPUs — each AI server contains 512GB-2TB. NAND flash is the persistent storage medium that feeds into SSDs and storage devices (L07).
The financial transformation is staggering. Q2 FY2026 revenue hit $23.9B — nearly tripling from $8.05B a year earlier. DRAM revenue was $18.8B (79% of total), up 207% YoY. Cloud memory business revenue rose over 160% to $7.75B. Full-year FY2026 is tracking to approximately $109B — nearly doubling from prior year. Gross margins have expanded to the ~80% range as HBM and data center DRAM command premium pricing. The company guided Q3 FY2026 at $33.5B revenue with ~81% gross margin and $19.15 EPS.
Micron's HBM market share has grown from ~13% to ~24%, with SK Hynix at ~43% and Samsung at ~33%. Micron started volume production of HBM4 for NVIDIA's Vera Rubin platform in Q1 FY2026, with next-generation HBM4e scheduled for 2027. The company has sold out its entire 2026 HBM supply. Micron is a CHIPS Act beneficiary, with its Idaho megafab expansion co-funded by the US government.
Supply Chain Dependencies
Upstream Suppliers
The Catch
Micron has risen 7x from $66 to $460 in twelve months. That move was earned — revenue nearly tripled, HBM demand is real, and the AI buildout is structural. But memory is the most cyclical business in semiconductors. Every memory supercycle in the past four decades has ended with a crash — not a soft landing. The HBM content-per-GPU multiplier (8→12 stacks) could extend this cycle longer than previous ones, but it does not eliminate cyclicality. When the cycle turns — whether from AI capex slowdown, inventory builds, or HBM supply finally catching demand — Micron's margins compress from 80% to 50% within 2-3 quarters, earnings collapse, and the stock gives back 40-60% of its gains. The 7x move means the downside from here is larger in absolute dollars than the upside for most reasonable scenarios. Position sizing is everything.
If They Win
If HBM demand proves genuinely structural rather than cyclical — if the AI buildout requires ever-increasing memory bandwidth for a decade and content-per-GPU keeps multiplying — Micron becomes America's strategic memory champion. Revenue stabilizes at $120-150B. HBM market share grows to 30%+ as CHIPS Act investments and geopolitical policy route American data center demand through the sole Western producer. Margins sustain 65-75% as HBM's structural supply shortage prevents commodity pricing. Market cap reaches $700B-1T. The geopolitical moat — the only Western company that can supply the HBM chips AI demands — becomes as valuable as TSMC's foundry monopoly. US government policy cements Micron as the default HBM source for American data centers, and the memory cyclicality that has defined the industry for decades is partially broken by the structural nature of AI compute demand.
Others in Memory for the Server
Not financial advice. All scores generated via AI algorithms using public data.