How Micron Technology Makes its Money: Revenue Breakdown
How does Micron Technology (MU) make money? Full FY2024 revenue breakdown — DRAM, NAND, HBM3E for AI. Memory cycle mechanics, HBM vs SK Hynix competition, CHIPS Act fabs, and why memory margins are so volatile explained.
How Does Micron Technology Make its Money?
Micron Technology (NASDAQ: MU) is the only major US-headquartered memory semiconductor manufacturer — and one of only three globally significant producers of DRAM and NAND flash, alongside Samsung and SK Hynix. Micron generated $25.1 billion in total revenue for fiscal year 2024 (ending August 2024), up 61.9% year-over-year, with net income of $778 million — a dramatic recovery from a $5.8 billion net loss in FY2023 during one of the worst memory downturns in industry history.
Memory semiconductors — primarily DRAM (Dynamic Random-Access Memory) and NAND flash — are the components that allow computing devices to temporarily store and access data. DRAM provides fast working memory that CPUs and GPUs read and write thousands of times per second; NAND flash provides slower but persistent storage. Both technologies are in essentially every electronic device: smartphones, PCs, data centre servers, automobiles, and industrial equipment.
The single most important development in Micron’s recent history is its production and qualification of High Bandwidth Memory (HBM3E) — a specialised DRAM product stacked in 3D and placed directly adjacent to (or on top of) AI processors like Nvidia’s H200 and B200 GPUs. HBM is the memory backbone of modern AI training and inference, and demand for it has surged in lockstep with the AI infrastructure buildout. HBM represents a rare product where Micron holds significant pricing power — supply is tight, demand is urgent, and only three companies on Earth can make it.
Key Takeaways
- Micron generated $25.1B in FY2024 revenue (+61.9% YoY) with $778M net income — a massive swing from a $5.8B net loss in FY2023, illustrating the violent cyclicality of the memory industry
- DRAM accounts for ~70% of revenue ($17.6B) and is the dominant and more profitable product; NAND flash (~28%) has lagged in margin recovery
- High Bandwidth Memory (HBM3E) is the most strategically important product in Micron’s portfolio — the only memory chips that can satisfy the enormous bandwidth requirements of AI GPU training and inference; Micron qualified HBM3E for Nvidia’s H200 and B200, enabling it to compete directly with SK Hynix (the HBM market leader)
- The memory industry is a commodity oligopoly: three companies (Samsung, SK Hynix, Micron) control essentially 100% of global DRAM supply, making the industry’s supply/demand balance — and therefore pricing — extremely sensitive to each player’s capacity decisions
- Gross margin swings are extreme: FY2023 gross margin was negative (-15.5%); FY2024 recovered to 30.7% — a 46-percentage-point swing in a single year driven purely by memory pricing; understanding this cyclicality is the most important analytical framework for Micron investors
- CHIPS Act grants of up to $6.1 billion will fund construction of new DRAM fabs in Idaho (IMFS) and New York (Clay, NY) — the US government’s investment in domestic memory manufacturing as a strategic security priority
- Micron’s Compute & Networking business unit (CNBU, 46% of revenue) is growing 158% YoY and contains both the AI data centre DRAM and HBM products driving the current supercycle
- SK Hynix leads HBM market share (supplying the majority of Nvidia’s HBM allocation); Micron is in second place and gaining; Samsung has struggled with HBM yields — the competitive dynamics within HBM directly determine Micron’s AI revenue share
Micron Technology (MU) Business Model
Micron’s business model is that of an integrated memory manufacturer — it both designs and fabricates its own memory chips, rather than outsourcing fabrication to a foundry like TSMC. This IDM model (analogous to Intel’s position in CPUs) creates the highest capital intensity of any semiconductor business model, but also provides the most control over process technology — critical in a commodity business where cost per bit is the primary competitive variable.
What Micron Actually Makes: DRAM vs. NAND
DRAM (Dynamic Random-Access Memory): The working memory of computing systems. DRAM stores data in tiny capacitors that must be refreshed thousands of times per second (hence “dynamic”) — it is fast and directly addressable by processors, but volatile (data is lost when power is removed). A modern server for AI workloads might have 1–4 terabytes of DRAM across multiple modules. DRAM production is dominated by three companies globally; Micron holds approximately 22–24% of global DRAM market share.
Key DRAM product generations: DDR4 (mainstream server/PC), DDR5 (next-generation, faster, lower power), LPDDR5 (low-power mobile), and HBM (high bandwidth memory for AI accelerators).
NAND Flash Memory: Non-volatile memory that retains data without power — the technology inside solid-state drives (SSDs), USB drives, and mobile phone storage. NAND is slower than DRAM but far cheaper per gigabyte, making it suitable for storage rather than active computation. NAND production is more fragmented than DRAM (Samsung, SK Hynix, Micron, Kioxia/WD, and Yangtze Memory Technologies all produce significant volume). Micron holds approximately 11–13% of global NAND market share.
Key NAND product generations measured by number of layers stacked: 232-layer, 276-layer (current leading edge). Higher layer counts increase storage density per chip, reducing cost per gigabyte.
The Memory Cycle: Why Micron’s Financials Are So Volatile
The memory industry operates on a commodity cycle more extreme than almost any other technology sector. The cycle mechanism:
1. Demand surge: A new technology wave (smartphones, cloud computing, AI) creates rapid demand growth for memory.
2. Industry underinvestment response lag: Memory fabs take 2–3 years and $10–20B to build. When demand surges, supply cannot keep up immediately. Memory prices rise, often sharply.
3. Profit bonanza: All three major DRAM producers (Samsung, SK Hynix, Micron) simultaneously earn high margins — 50%+ gross margins in good upcycles. Each company uses the profits to fund new fab construction.
4. Supply overshoot: New capacity comes online across all three producers simultaneously, often overshooting demand growth. Memory prices fall, sometimes catastrophically (-50% to -70% in severe downturns).
5. Industry losses and production cuts: All three producers lose money. They reduce production growth rates (capex cuts, wafer start reductions) to rebalance supply. Samsung historically has been slower to cut production due to its size and diversification, which prolongs downturns.
6. Rebalancing and next upcycle: Supply growth slows, demand recovers, inventory digests, and prices recover.
The FY2022–FY2024 Micron story is a complete cycle: the smartphone/PC post-COVID demand collapse (FY2022 revenue declines began) → deep downturn (FY2023: $5.8B net loss, negative gross margin) → AI-driven demand surge creating new DRAM demand catalyst → rapid price recovery (FY2024: $778M net profit, 30.7% gross margin) → continued AI-driven upcycle with HBM as the premier product.
HBM: The Product That Changes Micron’s Competitive Position
High Bandwidth Memory (HBM) is architecturally different from standard DRAM and represents the most important structural change in Micron’s competitive position in years:
What HBM is physically: Standard DRAM chips are mounted on PCB modules (DIMMs) connected to a processor via a memory bus. HBM is manufactured by stacking multiple DRAM dies vertically (3D stacking), then connecting the stack to an AI processor (GPU, TPU, custom accelerator) via thousands of tiny connections on an interposer substrate — an architecture called 2.5D packaging. The result: memory bandwidth 10–20x higher than standard DRAM modules.
Why AI needs HBM: Large neural networks running AI inference or training move enormous amounts of data between the processor and memory — transformer models in particular are “memory bandwidth bound,” meaning the GPU’s compute units are often waiting for data to arrive from memory. Standard DRAM (even DDR5) provides insufficient bandwidth for large AI models. HBM3E, with ~1.2TB/s of bandwidth per stack, provides the memory throughput that modern AI workloads require.
The Nvidia relationship: Nvidia’s H100 GPU ships with 80GB of HBM2E (from multiple vendors). The H200 ships with 141GB of HBM3E. The B200 (Blackwell) ships with 192GB of HBM3E — from both SK Hynix and Micron. Micron’s qualification of HBM3E for Nvidia’s GPUs was a commercial breakthrough — it gave Micron access to the highest-value, highest-volume HBM customer in existence.
Competitive landscape within HBM:
- SK Hynix is the undisputed market leader in HBM — it was first to volume production of HBM2E, HBM3, and HBM3E, and supplies the majority of Nvidia’s HBM allocation
- Micron qualified HBM3E in 2024 and is ramping production; positioned as the number-two HBM supplier with a competitive product
- Samsung has struggled with HBM3E yield issues; though it is the largest memory company globally by revenue, it has lagged both SK Hynix and Micron in capturing the AI HBM opportunity — a significant strategic setback for Samsung
HBM commands dramatically higher ASPs (average selling prices) than standard DRAM — roughly 5–8x the price per bit of equivalent DDR5. With AI accelerator demand growing rapidly, HBM revenue contribution to Micron’s overall revenue and margin is increasing each quarter.
Manufacturing: Fabs, CHIPS Act, and the US Supply Chain Rationale
Micron operates fabs in Boise (Idaho), Manassas (Virginia), Hiroshima (Japan), Singapore, and Taichung (Taiwan). The geographic diversity provides some protection against single-location disruptions, though a significant portion of DRAM production is concentrated in Asia.
CHIPS Act investment: The US government designated Micron as a priority recipient of CHIPS Act funding — announcing up to $6.1 billion in grants (with additional loan support) for construction of new leading-edge DRAM fabs in Idaho (Micron’s existing Boise campus, expanding with a new fab building) and Clay, New York (a greenfield site that will be the largest semiconductor fab ever built in the United States when complete). The New York fab is projected to create 9,000 direct Micron jobs and $100B+ in total investment over the programme lifetime.
The US government’s rationale: 100% of global DRAM production is concentrated in the US (Micron), South Korea (Samsung, SK Hynix), and Taiwan. A geopolitical disruption — particularly a conflict involving Taiwan — would eliminate a critical portion of global memory supply. Domestic US DRAM manufacturing is viewed as a national security priority, making Micron uniquely important to US semiconductor policy.
Micron Technology Competitors
Samsung Semiconductor — the world’s largest memory company
Samsung’s semiconductor division is the largest memory manufacturer globally, with approximately 40–45% of DRAM market share and similar NAND share. Samsung’s scale advantages are formidable — it can spread R&D and process development costs across enormous volumes. However, Samsung’s HBM3E struggles (yield issues that led to delayed qualification at Nvidia) created a market opening for Micron that was historically unusual — Micron is not accustomed to having a technical lead over Samsung in any product category.
SK Hynix — the HBM leader and Micron’s most relevant competitor for AI
SK Hynix is the company Micron must most directly measure itself against in the current AI memory supercycle. SK Hynix was first to HBM3 and HBM3E production, secured the dominant position in Nvidia GPU memory supply, and commands premium pricing for its HBM products. Micron’s competitive position vs. SK Hynix in HBM — market share, ASP parity, customer qualification breadth — is the most important competitive dynamic to track. SK Hynix is listed on the Korea Stock Exchange (KRX).
Intel — adjacent competitor in data centre, collaborator in packaging
Intel competes with Micron in data centre memory through Intel’s Optane persistent memory (now discontinued) and as a customer for Micron’s DRAM/HBM products. The competitive dynamic is more complex — Intel uses Micron memory in its Xeon platforms and is also exploring advanced packaging partnerships. Intel’s foundry services (IFS) potentially compete with TSMC for HBM packaging work, which is tangentially related to Micron’s HBM assembly supply chain.
Broadcom and custom AI silicon
Broadcom designs custom AI accelerators (TPUs, network ASICs) for hyperscalers including Google and Meta. These custom chips require HBM — making Broadcom’s silicon roadmap an indirect driver of Micron’s HBM demand. As hyperscalers build more custom silicon (rather than buying Nvidia GPUs), their HBM procurement shifts from Nvidia-led to hyperscaler-led, potentially changing the customer relationship dynamics for HBM suppliers including Micron.
Revenue Breakdown
| Business Unit | FY2024 (Aug) | FY2023 (Aug) | YoY Growth |
|---|---|---|---|
| Compute & Networking (CNBU) | $11.6B | $4.5B | +157.8% |
| Mobile (MBU) | $6.8B | $3.2B | +112.5% |
| Embedded (EBU) | $3.9B | $3.6B | +8.3% |
| Storage (SBU) | $3.2B | $4.0B | -20.0% |
| Total Revenue | $25.1B | $15.5B | +61.9% |
| Technology | FY2024 | % of Revenue |
|---|---|---|
| DRAM | $17.6B | ~70% |
| NAND | $7.1B | ~28% |
| Other | $0.4B | ~2% |
Micron’s fiscal year ends in late August. Financial data sourced from Micron SEC Filings.
Compute & Networking (CNBU) — 46% of Revenue ($11.6B, +158%)
The fastest-growing and strategically most important business unit. CNBU encompasses DRAM and HBM for data centre servers, AI accelerators, and networking infrastructure. The 158% growth rate is almost entirely attributable to two factors: (1) AI-driven DRAM demand from hyperscalers and cloud providers building GPU training clusters and inference infrastructure, and (2) HBM3E ramp as Micron qualified and began delivering HBM to Nvidia and other AI chip customers.
Server DRAM pricing recovered sharply in FY2024 after the FY2023 price collapse — every major cloud provider (AWS, Azure, Google) was simultaneously restocking depleted inventory and procuring memory for new AI data centre buildouts. The combination of restocking demand and structural AI demand growth created a powerful demand surge that overwhelmed available supply.
HBM within CNBU is a relatively small but extremely high-ASP product line that carries disproportionate margin contribution. Micron’s stated target is to grow HBM revenue to “multiple billions” of dollars — a trajectory that would make HBM one of the company’s most profitable products despite a small unit count relative to standard DRAM.
Mobile (MBU) — 27% of Revenue ($6.8B, +113%)
DRAM and NAND for smartphones and tablets. Mobile recovered dramatically from the FY2023 lows driven by two factors: (1) the global smartphone market recovering from its deepest inventory correction since 2008, as manufacturers who had over-ordered components in 2021–2022 digested excess stock; (2) growing memory content per device — flagship smartphones now routinely ship with 12–16GB of LPDDR5X DRAM and 256–512GB of UFS NAND, up from 6–8GB DRAM and 128GB NAND just a few years ago.
The AI smartphone narrative (on-device AI requiring local model inference, which demands more DRAM) is an incremental long-term driver. Each generation of smartphone AI features adds memory content requirements at the margins.
Embedded (EBU) — 16% of Revenue ($3.9B, +8%)
Memory for automotive, industrial, and consumer electronics applications. The most stable of Micron’s four business units — automotive memory demand is relatively insensitive to short-term economic cycles (car production volumes are smoother than consumer electronics) and automotive memory content per vehicle is growing with every new ADAS feature, infotainment system, and connectivity upgrade.
Automotive memory has strict quality requirements (operating temperature ranges, reliability specifications, longevity guarantees) that differentiate it from consumer memory products and command premium pricing. Micron’s automotive-grade DRAM and NAND products are qualification-intensive but provide stable, margin-accretive revenue.
Storage (SBU) — 13% of Revenue ($3.2B, -20%)
NAND-based solid-state drives (SSDs) for data centres (enterprise SSDs, QLC NAND for high-capacity hyperscale storage) and consumer PCs. SBU was the laggard in FY2024 — NAND pricing recovery lagged DRAM recovery, enterprise customers were slower to restock, and competitive pricing from Chinese NAND producers (particularly YMTC — Yangtze Memory Technologies) in the consumer segment provided headwinds.
Enterprise SSDs are the higher-margin and more strategically important SBU product — data centres require massive storage capacity and pay premium pricing for high-reliability, high-endurance enterprise SSDs. Consumer SSDs are more commoditised with tighter pricing. Improving the NAND margin profile — through higher-layer-count product transitions (232-layer to 276-layer+), better enterprise SSD mix, and rationalisation of commodity NAND volume — is a key operational priority.
Revenue Trend (3-Year)
| Fiscal Year | Total Revenue | YoY Growth | Gross Margin | Net Income |
|---|---|---|---|---|
| FY2024 (Aug 2024) | $25.1B | +61.9% | 30.7% | $0.8B |
| FY2023 (Aug 2023) | $15.5B | -49.4% | -15.5% | -$5.8B |
| FY2022 (Aug 2022) | $30.8B | +11.2% | 47.0% | $8.7B |
The three-year table illustrates memory cycle volatility better than any description could. FY2022 was peak-cycle ($30.8B revenue, 47% gross margin, $8.7B net income); FY2023 was the trough (-50% revenue, negative gross margin, $5.8B loss); FY2024 is the recovery (+62% revenue, 30.7% gross margin, $0.8B income). Note that FY2024 revenue ($25.1B) is still below the FY2022 peak ($30.8B), and gross margin (30.7%) is still well below the peak (47.0%) — the upcycle was still in progress at FY2024 year-end.
Micron Technology (MU) Income Statement
| Metric | FY2024 | FY2023 |
|---|---|---|
| Total Revenue | $25.1B | $15.5B |
| Cost of Goods Sold | $17.4B | $17.9B |
| Gross Profit | $7.7B | -$2.4B |
| Gross Margin | 30.7% | -15.5% |
| R&D Expense | $3.1B | $3.1B |
| Selling, General & Administrative | $1.0B | $0.9B |
| Operating Income | $3.6B | -$6.4B |
| Operating Margin | 14.3% | -41.3% |
| Net Income | $778M | -$5.8B |
Financial data sourced from Micron SEC Filings.
Micron Technology (MU) Key Financial Metrics
Gross Margin: 30.7% — Recovery from negative to 30.7% in a single year is extraordinary but entirely consistent with how memory cycle recoveries work: as memory prices recover, cost of goods sold is relatively fixed (depreciation, labour, materials have limited short-term variability), so most incremental revenue flows directly to gross profit. Peak-cycle gross margins for Micron can reach 45–55%; the 30.7% in FY2024 suggests the upcycle had room to run further
Operating Margin: 14.3% — Still well below peak-cycle levels. The operating margin is constrained by R&D ($3.1B, consistent year-over-year even during the loss year — Micron does not cut process R&D during downturns because falling behind in process technology would be catastrophic for a commodity manufacturer). As gross margin expands with memory price recovery, operating margin follows directly
Revenue Growth: +61.9% — Almost entirely driven by memory price recovery and HBM ramp, not unit volume growth. Memory revenue is the product of (units shipped × price per unit); in FY2024, price recovery was the dominant driver while bit shipment growth was moderate. This is both reassuring (shows pricing power, not just volume growth) and a warning — future revenue requires sustained or improving price levels
R&D Spending: $3.1B (12.4% of revenue) — Consistent through the cycle. Micron’s R&D is primarily process technology development — advancing to the next DRAM and NAND process node (smaller features = more bits per wafer = lower cost per bit). Maintaining process technology competitiveness vs. Samsung and SK Hynix requires sustained R&D regardless of the revenue environment. The consistency of R&D spend through the FY2023 loss year is one of Micron’s more important financial decisions
Capital Expenditures: ~$8B — High, reflecting both ongoing fab maintenance and the early-stage investment in CHIPS Act-supported US capacity expansion. Micron’s capex cycle is critical to watch: excessive capex drives industry oversupply (as in 2021–2022), while capex restraint supports pricing (as in 2023–2024). Micron has been disciplined on capex during the recovery
Free Cash Flow: FY2024 free cash flow was modestly positive — operating cash flow recovering while capex remained elevated. As the upcycle continues and revenue/margins expand, free cash flow should grow substantially. Micron’s historical peak-cycle FCF approached $8–10B; returning to that range would enable significant shareholder returns
HBM pricing power: Unlike standard DRAM, which is priced on near-commodity spot and contract markets, HBM pricing is negotiated directly between Micron, SK Hynix, and Samsung with customers (primarily Nvidia, AMD, hyperscalers). The limited supply of HBM-capable production capacity and the urgency of AI GPU demand have given HBM suppliers significant pricing leverage — HBM ASPs are multiples of equivalent DDR5 DRAM. This pricing dynamic is unlike anything Micron has experienced in standard memory markets
Is Micron Technology Profitable?
Yes, on a net income basis in FY2024 — but the more important observation is the trajectory. Micron swung from a $5.8 billion net loss in FY2023 to a $778 million net profit in FY2024 on a 61.9% revenue increase, with a gross margin recovery from -15.5% to +30.7%.
The modest FY2024 net income ($778M) relative to the operating income ($3.6B) reflects the company catching up on deferred tax items, interest expense on debt used to fund fab construction, and other below-the-line items. The operating income trajectory ($-6.4B in FY2023 → $3.6B in FY2024) is a better measure of how quickly the underlying business profitability has recovered.
Micron’s profitability is highly cyclical by nature — the same business that earned $8.7B in net income in FY2022 lost $5.8B in FY2023. Evaluating Micron as “profitable” or “not profitable” at any given moment is less useful than understanding where it sits in the memory cycle and what the trajectory implies for margins over the next 2–4 quarters.
Micron Technology (MU): What to Watch
HBM market share vs. SK Hynix — The most important competitive variable for Micron’s AI revenue. Track quarterly management commentary on HBM revenue growth, customer qualification wins (Nvidia, AMD, hyperscaler custom silicon), and any indication of market share shifts between Micron and SK Hynix. Micron gaining HBM share at SK Hynix’s expense — or vice versa — directly impacts the AI revenue trajectory and margin mix
DRAM pricing cycle — Standard DRAM pricing (DDR5 contract and spot prices) is the primary driver of Micron’s quarterly gross margin. Watch industry supply/demand indicators: monthly DRAM bit shipment data from research firms (TrendForce, IDC), Samsung and SK Hynix quarterly earnings guidance on pricing direction, and any announcements of capex expansion (which signals future supply additions that could suppress pricing)
NAND margin recovery — NAND has lagged DRAM in margin recovery. Improved NAND gross margins would boost Micron’s overall profitability and represent upside from current levels. Key factors: enterprise SSD demand from hyperscalers (restocking cycle), pricing recovery in QLC NAND for data centre applications, and the competitive impact of YMTC (China’s largest NAND producer) on global NAND pricing
CHIPS Act fab construction execution — Micron’s Idaho fab expansion and New York greenfield are among the largest semiconductor investments in US history. Construction timelines, cost overruns, and equipment procurement timelines all affect when new capacity comes online. New domestic capacity is essential for Micron’s long-term competitive position and its access to US government contract customers, but construction complexity is high. Watch quarterly updates on IMFS (Idaho) and Clay (New York) construction milestones
Samsung HBM qualification at Nvidia — Samsung’s ongoing struggle to qualify its HBM3E for Nvidia’s GPU platforms has been a windfall for Micron and SK Hynix, who have split the Nvidia allocation. If Samsung resolves its HBM yield issues and qualifies at Nvidia, it would add supply to a tight market and potentially compress HBM pricing and Micron’s allocation share. Conversely, continued Samsung HBM underperformance extends Micron’s pricing power window
AI demand durability beyond hyperscalers — The current HBM demand supercycle is driven primarily by a small number of hyperscalers and AI companies building GPU infrastructure. If this AI infrastructure buildout moderates (spending plateau, ROI questions, compute efficiency improvements reducing memory requirements per model), HBM demand growth could slow. Watch for any signals in hyperscaler capex guidance — Azure, AWS, and Google Cloud’s data centre investment plans are leading indicators for Micron’s AI memory demand
LPDDR5X and AI mobile memory — The next memory cycle catalyst may be smartphones and edge devices running AI models locally. LPDDR5X (the fastest mobile DRAM generation) and future LPDDR6 will be required for on-device AI inference at scale. Watch smartphone flagship launch announcements for memory spec upgrades as an indicator of mobile DRAM content growth
Micron Technology (MU) Financial Summary
Micron Technology (MU) is the only major US-headquartered memory semiconductor manufacturer, generating $25.1 billion in total revenue in fiscal year 2024 (+61.9% YoY) with $778 million in net income and a 30.7% gross margin — a dramatic recovery from a $5.8 billion net loss in FY2023. The business is structurally a commodity oligopoly: three companies (Samsung, SK Hynix, Micron) control essentially 100% of global DRAM production, making memory pricing highly sensitive to each player’s capacity decisions and creating the violent margin swings that define Micron’s financial profile.
The AI infrastructure buildout has structurally shifted Micron’s demand profile — High Bandwidth Memory (HBM3E) for Nvidia GPUs and other AI accelerators is the highest-value product Micron has ever produced, commands premium pricing, and grows with every new AI GPU generation. Whether the AI memory supercycle sustains, and whether Micron can maintain its HBM qualification position vs. SK Hynix, are the central questions for the stock’s next phase. For broader semiconductor context, see How Nvidia Makes its Money, How Intel Makes its Money, and How Broadcom Makes its Money.
Weekly Company Breakdowns — Visualized
See how top companies actually make money. Visual revenue breakdowns delivered free every week.