Micron vaults to No. 2 in the HBM race as Samsung loses ground

Micron Technology has crossed a major milestone in the high-bandwidth memory market, surpassing Samsung Electronics to take the second spot in global HBM shipments during the second quarter of 2025. The shift, highlighted by Counterpoint Research, marks the first time Micron has moved ahead of Samsung in this fast-growing segment, underscoring how quickly the HBM landscape is evolving as artificial intelligence accelerates demand.

Why this matters
HBM is the performance backbone for modern AI and high-performance computing. It enables the massive memory bandwidth required by leading AI accelerators and data center GPUs. With AI training and inference workloads surging, every percentage point of HBM share has outsized implications for server buildouts, product roadmaps, and chip vendor competitiveness.

What drove Micron’s rise
– Rapid HBM3E ramp: Micron has been scaling production of its latest HBM generation, aligning supply with a wave of AI platforms coming to market. Swift execution on newer, higher-capacity stacks helped the company win more sockets and grow shipments.
– Key customer qualifications: As top accelerator and GPU makers expanded their supplier rosters to secure memory availability, Micron’s timely qualifications translated into real volume in Q2.
– Tight supply, diversified sourcing: With HBM supply still constrained industrywide, hyperscalers and chip designers are diversifying procurement. Micron’s momentum reflects that strategy, reducing reliance on a single dominant supplier.

The competitive picture
While Micron’s ascent to number two is the headline, the broader story is a three-way race. SK hynix remains the category leader, buoyed by early bets on advanced HBM and strong traction across flagship AI platforms. Samsung, meanwhile, ceded ground this quarter as it works to accelerate production and qualifications for its newest HBM offerings. The gap between vendors can shift quickly as yields improve, capacities scale, and next-generation products roll out.

Market dynamics to watch
– AI demand remains relentless: Training larger models, powering inference at scale, and deploying on-prem AI clusters continue to stretch HBM availability through 2025.
– Transition to denser stacks: The industry is moving swiftly from earlier HBM iterations to higher-capacity stacks, pairing more memory with each accelerator to boost performance per watt.
– Advanced packaging bottlenecks: 2.5D and 3D packaging capacity is a critical gating factor. Even with DRAM supply, packaging throughput can dictate how fast finished HBM modules reach customers.
– Pricing and margins: Tight supply and premium specifications support robust pricing, which can meaningfully impact profitability for memory suppliers that execute well on yield and volume.

What it means for buyers and builders
– More supplier choice: Micron’s gains give AI chip vendors and cloud providers an additional high-volume option, lowering risk and improving negotiating leverage.
– Faster platform rollouts: Additional HBM supply should help ease some lead times for new AI servers and accelerators, although the market remains supply-constrained overall.
– A path to HBM4: As the industry prepares for the next HBM generation, vendors with proven HBM3E execution are better positioned to secure early design wins.

Outlook for the second half of 2025
Expect elevated demand to continue as new AI GPUs and accelerators ramp into volume. Vendors will push to expand both DRAM output and advanced packaging capacity to capture share. The competitive gap could narrow again as Samsung accelerates its ramp, while Micron will aim to consolidate its new position through continued qualifications, yield improvements, and capacity adds. SK hynix is likely to defend leadership with deep customer relationships and a broad HBM portfolio.

Bottom line
Micron moving into the number-two spot in Q2 2025 is a notable turning point in the HBM market. It reflects the reshaping of AI supply chains, the premium placed on execution in leading-edge memory, and the industry’s drive to secure enough bandwidth for the next generation of AI workloads. With three major players pushing hard and HBM demand still outpacing supply, the rest of 2025 is set to be a pivotal period for market share, product innovation, and the rollout of ever more powerful AI systems.