Micron SOCAMM

NVIDIA Plans Major Rollout: 800,000 SOCAMM Modules for AI Innovations in 2023

NVIDIA is gearing up to significantly boost production of its innovative SOCAMM memory solution, designed to provide enhanced upgradeability and performance for AI products. This development is expected to drive demand, especially with the roll-out of the next-gen SOCAMM 2.

Recently showcased at the NVIDIA GTC event, the SOCAMM memory is all set to see a major inventory expansion this year. NVIDIA is focusing on achieving superior performance and energy efficiency in its AI offerings. During the event, their GB300 platform highlighted the use of this cutting-edge memory, developed by Micron, which differs from the HBM and LPDDR5X memories typically used in AI servers and mobile systems.

Unlike conventional LPDDR DRAM, which is often found in mobile and low-power devices, SOCAMM is upgradable, not soldered, and easily secured with three screws. According to ETNews, NVIDIA plans to produce between 600,000 and 800,000 units this year to integrate into its AI product lineup.

The latest GB300 Blackwell platform is one of the first to feature SOCAMM memory, indicating NVIDIA’s shift towards this new modular form. Although the initial production targets are lower compared to HBM shipments expected in 2025, production is anticipated to ramp up as SOCAMM 2 becomes available.

SOCAMM offers a custom form-factor that’s compact, modular, and more power-efficient than RDIMM. While the exact efficiency gains remain undisclosed, reports suggest SOCAMM will achieve superior efficiency and higher bandwidth compared to RDIMM, LPDDR5X, and LPCAMM, popular in mobile platforms.

With an expected 150-250 GB/s of memory bandwidth and easy upgradability, SOCAMM presents a versatile solution for AI PCs and servers. As it gains traction, it’s poised to become the standard for low-power AI devices. While Micron currently manufactures SOCAMM for NVIDIA, discussions with Samsung and SK Hynix about future production are reportedly underway.