Nvidia Partners with Nanya to Power AI Racks Using LPDDR Memory Equal to 4,500 Smartphones Each

The artificial intelligence boom is no longer straining only the high-end memory used in data centers. It’s now putting serious pressure on low-power DRAM as well, setting the stage for a new kind of shortage that could ripple across the entire semiconductor supply chain.

For years, most of the attention around AI hardware has focused on high-bandwidth memory (HBM), the expensive, ultra-fast memory stacked next to GPUs in many AI accelerators. But as AI processors evolve and workloads scale, chip developers are increasingly turning to LPDDR-style memory (low-power DRAM commonly associated with smartphones and thin laptops) for new designs that need high capacity, strong efficiency, and better thermals.

That shift is becoming a big deal. Companies developing next-generation processors and AI systems are adopting LPDDR in more places than before, including advanced compute modules and rack-scale hardware concepts where power efficiency and heat management are critical. As a result, LPDDR supply is tightening, and early signs of shortages are emerging.

What’s driving the squeeze is simple: demand is expanding faster than the industry’s ability to allocate capacity. LPDDR production has historically been balanced around mobile devices and consumer electronics. AI is changing that equation by adding a fast-growing, premium demand stream that competes for the same wafers and packaging resources. Even if LPDDR isn’t as headline-grabbing as HBM, it still requires leading-edge manufacturing and careful capacity planning—especially when customers want large volumes and consistent long-term supply.

This is also why memory sourcing decisions are becoming more strategic. AI hardware vendors are looking to broaden their supplier ecosystems to ensure stable access to LPDDR and similar low-power DRAM technologies, particularly as new platforms ramp. When large chipmakers start pulling more low-power memory into AI-focused designs, it can tighten availability for other markets, potentially affecting pricing and lead times for devices that rely on the same category of DRAM.

In the near term, the key takeaway is that AI’s memory appetite is expanding in multiple directions. HBM remains crucial, but it’s no longer the only pressure point. LPDDR and LPDDR-equivalent solutions are now part of the AI infrastructure conversation—and if demand continues to accelerate, low-power DRAM could become the next battleground for supply, capacity, and cost across the tech industry.