A person in a blue shirt stands in front of Dell EMC server racks in a data center.

Dell CEO Warns AI Memory Needs Will Soar to “Unimaginable” Heights by 2028, Forcing Buyers to Pay the Price

Dell CEO Michael Dell believes the AI memory “supercycle” is far from over, arguing that today’s explosive demand for DRAM and other memory technologies could keep accelerating for several more years. Speaking at a recent event, Dell suggested the industry’s current supply-demand imbalance isn’t a short-term spike, but a multi-year shift powered by hyperscalers racing to build and expand AI infrastructure.

A big driver behind this confidence is the competitive pressure facing hyperscalers. In simple terms, cloud giants can’t afford to slow their spending on memory without risking that rivals will outpace them in AI performance, capacity, and cost efficiency. That fear of falling behind keeps budgets flowing into accelerators, servers, and the increasingly massive memory footprints needed to run modern AI.

Dell pointed to a key dynamic reshaping the market: memory requirements are rising in two directions at once. First, each new generation of AI accelerator tends to demand more memory per chip. Second, overall deployment scales up as companies buy and install far more accelerators across data centers. Dell’s estimate combines both trends, projecting that total memory demand could expand dramatically by 2028—potentially up to 625 times—based on a roughly 25x increase in memory per accelerator and a 25x increase in accelerator deployments.

While those figures are estimates rather than confirmed forecasts, the underlying idea matches what many in the supply chain have been observing: AI workloads are pushing memory consumption higher faster than traditional data center growth ever did. It’s not only about high-bandwidth memory (HBM) attached to GPUs and other accelerators, either. The industry is also adopting new memory approaches to support specific AI tasks and system designs, adding even more pressure on standard DRAM supply.

Another reason the market may stay tight is how long it takes to expand manufacturing capacity. Building out memory supply isn’t something that happens in a quarter or two—it can take years to add meaningful new output. At the same time, demand for AI infrastructure has not shown signs of easing, especially as AI moves beyond training into large-scale inference, where deployment volumes can be enormous and memory needs remain steep.

Hyperscalers are also increasingly securing supply through long-term agreements, in some cases spanning as long as five years. That kind of commitment signals buyers expect memory demand to remain intense—and they’re willing to lock in availability to protect their expansion plans. For suppliers, these multi-year deals reinforce the expectation that strong demand will persist rather than fade quickly.

Taken together, Dell’s view suggests the DRAM and AI memory market could remain strained for several more years, with shortages and elevated demand potentially lasting until new capacity meaningfully comes online—likely not until the second half of 2027. If Dell’s timeline holds, the industry may be looking at an extended period where AI growth continues to reshape memory pricing, availability, and long-term supply strategy through 2028.