Huawei’s AI SSD Push Seeks to Reduce HBM Reliance, Rattling Korea’s Chip Giants

Huawei is gearing up to commercialize a new class of solid-state drives built specifically for artificial intelligence workloads—an approach that industry watchers say could reduce the sector’s dependence on high-bandwidth memory. According to reports from South Korea citing industry sources, these AI-focused SSDs are aimed at data center deployments and integrate on-drive computing capabilities, enabling them to handle massive storage and data movement tasks that push past what HBM is designed to manage.

Why this matters comes down to scale, cost, and availability. HBM delivers extraordinary bandwidth for AI accelerators, but it’s expensive, supply-constrained, and limited in capacity per package. By moving certain data processing functions onto the storage device itself—often called computational storage—AI-specific SSDs can offload routine, data-heavy operations from CPUs and GPUs. That can help streamline pipelines for large-scale model training, retrieval-augmented generation, vector search, and inference serving, where shuttling terabytes of data becomes the bottleneck.

In practice, AI SSDs could pre-process, filter, compress, or index data closer to where it lives, reducing the need to pull everything into expensive HBM. They also offer far greater capacity per dollar than memory stacks, which is attractive for workloads that mix hot and warm data or require high-throughput streaming rather than peak on-chip bandwidth. For data centers, the potential payoff includes better resource utilization, lower total cost of ownership, and the ability to scale storage and compute more flexibly.

This strategy doesn’t replace HBM for latency-critical matrix math inside accelerators, but it can complement it by relieving pressure on memory bandwidth and interconnects. As AI models grow, pairing accelerators with intelligent storage could become a pragmatic way to keep training and inference pipelines fed without overbuilding HBM capacity.

Industry sources indicate Huawei’s AI SSDs are being prepared for commercial rollout, targeting large-scale storage and transmission workloads. If the company executes, it could give hyperscalers and enterprises a new lever to balance performance, capacity, and cost—while chipping away at HBM’s current centrality in AI infrastructure.