SK Hynix To Develop 96 GB & 128 GB DDR5-Based CXL 2.0 Memory Solutions Next Year 1

SK Hynix Plans Advanced 96 GB & 128 GB DDR5 CXL 2.0 Memory Solutions for Next Year

SK Hynix is advancing in the development of DDR5-based CXL 2.0 memory solutions aimed at the AI sector, particularly for large language models (LLMs) that require substantial memory.

Compute Express Link, or CXL, is an interconnect technology designed to facilitate faster data transfer between CPUs and GPUs and, in AI applications, between CPUs and accelerators. It significantly enhances performance by allowing GPUs to directly access system memory, which traditional PCIe interfaces cannot achieve effectively. This capability has attracted significant market interest, as conventional memory techniques are increasingly insufficient for large-scale AI models.

At a recent event in Korea, SK Hynix’s Vice President of System Architecture elaborated on the company’s efforts to develop CXL-memory modules. He highlighted the limitations of current AI memory, which predominantly relies on the Graphics Processing Unit (GPU) and High Bandwidth Memory (HBM). Given that HBM has limited size, the memory demand remains high, and CXL memory tuning for AI is actively in progress.

Kyoung Park, VP of Research at SK Hynix, stated that the company is working on DDR5-based 96GB and 128GB CXL 2.0 memory products, with plans to release them by the second half of 2025. This interconnect technique facilitates direct memory access to other components on the board and provides the capacity for expanded memory.

In related developments, a Korean startup named Panmnesia has introduced a pioneering CXL IP that enables GPUs to utilize memory from DRAM or even SSDs, supplementing the built-in HBM.

Although SK Hynix has not yet showcased its CXL 2.0 memory solutions, details about their effectiveness in AI applications remain pending. Meanwhile, Samsung is expected to launch its 256 GB CXL 2.0 memory module this year, which might provide further insights into the technology’s potential.