Revolutionizing Data Transfer: Unleashing 36 GB Capacity & Over 2 TB/s Bandwidth

Micron has taken a significant leap forward by sampling its next-gen HBM4 memory to select customers, setting the stage for enhanced performance and impressive capacities in AI platforms. The company has introduced the HBM4 memory with a 12-high solution, offering a hefty 36GB capacity and astonishing speeds exceeding 2 TB/s.

This development marks a step forward in Micron’s leadership in enhancing memory performance and power efficiency for AI applications. Built on Micron’s established 1-beta DRAM process and advanced packaging technology, HBM4 is designed to integrate seamlessly into next-gen AI platforms. Its built-in self-test feature ensures reliability and robust performance.

As generative AI continues to expand, managing inference efficiently becomes vital. Micron HBM4 boasts a 2048-bit interface, delivering speeds greater than 2.0 TB/s per stack, which is more than a 60% improvement over its predecessor. This enhancement allows for rapid communication and high-throughput designs that boost the performance of large language models and reasoning systems. HBM4 enables AI accelerators to respond faster and think more effectively.

Moreover, HBM4 offers over 20% better power efficiency compared to the previous HBM3E products, setting new industry standards for power efficiency. This improvement ensures maximum throughput with minimal power consumption, optimizing data center operations.

The surge in generative AI applications shows how transformative this technology can be, offering significant societal benefits. HBM4 plays a vital role in driving faster insights and discoveries, fostering innovation across various sectors such as healthcare, finance, and transportation.

Micron’s legacy of nearly five decades in memory and storage innovation underpins its role in accelerating AI growth. The company’s broad portfolio turns data into intelligence, fueling advancements from data centers to the edge. With HBM4, Micron solidifies its position as a pivotal force in AI innovation and promises to be a trusted partner in future solutions. The company plans to fully ramp up HBM4 by 2026, in alignment with the next generation of AI platform launches.