In a move that’s set to redefine the high-performance memory landscape, Micron Technology has unveiled its groundbreaking 12-high HBM3E memory solution. This technological marvel boasts an impressive 36 GB capacity and a staggering 1.2 TB/s bandwidth, promising to unleash unprecedented levels of performance for applications that demand the absolute best.
Micron HBM3E 12-high boasts an impressive 36 GB capacity, a 50% increase over current HBM3E 8-high offerings, allowing larger AI models like Llama 2 with 70 billion parameters to run on a single processor. This capacity increase allows faster insight time by avoiding CPU offload and GPU-GPU communication delays. Micron HBM3E 12-high 36 GB delivers significantly lower power consumption than the competitors’ HBM3E 8-high 24 GB solutions. Micron HBM3E 12-high 36 GB offers more than 1.2 terabytes per second (TB/s) of memory bandwidth at a pin speed greater than 9.2 gigabits per second (Gb/s). These combined advantages of Micron HBM3E offer maximum throughput with the lowest power consumption and can ensure optimal outcomes for power-hungry data centres. Additionally, Micron HBM3E 12-high incorporates fully programmable MBIST that can run system representative traffic at full spec speed, providing improved test coverage for expedited validation and enabling faster time to market and enhancing system reliability.
Micron is now shipping production-capable HBM3E 12-high units to key industry partners for qualification across the AI ecosystem. This HBM3E 12-high milestone demonstrates Micron’s innovations to meet the data-intensive demands of the evolving AI infrastructure.
Micron is also a proud partner in TSMC’s 3DFabric Alliance, which helps shape the future of semiconductor and system innovations. AI system manufacturing is complex, and HBM3E integration requires close collaboration between memory suppliers, customers and outsourced semiconductor assembly and test (OSAT) players.
In a recent exchange, Dan Kochpatcharin, head of the Ecosystem and Alliance Management Division at TSMC, commented, “TSMC and Micron have enjoyed a long-term strategic partnership. As part of the OIP ecosystem, we have worked closely to enable Micron’s HBM3E-based system and chip-on-wafer-on-substrate (CoWoS) packaging design to support our customer’s AI innovation.”
In summary, here are the Micron HBM3E 12-high 36 GB highlights:
Micron’s leading-edge data center memory and storage portfolio is designed to meet the evolving demands of generative AI workloads. From near memory (HBM) and main memory (high-capacity server RDIMMs) to Gen 5 PCIe NVMe SSDs and data lake SSDs, Micron offers market-leading products that scale AI workloads efficiently and effectively.
As Micron continues to focus on extending its industry leadership, the company is already looking toward the future with its HBM4 and HBM4E roadmap. This forward-thinking approach ensures that Micron remains at the forefront of memory and storage development, driving the next wave of advancements in data centre technology.
As one of the most popular online games lately, it’s no surprise that Xbox fans…
We've finally reached the month of November, and that means one thing for Xbox users:…
For those who haven't had it on their radar, this week we take a new…
An overclocker from the MSI team has managed to push the Kingston Fury Renegade CUDIMM…
It seems that NVIDIA wants to launch its next products ahead of time. We are…
The trend of upgrading storage from traditional hard drives to SSDs has become increasingly popular,…