Samsung says it has developed the industry’s first 12-stack HBM3E 12H DRAM, outpacing Micron Technology and potentially setting the stage for the next generation of Nvidia’s AI cards.
The South Korean tech giant’s HBM3E 12H offers bandwidth of up to 1,280GB/s and an industry-leading capacity of 36GB, representing a more than 50% improvement over the 8-stack HBM3 8H.
The 12-stack HBM3E 12H uses advanced thermal compression non-conductive film (TC NCF), which allows the 12-layer products to meet the current HBM package requirements while maintaining the same height specification as 8-layer ones. These advancements have led to a 20% increase in vertical density compared to Samsung’s HBM3 8H product.
The battle heats up
“The industry’s AI service providers are increasingly requiring HBM with higher capacity, and our new HBM3E 12H product has been designed to answer that need,” said Yongcheol Bae, Executive Vice President of Memory Product Planning at Samsung Electronics. “This new memory solution forms part of our drive toward developing core technologies for high-stack HBM and providing technological leadership for the high-capacity HBM market.”
Meanwhile, Micron has started mass production of its 24GB 8H HBM3E, which will be used in Nvidia’s latest H200 Tensor Core GPUs. Micron claims its HBM3E consumes 30% less power than its competitors, making it ideal for generative AI applications.
Despite missing out on Nvidia’s most expensive AI card, Samsung’s 36GB HBM3E 12H memory outperforms Micron’s 24GB 8H HBM3E in terms of capacity and bandwidth. As AI applications continue to grow, Samsung’s 12H HBM3E will be an obvious choice for future systems requiring more memory, such as Nvidia‘s B100 Blackwell AI powerhouse which is expected to arrive by the end of this year.
Samsung has already begun sampling its 36GB HBM3E 12H to customers, with mass production expected to start in the first half of this year. Micron is set to begin shipping its 24GB 8H HBM3E in the second quarter of 2024. The competition between the two tech giants in the HBM market is expected to intensify as the demand for high-capacity memory solutions continues to surge in the AI era.