Samsung missed out on Nvidia’s most expensive AI card but beats Micron to 36GB HBM3E memory — could this new tech power the B100, the successor of the H200?



Samsung says it has developed the industry’s first 12-stack HBM3E 12H DRAM, outpacing Micron Technology and potentially setting the stage for the next generation of Nvidia’s AI cards. 

The South Korean tech giant’s HBM3E 12H offers bandwidth of up to 1,280GB/s and an industry-leading capacity of 36GB, representing a more than 50% improvement over the 8-stack HBM3 8H.



Source link

This post originally appeared on TechToday.