Samsung’s arch-rival presents HBM3E memory chip that could power Nvidia’s Blackwell B100 AI GPU — with 16 layers, 48GB and 10.24Gbps transfer rate, this may well be the key to make ChatGPT 6 live

Samsung is set to numerous new products at the forthcoming International Solid-State Circuits Conference (ISSCC), including a superfast DDR5 chip, 280 layer QLC NAND flash memory, and the the world’s fastest GDDR7. 

But while Samsung will certainly draw a lot of attention, it’s not the only game in town, as its South Korean rival SK hynix is also set to reveal its new HBM3E DRAM straight after Samsung finishes talking about its 3D-NAND flash memory at the High-Density Memory and Interfaces session.

HBM3E (High Bandwidth Memory gen 3 Extended) is a groundbreaking memory that offers a significant leap in performance and power and is designed to meet the escalating demands of high-performance computing, AI, and graphics applications.

Nvidia has a choice

HBM3E is the 5th generation of HBM, and interconnects multiple DRAM vertically, significantly increasing data speed, capacity, and heat dissipation.

According to SK hynix, its new memory chip can process data up to 1.15 terabytes per second, equivalent to processing over 230 Full-HD of 5GB each in a second. It also boasts a 10% improvement in heat dissipation, thanks to the implementation of the cutting-edge Advanced Mass Reflow Molded Underfill technology (or MR-MUF).

SK Hynix sees its HBM3E as the driving force behind AI tech innovation, but it could also power Nvidia’s most powerful GPU ever — the B100 Blackwell AI GPU. Micron has stated it won’t release the of its high-bandwidth memory unit, HBM4, until 2025. This has led to speculation that Nvidia may seek an alternative supplier for the B100 Blackwell.

Although Samsung was considered the most likely contender for this, with its new ‘Shinebolt’ HBM3E memory, SK Hynix is well-positioned to step in with its .

There’s no official on this yet, but it likely won’t be long until we find out which of the Korean companies Nvidia chooses.

More from TechRadar Pro

Source link