Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
© micron
Electronics Production |

Micron begins volume production of HBM3e memory

The US's biggest memory firm, Micron, has joined the race to compete in the next-gen HBM3e space, and says its new products will be used by Nvidia.

The race is on to meet the voracious demand from data centres for memory that combines high-performance with low energy consumption. HBM2 and HBM3 represent the cutting edge right now, and Nvidia has already released its H100 AI GPU with HBM3 memory. AMD's new Instinct MI300X AI GPU also comprises HBM3.

But HBM3e is imminent, and is attracting a lot of attention from advanced chip design firms. Nvidia says its upgraded H200 AI GPU and Blackwell B100 AI GPU will feature HBM3e, as will AMD's new Instinct MI300X AI GPU.

Micron claims its new HBM3e offering will consume 30% less power than the competition, offering pin speeds of 9.2Gbit/s and overall bandwidth of 1.2Tbyte/s. It says its 24Gbyte 8-high HBM3e components will be part of the H200 tensor core GPUs that Nvidia will begin shipping in 2Q24.

“Micron is delivering a trifecta with this HBM3e milestone: time-to-market leadership, best-in-class industry performance, and a differentiated power efficiency profile,” said Sumit Sadana, EVO at Micron. “AI workloads are heavily reliant on memory bandwidth and capacity, and Micron is very well-positioned to support the significant AI growth ahead through our industry-leading HBM3E and HBM4 roadmap, as well as our full portfolio of DRAM and NAND solutions for AI applications.”
 


Ad
Ad
Load more news
April 26 2024 9:38 am V22.4.33-2
Ad
Ad