Micron was the first to announce the mass production of its new ultra-fast HBM3E memory in February 2024, placing the company ahead of HBM rivals SK hynix and Samsung. The American memory company announced that it would supply HBM3E memory chips for NVIDIA's upcoming H200 AI GPU, which will have HBM3E memory, as opposed to its predecessor with the H100 AI GPU, which had HBM3 memory.
Micron will manufacture its new HBM3E memory chips on its 1b nanometer DRAM chips, comparable to the 12nm nodes that HBM leader SK Hynix uses on its HBM. According to Korea JoongAng Daily, Micron is "technically ahead" of HBM competitor Samsung, which still uses 1a nanometer technology, which is equivalent to 14nm technology.
Micron's new 24GB 8-Hi HBM3E memory will be the heart of NVIDIA's upcoming H200 AI GPU, and Micron's superiority in the HBM process is an essential part of their deal with NVIDIA. Micron explains their new HBM3E memory:
Chip expert Jeon In-seong and author of "The Future of the Semiconductor Empire" said, "It is proven that Micron's manufacturing method is more advanced than Samsung Electronics because their HBM3E will be made with 1b nanometer technology. Micron will need some more work on packaging, but it should be easier than what they've already achieved with 1b nanometer technology".