SK Hynix accelerates HBM plan

0917BED7_8FB3_4565_88D4_1EE272C96652
Translate from : SK Hynix fremskynder HBM-plan
SK Hynix has launched a roadmap that indicates the company will continue to dominate the production of high-bandwidth memory (HBM) indispensable for AI.

However, the lead the company has gained over rivals Samsung and Micron will face tougher competition, industry experts told industry sources.

SK Hynix said at an industry event last month that it may be the first to introduce the next-generation HBM4 in 2025. At the event, the company showed a presentation slide of two HBM3E modules packed into Nvidia's Grace Hopper GH200 GPU.

SK Hynix HBM.jpg

SVP Ilsup Jin, who heads the company's DRAM and NAND technology development, said at ITF World in Antwerp last month that the company's next-generation HBM4 may be available sooner than expected. “HB4M is coming pretty quickly,” Jin said. "It's coming next year."

SK Hynix has the leading market share in HBM, with over 85% in HBM3 and over 70% in total HBM, SemiAnalysis Chief Analyst Dylan Patel told EE Times earlier this year. Competition is expected to get stronger, according to Sri Samavedam, SVP of CMOS technologies at global R&D organization imec.

3cc9c91d-41ea-4bf5-a848-063afdcdf5b8_d2207337.webp

"SK Hynix were out early and they got ahead," Samavedam told EE Times. "Micron is not far behind. They came out with some really competitive HBM offerings last year and an HBM3E offering this year, too."

In February, Micron announced commercial production of its HBM3E, which will be part of Nvidia's H200 Tensor Core GPUs to ship in the second quarter of 2024. Advanced packaging is critical to the wide adoption of HBM. According to an article in the Korean Economic Newspaper, Samsung will offer 3-dimensional (3D) packaging service for HBM this year, followed by their own HBM4 in 2025.

Processing-in-Memory.png

Samsung has announced that it plans to introduce HBM3E products with 12 layers by the second quarter of 2024. Samsung has promised to strengthen its HBM delivery capabilities and technological competitiveness. The provision of HBM is a potential obstacle to the expansion of AI models and services.

Our Partners