High bandwidth memory (HBM) is rapidly emerging as a critical component in the ongoing race for AI market supremacy. SK hynix, the world’s second-largest memory chip manufacturer, has announced an ambitious plan to invest ₩103 trillion (approximately $74.5 billion) between now and 2028 to boost its semiconductor operations, with 80% of this is earmarked for AI-related business areas including HBM.
SK hynix has already sold out its HBM production capacity for this year and most of 2025, driven by the surging demand from the AI sector because its chips have been certified for use with Nvidia’s high-end GPU accelerators. The aim of the company’s new investment is to maintain its leadership position in the HBM segment by boosting its production capacity and strengthening its R&D capabilities.
HBM technology enhances memory bandwidth by integrating memory chips within the same package as the CPU or GPU, sometimes stacking them directly on top of each other. This design minimizes connection lengths, resulting in faster data transfer rates and improved performance for AI workloads.
However, the growing enthusiasm for HBM is raising concerns about a potential DRAM supply shortage unless additional manufacturing lines are established. Demand for HBM is projected to increase by 200 percent this year and double again by 2025, highlighting the urgent need for expanded production capacity.
Samsung, another memory giant, is also looking to capitalize on the growth of the HBM market, but it is currently awaiting certification of its chips for Nvidia’s GPUs. Although rumors suggest that Samsung’s chips are struggling to meet Nvidia’s power consumption and heat requirements, the company has denied these claims.
Meanwhile, Micron, the third-largest memory maker, has reported that its HBM production capacity is sold out through 2025 and anticipates a substantial revenue boost because of the AI sector’s insatiable need for the memory chips. However, Micron’s new fabrication plants in the US, located in Boise and New York, will not contribute to memory supply until fiscal 2027 and 2028, respectively.
In China, tech giant Huawei is taking measures to secure its own HBM chips in response to US sanctions that have restricted its access to components made outside China. Huawei is reported to be partnering with Wuhan Xinxin Semiconductor Manufacturing and other local companies to develop an equivalent to the Chip on Wafer on Substrate (CoWoS) packaging technology that is necessary to combine GPUs with HBM chips. This initiative aims to circumvent the impact of US sanctions and bolster China’s self-sufficiency in high-tech components. How soon Huawei will be able to achieve this objective is open to question.
Gadgets, gizmos, software and devices. I love all technology.