Driven by the wave of artificial intelligence, High-Bandwidth Memory (HBM) has moved from behind the scenes to the forefront, with a growing market demand. According to TrendForce,a global market research organization, the demand for HBM is projected to increase by 58% year on year by 2023, and another approximately 30% in 2024. Compared to traditional DRAM, HBM boasts numerous advantages such as high bandwidth, large capacity, low latency, and low power consumption, making it significantly superior in AI data processing. Therefore, it is widely used in high-performance computing scenarios like ChatGPT. This is why HBM is highly favored, and the memory manufacturers are actively promoting the iterative upgrade of HBM technology.
Since the introduction of the first silicon through-hole HBM product in 2014, HBM technology has undergone multiple updates and iterations. Currently, there are several HBM products on the market, including HBM, HBM2, HBM2E, HBM3, HBM3e, etc. According to TrendForce's survey, two major Korean manufacturers, SK Hynix and Samsung, were the first to introduce HBM3-based products, such as NVIDIA's H100/H800 and AMD's MI300 series. It is expected that these two Korean manufacturers will release HBM3e samples in the first quarter of 2024. At the same time, the original US manufacturer, Micron, has chosen to skip HBM3 and directly develop HBM3e.
It is understood that HBM3e will adopt 24Gb single-die stacking technology, and the single chip capacity will be significantly increased to 24GB on the basis of 8 layers (8Hi). This will be applied to NVIDIA's GB100 chip planned to be launched in 2025. Therefore, the three major original manufacturers are expected to release HBM3e samples in the first quarter of 2024 and strive to achieve mass production in the second half of next year.
In addition to HBM3 and HBM3e, the latest news shows that the giants in the storage field are also planning to launch the next generation of HBM, namely HBM4. Recently, Huang Sang-jun, Vice-President of Samsung Electronics and Head of DRAM Product and Technology Team, revealed that Samsung has successfully developed 9.8Gbps HBM3E and plans to start providing samples to customers. At the same time, Samsung is also developing HBM4 and plans to achieve supply in 2025. It is reported that in order to adapt to high-temperature thermal characteristics, Samsung Electronics is developing non-conductive adhesive film (NCF) assembly technology and hybrid bonding (HCB) technology for application in HBM4 products.
According to media reports, in order to seize the opportunity in the rapidly growing HBM market, Samsung plans to heavily innovate its new generation product process technology and introduce a brand-new HBM4 product. It is reported that the HBM4 memory stack will adopt a 2048-bit memory interface, while all previous HBM stacks have used a 1024-bit interface. The increase in interface width from 1024 bits per stack to 2048 bits per stack is enough to demonstrate the significant transformative impact of HBM4.