Driven by artificial intelligence, High Bandwidth Memory (HBM) has gradually become the focus of attention in the storage market. Recently, there have been reports that the next generation of HBM, HBM4, will undergo a major transformation with a 2048-bit memory interface for each HBM stack.
Since 2015, all HBM stacks have adopted a 1024-bit interface. The change from a 1024-bit interface per stack to a 2048-bit interface per stack is significant and signifies the transformative significance of HBM4.
Furthermore, according to reports from Korean media, in order to master the rapidly developing HBM market, Samsung plans to significantly innovate the manufacturing process technology of the next generation of products, with mass production of HBM4 expected to be achieved by 2026.
However, despite the upcoming release of HBM4, it will not immediately replace the current mainstream HBM2e. Currently, the market is dominated by HBM2e, and in the future, HBM3 will become the main product.
According to a survey conducted by TrendForce, a global market research firm, the dominant specification in the current HBM market is HBM2e. This specification is used in products such as NVIDIA A100/A800, AMD MI200, and most CSPs' self-developed acceleration chips. In order to meet the evolving demands of AI accelerator chips, original equipment manufacturers (OEMs) plan to introduce new products HBM3e in 2024. It is expected that HBM3 and HBM3e will become mainstream products in the market next year.
According to TrendForce's data, when looking at the demand distribution by generation, by 2023, the mainstream demand will shift from HBM2e to HBM3, with estimated demand shares of 50% and 39%, respectively. As more HBM3-based acceleration chips are mass-produced, market demand will significantly shift towards HBM3 in 2024, and it is expected to surpass HBM2e as the mainstream specification, with an estimated share of 60%. Benefitting from higher average selling prices (ASPs), HBM revenues will see significant growth next year.