Recently, the Korean media reported that Samsung Electronics' HBM3 and packaging services have passed AMD's quality tests.
It is understood that AMD plans to launch a series of AI chips called Instinct MI300, which will use Samsung's HBM3 and packaging services, and is expected to be released in the fourth quarter of this year. This chip combines central processing units (CPU), graphics processing units (GPU), and HBM3 technology, and is expected to provide outstanding performance.
Due to TSMC's tight production capacity, Samsung may provide packaging and HBM services for AMD. Currently, Samsung is the only company capable of providing advanced packaging solutions and HBM products simultaneously. Originally, AMD planned to use TSMC's advanced packaging services, but due to TSMC's inability to meet its demand, the plan had to be changed.
Driven by AI technology, the demand for high-performance GPUs is continuously increasing. This not only benefits GPU manufacturers such as Nvidia and AMD, but also provides greater impetus for the development of HBM and advanced packaging technologies.
According to data, Generative Artificial Intelligence (AIGC) models require AI servers for training and inference. High-end GPUs are essential components for training-side AI servers, with HBM utilization approaching 100%.
Currently, according to the latest survey by TrendForce, a global market research firm, the main manufacturers providing HBM products are Samsung, SK Hynix, and Micron. It is expected that with the active expansion of the original manufacturers, the annual growth rate of HBM bit supply will reach 105% by 2024.
In terms of competition, TrendForce pointed out that SK Hynix's HBM3 products are currently ahead of other original manufacturers and have become the main supplier of Nvidia Server GPUs. Samsung, on the other hand, focuses on meeting orders from other cloud service providers. With the increase in customer orders, the market share gap between SK Hynix and Samsung is expected to significantly narrow this year. It is estimated that by 2023 to 2024, the two companies will have a similar market share in the HBM market, accounting for approximately 95% of the total market share. However, due to slightly different customer compositions, the shipment performance may vary in different quarters.
Micron will focus on developing HBM3e products this year. Compared with the two Korean manufacturers, Micron's expansion plans are relatively smaller. Therefore, it is expected that Micron's market share may be slightly affected by the squeeze effect in the next two years.
In terms of advanced packaging capacity, TSMC's CoWoS packaging technology is currently the mainstream choice for AI server chips. According to TrendForce's estimation, driven by strong demand for high-end AI chips and HBM, TSMC's monthly CoWoS capacity is expected to reach 12K by the end of 2023. Among them, Nvidia's demand for CoWoS capacity has increased by nearly 50% compared to the beginning of the year, driven by products such as A100 and H100. In addition to the growth in demand from AMD, Google, and other high-end AI chips, it is expected that the CoWoS capacity will become even tighter in the second half of the year. This strong demand will continue into 2024. If the relevant equipment is available, the capacity of advanced packaging is expected to increase by an additional 30-40%.