Wallstreetcn
2024.03.22 08:17
portai
I'm PortAI, I can summarize articles.

Micron Technology supplies NVIDIA: Production capacity is expected to be fully sold out this year and next

This year's production capacity has been sold out, and most of next year's production capacity has already been booked

Author: Zhou Yuan / Wall Street News

On March 20th, local time in North America, Sanjay Mehrotra, CEO of Micron Technology Inc., announced during the financial report conference call that Micron Tech's HBM capacity for this year has been sold out, with the majority of the capacity for 2025 already being pre-booked.

On February 26th, Micron Tech officially announced the start of mass production of HBM3E high-bandwidth memory. Their 24GB 8H HBM3E product will be supplied to NVIDIA and will be used in NVIDIA products. This product is set to be shipped in the second quarter.

In fact, Micron Tech will also supply the corresponding HBM3E products to NVIDIA's next-generation AI accelerator card B100.

Micron Tech Officially Enters NVIDIA's Supply Chain

Mehrotra mentioned that the demand for AI servers is driving the rapid growth of HBM DDR5 and data center SSDs, causing tight supply of high-end DRAM and NAND, which in turn has a positive chain effect on storage terminal market prices.

For Micron Tech, their 24GB 8-Hi HBM3E will be supplied to NVIDIA.

This HBM product has a data transfer rate of 9.2 GT/s, with a peak memory bandwidth exceeding 1.2 TB/s. Compared to HBM3, HBM3E has increased the data transfer rate and peak memory bandwidth by 44%, which is particularly important for processors like NVIDIA's H200 that require a large amount of bandwidth.

HBM3E is an extended version of HBM3 with a memory capacity of 144GB, providing a bandwidth of 1.5TB per second, equivalent to processing 230 5GB full HD movies in one second.

As a faster and larger memory, HBM3E can accelerate generative AI and large language models, while also advancing scientific computing for HPC workloads. HBM connects multiple DRAM chips through Through Silicon Via (TSV), innovatively improving data processing speed.

On August 9, 2023, Huang Renxun released the GH200 Grace Hopper super chip, which also marks the first appearance of HBM3E. With the upcoming release of the 36GB 12-Hi HBM3E product in March 2024, Micron Tech's AI memory roadmap is further solidified.

Micron Tech's HBM3E memory is based on 1β process, using TSV (Through Silicon Via) packaging, 2.5D/3D stacking, and can provide 1.2TB/s and higher performance. This is a significant achievement for Micron Tech as it utilizes the latest production nodes in data center-level products, proving its manufacturing technology capabilities According to official information from Micron Technology, the new HBM3E product, which is the mainstay of NVIDIA's supply chain, has three advantages compared to its competitors. Firstly, it boasts outstanding performance with over 9.2Gb/s pin speed and over 1.2TB/s memory bandwidth, meeting the extreme performance requirements of AI accelerators, supercomputers, and data centers (IDC).

Secondly, it has excellent energy efficiency. Compared to competitors, Micron Technology's HBM3E reduces power consumption by about 30%. When providing maximum throughput, it can effectively reduce power consumption (although the data has not been disclosed), improving the operational expenditure metrics of data centers (IDC).

Thirdly, it has seamless scalability. Currently, this product can provide a capacity of 24GB, making it easy to expand AI applications in data centers, enabling training of large-scale neural networks and accelerating the necessary bandwidth for inference tasks.

Sumit Sadana, Executive Vice President and Chief Business Officer of Micron Technology, stated, "With this milestone product HBM3E, Micron Technology has achieved three consecutive victories: leading time to market, strong industry performance, and differentiated energy efficiency."

AI workloads heavily rely on memory bandwidth and capacity, and Sadana believes that Micron Technology is in a favorable position to support the significant growth of AI in the future through their industry-leading HBM3E and HBM4 roadmap, as well as their complete DRAM and NAND solution portfolio for AI applications.

Micron Technology's biggest competitors in the HBM industry are SK Hynix and Samsung, and in terms of the competition in the NVIDIA AI accelerator card supply chain, Micron Technology's competitor is also SK Hynix.

Although Samsung has provided test samples to NVIDIA, as of February 29th, the test results for Samsung's HBM3E have not been disclosed. Industry sources indicate that in March, there will be results from the quality testing of Samsung's HBM3E. It is worth noting that if Samsung's product passes NVIDIA's quality testing, it will be provided to NVIDIA's B100 GPU (Blackwell architecture, which NVIDIA plans to launch in the late second quarter or early third quarter of this year).

On February 20th of this year, it was reported that in mid-January, SK Hynix completed the development of HBM3E and successfully completed NVIDIA's six-month product performance evaluation. At the same time, SK Hynix plans to start mass production of the fifth-generation HBM3E high-bandwidth memory products in March and supply the first batch of products to NVIDIA in April.

Semiconductor product development is divided into nine phases (Phases 1-9). Currently, SK Hynix has completed all phases of development and has entered the final ramp-up stage A brief review of the NVIDIA H200 features: based on the Hopper architecture, it provides the same computing performance as the H100. In addition, the H200 is equipped with 141GB of HBM3E memory, with a bandwidth of up to 4.8TB/s, a significant upgrade from the H100's 80GB HBM3 (bandwidth 3.35TB/s).

Two Korean Giants Sell Out HBM3E Capacity

On February 21, Kim Ki-tae, Vice President of SK Hynix, pointed out that the demand for HBM as an AI memory solution is experiencing explosive growth as generative AI services become more diverse and continue to develop.

Public information shows that SK Hynix's HBM capacity for 2024 has also been completely sold out.

Kim Ki-tae stated that HBM has high efficiency and high capacity characteristics. From both a technical and business perspective, HBM is considered a milestone. "Although external instability factors still exist, the memory market is expected to gradually heat up in 2024," Kim Ki-tae noted, "due to the recovery of product demand from global tech giants."

On February 23, SK Hynix's management issued a statement regarding HBM memory sales: although the planned capacity for 2024 needs to be increased in advance, the current production and sales volume has reached saturation.

Kim Ki-tae mentioned that as a leader in the HBM industry, SK Hynix has insight into the significant demand for HBM storage in the market, and has already adjusted production in advance to better meet market demand. Additionally, Kim Ki-tae believes that "besides HBM3E, DDR5 and LP­D­DR5T memory will also be highly sought after in the market this year."

SK Hynix expects to maintain its leading position in the market in 2025. Some institutions predict that SK Hynix's revenue in 2024 will reach $7.5 billion. In 2023, SK Hynix's main products, DDR5 DRAM and HBM3, saw revenue growth of over 4 times and 5 times respectively compared to 2022 (source: SK Hynix's 2023 financial report).

At the end of December 2023, during Micron Technology's financial report meeting, Micron Technology CEO Sanjay Mehrotra revealed that due to the popularity of generative AI, the strong demand for HBM in high-performance AI chips in the cloud is driving Micron Technology's HBM capacity for 2024 to be fully sold out. The HBM3E, which began mass production at the beginning of 2024, is expected to generate hundreds of millions of dollars in revenue in the 2024 fiscal year.

Currently, HBM has developed to the fifth generation (HBM3E is an extension of HBM3). Previous generations of products include the first generation (HBM), the second generation (HBM2), the third generation (HBM2E), and the fourth generation (HBM3).

The sixth generation HBM product, HBM4, will use a 2048-bit interface, increasing the theoretical peak memory bandwidth per stack to over 1.5TB/s. To achieve this goal, HBM4 needs to have a data transfer rate of approximately 6GT/s, which will help control the power consumption of the next generation DRAM Micron Tech's HBM products are expected to generate hundreds of millions of dollars in revenue in the fiscal year 2024, with HBM revenue expected to make a positive contribution to Micron Tech's DRAM business and overall gross margin starting from the third quarter of the fiscal year 2024.

It is worth mentioning that Micron Tech anticipates that the supply of DRAM and NAND industries in 2024 will be lower than demand. The growth in supply of DRAM and NAND bits for Micron Tech in the fiscal year 2024 is still expected to be lower than the growth in demand, leading to a reduction in days of inventory in 2024