The AI wave is sweeping the globe, triggering a surge in storage demand! Micron's revenue soared by 82%, but performance expectations failed to impress Wall Street
Micron Technology, the largest computer memory chip manufacturer in the United States, has released its latest financial report, showing strong financial fundamentals but falling short of Wall Street's high expectations. Quarterly revenue increased by 82% to $6.81 billion, exceeding analysts' expectations. Micron expects future storage demand to continue growing, driven by artificial intelligence capabilities. However, the company fell short of some analysts' expectations of over $8 billion for the next fiscal quarter
According to Zhitong Finance and Economics APP, after the release of the latest financial report by Micron Technology (MU.US), the largest computer storage chip manufacturer in the United States, the stock price fell more than 9% in after-hours trading on the US stock market. The company's quarterly performance data and outlook both demonstrate an extremely strong financial foundation. In the midst of the fervent wave of global enterprises investing heavily in AI, the demand for storage has entered a rapid growth phase. However, Wall Street had very high expectations for this financial report and outlook. Although Micron's core performance indicators exceeded expectations across the board, the outlook for the next quarter did not meet the extremely high expectations of some investment institutions on Wall Street.
The financial report data shows that in the third quarter of the 2024 fiscal year ending on May 30, Micron's total revenue scale achieved a significant growth of 82% to $6.81 billion. The storage chip giant based in Boise, Idaho reported that, excluding certain items, the non-GAAP earnings per share were $0.62, compared to a loss of $1.43 per share in the same period last year and $0.42 per share in the previous quarter. In comparison, Wall Street analysts' average expectations were revenue of about $6.67 billion and earnings per share of $0.50. Micron's actual performance far exceeded Wall Street's expectations.
In a slide presentation, the company stated that it expects PC sales to continue to recover with low single-digit growth in 2024, while smartphone sales are expected to recover with growth trends in the low to mid-single digits. The company also forecasts that by 2025, artificial intelligence capabilities will help stimulate widespread demand for upgrades in smartphones and personal computers, meaning that both DRAM and NAND storage demand will enter a new growth phase. Micron, focusing on the storage field, is expected to benefit comprehensively from this trend.
Regarding the performance expectations for the next quarter, the company stated in the outlook section of the performance announcement that it expects fourth-quarter revenue to reach $7.4 billion to $7.8 billion. Analysts' average expectations are around $7.58 billion, which is basically in line with the average expectations. However, some analysts expect revenue to exceed $8 billion. This is also an important reason for the sharp decline in the stock price after Micron announced its outlook. For example, Citigroup on Wall Street has listed Micron Technology as a "top pick" and expects fourth-quarter revenue to exceed $8 billion. Excluding certain items, Micron expects fourth-quarter non-GAAP earnings per share to be around $1.08, with a fluctuation range of $0.08, higher than the Wall Street average expectation of $1.02.
Despite benefiting significantly from the global trend of AI technology deployment by enterprises, Micron Technology has yet to see strong growth in traditional markets such as personal computers and smartphones. These traditional markets have only just begun to recover from last year's historic demand decline. It is expected that the "year of AI PC and AI smartphones" starting in 2024 will drive storage demand in these two major traditional markets into a phase of explosive growth.
After announcing the latest quarterly performance and outlook data, the stock fell more than 9% in after-hours trading on the US stock market, but the decline narrowed to around 7% afterwards. Prior to today's financial report release, Micron's stock price, which has greatly benefited from the AI boom, has surged by 67% so far. Investors expect the company to be one of the major beneficiaries of AI spending, with top Wall Street institutions significantly raising their target stock prices for Micron in the next 12 months, with the most optimistic target price reaching $225 (Micron closed at $142.36 on Wednesday).
Micron CEO Sanjay Mehrotra reiterated his optimistic expectations for the storage industry, stating that 2024 will mark a significant rebound in the storage chip industry, with sales in 2025 expected to reach a historical record.
The AI boom will drive the demand for expensive HBM storage systems, which are highly complex 3D stacked chip systems that require a significant portion of chip manufacturers' production capacity. This will significantly reduce the risk of future inventory surplus, a long-standing issue in the storage industry. The CEO of Micron emphasized that due to the surge in HBM demand, the expansion of HBM production capacity will accelerate, which will have profound implications for the broader DRAM and NAND production capacity and price increases. The supply of DRAM and NAND is gradually falling behind demand, leading to expected steady price increases.
Micron sold approximately $100 million worth of HBM3E storage in the just-ended quarter and expects total sales of HBM storage systems to rise to "hundreds of millions of dollars" this quarter. It is projected that by the 2025 fiscal year (ending in August of that year), sales of HBM storage systems, a sub-segment of DRAM, will increase to several billion dollars.
Micron's sales of AI hardware infrastructure, including HBM storage systems, and a wide range of DRAM and NAND storage products required for AI infrastructure, have allowed Micron to fully benefit from this unprecedented surge in AI spending. HBM storage systems, a crucial component of AI hardware infrastructure, along with the core hardware necessary for heavyweight AI applications such as ChatGPT and Sora provided by AI chip leader NVIDIA (NVDA.US) - H100/H200/GB200 AI GPUs - are indispensable. Due to the almost endless demand for NVIDIA's full line of AI GPU products, NVIDIA has become the world's most valuable chip company HBM storage systems can provide information faster to help in the development and operation of computing systems for large artificial intelligence models.
Large artificial intelligence models are often created through data-bombing software and high-density matrix operations, involving tens of trillions of parameters and heavily relying on HBM storage systems. AI inference workloads involve massive parallel computing patterns, also highly dependent on HBM storage to provide high bandwidth, low latency, and high energy-efficient storage solutions. To avoid computational bottlenecks and keep expensive processors working at full speed, Micron and its competitors have developed HBM storage that communicates faster with other components than traditional storage.
South Korea is home to the world's two largest memory chip manufacturers, SK Hynix and Samsung, where global HBM leader SK Hynix has become NVIDIA's core HBM supplier. The HBM storage system carried by NVIDIA's H100 AI GPU is produced by SK Hynix. In addition, the HBM in NVIDIA's H200 AI GPU and the latest B200/GB200 AI GPUs based on the Blackwell architecture will also be equipped with SK Hynix's latest generation HBM storage system - HBM3E. Another major HBM3E supplier is the US storage giant Micron, and Micron's HBM3E is highly likely to be used in NVIDIA's H200 and the latest powerful B200/GB200 AI GPUs.
In the highly competitive HBM market in the storage field, as of 2022, the market share of the three major original HBM manufacturers is approximately 50% for SK Hynix, around 40% for Samsung Electronics, and about 10% for Micron. Due to SK Hynix's early entry into the HBM field as early as 2016, it holds the vast majority of market share. Industry insiders suggest that by the end of 2023, SK Hynix's HBM market share may increase to around 55%, solidifying its dominant position.
Not only HBM demand is soaring! DRAM and NAND demand are also on the rise
HBM is a high-bandwidth, low-power storage technology specifically designed for high-performance computing and graphics processing. Through 3D stacked storage technology, HBM connects multiple stacked DRAM chips together through fine Through-Silicon Vias (TSVs) for high-speed, high-bandwidth data transfer. By stacking multiple memory chips together through 3D stacking technology, HBM significantly reduces the storage system's space footprint, lowers data transfer energy consumption, and improves data transfer efficiency with high bandwidth, allowing AI large models to run more efficiently 24/7.
Especially, HBM storage systems also have powerful low-latency characteristics, enabling quick response to data access requests. Generative AI large models like GPT-4 often need to access large datasets frequently and perform extremely heavy model inference workloads. The powerful low-latency characteristics greatly enhance the overall efficiency and response speed of AI systems. In the field of AI infrastructure, HBM storage systems are fully integrated with NVIDIA H100/H200 AI GPU server systems, as well as upcoming NVIDIA B200 and GB200 AI GPU server systems.
According to data released by the Korean National Statistical Office, in April, Korean chip inventory plummeted by 33.7% year-on-year, marking the largest decline since the end of 2014. This largely reflects the surge in Korean memory chip exports, especially the demand for storage chips produced by Samsung and SK Hynix, the two storage giants that contribute nearly 15% to the Korean GDP. In particular, the demand for HBM storage systems is increasing at a much faster pace than supply.
Goldman Sachs, a major Wall Street firm, recently released a research report stating that the incredibly strong demand for Generative Artificial Intelligence (Gen AI) from enterprises is driving higher shipments of AI servers and higher HBM density per AI GPU. The institution has significantly raised its total market size estimate for HBM, now projecting the market size to grow tenfold from 2022 to 2026 (with a staggering 77% compound annual growth rate over 4 years), increasing from $2.3 billion in 2022 to $23 billion in 2026. Goldman Sachs expects the situation of undersupply in the HBM market to continue over the next few years, benefiting major players such as SK Hynix, Samsung, and Micron.
Another major player from Korea, Samsung, is the world's largest supplier of DRAM and NAND memory chips, and is also striving to become one of the suppliers for NVIDIA's HBM and the next generation HBM3E. Samsung leads by a wide margin in the market share of DDR series memory chips (such as DDR4, DDR5), one of the mainstream applications of DRAM, and in the market share of SSD, one of the mainstream applications of NAND memory, with Micron possibly ranking second only to Samsung in the DRAM and NAND memory sectors. Unlike HBM, which is widely used in AI data centers, DDR series memory is mainly used as the main memory in PC systems, providing sufficient memory capacity and bandwidth to support multitasking and data set processing on consumer electronic devices, while LPDDR (Low Power DDR) series is used in smartphones.
Looking at Korean chip export data, the strong demand for memory chips becomes even more evident. Early trade data from Korea shows that in the first 20 days of June, chip product sales increased by 50.2% year-on-year, continuing to lead export growth, mainly driven by the demand from smartphone manufacturers, data center operators, and AI developers, boosting the demand for memory chips and sales prices Since 2023, the global AI boom has driven a surge in demand for AI servers, with top global data center server manufacturers such as Dell Technologies (DELL.US) and Super Micro Computer Inc. (SMCI.US) typically using Samsung and Micron DDR series products in their AI servers, as well as Samsung/Micron SSDs, one of the mainstream applications of NAND storage, extensively used in the server main storage system for computational systems. SK Hynix's HBM storage system is fully integrated with NVIDIA's AI GPUs. This is also the key logic behind the surge in demand for HBM storage systems, as well as the entire DRAM and NAND storage.
DRAM is mainly used as the main memory of computational systems, providing temporary data storage and intermediate calculation results for CPUs and GPUs, as well as data loading and preprocessing. Although the read/write speed of NAND storage is not as fast as DRAM and HBM, its large capacity and low cost make it an ideal choice for long-term data storage. In generative AI computing systems, NAND is usually used to store large-scale training/inference datasets and trained models. When training or inference loads are required, data is rapidly loaded into DRAM or HBM for processing.
The trend of large-scale AI models led by Apple Intelligence integrating into consumer electronics is expected to drive the surge in demand for DRAM and NAND, which is why Micron's stock price has continued to rise following Apple's WWDC.
According to Morgan Stanley, if the base model remains at around 3 billion parameters, the base model of the new iPhone 16 is expected to upgrade its DRAM capacity from 6GB in iPhone 15 to 8GB (possibly the minimum configuration to drive Apple's edge AI large models); considering the limited storage density of the M2 chip (192GB), the growing Apple AI servers will consume a large amount of LPDDR5.
The AI grand plan showcased by Apple at WWDC implies that starting from 2024, edge AI large models will gradually integrate into consumer electronics terminals such as PCs, smartphones, smartwatches, and may even soon integrate into humanoid robots, ushering in the era of embodied AI intelligence. The capacity demand for storage in major terminals may show an exponential growth trend. This is also the core logic behind some research institutions continuously raising their storage chip demand expectations for the coming years after Apple's WWDC.
In a recent research report, Morgan Stanley emphasized that the significant increase in AI demand, combined with the severe lack of capital expenditure by major storage factories over the past two years due to a demand winter, is leading the storage market into an unprecedented "super cycle." Morgan Stanley expects that starting from 2025, the AI upgrade cycle for smartphones and personal computers may require additional storage capacity, leading to a severe supply shortage in the market. The shortage rate for HBM is projected to be -11%, and the shortage rate for the entire DRAM market is projected to be -23% The latest semiconductor industry outlook data released by the World Semiconductor Trade Statistics (WSTS) shows that the global semiconductor market is expected to experience a very strong recovery trend in 2024. WSTS has significantly raised its sales forecast for the global semiconductor market in 2024 compared to the end of 2023. For 2024, WSTS predicts a market size of $611 billion, indicating a substantial 16% growth compared to the previous year, which is a significant upward revision from the end of 2023 forecast.
WSTS stated that the revised expectations for 2024 reflect strong performance in the past two quarters, especially in the computing end market. After a significant market contraction in 2023, WSTS expects that in 2024, two core chip product categories will drive double-digit sales growth, with total sales of logic chips including CPUs and GPUs expected to increase by 10.7%, and the storage chip category dominated by DRAM and NAND expected to surge by 76.8% in 2024.
Looking ahead to 2025, WSTS predicts that the global semiconductor market's sales could reach $687 billion, meaning that the global semiconductor market is expected to grow by approximately 12.5% on top of the already strong recovery trend in 2024. WSTS still expects this growth to be mainly driven by the storage chip category and logic category, with the overall size of these two categories expected to soar to over $200 billion in 2025 with the help of the AI boom. Compared to the previous year, WSTS expects the sales growth rate of the storage chip category dominated by DRAM and NAND to exceed 25% in 2025, and the sales growth rate of logic chips including CPUs and GPUs to exceed 10%. Additionally, it is also expected that the growth rates of discrete devices, optoelectronics, sensors, analog semiconductors, and all other submarkets will reach single digits.
Micron Technology, one of the "AI Three Knights," may still have a long way to go in its stock price rally!
International major brokerage Mizuho Securities recently released a research report stating that Nvidia is undoubtedly the leader in the generative AI field, but recently, both Ethernet chip giant Broadcom and storage giant Micron have also shown outstanding performance and stock performance. Especially the importance of the storage chips provided by Micron for AI training/inference systems, which is enough to rival Nvidia's AI GPU, together forming the "AI Three Knights".
Before Micron announced its financial report and performance outlook, Wall Street investment institutions generally bullish on Micron's stock price to continue to hit historic highs in the next 12 months with a "bullish pace". Among them, the well-known Wall Street investment institution Rosenblatt, with a target price as high as $200 for Nvidia, reiterated a "buy rating" for Micron, and gave Micron a highest target price of $225 on Wall Street.
Wolfe Research and Citigroup both maintained a bullish rating on Micron before the company's financial report was released. Wolfe Research maintained a "hold" rating on Micron Technology, with the target price raised significantly from $150 to $200. The analysts at the firm pointed out that they raised their expectations for the company's performance because the storage industry is in good condition, and they are optimistic about the company's HBM sales expectations.
However, Wolfe Research analysts added that Micron's recent situation is not the core reason for their long-standing bullish view on the stock. The analysts believe that all memory chip suppliers have been working to limit supply in recent years to avoid customers building inventory before expected price increases. Wolfe Research analysts believe that in the fiscal years 2025/2026, Micron's earnings per share are expected to be pushed to $20, with HBM contributing approximately $3.
At the same time, Citigroup reiterated a "buy" rating on Micron and raised the target price for the stock from $150 to $175. The analysts at the firm expect that given the company's overall improvement in DRAM and the increasing exposure in the HBM segment, Citigroup believes that due to the company's core position in the field of artificial intelligence, the stock price should continue to be above its historical range. Another major Wall Street bank, Bank of America, also reiterated a "buy" rating on Micron, raising the target price from $144 to $170, and pointed out that Micron will be the biggest beneficiary of the significant increase in HBM market share