2024 may be the year of the "liquid cooling" outbreak? Under the leadership of NVIDIA, the demand for "liquid cooling" embarks on a rapid growth path

Zhitong
2024.05.10 14:12
portai
I'm PortAI, I can summarize articles.

In 2024, under the leadership of NVIDIA, the demand for liquid cooling has flourished, with Vertiv's stock price soaring by over 600%. Wall Street analysts believe that driven by NVIDIA, liquid cooling will become a necessary solution in the high-performance AI server field. Vertiv has shown strong performance, with a 60% year-on-year increase in total orders in Q1, and is expected to achieve a year-on-year sales growth of about 12% in 2024. In the liquid cooling technology sector, Asetek has also achieved strong performance in the first quarter

Amid the frenzy of global enterprises deploying AI technology since 2023, Vertiv (VRT.US), one of the liquid cooling technology solution providers for the most powerful AI GPU server GB200 AI GPU server by NVIDIA (NVDA.US), has seen its stock price surge by over 600% since 2023, with a 103% increase since 2024. According to Wall Street analysts, under the strong promotion by NVIDIA, the absolute leader in the AI chip field, liquid cooling is expected to shift from "optional" to "mandatory" in the ultra-high-performance AI server sector. This implies that the market size for "liquid cooling" solutions in the future is immensely huge, and in terms of stock price expectations, leaders in the liquid cooling sector like Vertiv may still have a long way to go in terms of stock price appreciation.

In terms of the latest performance and performance expectations, Vertiv has also delivered a very satisfactory performance, indicating a sharp increase in demand for liquid cooling technology in global AI data centers. This also indirectly shows that the demand for NVIDIA's AI GPUs by global enterprises remains extremely strong. Recently, Vertiv, the supplier of NVIDIA's GB200 liquid cooling solution, reported a 60% year-on-year increase in total orders in the first quarter, with a year-end backlog amounting to a record high of $6.3 billion. Q1 net sales were $1.639 billion, an 8% increase year-on-year, with adjusted operating profit reaching $249 million, a 42% increase year-on-year.

Not only were the first quarter orders and sales strong, but Vertiv also raised its full-year 2024 performance expectations ahead of market expectations. The mid-point of sales is expected to increase by about 12% compared to the strong sales base of 2023, with adjusted operating profit ranging from $1.325 billion to $1.375 billion, representing a mid-point increase of about 28% compared to the strong growth in 2023.

In the A-share market in China, the leader in liquid cooling technology, Incooling (002837.SZ), also reported an incredibly strong first quarter performance. During the reporting period, Incooling achieved operating income of RMB 746 million, a 41.36% year-on-year increase. Net profit attributable to shareholders of the listed company was RMB 61.9752 million, a 146.93% year-on-year increase. Net profit attributable to shareholders of the listed company excluding non-recurring gains and losses was RMB 54.3077 million, a 169.65% year-on-year increase.

Looking ahead to the future prospects of liquid cooling, starting from 2024, the penetration scale of liquid cooling solutions is expected to enter a "explosive growth" mode. According to Dell'Oro Group's forecast data in February 2024, the organization expects the data center thermal management market size (air cooling + liquid cooling) to reach $12 billion by 2028, with liquid cooling expected to reach $3.5 billion at that time, accounting for nearly 1/3 of the total thermal management expenditure, compared to the current share of less than 1/10. International renowned research firm IDC recently released a report stating that China's liquid-cooled server market will continue to grow rapidly in 2023. The market size of China's liquid-cooled server market is expected to reach $1.55 billion in 2023, a 52.6% increase compared to 2022, with over 95% adopting cold plate liquid cooling solutions. IDC predicts that the compound annual growth rate of China's liquid-cooled server market from 2023 to 2028 will reach 45.8%, and the market size is expected to reach $10.2 billion in 2028.

Liquid Cooling - Gradually transitioning from an "optional" to a "mandatory" option for AI server cooling modules

Currently, AI servers globally using NVIDIA's H100 AI GPU show a variety of choices in cooling solutions, but air cooling remains the mainstream choice. Although liquid cooling is gradually becoming more popular due to its advantages in high-performance computing such as more effective heat management and energy efficiency, the deployment of liquid-cooled servers has not yet been fully popularized in all systems using NVIDIA's H100 GPU.

In the era of NVIDIA's new Blackwell architecture GPU (B100\B200\GB200 AI GPU), with the significant increase in AI GPU performance, from a theoretical technical perspective, the scale of air cooling heat dissipation has almost reached its limit, ushering in the era of liquid cooling heat dissipation. As liquid cooling transitions from "optional" to "mandatory" in the field of AI servers, it will greatly expand market space and become one of the important sub-sectors in the field of AI computing power. Overall, liquid cooling not only ensures efficient 24/7 operation of AI GPU servers at optimal performance but also helps extend hardware lifespan.

The performance of NVIDIA's GB200 supercomputing server can be described as the existence of a "unique" computing power system globally. The AI supercomputing system GB200 created by NVIDIA based on two B200 AI GPUs and the self-developed Grace CPU can instantly increase the performance of inference workloads based on large language models (LLM) by 30 times. Compared to the previous generation Hopper architecture, the cost and energy consumption of GB200 have been significantly reduced by about 25 times. Based on the GPT-3 LLM benchmark with 175 billion parameters, GB200's inference performance is 7 times that of the H100 system and provides training speeds 4 times faster than the H100 system.

Such a powerful performance improvement means that air cooling modules are no longer sufficient to support the normal heat dissipation operation of computing power systems, which is also an important factor in NVIDIA's decision to mass adopt liquid cooling solutions for the GB200 AI GPU servers produced in September As AI and machine learning algorithms become more complex, the corresponding demand for AI computing power is also growing rapidly. Especially when training large AI models or conducting large-scale AI inference processes, AI servers require high-performance GPUs to handle these computationally intensive tasks. These high-performance AI GPUs (such as NVIDIA's GB200) generate a large amount of heat during operation, requiring effective cooling solutions to maintain operational efficiency and hardware lifespan. Liquid cooling systems can transfer heat from GPUs and other heat sources to heat sinks more quickly and efficiently, reducing the possibility of heat accumulation and significantly lowering the risk of transistor burnout, thus keeping GPUs running at high performance levels in the long term.

From a technical perspective, the mainstream view in the industry is that cold plate indirect liquid cooling is expected to achieve comprehensive penetration and promotion ahead of direct liquid cooling. Liquid cooling systems can be classified into direct liquid cooling and indirect liquid cooling based on the method of contact between the liquid and hardware, with direct cooling including immersion and spray cooling, while indirect liquid cooling mainly refers to cold plate liquid cooling solutions. Cold plate liquid cooling technology is mature, does not require changes to the existing server form factor, has low processing complexity and cost, and the cooling power consumption can meet the requirements of AI servers, making it likely to be promoted first.

According to a research report by the renowned institution Markets And Markets, the global data center liquid cooling market is expected to grow from $2.6 billion in 2023 to at least $7.8 billion in 2028, with a compound annual growth rate of 24.4% during the forecast period. Markets And Markets stated that due to the development of AI servers, edge computing, and devices such as the Internet of Things (IoT), there is a need for compact and efficient cooling solutions, with the advantage of liquid cooling being able to effectively handle large volumes of data in challenging situations through cooling small devices and servers. Overall, driven by the strong demand for modern data centers to process massive amounts of data, the data center liquid cooling market is primarily driven by the need to improve cooling efficiency, energy efficiency, scalability, sustainability, and hardware requirements such as higher performance GPUs.

Wall Street analysts are generally optimistic that the massive investments by global enterprises in AI technology will support the continuous expansion of data center capacity. This is a significant positive for Vertiv, as most of the company's revenue comes from the sales of data center power management, IT liquid cooling, and hybrid cooling systems used in data centers. The company's main business focuses on providing power management and various cooling technologies for data centers worldwide Vertiv is currently dedicated to developing advanced liquid cooling solutions for AI data centers. Public information shows that Vertiv is collaborating with the AI chip giant NVIDIA (NVDA.US) to develop the next-generation NVIDIA AI GPU-accelerated data center advanced liquid cooling solution, which is expected to be suitable for the GB200. Vertiv's high-energy density power and cooling solutions are designed to support NVIDIA's next-generation GPU to run the most computationally intensive AI workloads with optimal performance and high availability.

According to institutional data, Wall Street analysts have given Vertiv 8 "buy" ratings, 1 "hold" rating, and no "sell" ratings, with a consensus rating of "strong buy". The most optimistic target price is as high as $102 (closing at a historical high of $97.94 on Thursday). Analyst Noah Kaye from Oppenheimer & Co. emphasized that the "mega-trend of artificial intelligence" is expanding the potential market for AI data center capacity, and it is expected that by 2026, the high-density computing market alone for Vertiv will reach $25 billion.

This Chinese liquid cooling technology leader is favored by Wall Street giant Goldman Sachs

Goldman Sachs believes that artificial intelligence, as the "stock power fuel" of the global stock market, is far from exhausted. In its latest forecast report, the institution stated that the global stock market is currently only in the first stage of the investment frenzy led by artificial intelligence, and this frenzy will continue to expand to the second, third, and fourth stages, boosting more and more industries worldwide.

"If NVIDIA represents the first stage of the artificial intelligence stock trading frenzy—the AI chip stage that directly benefits the most, then the second stage will be other global companies helping to build infrastructure related to artificial intelligence." the institution wrote. "It is expected that the third stage will be companies incorporating artificial intelligence into their products to increase revenue, while the fourth stage will see a comprehensive improvement in production efficiency related to artificial intelligence, a prospect that can be achieved in many companies worldwide."

In the second stage of the artificial intelligence investment frenzy, the focus is on companies involved in AI infrastructure construction other than NVIDIA, including ASML, Applied Materials, semiconductor equipment manufacturers, chip manufacturers, cloud service providers, data center REITs, data center hardware and equipment companies, software security stocks, and utility companies. In this stage, Goldman Sachs specifically mentioned a Chinese listed company, Shenzhen Envicool, which focuses on precision liquid cooling technology for servers, data centers, and energy storage systems.

Renowned institution IDC predicts that the compound annual growth rate of China's liquid-cooled server market will reach 45.8% from 2023 to 2028, with the market size reaching $10.2 billion in 2028. IDC data shows that based on industry demand and policy promotion, the market size of China's liquid-cooled server market will further increase in 2023, and the partners participating in the liquid-cooled ecosystem are becoming more diverse, indicating a very positive attitude towards data center liquid cooling solutions in the market. As Chinese AI enterprises and organizations have increasingly higher requirements for intelligent computing centers in terms of construction and computing power supply, the energy consumption of such data centers has significantly increased. This necessitates more efficient liquid cooling systems to maintain suitable operating temperatures, otherwise it will pose significant challenges to the lifecycle management and operation of large-scale products.

Liquid cooling technology is popular worldwide, indicating strong demand for NVIDIA AI GPUs

Leaders in the liquid cooling field such as Vertiv and Asetek have delivered exceptionally strong performance data, and analysts' bullish expectations for Vertiv have increased, indicating a surge in demand for liquid cooling heat dissipation technology in global data centers, especially AI data centers. This also indirectly indicates the extremely strong demand from global enterprises for NVIDIA's AI GPUs based on the Hopper architecture and the latest Blackwell architecture.

Goldman Sachs predicts that the capital investment in cloud computing by four major tech companies this year, including Microsoft, Google, AWS under Amazon, and Meta under Facebook, will reach a staggering $177 billion, far exceeding last year's $119 billion, and is expected to further increase to an astonishing $195 billion by 2025.

According to media reports, Microsoft and OpenAI are in detailed negotiations on a mega-scale global data center project costing up to $100 billion. This project will include a supercomputer temporarily named "Stargate," which will be the largest-scale supercomputing facility planned by the two leaders in the AI field over the next six years.

Undoubtedly, this behemoth-level AI supercomputer will be equipped with "millions of cores" of core hardware - NVIDIA's continuously upgraded AI GPUs, aimed at providing powerful computing power for OpenAI's future more powerful GPT large models and more disruptive AI applications such as ChatGPT and Sora Videos. Although the supply bottleneck is gradually being eliminated and the incremental demand for AI GPUs may stabilize, the market for underlying hardware is still expanding. The situation of high-performance AI GPUs under NVIDIA's umbrella being in short supply may be difficult to completely alleviate in the next few years. This is also an important logic for major Wall Street firms such as Goldman Sachs to be optimistic about NVIDIA's potential to break through the $1100 mark in the coming year (NVIDIA closed at $887.47 on Thursday).

In particular, the technical scenario faced by AI large models and AI software under the stimulus of the trend of iterative updates means that software developers will inevitably continue to procure or upgrade AI GPU systems. Therefore, the AI hardware market is expected to remain extremely large in the coming years. According to the latest forecast by market research firm Gartner, the AI chip market is expected to grow by 25.6% from the previous year to reach $67.1 billion by 2024. It is projected that by 2027, the AI chip market will be more than double the size in 2023, reaching $119.4 billion.