Zhitong
2023.11.22 00:07
portai
I'm PortAI, I can summarize articles.

Performance continues to explode! The AI boom is unstoppable, and "shovel sellers" Nvidia is crazily raking in the cash.

NVIDIA, known as the "undisputed champion" in the AI chip field, has once again announced a strong quarterly performance and a significantly better-than-expected outlook.

Zhitong App has learned that Nvidia (NVDA.US), known as the "strongest king" in the AI chip field, has once again announced strong quarterly performance and significantly exceeded market expectations for its performance outlook. The groundbreaking generative AI model, ChatGPT, has emerged, signaling the world's gradual entry into a new era of AI. Not only the technology industry, but also various industries around the world have seen a surge in demand for Nvidia's GPU chips, specifically the A100/H100 chips used in AI training and inference. This has led to another astonishingly strong performance data release by the tech giant, following the previous two quarters.

The performance data shows that in the third quarter ending on October 29th, Nvidia's quarterly revenue more than doubled year-on-year, reaching $18.1 billion. After excluding certain items, the non-GAAP earnings per share were $4.02, both of which significantly exceeded analyst expectations. In comparison, the average analyst expectations were approximately $16 billion in total revenue and earnings per share of $3.36.

As the world enters the AI era, the data center business has become Nvidia's core business, replacing the gaming business that previously relied on gaming graphics cards. In terms of segment business, Nvidia's data center business, which provides A100/H100 chips to data centers worldwide, was once considered a "side business" (since gaming has always been Nvidia's most important business since its inception), but it has now become the most powerful contributor to the company's overall revenue.

Nvidia's data center business is the most outstanding department among all its businesses. The revenue generated by the data center business in Q3 reached $14.5 billion, an increase of approximately 279% compared to the same period last year. At the same time, the company's gaming business benefited from the recovery trend in the chip industry, with revenue growing by 81% year-on-year to $2.86 billion.

Nvidia, the world's most valuable chip company, stated in its performance expectations that its total revenue for Q4 of the 2024 fiscal year (for the fourth quarter ending in January next year) will reach approximately $20 billion. Although this figure exceeds the Wall Street analyst average forecast of $17.9 billion, some analysts' predictions even go as high as $21 billion.This incredibly strong performance outlook highlights NVIDIA as the best beneficiary of the global AI boom, making it the "strongest shovel seller" in the field of AI core infrastructure. Faced with the increasing demand for generative AI products such as ChatGPT and Google Bard, as well as other AI software and important AI-assisted tools from various data center operators around the world, these operators are doing their utmost to stock up on the company's GPU accelerators, which are extremely proficient in handling the heavy workloads required for artificial intelligence.

NVIDIA's AI chips drive revenue growth - analysts expect the company's high revenue growth to continue into next year.

However, in the secondary market, NVIDIA investors seem to have a lukewarm response to the company's latest quarterly data and outlook. Although the company's quarterly report exceeded analysts' average expectations, it failed to meet the higher expectations of analysts and investors who bet on the prosperity of artificial intelligence. It is not ruled out that some investors will adopt the classic strategy of "buying expectations and selling facts" after the actual performance is announced. After the financial report was released, NVIDIA's stock price fell by 6.3% in after-hours trading, and then fell by about 1%.

Although NVIDIA has released another impressive quarterly performance report and the latest performance outlook, some investors clearly expected stronger data. These investors, who bet heavily on the incredibly strong demand for AI chips, have poured a large amount of capital into the stock this year, resulting in an astonishing increase of 242% in its value. They hope that the global AI boom will continue to bring explosive revenue growth to NVIDIA. Some analysts believe that this means that investors have extremely high performance requirements for NVIDIA, and its stock price is basically at a demanding level that requires "absolutely perfect results".

"Setting aside high expectations, NVIDIA's performance is still shocking." said Chris Caso, an analyst at Wolfe Research. "Considering the impact of US restrictions on China, these numbers are particularly impressive." In addition, he pointed out that NVIDIA is about to launch chips designed specifically for the Chinese market, which may continue to boost its performance in the Chinese market.

Before the release of the financial report, NVIDIA's stock price closed at $499.44 in New York on Tuesday. The company is a global chip benchmark - the best-performing stock in the Philadelphia Semiconductor Index, with a market value exceeding $1.2 trillion.In fact, Nvidia's current market value is more than $1 trillion higher than its competitor Intel Corp. Until recently, Intel was the world's largest chip manufacturer by market value.

It is worth noting that another threat to Nvidia's business comes from the US restrictions on chip exports to China. China is the world's largest chip market, however, the Biden administration has restricted the export of Nvidia's highest-performance AI chip products to China on the grounds of national security.

The US government updated regulations regarding such exports in October, aiming to make these restrictions more difficult to circumvent. Nvidia stated that these changes currently do not affect its sales, as the demand for its products in other regions is endless. However, they are forcing the company to readjust its business, such as developing lower-performance AI chips specifically for the Chinese market.

Nvidia reiterated on Tuesday that these requirements did not have a "meaningful impact" on the previous quarter. However, China and other restricted regions account for about a quarter of its data center revenue. The company stated, "We expect sales to these destinations to decline significantly in the fourth quarter of fiscal year 2024, but we believe that strong growth in other regions will offset this downward trend."

Nvidia's CFO, Colette Kress, stated that the US regulations require certain exported products to obtain licenses. The company is working with customers in these regions to obtain shipping permits for some products and to find "solutions" that do not trigger US government restrictions. She even stated that "if there were no new shipping regulations for AI chips to China, Nvidia's fourth-quarter performance outlook data would be higher."

Analysts Kunjan Sobhani and Oscar Hernandez Tejada from Bloomberg Intelligence stated in a report that although the decline in the Chinese market in the fourth quarter is not a concern for Nvidia in the short term, it may become an area of close attention for investors.

GPU, one of the most core infrastructures in the AI era

"A new era of computing has begun," said Nvidia CEO Jensen Huang at an earnings conference. He emphasized that companies from around the world are shifting from general data processing methods such as CPUs to GPU systems that can handle large-scale accelerated computing tasks, as well as developing revolutionary solutions based on ChatGPT-style generative AI.

Nvidia, the undisputed leader in the AI chip field, has shown incredibly strong performance for three consecutive quarters and extremely optimistic performance expectations. This largely indicates that this year may be the beginning stage of comprehensive development and expansion of global AI technology, rather than a speculative frenzy centered around tech stocks. More importantly, Nvidia's strong performance announces to the world that we have officially entered the AI era and the new era dominated by the "computing power is king" trend has begun.AI technology continues to break through, and the integration of AI and applications is becoming more perfect. Companies around the world are competing to layout advanced technologies represented by artificial intelligence, helping companies achieve new business empowerment, optimize decision-making processes, and improve operational efficiency. This has given rise to a more diverse and customized demand for artificial intelligence. According to the latest data from IDC, the global AI IT investment scale is expected to reach 128.8 billion US dollars in 2022, and is expected to increase to 423.6 billion US dollars by 2027, with a five-year compound annual growth rate (CAGR) of approximately 26.9%.

As the world enters the AI era and the process of everything being interconnected accelerates, it means that the global demand for computing power is experiencing explosive growth. Especially for various AI sub-tasks based on AI training and inference, they involve a large number of matrix operations, forward and backward propagation of neural networks, and other computationally intensive operations that require high hardware performance. However, these problems cannot be solved by CPUs that have enjoyed the benefits of Moore's Law for many years. Even a large number of CPUs cannot solve this problem. After all, the original intention of CPU design is to perform general-purpose computing between various conventional tasks, rather than to handle parallel computing patterns with massive amounts of data and high computational density.

More importantly, as innovation and development in the global chip field enter the "Post-Moore Era," CPUs, which have been the driving force behind human social development, can no longer achieve rapid breakthroughs on the scale of "wide nm" within less than 5 years, like the transition from 22nm to 10nm. The subsequent breakthroughs on the nm scale face various obstacles such as quantum tunneling, which also greatly limits the performance upgrade and optimization of CPUs.

Therefore, GPUs, which have a large number of computing cores, can simultaneously execute multiple high-intensity AI tasks, and are extremely good at parallel computing, have become the most core hardware in the chip field in recent years. GPUs have significant advantages in high-performance computing fields such as AI training/inference that other types of chips cannot match. This is particularly important for extremely complex AI tasks such as image recognition, natural language processing, and large-scale matrix operations. Modern GPU architectures have been optimized for AI, making them suitable for AI tasks such as deep learning. For example, NVIDIA's Tensor Cores can accelerate critical high-intensity operations such as matrix multiplication and convolution, thereby improving computational efficiency.

The strong rise of GPUs reflects a complete reversal in the status between GPUs and general-purpose processors (CPUs) that have enjoyed the benefits of Moore's Law for many years in the AI boom. Since the advent of ChatGPT, with the increasing influence of AI on the global high-tech industry and technological development, CPUs that focus on single-thread performance and general-purpose computing are still indispensable in the chip field.However, its position and importance in the field of chips are far less than that of GPUs.

From a theoretical perspective, the exponential growth trend predicted by Moore's Law has not disappeared in recent years, but has shifted from CPUs to GPUs based on a large number of cores. In recent years, GPU performance still follows the exponential growth trend, doubling approximately every 2.2 years. In comparison, Intel CPU GFLOPs are still growing, but compared to GPU GFLOPs, it seems to be a straight line.

In recent years, GPUs have been able to continue their exponential growth mainly because in the field of artificial intelligence (AI) and deep learning, large-scale parallel computing is usually required, especially for training and inference of deep learning models, which involve a large number of matrix operations. This is where GPUs excel, while CPUs have limited capabilities in this regard. GPU designs are focused on supporting a large number of computing cores, allowing them to handle multiple tasks simultaneously, which makes them extremely efficient in parallel computing. In contrast, general-purpose CPU designs prioritize the processing performance of individual tasks, which severely limits their capabilities in handling parallel tasks.

Huang Renxun emphasized that the global shift towards artificial intelligence is just beginning. He believes that accelerating the computation of specific tasks through task decomposition and parallel processing is becoming dominant. In terms of market size expectations, according to the latest research by well-known market research firm Mordor Intelligence, the GPU market size (covering PC, server, high-performance computing, autonomous driving, and other application-side GPUs) is expected to expand significantly from $41.82 billion in 2023 to $172.08 billion in 2028, with a compound annual growth rate (CAGR) of 32.70% during the forecast period (2023-2028).

NVIDIA has achieved success in selling A100/H100 GPUs to tech giants such as Microsoft and Google. However, these tech giants are also accelerating the development of dedicated AI chips. For example, Google's next-generation TPU chip (a type of ASIC chip) - Cloud TPU v5e is designed to provide cost-effectiveness and performance for large and medium-sized training and inference. Compared to NVIDIA GPUs, Google TPUs adopt low-precision computing, significantly reducing power consumption and accelerating computation speed without significantly affecting the effectiveness of deep learning processing.Especially for mid-sized LLM designers, it is completely sufficient, so they may not need to rely on high-performance NVIDIA A100/H100.

Microsoft, on the other hand, released its proprietary internal AI chip last week, following similar efforts by Amazon's AWS. This quarter, one of NVIDIA's competitors, AMD, is set to launch an AI accelerator, the MI300, to compete with NVIDIA's A100/H100. However, NVIDIA is not standing still and has recently released the next-generation update of its highly acclaimed H100 chip, the H200, which is planned to be launched in early next year.

According to a recent research report released by market research firm Technavio, the market size of AI chips is expected to explode with a staggering compound annual growth rate of 61.51% between 2022 and 2027. The report covers chip categories including customized ASICs, GPUs, CPUs, and FPGAs, among other underlying chips.

The demand for AI chips is leading the chip industry into an upward cycle.

From the unexpectedly strong financial reports of TSMC, Samsung Electronics, and SK Hynix, to the incredibly strong performance outlook of Intel and NVIDIA, the giants of the chip industry have undoubtedly sparked a wave of recovery.

Looking at the revenue scale of TSMC, the global leader in chip manufacturing, the HPC business, which mainly focuses on high-performance server chips and PC chips, as well as the smartphone business, still account for the majority. This distribution also means that if there is strong demand for these three types of chips, it is expected to drive the recovery cycle of the entire chip industry.

AI chips undoubtedly serve as the main driving force behind the recovery of chip inventory and demand, with their extremely strong demand expected to continue throughout the year and potentially into next year, as seen from TSMC's NVIDIA orders. NVIDIA's incredibly strong performance outlook is the best reflection of the strong demand for AI chips from global enterprises.

In addition, due to the much stronger-than-expected demand for AI chips, NVIDIA has placed a large number of orders with TSMC in advance, mainly concentrated in 2024, to accelerate the production of a full range of AI GPUs for servers. The limitation of NVIDIA H100 lies in the advanced CoWoS packaging capacity. Research firm TrendForce predicts that the CoWoS packaging capacity will remain tight, and the strong demand for CoWoS packaging for NVIDIA H100 will continue until 2024.

According to the latest data, PC shipments are picking up pace and demand is continuing to recover. According to data released by Counterpoint Research, global PC shipments have seen consecutive quarterly growth in the third quarter of 2023. Counterpoint Research once again emphasized that the PC market has bottomed out and is expected to gradually recover in the coming months, especially with the launch of multiple new products, particularly models that support AI functionality.Counterpoint Research predicts that the global PC market's shipment volume will return to pre-pandemic levels next year, thanks to the replacement of Windows 11, the next wave of Arm PCs, and AI PCs.

In terms of smartphones, according to statistics from research firm Canalys, the global smartphone market only declined by 1% in the third quarter of 2023, indicating a slowdown in the downward trend. It is expected that leading manufacturers will gradually enter the path of recovery. Canalys stated that driven by regional recovery and demand for new product upgrades, the global smartphone market achieved double-digit MoM growth in the third quarter before the peak season. Canalys predicts that by the end of 2023, manufacturers are expected to have relatively healthy inventory levels and enough room to rebuild inventory in anticipation of potential demand recovery. In addition, the global smartphone market is expected to achieve moderate growth in a cautious manner in 2024.

Numerous signs indicate that the chip industry is constantly showing signs of resurgence, indicating that the overall industry inventory has overcome the most difficult period and is entering a period of recovery. The demand for AI chips throughout the year can be described as extremely strong, and for NVIDIA, the trend of recovery has even begun since the end of last year. Artificial intelligence has always been the hottest topic in technology investment this year, and major companies are discussing their expectations in this field. However, NVIDIA is one of the few companies that have achieved huge profits from this major trend. Since the public appearance of ChatGPT developed by OpenAI in November 2022, human society has gradually entered a new era of AI, and ChatGPT has shown a revolutionary change that AI can bring to a wider audience.

With the recovery of the industry driven by AIC and PC chips, for memory chip giants such as Micron and SK Hynix, the second quarter of this year has already reached the bottom of the cycle, and the third quarter has seen an upturn in prosperity. However, for the entire chip industry, including PC CPUs, GPUs, SoC chips for smartphones, and various application chips for consumer electronic terminals, the fourth quarter is likely to be an important turning point for the inventory and prosperity cycle of the entire industry.

For TSMC, the world's largest chip foundry, the turning point in the chip market may be in the fourth quarter, and the demand in the field of artificial intelligence will be the long-term growth booster. The more important information is that TSMC's most important customers, especially AI, PC chip, and smartphone chip foundry customers, have accepted the proposal for price increases next year, which to some extent indicates that the prosperity cycle of the chip industry next year has basically been established.According to the silicon shipment data, which reflects the overall demand for chips, the silicon wafer shipment volume is expected to rebound significantly next year. SEMI (the Semiconductor Equipment and Materials International) stated in its latest annual silicon shipment forecast report that the global silicon wafer shipment volume is expected to decrease by 14% in 2023, from a record high of 14,565 million square inches (MSI) in 2022 to 12,512 million square inches. However, SEMI predicts that with the recovery of wafer and semiconductor demand and the normalization of inventory levels, the global silicon wafer shipment volume will experience a strong rebound in 2024.

The World Semiconductor Trade Statistics (WSTS) has released its outlook for the semiconductor industry in 2024, which indicates that the global semiconductor market is expected to decline by 10.3% in 2023. However, WSTS predicts a strong recovery thereafter, with a projected growth of 11.8% in 2024. Almost all key categories, including discrete devices, sensors, analog chips, logic chips, and MCUs, are expected to show single-digit growth.