Wallstreetcn
2023.12.21 09:10
portai
I'm PortAI, I can summarize articles.

Morgan Stanley's view on the US stock market in 2024: Will NVIDIA be ambushed by the "inference side" of the chip industry?

According to Morgan Stanley, the revenue growth of NVIDIA may be the biggest surprise in 2024. In the coming period, NVIDIA will not face significant competitive pressure in the performance-driven market. The only thing to watch out for is the cost.

Will NVIDIA face pressure in the "large-scale model training" market as its market share is gradually eroded?

According to a report released by Morgan Stanley last week, it is expected that NVIDIA will not face significant competitive pressure in the performance-oriented market in the near future. According to Morgan Stanley:

" The market for model training in AI and machine learning is not only rapidly evolving but also highly competitive, with "performance" being the key factor. Companies that can provide high-performance solutions will be more competitive. We expect NVIDIA to continue to maintain its leading position in product performance, and high-performance products will continue to maintain their premium pricing power."

NVIDIA's revenue growth may be the biggest surprise of the 2024 fiscal year:

NVIDIA expects its revenue in the fourth quarter of the 2024 fiscal year (ending in January) to reach $20 billion. This figure not only exceeds market expectations but also surpasses the revenue forecast of the industry-leading semiconductor giant TSMC for its fourth quarter. NVIDIA's predicted revenue is not only high but unexpectedly high.

In addition, NVIDIA's expected gross margin is 20 percentage points higher than market expectations. In the semiconductor industry, it is rare to create such a large profit in such a short period of time, which has attracted high attention from investors to the market competition landscape. Investors are very concerned about whether/when/how other companies can compete for this excess profit.

NVIDIA's B100 product has a leading performance advantage. Compared to Amazon and Microsoft's custom ASIC chips, it will not be mass-produced until February next year. In the field of model training market that values technical capabilities rather than costs, falling behind means being beaten:

"We expect NVIDIA to not face significant competitive pressure in the performance-oriented market (such as the AI model training market) in the near future. The newly launched B100 product by NVIDIA will have a significant improvement in performance compared to its already leading H100 product, which will dampen the competitive enthusiasm of other companies.

Amazon and Microsoft recently announced that their custom ASIC chips will not be mass-produced until February 2024. By then, their products will be far behind the latest technological level and unable to compete with NVIDIA's technology. In the field of model training market that values technical capabilities rather than costs, falling so far behind is of no value."

NVIDIA's genes are fundamentally different from other companies, and whenever NVIDIA launches a new product, it directly suppresses the attractiveness of other products:

"History is the prologue to the future, and NVIDIA's continuous efforts have repeatedly raised the standards of its competitors. In the 1990s and early 2000s, there were dozens of GPU suppliers. With industry consolidation, many of NVIDIA's early competitors either ceased operations or were acquired by companies such as AMD and Intel." During the period from 2010 to 2014, although NVIDIA's annual revenue reached approximately $4 billion, which was considerable, compared to Intel's market value at that time (about ten times that of NVIDIA, exceeding $100 billion), this market size was not enough to prompt Intel to invest heavily in competition. Before AI became a mainstream trend, other large semiconductor companies, such as Intel, invested less in the GPU market.

By 2014 and 2015, when AI began to become a hot topic in the technology field (from 2012 to 2015, the share of AI in venture capital funding doubled), NVIDIA had already established a monopolistic business centered around GPUs. Since NVIDIA introduced CUDA technology in 2006, effectively creating the GPU computing market, the company's DNA has fundamentally differed from other companies.

Around 2015, people generally realized that the application scenarios for graphics processing would rapidly expand beyond gaming. We witnessed the resurgence of startups and the attention of Intel, AMD, and cloud service providers to the data center GPU market. However, except for Google's TPU, other competing products have struggled to compete with NVIDIA in the training field. For a long time, we have conducted extensive research on solutions similar to NVIDIA's and found that other companies find it difficult to catch up with NVIDIA, and the gap between them is hard to bridge. Moreover, we often see that whenever NVIDIA releases a more powerful product and raises industry standards, it directly suppresses the attractiveness of competitors' products.

In the inference market, NVIDIA's performance goes without saying, but cost may also be a key factor in dealing with competition:

In the inference market, while performance is important, controlling costs is equally crucial: The competition in the inference market is quite fierce, mainly because customers are very concerned about costs. Although NVIDIA is expected to perform well in this regard, customers' main focus is on how to minimize costs while maintaining an appropriate processing speed (latency). For a large amount of inference workload, major customers can anticipate future workload situations, giving them enough time and motivation to conduct in-depth software optimization, thereby reducing service costs.

Key factors for the success of the inference market: In-depth understanding of workloads and targeted hardware/software optimization. In this fiercely competitive market, successful examples are mainly Google. The reason why Google is successful is that they understand their workloads better than anyone else. Google invented the "Transformer" model more than a decade ago, and their fifth-generation TPU (Tensor Processing Unit) silicon chip is optimized for this model. As the Transformer model is now very important, Google's TPU has been very successful in the market, and this success is difficult to replicate by other companies.

In addition, the Analysis Report also mentioned two potential favorable factors:

  1. In the first half of 2024, NVIDIA plans to hold a launch event for the B100 GPU. The introduction of this new product is expected to showcase NVIDIA's competitiveness and may lead to an increase in the average selling price of its products.

  2. In the second quarter of 2024, NVIDIA will begin selling the H200 GPU, which is equipped with 141GB of HBM3e memory. This new GPU is expected to once again establish NVIDIA's leading position in the field of large-scale language model inference.