NVIDIA's closed-door meeting revealed what? Morgan Stanley: Inference computing will boost long-term demand for AI chips

Wallstreetcn
2024.10.10 12:13
portai
I'm PortAI, I can summarize articles.

NVIDIA stated in a closed-door meeting that with the release of the OpenAI o1 model, a new AI narrative is unfolding, and the increased demand for inference computing power is bringing new growth opportunities to NVIDIA. Morgan Stanley predicts that NVIDIA's market share in AI processors will continue to increase in 2024 and 2025, with a projected upward trend in shipments

Recently, the AI ​​boom has reignited, with NVIDIA achieving a "five-day increase" and the stock price just a step away from a new high. The "New AI Narrative" - the growth in inference computing power is opening up space for NVIDIA's chip demand.

On Tuesday this week, the leading computing power company NVIDIA held a three-day AI roadshow in New York, with CEO Jensen Huang, CFO Colette Kress, and other members of the management team in attendance. As expected, the management team is enthusiastic about both the short-term and long-term prospects of AI, especially in the innovation and expansion of AI computing, emphasizing substantial growth in the future.

The NVIDIA management team stated that we are still in the early stages of the AI cycle. With the release of the OpenAI o1 model, a new AI narrative is unfolding, shifting towards solving more complex reasoning problems, which will increase the demand for hardware configurations, with NVIDIA's upcoming rack products being the optimal solution.

The long-term vision for AI is that deep thinking will allow every company in the world to hire a large number of "digital AI employees" capable of performing challenging tasks.

In response, Morgan Stanley pointed out in a report:

The complexity and demand for inference computing are growing exponentially, especially for task-oriented inference demands, bringing new growth opportunities for NVIDIA. NVIDIA's full-stack solution has significant advantages in addressing such complex problems.

In the short term, the advancement of the Blackwell product line is proceeding as planned, with products sold out for the next 12 months, indicating strong market demand. NVIDIA expects strong performance in 2025 and views 2026 as the early stage of a long-term investment cycle.

Inference computing will grow exponentially with "deep thinking"

Morgan Stanley stated that the management team has mentioned the new OpenAI model o1 multiple times, which requires more "thinking" time during inference:

During the meeting, Jensen Huang often mentioned the recently released o1 model by OpenAI, which can generate a series of thoughts before responding to queries. The model outputs without delay constraints, allowing the model to "think" for as long as possible before responding. OpenAI has not explicitly stated the cost difference of o1 inference, but some data sources suggest that the cost may be around 10 times that of GPT-4.

For NVIDIA's growth story, the bigger picture is that NVIDIA is excited about the capabilities that models trained on Blackwell or Rubin systems will be able to achieve in the next two to three generations. For inference computing, to serve a GPT-6 level model with these advanced features at low latency, how much additional computing power will be needed may be an order of magnitude higher than the leading level of computational intensity we think of today.

Furthermore, Morgan Stanley stated that NVIDIA's long-term vision is that in the next decade, companies will have thousands of "digital employees" to perform complex tasks - such as programmers, circuit designers, marketing project managers, legal assistants, etc. The advancement in inference computing will require more complex hardware, with NVIDIA's Blackwell system, especially rack-level systems, considered groundbreaking technology

NVIDIA's market share is expected to continue to grow by 2025

The Morgan Stanley report also points out that inference computing will experience exponential growth, which means a significant increase in investment demand for inference hardware, which bodes well for NVIDIA's business in the long term:

NVIDIA positions Blackwell—especially rack-scale systems—as a breakthrough technology to address these issues. Blackwell brings a more powerful processor to the AI market, but the most important innovation may come from the GB200 system, which introduces the Grace CPU into the system and more complex NVLink chip connections, allowing each GPU in a 36 or 72 GPU rack to collaborate with all other GPUs simultaneously, placing all GPUs in the same NVLink domain, greatly enhancing the ability to treat the entire rack as one giant GPU.

In the short term, the advancement of the Blackwell product line is proceeding as planned, with products for the next 12 months already sold out, indicating strong market demand and suggesting a continued high growth trend for shipments throughout the year. In 2024 and 2025, NVIDIA's AI processor market share is expected to increase, with shipment trends expected to continue to grow.

Regarding NVIDIA's recent stock performance, Morgan Stanley remains optimistic about NVIDIA's long-term prospects, with a "overweight" rating and a target price of $150. However, it also acknowledges that with the rebound in the stock price, the upside potential in the short term has increased to some extent due to higher profit expectations.

As consensus shifts to very high expectations for the fiscal year 2025, debates at our current position often turn to fiscal year 2026 and beyond. While we are optimistic about the long-term outlook, these debates are more difficult to resolve.

The company's performance has significantly exceeded their guidance every quarter, with gross margins exceeding guidance by 1 percentage point or more, which has become the expectation. At some point, the magnitude of the increase could rise, and there are signs that this quarter may have more upside potential