NVIDIA invests heavily in AI emerging forces! The "NVIDIA AI Empire" is becoming stronger and stronger

Zhitong
2024.09.12 12:16
portai
I'm PortAI, I can summarize articles.

NVIDIA's leadership position in the AI chip field is becoming more solid. CEO Jensen Huang stated at the Goldman Sachs Technology Conference that the demand for the new generation AI GPU Blackwell is extremely high, leading to some large customers unable to obtain the product in a timely manner. Since 2023, NVIDIA has further strengthened its dominant position in generative AI infrastructure through investments in global AI startups. The company's stock has risen by over 700% since early 2023, with total investments in AI startups exceeding $1.5 billion in early 2024

According to the financial news app Zhitong Finance, Jensen Huang, the CEO of NVIDIA (NVDA.US), stated at a technology conference organized by Goldman Sachs on Wednesday that NVIDIA's new generation AI GPU - Blackwell architecture AI GPU is so popular that it has left some large customers who cannot obtain the product in time feeling dissatisfied. Huang's "Versailles-style" statement undoubtedly demonstrates NVIDIA's near-monopoly position in the field of AI infrastructure construction and the continued fervent demand for NVIDIA AI GPUs in the market. In addition, statistics show that since 2023, NVIDIA has been aggressively investing in AI startups globally to consolidate its dominant position in the generative AI infrastructure field.

Undoubtedly, NVIDIA is one of the biggest winners in the global artificial intelligence investment frenzy to date. Due to the strong demand for its high-performance AI GPUs from global enterprises and government agencies, NVIDIA's stock price has surged over 700% since early 2023, and over 900% since its historical low in October 2022. As NVIDIA's market value and revenue scale soar, the company's management continues to strengthen the penetration of the "NVIDIA software and hardware ecosystem" in the AI industry. The pace of investment in AI startups is also accelerating, with over half of the company's investments in startups occurring in the past two years since 2005.

Statistics show that the AI chip giant, once crowned the "world's highest market value listed company," has invested over $1.5 billion in AI startups in the early part of 2024, a significant increase from $300 million a year ago. According to Crunchbase data, in 2024 alone, this AI chip giant has participated in over ten financing rounds of $100 million or more for AI startups. Since 2023, NVIDIA has invested in over 50 AI startups, including important companies in the AI field such as Perplexity AI and Hugging Face.

Furthermore, NVIDIA is also considering investing in the upcoming ChatGPT developer OpenAI's latest financing round. All these latest developments from NVIDIA indicate that the undisputed leader in the AI chip field - NVIDIA, with the "NVIDIA AI Empire" built on the powerful AI GPU + CUDA ecosystem, is becoming increasingly powerful.

AI startups are crucial for the development of the global AI industry, especially in the booming enterprise AI application market. Unlike global cloud computing giants such as Amazon AWS, Microsoft Azure, and Google Cloud Platform focusing on building AI application development ecosystems or AI underlying infrastructure, these AI startups focus on various different segmented AI application scenarios, which are crucial for improving enterprise operational efficiency or enhancing global consumer work or learning efficiency.

For example, the AI startup Perplexity AI from the United States focuses on the cutting-edge field of "AI search"; the French AI startup Bioptimus focuses on fully integrating the most advanced AI technology with medical science and biotechnology; AI startup Cognition has launched what is considered the world's first "fully autonomous virtual AI software engineer," a virtual engineer with strong programming and software development capabilities that can assist programmers in multiple cutting-edge technologies or independently complete large software development projects.

Here are some of the AI startups that NVIDIA has invested in and hold significant positions in the AI field:

Perplexity AI

Jensen Huang is openly passionate about Perplexity AI, which is dubbed the "Google killer" and has unexpectedly become NVIDIA CEO Jensen Huang's favorite AI tool. In an interview this year, when asked, "Do you use ChatGPT or Google AI chatbots frequently, or do you use other products?" Jensen Huang replied, "I generally use Perplexity, and I use it almost every day. For example, when I recently wanted to understand how AI assists in drug development, I used Perplexity for related searches."

He even showed his support for Perplexity through actions, as NVIDIA participated in a round of funding for Perplexity AI in April, raising approximately $62.7 million, valuing the AI startup at around $1 billion. Top investors led by Daniel Gross, with participation from Amazon founder Jeff Bezos, among others. This is not NVIDIA's first support for the company—this chip giant also invested in Perplexity AI in a funding round in January, where the AI startup raised as much as $73.6 million.

Hugging Face

Hugging Face is an AI startup that provides open-source artificial intelligence large models or application development platforms, with a long-standing close relationship with NVIDIA. This chip giant participated in a round of funding for Hugging Face in August 2023, raising as much as $2.35 billion, valuing Hugging Face at approximately $4.5 billion after the funding round. Other corporate investors who participated in the funding at that time included Google, Amazon, Intel, AMD, and Salesforce.

Hugging Face has long incorporated NVIDIA's hardware system and CUDA software tools and library resources into its shared resources. In May, the startup launched a new project, donating up to $10 million worth of free shared NVIDIA GPUs for AI developers to use.

Adept AI

Unlike well-known generative artificial intelligence chatbots from AI startups like OpenAI and Anthropic, Adept AI's main product is not centered around text or image generation. Instead, this AI startup focuses on creating a software engineering assistant that can perform tasks on computers, such as generating reports or browsing the web, and can use software tools. NVIDIA also participated and invested in a round of funding for Adept AI in March 2023, raising as much as $350 million Databricks

Last fall, Databricks received a whopping $43 billion valuation, making it one of the most valuable AI startups in the world. As expected, this data analytics software provider extensively uses NVIDIA AI GPUs and has garnered support from this chip giant, as well as investors like Anderson Horowitz and Capital One Ventures, all of whom participated in a $500 million funding round in September 2023. "Databricks is doing incredible work leveraging NVIDIA's software and hardware technology to accelerate data processing and large-scale AI model generation," Jensen Huang stated in a press release at the time.

Cohere

Canadian renowned AI startup Cohere is a strong competitor to OpenAI and Anthropic, specializing in providing exclusive AI models for enterprises. The company's growth over the past five years has attracted support from major tech players like NVIDIA, Salesforce, and Cisco, who provided funding for Cohere in a funding round in July. NVIDIA also participated in a funding round in May 2023, bringing approximately $270 million in funding to this AI startup.

The Strengthening of the "NVIDIA AI Empire"

When NVIDIA invests in AI startups focusing on different application areas, these companies essentially allocate most of the investment funds to purchasing NVIDIA AI GPUs to establish or expand their AI training and inference infrastructure. AI startups require a significant amount of computing power to train their deep learning models, and NVIDIA's GPUs (such as H100, H200, and the upcoming Blackwell GPU) set the industry standard in performance, making their choice of NVIDIA products a natural decision.

NVIDIA CUDA is a highly optimized parallel computing platform and programming model deeply integrated with NVIDIA's GPU hardware. AI startups that receive investments from NVIDIA essentially invest heavily in purchasing advanced versions of the CUDA acceleration toolkit, further deepening their reliance on the NVIDIA ecosystem. This "lock-in effect" ensures that these startups, when developing AI applications or iterating large models, almost inevitably continue to use NVIDIA's hardware and software tools.

In the future, when enterprises use AI large models or applications developed by these AI startups, due to their construction and iterative optimization on the NVIDIA ecosystem, these enterprises will have to continue to rely on NVIDIA's full-stack ecosystem of hardware and software in the inference and deployment stages, allowing NVIDIA to further expand its market share through startups.

Furthermore, if enterprises that actually use the AI large models or AI applications developed by these AI startups choose cloud computing platforms for training/inference computing power deployment, they in turn require AWS, as well as cloud giants like Microsoft and Oracle, to continue to intensify their purchases of NVIDIA's continuously evolving AI GPUs to build AI infrastructure. Many enterprises tend to use cloud computing platforms like AWS, Microsoft Azure, and Oracle OCI when deploying AI applications If these cloud service providers' customers are using AI models and applications developed based on the NVIDIA software and hardware ecosystem, these cloud service providers will also need to continuously purchase NVIDIA's latest AI GPUs and configure CUDA advanced acceleration tools and libraries to meet the massive computing power requirements. The software and hardware full-stack ecosystem formed by these factors together is driving the growth of the "NVIDIA AI Empire" established by NVIDIA.

Among them, the CUDA ecosystem barrier can be described as NVIDIA's "strongest moat." NVIDIA has been deeply cultivating the global high-performance computing field for many years, especially with its CUDA computing platform that it has single-handedly created, which is popular worldwide and can be said to be the preferred software and hardware collaborative system for high-performance computing in areas such as AI training/inference. The CUDA accelerated computing ecosystem is an exclusive parallel computing acceleration platform and programming assistance software developed by NVIDIA, allowing software developers and engineers to use NVIDIA GPUs for accelerated parallel general-purpose computing (only compatible with NVIDIA GPUs, not compatible with mainstream GPUs such as AMD and Intel).

CUDA can be said to be a platform that ChatGPT and other generative AI applications rely heavily on, and its importance is on par with the hardware system, being crucial for the development and deployment of large AI models. With its high technological maturity, absolute performance optimization advantages, and extensive ecosystem support, CUDA has become the most commonly used and widely popular collaborative platform in AI research and commercial deployment.

According to information from the NVIDIA official website, using NVIDIA GPUs for CUDA regular accelerated computing programming and some basic tools is free, but if it involves CUDA enterprise-level large-scale applications and support (such as NVIDIA AI Enterprise), or when renting NVIDIA computing power on cloud platforms (such as Amazon AWS, Google Cloud, Microsoft Azure) for advanced CUDA microservices development for AI systems, additional fees may be required.

Based on the extremely powerful and highly penetrated CUDA platform and powerful AI GPUs, NVIDIA's recent layout in the software and hardware full-stack ecosystem can be described as continuously strengthening. NVIDIA officially launched a microservice called "NVIDIA NIM" at the March GTC, charged based on GPU usage time. It is an optimization-focused cloud-native microservice designed to shorten the time to market for AI large models and simplify their deployment workloads on the cloud, data centers, and GPU-accelerated workstations, enabling enterprises to deploy AI applications on NVIDIA AI GPU cloud inference computing power and accelerate the development of AI applications in the exclusive NVIDIA GPU ecosystem based on the CUDA platform.

Wall Street Calls NVIDIA Stock "Overly" Sold, Best to "Buy on Dips" Now

This is also why the well-known Wall Street investment firm Rosenblatt, compared to NVIDIA's AI GPU revenue, is more optimistic about the core logic of software revenue growth centered around CUDA. Rosenblatt's semiconductor industry analyst Hans Mosesmann significantly raised the firm's 12-month target price for NVIDIA from $140 to an astonishing $200 per share in a research report, ranking it as the highest target price for NVIDIA on Wall Street Mosesmann stated that based on NVIDIA's software business centered around CUDA, the potential prosperity is expected to continue, even though the stock price of the AI chip leader NVIDIA has skyrocketed in the past year. Therefore, in addition to the huge GPU revenue brought by CUDA tightly bound to NVIDIA's AI GPU, as well as the revenue generated by large-scale enterprise applications of CUDA, the software business derived from CUDA is also an engine for NVIDIA to achieve huge revenue.

Regarding the recent sharp drop in NVIDIA's stock price, with a total market value evaporating by about $400 billion in the past week, top Wall Street investment giants such as Goldman Sachs have all expressed that cautious investors have been "over-selling" NVIDIA. Goldman Sachs analyst Toshiya Hari recently maintained a "buy" rating on NVIDIA, stating, "The recent (NVIDIA) stock performance has not been very good, but we still like this stock, and the recent selling is clearly excessive. First of all, the global demand for accelerated computing remains very strong. We tend to focus more attention on super large enterprises, such as Amazon, Google, Microsoft, and other global large enterprises, but what you will see is that the demand is expanding to enterprises, and even sovereign countries."

As competition in the AI field intensifies among large tech companies such as Microsoft and Amazon, the international bank UBS recently estimated that the overall AI capital expenditure of these tech giants may increase by 47% and 16.5% in 2021 and 2025, reaching $218 billion and $254 billion, respectively. Nevertheless, UBS stated that the overall capital expenditure intensity of large tech companies (capital expenditure divided by sales) is still below historical peaks. UBS predicts that as generative AI accelerates monetization, these tech giants seem poised to achieve close to 15-20% profit growth in the next few quarters. UBS forecasts that the total free cash flow of large tech companies may increase from $413 billion in 2021 to $522 billion in 2025.

On Wall Street, the sentiment of "buying on dips" is exceptionally strong. Bullish investors on Wall Street firmly believe that this round of correction has squeezed out most of the "AI bubble", and in the future market, tech companies that can continue to profit in the AI wave are expected to enter a new round of "major uptrend" surge, such as NVIDIA, AMD, TSMC, Intel, and Broadcom, popular chip stocks. Chips are an indispensable core infrastructure for popular generative AI tools like ChatGPT, and these popular chip stocks can be considered the biggest winners of the AI boom, especially the "CUDA ecosystem + high-performance AI GPU", jointly forming NVIDIA's incredibly strong moat.

In addition to Goldman Sachs, analysts from major banks such as Bank of America and Morgan Stanley are also optimistic about NVIDIA's stock trend, and they are shouting out that the opportunity to "buy on dips" has arrived. Among them, Bank of America analyst Vivek Arya recently reiterated his "buy" rating on NVIDIA, calling it the "best industry choice", stating that the drop in NVIDIA's stock price provides a good entry point, and raised NVIDIA's target price from $150 to $165, compared to NVIDIA's close at $116.910 on Wednesday. The Bank of America analyst emphasized that doubts about the potential of artificial intelligence in the market are unnecessary at least until 2026 The current demand for AI chips is incredibly strong, and this is likely to continue for a long time in the future. TSMC's management recently stated at an earnings conference that the supply-demand imbalance for advanced packaging technology CoWoS needed for AI chips is expected to persist until 2025, with a possible slight easing in 2026. Industry insiders have revealed that due to the strong global demand for NVIDIA's upcoming Blackwell architecture AI GPU, NVIDIA has significantly increased its AI GPU foundry orders with chip giant TSMC by at least 25%.

Jensen Huang stated at a meeting on Wednesday: "The demand for AI GPUs can be said to be very strong, everyone wants to be the first to receive them, everyone wants to receive the most products." "We may now have more emotional customers, and that's only natural. The situation is a bit tense now, and we are doing our best." He added that the demand for the latest generation AI GPU - the Blackwell GPU is strong, and suppliers are catching up.

Furthermore, when asked whether the massive AI investments are providing returns for customers - a concerning issue during this AI boom, Jensen Huang mentioned that besides embracing "accelerated computing," companies have no other choice. He pointed out that NVIDIA's technology speeds up traditional workload processing and can handle AI tasks that older technologies cannot cope with