Wallstreetcn
2024.06.29 10:27
portai
I'm PortAI, I can summarize articles.

After CoreWeave comes Lambda, the "AI Cloud New Noble" raises a large amount of financing, NVIDIA GPU ammunition continues to flow

NVIDIA-backed computing power rental company Lambda Labs is expected to complete a new financing of $800 million. Computing power rental startups are leveraging their flexibility and cost advantages to stand out in competition with cloud computing giants such as Amazon AWS, Google Cloud, and Microsoft Azure

With the vigorous development of generative AI, the demand for computing infrastructure to support its operation is soaring. This trend is driving rapid growth in the cloud computing market, while also intensifying market competition. Traditional cloud computing giants and emerging professional computing power lessors are engaged in a fierce game.

Rapid Expansion of Computing Power Lessors

According to media reports on Saturday, Lambda Labs, a computing power lessor supported by NVIDIA, is in talks to raise $800 million in funding, potentially ranking among the largest startups in Silicon Valley in recent years.

In February of this year, the company successfully raised $320 million at a valuation of $1.5 billion. Following that, in April, Lambda Labs secured a $500 million loan using NVIDIA chips as collateral to further expand its cloud service business. This rapid and consecutive financing not only reflects the company's ambition but also highlights the urgent demand for GPU resources in the market.

Meanwhile, another computing power lessor, CoreWeave, is also rapidly expanding. It recently completed $7.5 billion in debt financing and $1.1 billion in equity financing, with a valuation reaching $19 billion. CoreWeave has a closer relationship with NVIDIA, having previously received direct investment from NVIDIA and having priority purchase rights for NVIDIA GPUs.

These emerging computing power lessors are leveraging their flexibility and cost advantages to stand out in the competition with cloud computing giants such as Amazon AWS, Google Cloud, and Microsoft Azure.

Forrester's principal analyst Lee Sustar believes that cloud service providers like CoreWeave can succeed in part because they do not face the infrastructure "burden" that traditional suppliers do. This allows them to focus on providing high-end AI services without the need for the massive investments of large-scale cloud service providers.

Sid Nag, Vice President of Cloud Services and Technologies at Gartner, also points out that companies like CoreWeave are participating in what they call the specialized "GPU as a Service" cloud provider market. Given the high demand for GPUs, these companies provide customers with an alternative to large cloud service providers, bringing new vitality to the market.

NVIDIA's Calculations

At the core of driving this market frenzy is the scarcity and powerful performance of NVIDIA GPUs. The new round of industrial revolution initiated by "ChatGPT Moment" has brought star AI startups like OpenAI and Anthropic to the forefront, and the massive demand for computing power has driven Microsoft, Amazon, Google, and other cloud computing vendors to continuously build and upgrade data centers. NVIDIA's GPUs have become strategic resources comparable to gold and oil.

Faced with this situation, NVIDIA CEO Jensen Huang has taken a strategic approach: by providing GPUs to emerging computing power suppliers such as Lambda Labs and CoreWeave, NVIDIA has not only created a larger customer base but also nurtured new market competitors for Microsoft, Amazon, and Google, balancing their influence Behind this strategy is NVIDIA's deep insight into the market landscape. These tech giants are not only important customers of NVIDIA, but they are also actively developing their own AI-specific chips, posing a potential threat to NVIDIA. By supporting emerging GPU cloud service providers, NVIDIA is building a more diversified and resilient ecosystem.

NVIDIA's GPU Throne is Not Secure

By nurturing companies like CoreWeave to "eye" the giants, NVIDIA's performance has repeatedly reached new highs, and it has firmly maintained its moat. However, NVIDIA on the throne is not without worries.

Firstly, traditional cloud computing giants are increasing their investment in the development of proprietary AI chips. Google's TPU, Microsoft's recently released Azure Maia and Azure Cobalt chips, as well as Amazon's Trainium, Inferentia, and Graviton chips, are all aimed at reducing dependence on NVIDIA GPUs. The popularity of these self-developed chips may affect the price advantage of professional GPU cloud service providers.

Secondly, although many generative AI workloads run best on GPUs, not all AI tasks require the powerful computing power of GPUs. For some less time-sensitive workloads, traditional CPUs can still handle them, albeit at a relatively slower speed. This means that the growth of the GPU cloud service market may be somewhat limited.

Lastly, the market also faces a more macro risk: if the craze for generative AI suddenly cools down, companies investing heavily in GPUs may face the dilemma of resource surplus. A large number of idle high-performance GPUs will not only bring huge financial pressure to companies but may also lead to a reshuffle in the entire industry