NVIDIA's competitors have really emerged!

Wallstreetcn
2024.12.07 04:36
portai
I'm PortAI, I can summarize articles.

Inference calculation has become a breakthrough point, and cost-performance advantage is the key

NVIDIA's monopoly position in the AI chip market is facing unprecedented challenges.

Tech giants like AMD and Amazon, as well as startups such as SambaNova Systems and Groq, are all targeting large model inference, launching various custom chips that continuously improve in reliability and cost.

These new players not only bring more cost-effective options but also gain recognition from important customers.

For example, Meta has begun using AMD's MI300 chip to support inference calculations for its new AI model Llama 3.1 405B; Amazon's Trainium 2 chip has received positive feedback from potential users, including Apple.

Market research firm Omdia predicts that spending on non-NVIDIA chips in data centers will grow by 49% in 2024, reaching $126 billion.

It seems that the market landscape is already changing, and competition in the AI chip field is entering a new phase of greater diversification.

Tech Giants Enter the Fray, Inference Computing Becomes a Breakthrough

AMD's MI300 chip is expected to generate over $5 billion in sales in its first year of release. Meanwhile, Amazon has developed the next-generation AI chip Trainium in North Austin.

In addition to traditional tech giants, startups like SambaNova Systems, Groq, and Cerebras Systems are also actively positioning themselves. These companies claim to offer faster speeds, lower power consumption, and cheaper prices in inference compared to NVIDIA.

The "inference" phase is becoming the key for competitors to break NVIDIA's monopoly.

"Real commercial value comes from inference, and inference is beginning to scale," said Qualcomm CEO Cristiano Amon. Amon stated that Qualcomm plans to use Amazon's new chip to execute AI tasks, "We are starting to see the beginning of transformation."

In terms of cost, NVIDIA's existing chips are priced as high as $15,000, and its new Blackwell chip is expected to reach tens of thousands of dollars.

In contrast, competitors offer more cost-effective options. Dan Stanzione, executive director of the Texas Advanced Computing Center, stated that due to power consumption and price advantages, the center plans to use SambaNova's chips for inference tasks while using NVIDIA's Blackwell supercomputer.

Amazon Shows Ambition

Amazon has demonstrated strong ambition in the AI chip field, having invested $75 billion this year in AI chips and other computing hardware. Its new Trainium 2 chip has performance that is four times that of its predecessor, and it has announced plans for a more powerful Trainium 3 chip Eiso Kant, the CTO of San Francisco AI startup Poolside, estimates that compared to NVIDIA's hardware, Trainium 2 has improved performance per dollar by 40%. More importantly, Amazon plans to offer Trainium-based services in global data centers, which is particularly significant for inference tasks.

It is worth noting that Amazon has also partnered with AI startup Anthropic to build a giant AI factory containing hundreds of thousands of new Trainium chips, which will have five times the computing power of any system previously used by Anthropic.

Nevertheless, the intensifying competition does not mean that NVIDIA will lose its leading position in the short term. NVIDIA CEO Jensen Huang still emphasizes that the company has significant advantages in AI software and inference capabilities, and demand for the new Blackwell chips remains strong.

In a speech at Stanford University, Huang stated:

"The total cost of ownership of our products is very reasonable; even if competitors' chips are free, they are still not cheap enough."