Wallstreetcn
2024.03.23 07:20
portai
I'm PortAI, I can summarize articles.

The biggest risk for NVIDIA is NVIDIA itself?

NVIDIA has benefited from the strong demand for GPUs in the market, but its high selling price has also pushed up the costs for AI enterprises, leading to poor implementation of downstream applications. How much longer can NVIDIA's growth story continue?

With soaring costs, it is becoming the "top concern" for AI companies.

Recently, analysts from Citibank, including Yitchuin Wong, attended the Gartner Data Analytics Summit held in Orlando. In a summary report after the conference, Citibank pointed out that AI still faces many challenges, especially with high costs, prompting many enterprise clients to refocus on ROI (Return on Investment).

Why are costs soaring? This leads us to the "AI chip king" - NVIDIA.

With the vigorous development of AI technology, the graphics processing units (GPUs) produced by NVIDIA can meet the massive computing power requirements for AI training, leading to long-term high demand in the market. As a result, chip prices have surged, repeatedly pushing the company's stock price to historic highs.

The frenzy of betting on the "AI faith" has also benefited NVIDIA's upstream hardware suppliers. According to media reports, one of NVIDIA's main hardware suppliers, the high-bandwidth memory chip manufacturer Micron Tech, has sold out its products until 2025.

In an interview with the media, Micron Tech bluntly stated:

"I have never seen memory allocated 18 months in advance in history. This is completely driven by NVIDIA and many other companies in this AI game."

Soaring costs leading to poor implementation of downstream applications

Previously mentioned by Wall Street News, the price of an H100 GPU had already exceeded $30,000 in 2022. By last year, the selling price of each H100 on the Ebay website had skyrocketed to over $40,000, which is four times the average selling price of AMD's competing product MI300X.

The deployment of GPUs is far from over. Zuckerberg stated earlier this year that Meta will massively invest in GPUs this year, including up to 350,000 NVIDIA "Hopper" H100s and other devices, reaching a computing power equivalent to "nearly 600,000 H100s" by the end of the year.

On one hand, there is a continuous "chip shortage" brought about by technological development, and on the other hand, GPU prices remain high. Companies have to start considering the ability to commercialize AI, and the question arises: can the strong demand for GPUs in the market continue?

Citibank stated that the enthusiasm for generative AI is cooling down, and large-scale project implementation is still "premature."

The report indicates that although generative AI is still the focus of most attending management, the actual project scale and use cases are relatively small (such as text/image generation) rather than large-scale transformative changes According to media reports, Amazon Web Services and other AI providers have lowered sales expectations, and one of OpenAI's main competitors, Cohere, had minimal revenue last year. The company stated that customers are cautious about costs and are evaluating the functionality of the technology during the fundraising process.

Citi mentioned that research and consulting firm Gartner estimates that about one-third of projects will fail due to premature POC (Proof Of Concept) launches.

Executives from Micron Tech, Narasimhan, and IBM Semiconductor Research Department General Manager Mukesh Khare both stated that the cost of AI far exceeds the cost of traditional computing, and only when costs come down can more enterprise customers be attracted.

Micron Tech stated:

"Currently, the cost of large language models is quite high, which may be acceptable for large enterprises, but not for the general public."

"Frankly, today I think the investment is becoming... I don't want to use the word hype... too many people are excited about it. If you have a budget, you might prioritize investing in generative AI servers over any other area."

Regarding the question of "when will costs decrease," Micron Tech believes this is a "billion-dollar question," but "it will decrease, it is inevitable competition."

In addition, Citi also pointed out that enterprise customers' concerns about data governance are increasing. To enhance data and AI literacy, data governance and quality will become a higher priority for ongoing enterprise investments.