Wallstreetcn
2023.10.07 04:43
portai
I'm PortAI, I can summarize articles.

Another "big customer" of NVIDIA wants to do it themselves, Microsoft will launch its own AI chip next month.

Microsoft hopes that its self-produced chips can rival the in-demand H100 GPU from NVIDIA, in order to reduce costs and lessen dependence on NVIDIA. However, developers are already familiar with NVIDIA's proprietary programming language, CUDA. If they switch to Microsoft's custom chips, they would need to learn a completely new software language. Will they be willing to do so?

The era of NVIDIA's dominance is coming to an end? After years of preparation, Microsoft's AI chip may be unveiled next month.

On October 6th, media reports cited insiders as saying that Microsoft plans to launch its first AI-designed chip at the annual developer conference next month, in order to reduce costs and reduce reliance on NVIDIA.

The report stated that the Microsoft chip is designed for data center servers and is used for training large language models (LLM) and other software. It also supports inference and provides power for all AI software behind ChatGPT.

Insiders said that there is still debate within Microsoft about whether to provide this chip to Microsoft Azure cloud customers. However, if Microsoft's self-developed chip is unveiled at the developer conference, it indicates that it is trying to attract the interest of future cloud customers.

Currently, Microsoft's data center servers running ChatGPT use tens of thousands of NVIDIA A100 GPUs to provide advanced LLM for cloud customers, including OpenAI and Intuit, and support a series of AI functions in Microsoft applications.

Microsoft hopes that its Athena chip can rival the in-demand NVIDIA H100 GPU. Earlier, it was reported that Microsoft's secret team of 300 people started developing a custom chip called "Athena" in 2019. This year, Microsoft has accelerated the timeline for launching AI chips designed specifically for LLM.

Media analysis points out that in the chip war among Google, Microsoft, and Amazon, Microsoft has always been lagging behind. With the launch of Athena, Microsoft will catch up with Amazon and Google.

Renowned research firm Forrester Research's senior cloud computing analyst Tracy Woo said that the prosperity of AI is putting greater pressure on cloud computing providers, forcing them to develop their own chips:

"You can buy from NVIDIA, but when you see giants like Google and Amazon, you will find that they have the capital to design their own chips."

Is Microsoft trying to gradually reduce its reliance on NVIDIA?

In order to develop ChatGPT, Microsoft has acquired a large number of GPUs. With the increasing demand for computing power, Microsoft may need more chip support in the future. If Microsoft continues to purchase NVIDIA GPUs as before, it will be a costly expense. Therefore, there have been more discussions about Microsoft's self-developed AI chips in the market. According to the original plan, "Athena" will be built using TSMC's 5nm process, which is expected to reduce the cost of each chip by one-third.

If it can be widely implemented next year, Microsoft's internal and OpenAI teams can use "Athena" to simultaneously train and infer models. This will greatly alleviate the shortage of dedicated computers.

According to reports, Microsoft believes that its AI chips cannot directly replace Nvidia's chips, but as Microsoft continues to promote AI-driven features in Bing, Office, GitHub, and other areas, in-house chips may significantly reduce costs.

Dylan Patel, an analyst at research firm SemiAnalysis, pointed out that if Athena is competitive, it can reduce the cost of each chip by one-third compared to Nvidia's products.

It's not easy to shake Nvidia's dominance

It is obvious that Nvidia has monopolized global computing power. So far, most AI workloads are still running on GPUs, and Nvidia produces most of these chips.

Wall Street News previously mentioned that Nvidia's independent GPU market share is 80%, and its high-end GPU market share is as high as 90%. In 2020, 80.6% of the world's AI cloud computing and data centers were driven by Nvidia GPUs. In 2021, Nvidia stated that about 70% of the top 500 supercomputers in the world are powered by Nvidia chips.

According to practitioners, compared to general-purpose chips, the specialized integrated circuits (ASIC) chips that Amazon, Google, and Microsoft have been developing have faster execution speeds and lower power consumption for machine learning tasks.

When comparing GPUs and ASICs, O'Donnell, a director, used the following analogy: "You can use a Prius for everyday driving, but if you have to drive in the mountains, a Jeep Wrangler would be more suitable."

However, despite their efforts, Amazon, Google, and Microsoft all face the challenge of convincing developers to use these AI chips.

Currently, Nvidia's GPUs dominate the market, and developers are already familiar with its proprietary programming language, CUDA, for creating GPU-driven applications. If they switch to Amazon, Google, or Microsoft's custom chips, they would need to learn a completely new software language. Would they be willing to do so?