"Big Card Buyer" Microsoft: Bought nearly 500,000 GPUs this year! More than twice that of Meta

Wallstreetcn
2024.12.18 07:33
portai
I'm PortAI, I can summarize articles.

According to estimates from analysis firms, Microsoft purchased 485,000 NVIDIA Hopper chips this year, far ahead of Meta (224,000), ByteDance (230,000), Tencent (230,000), Amazon (196,000), Google (169,000), and other companies

Microsoft is frantically buying chips...

On December 17, the Financial Times reported that Microsoft has purchased far more Nvidia AI chips this year than any competitor to accelerate its investment in artificial intelligence infrastructure. This year, Microsoft's orders for Nvidia chips are more than three times the number of next-generation Nvidia AI processors it purchased in 2023.

By analyzing the company's publicly disclosed capital expenditures, server shipments, and supply chain information, analysts at Omdia Technology Consulting estimate that Microsoft purchased 485,000 Nvidia Hopper chips this year, far ahead of Meta (224,000), ByteDance (230,000), Tencent (230,000), Amazon (196,000), Google (169,000), and others.

Analysts believe that in the past two years, Nvidia's GPUs were in short supply, and Microsoft's chip inventory has given it a competitive edge in building next-generation AI systems.

This year, tech giants' spending on data centers reached hundreds of billions of dollars. Omdia estimates that global tech companies will spend about $229 billion on servers in 2024, with Microsoft's capital expenditure at $31 billion, Amazon at $26 billion, and the top ten global data center infrastructure buyers, including xAI and CoreWeave, accounting for 60% of global computing capacity investment.

As the largest investor in OpenAI, Microsoft has been the most aggressive in building data center infrastructure, not only running AI (such as the Copilot assistant) itself but also leasing it to customers through its Azure division.

Currently, Microsoft's Azure cloud infrastructure is being used to train OpenAI's latest models, competing with Google, xAI, Anthropic, and other startups, as well as competitors outside the United States, for dominance in next-generation computing.

Alistair Speirs, Senior Director of Global Infrastructure at Microsoft Azure, stated in an interview with the Financial Times:

"Good data center infrastructure is a very complex and capital-intensive project that requires years of planning, so it's important to forecast our growth and leave some room."

Vlad Galabov, Director of Cloud and Data Center Research at Omdia, also pointed out that about 43% of server spending in 2024 will be on Nvidia chips, but tech giants have also strengthened the deployment of their own AI chips this year to reduce reliance on Nvidia. For example, this year, Google and Meta deployed about 1.5 million of their own chips, respectively.

However, Microsoft is still in the early stages in this regard, having installed only about 200,000 of its self-developed Maia chips this year.

Speirs stated that Microsoft currently primarily uses Nvidia chips, but the company needs to invest heavily in its own technology to provide "unique" services to customers:

"When building AI infrastructure, based on our experience, it's not just about having the best chips; you also need to have the right storage components, the right infrastructure, the right software layer, the right host management layer, error correction, and all the other components to build this system