Wallstreetcn
2023.11.15 16:10
portai
I'm PortAI, I can summarize articles.

Microsoft, a partner and a new opponent! Microsoft launches its first AI chip and establishes new partnerships with AMD and NVIDIA.

Microsoft has launched its first AI chip, Maia 100, which is seen as a competitor to Nvidia, as well as a competitor to Intel CPUs: Cobalt 100, a cloud-native chip based on the Arm architecture. In addition to custom chips, Microsoft is expanding its collaboration with chip manufacturers by adding AMD MI300X accelerated virtual machines to Azure, opening a preview for NC H100 v5 virtual machines built with Nvidia H100, and planning to release AI-optimized virtual machines equipped with H200.

Microsoft is actively expanding its presence in the field of artificial intelligence (AI) and forging new partnerships, while also seeking to share the spoils with chip giants such as AMD and NVIDIA.

On November 15th, Wednesday, Microsoft announced on its official website that it has established new partnerships with AMD and NVIDIA. AMD will bring new AI and computing capabilities to Microsoft customers, while NVIDIA will launch generative AI foundry services on Microsoft's intelligent cloud platform, Azure, for global enterprises and startups.

Alongside the announcement of these new partnerships with the two chip giants, Microsoft unveiled two custom-designed chips at the Ignite 2023 conference, both of which are designed for Azure services.

One of these chips is the Azure Maia 100, an AI accelerator chip and Microsoft's first AI chip. It is used for cloud-based training and inference of AI workloads such as OpenAI models, Bing, GitHub Copilot, and ChatGPT. Manufactured using TSMC's 5-nanometer process, it contains 105 billion transistors, which is about 30% fewer than AMD's challenger to NVIDIA's AI chip, the MI300X.

Media reports suggest that the Maia 100 may directly compete with NVIDIA's chips and serve as an alternative to them.

Rani Borkar, Corporate Vice President of Azure Hardware Systems and Infrastructure at Microsoft, stated that she does not have detailed information on how the performance of the Maia chip compares to alternatives such as NVIDIA's H100. However, she pointed out that the Maia chip supports Microsoft's first implementation of sub-8-bit and MX data types, enabling joint hardware and software design. This helps Microsoft support faster model training and inference times.

The release of the Maia 100 confirms earlier reports this year about Microsoft's in-house development of AI chips. At that time, it was reported that the chips were designed for software such as large language models (LLMs) and also supported inference, providing power for all AI software behind ChatGPT. In recent years, Microsoft has been making efforts to develop custom chips for its servers to support Azure cloud computing services.

The other chip, Azure Cobalt 100, is a 128-core cloud-native chip based on the Arm architecture. It is designed for general computing tasks and has been optimized for performance, power, and cost-effectiveness for general workloads. It competes with Intel processors. Microsoft also announced that Azure Boost, another innovative product based on Microsoft's data center cluster, will be officially launched. This system can migrate storage and network processes from host servers to dedicated hardware and software, thereby improving storage and network speed.

Microsoft introduced that as a complement to custom chips, the company is expanding its partnership with chip suppliers to provide customers with infrastructure options, including the following collaborations with AMD and NVIDIA respectively:

  • Microsoft will add AMD MI300X accelerated virtual machines (VMs) in Azure. The AMD MI300 VMs will adopt the latest GPU from AMD, the AMD Instinct MI300X, which aims to accelerate the processing of AI workloads to achieve high-scale AI model training and generative inference.
  • Microsoft is opening a preview of the new NC H100 v5 VM series built for the NVIDIA H100 chip, which will improve the performance, reliability, and efficiency of large and medium-sized AI training and generative inference. Microsoft also announced plans for the ND H200 v5 virtual machine series, an AI-optimized virtual machine equipped with the upcoming NVIDIA H200 chip.

Borkar, the Vice President of Microsoft in charge of Azure hardware systems and infrastructure, stated that virtual machine instances running on Cobalt 100 will be commercialized through Microsoft Azure Cloud in 2024. She did not disclose a timetable for the release of Maia 100.

Borkar said that Microsoft is building chips for AI computing based on customer feedback. Microsoft is testing how Maia 100 meets the needs of its Bing AI chatbot Copilot, GitHub Copilot coding assistant, and OpenAI model GPT-3.5 Turbo.

In terms of Azure AI, at this Microsoft Ignite conference, it was announced that the GPT-3.5 Turbo model, which supports 16K token prompt length, will be commercially available, and GPT-4 Turbo will be publicly previewed in Azure OpenAI services at the end of this month. GPT-4 Turbo will enable customers to expand the prompt length, bringing more control and efficiency to generative AI applications.