Wallstreetcn
2023.12.06 21:11
portai
I'm PortAI, I can summarize articles.

AMD has launched the MI300X accelerator, which can increase performance by up to 60% compared to Nvidia's H100, greatly expanding market expectations.

AMD has released a brand new MI300 series AI chip, including the MI300A and MI300X chips, aiming to challenge NVIDIA's position in the AI accelerator market. The performance of the MI300X chip can be improved by up to 60%, with memory that is 2.4 times that of NVIDIA's H100 product and memory bandwidth that is 1.6 times that of the H100. AMD expects the AI accelerator market to nearly double in size by 2027. This release is one of the most important in AMD's history.

On Wednesday, AMD released the highly anticipated new MI300 series AI chips, including the MI300A and MI300X chips, targeting the market dominated by Nvidia. These chips are better at handling large datasets involved in AI training than traditional computer processors.

This product launch is one of the most important in AMD's 50-year history and is expected to challenge Nvidia's position in the booming AI accelerator market.

The new chips released by AMD have over 150 billion transistors. The MI300X accelerator supports up to 192GB of HBM3 memory.

The memory of the MI300X is 2.4 times that of Nvidia's H100 product, and the memory bandwidth is 1.6 times that of the H100, further improving performance.

The MI300X new chip can improve performance by up to 60% compared to Nvidia's H100. In a one-on-one comparison with the H100 (Llama 2 70B version), the MI300X performance improved by up to 20%. In a one-on-one comparison with the H100 (FlashAttention 2 version), the performance improved by up to 20%. In an 8-to-8 server comparison with the H100 (Llama 2 70B version), the performance improved by up to 40%. In an 8-to-8 server comparison with the H100 (Bloom 176B), the performance improved by up to 60%.

AMD CEO Lisa Su stated that the new chips are comparable to the H100 in terms of AI software training capabilities and significantly outperform competitors in inference, which is the process of running the software after it has been deployed.

With the popularity of AI, there is a huge demand for high-end chips in the market. This has prompted chip manufacturers to target this lucrative market and accelerate the launch of high-quality AI chips.

Although the competition in the AI chip market is fierce, AMD made bold and astonishing predictions for the future market size on Wednesday, stating that the AI chip market will expand rapidly. Specifically, it is expected that the size of the AI chip market will exceed $400 billion by 2027, which is nearly double the previous estimate of $150 billion in August, highlighting the rapidly changing expectations for AI hardware.

AMD is increasingly confident that its MI300 series will win the favor of some tech giants, which could lead these companies to spend billions of dollars on AMD's products. AMD has stated that Microsoft, Oracle, and Meta are among its customers.

On the same day, it was revealed that Microsoft will evaluate the demand for AMD's AI accelerator products and assess the feasibility of adopting this new product. Meta will use AMD's newly launched MI300X chip product in its data centers. Oracle has also expressed its intention to adopt AMD's new chips in its cloud services.

Previously, the market expected AMD's MI300 series to ship around 300,000 to 400,000 units in 2024, with Microsoft and Google being the largest customers. If not for the limited production capacity of TSMC's CoWoS and Nvidia's reservation of over 40% of the capacity, AMD's shipments are expected to be further increased.

After the announcement of AMD's MI300X accelerator, Nvidia's stock price fell by 1.5%. Nvidia's stock has skyrocketed this year, pushing its market value over $1 trillion, but the biggest question is how long it can dominate the accelerator market. AMD sees its own opportunity: large language models require a large amount of computer memory, which is exactly where AMD believes its advantage lies.

To solidify its market dominance, Nvidia is also developing its next-generation chip. The H100 will be replaced by the H200 in the first half of next year, which was launched last month and can provide new high-speed memory, with inference speed on Llama 2 being twice as fast as the H100. In addition, Nvidia will launch a brand new processor architecture later this year.