Wallstreetcn
2024.06.04 06:26
portai
I'm PortAI, I can summarize articles.

JPMorgan Chase evaluates Computex highlights: GPUs updated annually, Blackwell already in production, next trend "Physical AI"

JPMorgan Chase predicts that the memory capacity of NVIDIA AI chips will increase by at least 7 times from Hopper H100 to Rubin Ultra, benefiting SK Hynix

Author: Li Xiaoyin

Source: Hard AI

On Monday, May 29th, NVIDIA founder and CEO Huang Renxun kicked off the COMPUTEX conference with a keynote speech.

On June 3rd, JPMorgan Chase released a research report summarizing the key highlights of Huang Renxun's keynote speech.

1. Strong AI Chip Roadmap Released, Introducing a New Chip Annually by 2027

NVIDIA outlined its robust roadmap, planning to release a new AI chip every year until 2027 (Blackwell in 2024, Blackwell Ultra with more HBM in 2025, Rubin in 2026, and Rubin Ultra with upgraded HBM in 2027). This roadmap is powerful and generally aligns with market expectations.

In addition to GPUs, NVIDIA is also advancing its Arm CPU cores (new version Vera in 2026), network capabilities (NVLink Switch 6 for Rubin platform and migration to 1.6T network). We believe that logic chip upgrades may still occur every two years (Rubin will transition to N3P at TSMC in 2026), with HBM upgrades taking place during this period.

2. Supply Chain Beneficiaries - Fastest HBM Iteration, Strong Trends in Advanced Packaging, Networking, and Leading Processes

From a supply chain perspective, HBM iteration seems to be the fastest, with memory capacity increasing every year. We estimate that from Hopper H100 to Rubin Ultra, NVIDIA's AI chip memory capacity will increase by at least 7 times (we expect to grow from 80GB to 576GB), indicating a very rapid iteration pace, benefiting SK Hynix.

Advanced packaging supply chain players (ASE, ASMPT) should also benefit from increased complexity and higher HBM capacity. Based on our current view, CoWoS may remain tight until 2025.

For TSMC, AI chip front-end applications typically emerge 1-2 years after the introduction of new technology nodes, with most AI chips adopting N3 starting from the second half of the year.

Networking and optical supply chains will also benefit from an accelerated 2-year iteration cycle (usually 4-5 years). Due to the high content of BMC (Baseboard Management Controller, a microcontroller independent of the main processor) and high adoption rate of network cards, this is a positive development for SINETEK.

3. Blackwell in Production, Focus Primarily on GB200 Configuration

NVIDIA confirmed that Blackwell is already in production, which came as a surprise to the market as investors previously had concerns about supply chain bottlenecks for Blackwell Similar to the theme speech at GTC, NVIDIA also emphasized GB200 in the keynote speech, which should increase investors' interest in GB200 products - Hon Hai, Xilinx, Wistron, Auras, AVC, ASE, ASMPT, and SK Hynix.

4. The Accelerated Computing Revolution is Still in its Early Stages

NVIDIA emphasized that we are still in the early stages of the accelerated computing revolution, and the infrastructure for artificial intelligence is still in its early stages. The accelerated product roadmap may lower the cost of generative AI, drive the demand for large models, and expand the use cases of generative AI.

NVIDIA did not specifically comment on the NPU or CPU of ARM PCs, but hopes that its PC OEM partners can use independent GPUs (with over 100 million GPUs installed with tensor cores) to perform AI tasks and become Windows PCs that support Copilot+. AI PCs may become a major driver for PC OEM stocks such as ASUS and Lenovo, while MSI is also a major partner for NVIDIA's AI PCs.

5. The Next Stage of AI Applications Will be Physical Robotics AI

NVIDIA expects that the next stage of AI applications will be physical AI, which will largely involve industrial robots and humanoid robots. From a supply chain perspective, this seems premature, but it may bring some momentum to Asian stocks in industrial automation.

The main points of this article are from the research report "Asian Tech Computex takeaways" released by JPMorgan analysts Gokul Hariharan, Albert Hung, and Jay Kwon on June 3rd