AI CPUs Drive DDR5 Premiums; Memory Chip "Super Cycle" May Extend to 2027

Wallstreetcn
2026.05.02 05:17

AI inference architectures are reshaping the landscape of memory demand: CPUs are evolving from supporting roles into "AI coordinators." As manufacturers like Intel ramp up DRAM configurations to 400GB, server-side demand for DDR5 is surging, with spot prices rising 2.8% month-on-month in April. The current DRAM supply gap stands at approximately 10% of demand, suggesting the end of the memory "super cycle" may be delayed from 2026 to 2027

The transformation of AI inference architectures is reshaping the demand landscape for memory chips, and the duration of this supply-demand imbalance may exceed previous market expectations.

According to a Saturday report by The Seoul Economic Daily, server-side demand for DDR5 is expanding rapidly as manufacturers such as Intel launch AI CPUs equipped with up to 400GB of memory. Analysts point out that the existing production capacity of Samsung Electronics and SK Hynix struggles to keep pace with the combined pull from both GPU and CPU demand, and the DRAM supply shortage is expected to persist until 2027.

Market signals are already reflected in spot prices. According to Korean securities data, the spot price for DDR5 (16GB) rose 2.8% month-on-month in April, while traditional DDR4 fell by 16% over the same period, widening the price gap between the two.

Industry insiders state that the current supply gap in the DRAM market amounts to approximately 10 percentage points of demand. As general-purpose DRAM demand rises in tandem with HBM demand, the end of the memory chip "super cycle" may be delayed from the previously expected 2026 to 2027.

CPUs Leap to Become "AI Coordinators," Doubling Memory Demand

The core driver of this demand expansion lies in the AI industry's strategic shift from training to inference.

In the past, AI data centers built their computing infrastructure around GPUs, with server configurations typically featuring eight GPUs paired with one CPU, focusing on large-scale parallel training tasks. However, as inference scenarios become increasingly complex, the role of the CPU is upgrading from a auxiliary processor to an "AI coordinator"—responsible for scheduling multiple AI agent systems, managing outputs from various modules, and orchestrating overall workflows.

The key to this role transition is "contextual memory." CPUs need to save and reference the output content of various AI agents in real time to coordinate the complete inference process, making large-capacity memory a rigid requirement. Intel executives stated in a recent earnings conference call that the computing power ratio between CPUs and GPUs in AI inference infrastructure has evolved from the previous 1:8 to 1:4, and this ratio is further narrowing toward 1:1.

In this context, CPU manufacturers are increasing the DRAM configuration of AI CPUs to 300–400GB, up to four times higher than the 96–256GB configurations of traditional CPU products.

Combined GPU and CPU Demand Widens DDR5 Supply-Demand Gap

The competition for storage capacity is spreading from the GPU side to the CPU side, leading to snowballing demand growth.

On the GPU side, NVIDIA's next-generation AI chip, "Vera Rubin," features 288GB of memory through eight stacks of HBM, while AMD's next-generation GPU, the MI400, reaches a memory capacity of 432GB. Google's newly released eighth-generation Tensor Processing Unit, TPU 8i, is also equipped with 288GB of HBM.

On the CPU side, once Intel's "Xeon" and AMD's "Epyc" series AI CPUs begin widespread adoption of DDR5 configurations as high as 400GB, the supply-demand imbalance in general-purpose DRAM will intensify further. Unlike HBM, which is primarily supplied by a few manufacturers such as SK Hynix, the expansion of DDR5 demand will directly impact the supply balance of the entire general-purpose DRAM market.

Price divergence in the spot market clearly reflects this structural change: DDR5 prices are strengthening against the trend, while DDR4 prices remain under pressure. The contrasting market performance of these two product categories highlights the accelerating trend of demand shifting toward the new generation standard.

Samsung and SK Hynix Face Capacity Pressure; Super Cycle Expectations Revised Upward

Constraints on the supply side make it difficult to alleviate this memory shortage in the short term.

As major global DRAM suppliers, Samsung Electronics and SK Hynix have historically seen their capacity expansion rhythms constrained by fab construction cycles and yield ramp-ups for advanced processes. With HBM capacity largely locked in, the effective production space available for general-purpose DDR5 at these two companies is relatively limited, making it difficult to quickly respond to the incremental demand brought by AI CPUs.

Industry insiders point out that the overall supply in the DRAM market is currently short by about 10 percentage points. Prices for commodity DRAM had already more than doubled from their lows, driving historical profits for memory manufacturers like Samsung and SK Hynix. As demand from both GPU and CPU sides continues to stack, market expectations for the duration of the super cycle are being revised upward—extending from the originally projected 2026 to 2027, suggesting that the boom cycle for the memory chip industry may last longer than previously anticipated.