Cars also need to compete for memory chips

Wallstreetcn
2025.12.09 00:00
portai
I'm PortAI, I can summarize articles.

On October 30, 2025, NVIDIA CEO Jensen Huang met with the chairmen of Samsung and Hyundai Motor in Seoul, hinting at memory chip supply issues. The global automotive industry is transforming into a smart computing platform, leading to a surge in demand for memory chips. Li Auto Vice President Meng Qingpeng warned that the automotive industry will face a storage chip supply crisis in 2026, with the fulfillment rate possibly falling below 50%. The storage capacity of smart vehicles will advance to the TB level, and market demand is rapidly upgrading

On October 30, 2025, at 7:30 PM, at the "Kkanbu Fried Chicken" restaurant in Samseong-dong, Gangnam District, Seoul, South Korea, Jensen Huang, co-founder, president, and CEO of NVIDIA, had fried chicken and beer with Samsung Chairman Lee Jae-Yong and Hyundai Motor Chairman Chung Eui-sun.

The story of these three big shots wearing crew neck shirts to experience civilian life was wildly spread on global social media.

When the gathering ended and they were leaving, Jensen Huang told the two chairmen, "Today is the best day of my life." Lee Jae-Yong jokingly said, "Finding happiness is actually very simple—eating good food and having a drink with good people is happiness."

In retrospect, Lee Jae-Yong's sentiment may be a bit of soul food, but the joy of AI mogul Jensen Huang should be sincere; he probably secured another batch of memory chips from Samsung. Chung Eui-sun seemed not to have made a farewell speech, presumably because Hyundai Motor Group's supply of memory chips would not face any supply issues.

The power chip crisis triggered by Anshi Semiconductor has not yet fully subsided, and now the focus has shifted to memory chips. Behind their light-hearted conversation is a global scramble for memory chips driven by the explosive demand for AI computing power.

"In 2026, the automotive industry will face an unprecedented supply crisis for memory chips, with a fulfillment rate possibly below 50%." On December 6, Meng Qingpeng, vice president of the supply chain at Li Auto, raised concerns and worries about the supply of memory chips in the industry during a speech at the 2025 New Automotive Technology Cooperation Ecosystem Exchange Conference.

Why do cars need memory chips?

Today, the global automotive industry is undergoing a historic transition from traditional mechanical vehicles to "mobile intelligent computing platforms." This transformation has led to an explosive growth in the demand for data generation, processing, storage, and transmission in vehicles.

As BEV + Transformer large models become standard for advanced driver assistance systems and VLA end-to-end large models penetrate complex scenarios, the shortcomings of traditional in-vehicle storage in terms of bandwidth, latency, and power consumption are gradually being exposed.

Current data shows that the flash memory capacity of a single high-end intelligent connected vehicle generally ranges from 64GB to 256GB. With the improvement of in-vehicle sensor accuracy, the promotion of edge large models, and the enrichment of in-vehicle entertainment functions, the storage capacity of future intelligent vehicles will move towards the TB level.

In this trend, the market demand for in-vehicle storage products is rapidly upgrading towards higher performance, larger capacity, and lower cost. Under the combination of various factors, the entire automotive industry’s demand for memory chips has reached an unprecedented height.

However, why does the surge in demand for AI servers lead to a shortage of memory in the automotive industry? Does a fulfillment rate of less than 50% mean that car manufacturers' deliveries will be halved next year? How will the cost per vehicle and order delivery be affected? A series of issues are emerging, how should Chinese car companies face the "chip war" that is unfolding?

AI Triggers a New Cycle in Memory Chips

According to Automotive Business Review, the current memory chip supply crisis faced by the automotive industry largely aligns with the significant 3-5 year cyclical pattern that exists in the storage chip industry. Since 2012, the storage chip industry has gone through three complete cycles, with cycle durations generally lasting 3-5 years.

From 2012 to 2015, the industry benefited from the rise in demand driven by the popularity of smartphones, followed by an oversupply due to concentrated production expansion, completing the first cycle;

From 2016 to 2019, the promotion of 3D NAND technology and the widespread adoption of DDR4 memory drove demand, but later, due to capacity release and a slowdown in consumer electronics demand, prices fell;

From 2020 to 2024, the "stay-at-home economy" spurred demand due to the pandemic, but after the pandemic eased, the industry fell into a downturn due to overcapacity and weak demand.

As we enter 2025, when OpenAI locks in a monthly supply of up to 900,000 DRAM wafers under the name "Stargate" (approximately 40% of global production), and when SK Hynix and Samsung Electronics achieve record profits through their HBM business, it signals the arrival of a new cycle in the memory chip industry.

As one of the largest branches of the semiconductor industry, the global market sales of storage chips are expected to reach approximately $165.5 billion in 2024, accounting for over a quarter of the total semiconductor market.

In terms of application types, storage chips are divided into volatile (memory, including DRAM, SRAM) and non-volatile (flash memory, including NAND, NOR, etc.) types, with DRAM and NAND together accounting for over 99% of the storage market size.

Currently, the global DRAM market is monopolized by three companies: South Korea's Samsung, SK Hynix, and the United States' Micron (with a combined market share of over 95%), while the NAND market is dominated by five companies: Samsung, SK Hynix, Kioxia, Micron, and SanDisk (with a combined market share of over 92%).

The memory chip industry exhibits a typical oligopolistic structure characterized by both technology and capital intensity—giants maintain their lead through process miniaturization, architectural innovation, and annual investments of tens or even hundreds of billions of dollars in R&D, relying on massive production capacity to dilute costs, and using "counter-cyclical investment" to eliminate competitors, ultimately forming a monopoly.

For a long time, storage chip giants like Samsung and SK Hynix would predict demand based on different scenarios and balance the allocation of DRAM and NAND production capacity, covering consumer electronics, data centers, industrial control, and automotive sectors.

Automotive electronics, as an important but steadily growing sub-sector, has a relatively limited share in the allocation of memory chip production capacity. In this model, automotive manufacturers typically obtain stable memory supply and pricing through traditional channels such as Tier 1 suppliers In the past, the main drivers of strong demand for memory chips were concentrated in PCs, smartphones, and traditional data centers; now, the AI infrastructure frenzy initiated by Silicon Valley giants has completely disrupted the original market structure of memory chips and is reconstructing the supply-demand logic of memory chips.

AI servers, especially those used for large model training, have created explosive demand for high-bandwidth, large-capacity memory. The DRAM capacity of a single AI server is 8-10 times that of a regular server, and the market size for high-bandwidth memory (HBM) necessary for training large models is expanding at an astonishing rate.

Data provided by Trendforce shows that the overall HBM consumption will reach 6.47B Gb in 2024, a year-on-year increase of 237.2%; the estimated overall HBM consumption will reach 16.97B Gb in 2025, a year-on-year increase of 162.2%.

More critically, the complex stacking structure of HBM requires three times the wafer capacity of DDR5, which means that for every new memory demand in the AI field, more capacity that could have been allocated to other fields must be seized.

"We are witnessing an unprecedented market structure." Morgan Stanley stated in a research report that the core of this round of memory demand revolves around the "arms race" of AI data centers and cloud service providers, whose sensitivity to price is far lower than that of traditional consumers, and they are more concerned about whether they can obtain sufficient computing power infrastructure to support the development of large language models and AI applications.

This means that the AI field demonstrates a strong price tolerance in the procurement of memory chips.

Taking NVIDIA as an example, its Rubin architecture data center GPU, set to launch in 2026, has secured a large supply of HBM4. An agreement with SK Hynix shows that the unit price of HBM4 reaches $560, an increase of over 50% compared to the previous generation HBM3E.

NVIDIA has also begun to procure Samsung's fifth-generation high-bandwidth memory HBM3E for its latest AI acceleration chip GB300. Additionally, it is reported that about half of the supply orders for NVIDIA's 2026 SOCAMM2 memory module come from Samsung.

It is unclear how much NVIDIA is spending to procure from Samsung. Morgan Stanley's latest research report indicates that the server DRAM (memory) prices in the fourth quarter of 2025 have surged nearly 70%, and the contract prices for NAND (flash memory) have also risen by 20%-30%.

SK Hynix is driving its gross margin to 57% in Q3 2025 through high-end products such as 12-layer stacked HBM3E and server DDR5, with operating profit exceeding 10 trillion won for the first time; Samsung also saw a significant quarter-on-quarter increase of 158.55% in operating profit thanks to a similar high-end product portfolio.

Overlooked "Early Warning"

It is not difficult to see that this new cycle of memory chips triggered by AI is essentially a "squeezed replacement" of traditional capacity in high value-added areas.

The production capacity of Samsung Electronics, SK Hynix, and Micron Technology was previously divided into four categories: servers, PCs, mobile phones, and other fields such as automotive, with servers accounting for 55%-60% and the other three categories combined accounting for 30%-35%.

By 2025, the capacity allocation of the three major manufacturers has significantly tilted towards servers, increasing to 70%, while the total volume of other categories has shrunk by 10%-15%, directly leading to a tightening of memory chip supply in the mobile phone, PC, consumer electronics, and automotive sectors.

For Samsung Electronics, SK Hynix, and Micron Technology, the choice is not complicated—AI-related HBM and DDR5 products have profits several times that of traditional memory, naturally becoming the priority for capital expenditure and advanced process capacity.

In the highly watched HBM field, the expansion actions of leading manufacturers are particularly aggressive. SK Hynix, leveraging its first-mover advantage, has doubled its HBM capacity by 2025, capturing over 60% of the global market share and almost monopolizing NVIDIA's HBM orders.

To maintain this advantage, its capacity expansion for NVIDIA's HBM4 demand is steadily progressing—the M15X factory located in Cheongju, North Chungcheong Province, South Korea, is scheduled to officially start production in the fourth quarter of 2025.

Samsung Electronics closely follows, shifting its 1Y-nanometer process 16Gb DDR4 capacity to DDR5, resulting in a gross margin increase of 12 percentage points compared to the DDR4 era. On December 2, Samsung Electronics announced the completion of the mass production readiness certification (PRA) for HBM4 and is fully advancing its entry into NVIDIA's supply chain.

Micron Technology plans to invest 1.5 trillion yen (approximately 9.6 billion USD) to start building a new factory dedicated to HBM production at its existing site in Hiroshima, Japan, in May 2026, with mass shipments expected by 2028, focusing on supplying top global cloud service providers and AI infrastructure companies such as OpenAI, Google Cloud, and Microsoft Azure.

Earlier this year, Samsung was the first to announce a reduction in DDR4 production and requested customers to confirm orders by June. SK Hynix also plans to compress its existing DDR4 capacity to below 20% of total capacity and plans to cease production in April 2026. In the NAND market, manufacturers are also reducing consumer-grade capacity and shifting capacity towards enterprise-level 3D QLC products.

As a result, a procurement manager from a leading automotive company revealed to "Automotive Business Review": "In fact, there were signs six months ago (around May and June this year). It takes some courage to anticipate this situation and layout in advance." Now thinking of solutions is equivalent to closing the stable door after the horse has bolted, and the effect is limited."

He said, "Companies like NVIDIA need to build many large servers, which require computing power support, and computing power must be paired with storage; these two are bound together. Therefore, players in the AI industry started early and seized a large amount of capacity. After all, the core of the AI competition is computing power and electricity; they must firmly grasp these two aspects first."

The capacity adjustments of these large manufacturers have directly led to a situation where high-end HBM and DDR5 still cannot meet the huge incremental demand of AI servers, while the mid-to-low-end DDR4 has experienced severe supply shortages due to the reduction in production speed exceeding the decline in demand.

"Automotive Business Review" learned that this has caused the already tight overall DRAM capacity to completely collapse, even leading to a "price inversion" phenomenon where the price of the previous generation product (DDR4) surpasses that of the new generation product (DDR5) by a factor of two.

DDR4 memory just happens to meet the current automotive system's actual needs for data processing speed and storage capacity. Its mature technical solutions and supply chain system allow it to occupy more than 60% of the current automotive electronics memory market.

This means that in the high-end automotive-grade chip sector, car manufacturers have to directly compete with those "AI infrastructure maniacs" to secure a "future" for high-end intelligent connected vehicles, while in the case of "entry-level capacity" like DDR4, car manufacturers face the dilemma of strategic contraction from the three major original manufacturers.

The outcome of this direct confrontation is basically certain. For a long time, the automotive industry’s bids for memory chips have not been able to compare with those of technology companies, and the shipment volumes cannot compete either. The three major original manufacturers will inevitably favor technology companies more.

Regarding this strategic contraction, "Automotive Business Review" believes that it is not ruled out that memory chip manufacturers may collaborate to raise prices for DDR4 and similar products, after all, they have already suffered significant losses in the past.

In fact, this has also led many car manufacturers to overlook the "early warning" from the industrial end, as the enormous demand for AI servers had not yet been fully released at that time, causing many car manufacturers to believe that memory chip manufacturers were deliberately "inflating prices," leading them to choose to continue observing at that time.

At the 2025 WNAT-CES New Automotive Technology Cooperation Ecosystem Exchange Conference, many domestic OEMs and Tier 1 suppliers indicated that the memory chip shortage issue will become a definite risk in 2026, which will be unsolvable throughout the year and will have the greatest impact on high-end and flagship models.

It Seems There Are Thousands of Roads

"Automotive Business Review" learned that, apart from the capacity locked in advance with a few companies like Lenovo and Xiaomi, almost all of the three major manufacturers' capacity for 2026 has already been claimed.

Since the gap has been determined, Li Jiaqiu, product manager of Beidou Zhiliang Technology, said that based on the current price increase of memory chips, the cost per vehicle for mid-to-low-end models will increase by about 500 yuan, while for high-end intelligent connected vehicles, due to greater memory demand, the cost per vehicle may increase by around 1000 yuan There are also well-known procurement managers in the industry who indicate that this wave of market activity is unlikely to cool down before Q3 next year, unless the AI bubble bursts. However, procurement leaders from certain automotive companies have clearly pointed out that AI demand will definitely be strong in the next two years, and the storage chip capacity allocated to the automotive industry next year will certainly be insufficient.

So, will the shortage of memory chips lead to widespread and long-cycle delivery delays across the entire industry? The industry generally believes this is unlikely, but it does not rule out the possibility of some automotive companies experiencing delayed capacity delivery.

The optimistic reasons mainly do not come from domestic substitution.

Song Yang, chairman of Zhixing Technology, stated that some companies, including theirs, are already experimenting with domestic storage chips, involving chips from Changxin, Jiangbolong, ISSI, Jingcun, and others.

Automotive Business Review consulted several industry insiders, and they generally believe that domestic storage chip companies still have a significant technological gap with international giants in high-end products. In the mid-to-low-end production lines like DDR4, although domestic manufacturers represented by Changxin Storage have relatively mature technology, they are relatively conservative in their capacity layout for DDR4 due to the upcoming upgrade to DDR5.

It seems that the main option left is to buy at high prices, "there's nothing that Chinese people can't buy," these industry insiders stated. As for the reasons behind this, Wang Shuyi, chief analyst at Techsugar, believes that this round of memory chip shortage is due to both systemic factors from the new AI demand and, to a large extent, market manipulation by the three major original manufacturers.

He said, "As long as the profits are large enough, all industrial products may face oversupply. However, because the bulk storage market is currently very monopolized, with only a handful of players, their ability to manipulate prices is relatively strong, and it won't experience long-term, widespread deep losses like our photovoltaic industry."

After all, the past two years coincided with a downward cycle, and even until the second quarter of this year, the three original manufacturers maintained significant losses. From a business logic perspective, in the relatively closed circle of memory chips, where there are losses, they ultimately have to recover from there. The three manufacturers began to uniformly delay shipments and raise prices starting from Q4 2024. Currently, the prices of DDR4 and DDR5 memory sticks have increased by 5-6 times compared to the beginning of the year, creating a buying frenzy in the market.

Industry insiders pointed out that the three major original manufacturers usually announce order quotes at the beginning of each quarter, but this year's Q4 quotes were not announced until early December. If this action is said to lack the intention to wait for a better price, it is probably hard to convince. Since they are waiting for a better price, it indicates that chips are still available for purchase, but may require waiting and will be at a higher price, all of which will directly reflect on the costs for automotive companies To address the storage chip issue, Automotive Business Review believes that manufacturers may reduce configurations, such as lowering computing power and storage configurations. High-end models may be adjusted, allowing the storage chips originally intended for one vehicle to potentially meet the needs of two vehicles. This is an adjustment that manufacturers can make themselves.

Moreover, for car companies, the impact of this storage crisis is not homogeneous but shows a clear characteristic of "model differentiation." Traditional vehicles have relatively low memory demands and are less affected; in contrast, smart flagship models equipped with advanced driving capabilities face significantly greater impacts.

A procurement manager from a car company stated that many models previously used DDR4, and user experience was very good; it is not necessary to use the highest-end products. High-end storage chips can allow models to be sold at higher prices as high-end configurations; however, for ordinary consumers, the current smart driving functions already meet daily usage needs without issues.

From a long-term perspective, car companies can also personally engage in building solid cooperative relationships with international giants or domestic enterprises based on their own scale and influence. The procurement manager mentioned that the storage chip industry has its unique "play," where price is important, but cooperative relationships and long-term stable cooperation models also determine chip allocation results.

Previously, Tesla signed a typical case of a $16.5 billion chip foundry agreement with Samsung, spanning 8 years. Although long-term contracts may imply higher costs, for Tesla, stable supply is key to ensuring that its intelligent process is not interrupted.

Another approach is for car companies to secure product orders early through more influential Tier 1 suppliers, allowing them to expand their supplier base and avoid excessive reliance on a single source, thereby effectively reducing the risk of supply interruptions.

However, distant water cannot quench immediate thirst. Automotive Business Review has learned that some procurement managers from car companies have already visited South Korea to meet with Samsung. They even dined in the same restaurant and at the same location where Jensen Huang, Lee Jae-Yong, and Zheng Yixuan had fried chicken, hoping to secure more orders.

"Going is better than not going," he said.

Risk Warning and Disclaimer

The market has risks, and investment requires caution. This article does not constitute personal investment advice and does not consider individual users' specific investment goals, financial conditions, or needs. Users should consider whether any opinions, views, or conclusions in this article align with their specific circumstances. Investing based on this is at one's own risk