
"It Used to Be Energy, Now It's Memory!" – OpenAI COO on "AI Bottlenecks"
Memory chips have officially replaced electricity as the most pressing bottleneck for AI expansion. OpenAI COO Brad Lightcap's remark, "Now the bottleneck is memory," reveals a structural shift in the core conflict of the AI arms race. SK Hynix further warned that the shortage will persist until around 2030, with a wafer supply gap exceeding 20%, putting simultaneous pressure on traditional DRAM and HBM
Memory chip shortages are replacing power supply as the primary constraint on AI infrastructure expansion.
OpenAI Chief Operating Officer Brad Lightcap stated on Tuesday at the Hill and Valley Forum in Washington that memory chip shortages and tight U.S. energy supply are the two potential bottlenecks currently facing AI infrastructure expansion. "The bottleneck now is memory, whereas before it was power," he stated bluntly on stage.
This statement aligns closely with the previous assessment by SK Hynix Chairman Chey Tae-won, who predicts that the global memory chip shortage will last until around 2030, with an industry wafer supply gap of over 20%.
For the market, this means the core contradiction of the AI computing power arms race is undergoing a structural shift: from data center site selection and grid capacity to memory chip supply chain security and production capacity bottlenecks. The strategic status of AI accelerator suppliers like Nvidia and High Bandwidth Memory (HBM) manufacturers is therefore further highlighted.
Memory Replaces Power as the New Bottleneck for AI Expansion
Lightcap's remarks mark a clear shift in the AI industry's perception of infrastructure constraints. Over the past two years, insufficient power supply for data centers was the most focused topic of discussion in the industry; now, the memory chip shortage has leaped to become a more urgent practical hurdle.
The root of this shortage lies in the explosive growth of demand. AI companies such as OpenAI are continuously purchasing Nvidia AI accelerators on a large scale, and each accelerator is equipped with a significant amount of memory chips, thereby consuming a considerable share of global memory capacity. Lightcap pointed out that OpenAI is actively broadening its supplier base and expanding the geographical layout of its data centers to ensure that infrastructure expansion plans are not constrained by a single supply chain.
According to Bloomberg, OpenAI had previously committed to investing $1.4 trillion over the next few years in data center construction and chip procurement to support the development of more advanced AI systems and a wider adoption of the technology.
Shortage May Persist Until 2030, Traditional DRAM Also Faces Pressure
Previously, SK Hynix Chairman Chey Tae-won stated at Nvidia's GTC conference that the global memory chip shortage is expected to continue for another four to five years, as expanding wafer production capacity requires at least the same amount of time, making it difficult for major memory manufacturers to fully meet market demand before 2030.
Chey also warned that the industry's excessive focus on High Bandwidth Memory (HBM) could trigger a shortage of traditional DRAM, which in turn could affect the smartphone and PC markets. In recent years, SK Hynix, Samsung, and Micron have all shifted a considerable portion of their capacity to HBM for AI accelerators, leading to a decline in traditional DRAM output and driving up prices for consumer electronics.
SK Hynix currently holds approximately 57% of the global HBM market share and about 32% of the overall DRAM market. The company is constructing a $13 billion HBM packaging and testing plant at its Cheongju campus in South Korea, with construction planned to begin next month and a target completion date by the end of 2027.
Energy Issues Unresolved, Nuclear Power and Government Support Highly Anticipated
Although memory has become the most pressing current bottleneck, the pressure on energy supply has not diminished. Lightcap stated that OpenAI is considering introducing diversified power sources, including nuclear energy, to meet continuously rising energy demands, and revealed that the company is in discussions with fusion startup Helion Energy for cooperation.
Notably, OpenAI CEO Sam Altman was previously a supporter of Helion and announced on Monday that he was resigning from Helion's board of directors while recusing himself from negotiations between the two companies.
Lightcap emphasized that government investment in energy supply is "crucial" for the success of the AI industry and highly praised the Trump administration's initiatives to promote AI infrastructure construction and accelerate government adoption of AI technologies.
On the government business front, Lightcap disclosed that OpenAI currently serves over one million employees across U.S. local, state, and federal governments, and making the delivery of products to federal agencies a strategic priority for the company. Late last month, OpenAI reached an agreement with the U.S. Department of Defense to deploy its AI models on the Pentagon's classified networks. Previously, the Pentagon had announced the termination of its partnership with competitor Anthropic PBC. Lightcap described the capability to serve the government as a "critical" direction for the company.
