
AI "Shovel Seller"! Global chip companies' sales will exceed $400 billion in 2025, Goldman Sachs predicts NVIDIA's revenue will reach $383 billion next year

Analysts surveyed by FactSet expect that the combined sales of five companies—NVIDIA, Intel, Broadcom, AMD, and Qualcomm—will exceed $538 billion next year, not including sales from Google's TPU business and Amazon's custom chips. However, industry growth also faces significant constraints, including shortages of key components for data centers, difficulties in power supply, and issues related to financing sustainability
Driven by the explosive growth of artificial intelligence, global chip companies are expected to surpass $400 billion in total sales by 2025, setting a historical record for the chip industry, with this figure likely to climb further in 2026.
According to Goldman Sachs, NVIDIA alone is projected to achieve GPU and other hardware sales of $383 billion in 2026, a 78% increase from the previous year.
Analysts surveyed by FactSet expect the combined sales of NVIDIA, Intel, Broadcom, AMD, and Qualcomm to exceed $538 billion next year, not including sales from Google's TPU business and Amazon's custom chips.
Last week, NVIDIA signed a $20 billion licensing agreement with chip startup Groq, which focuses on AI inference acceleration. This marks a shift in the AI race from the training phase to the inference phase, with tech giants now competing to provide the fastest and most cost-effective inference capabilities.
These hardware designers are playing the role of "shovel sellers" in this digital gold rush; however, industry growth also faces significant constraints, including shortages of data center components such as transformers and gas turbines, difficulties in power supply, and questions about whether AI companies can continue to secure sufficient financing to maintain the pace of chip procurement.
Intensifying Competitive Landscape
NVIDIA achieved more than double year-on-year revenue growth in 2025, but the competitive landscape is changing.
Data center operators, AI labs, and enterprise customers have a strong demand for NVIDIA's advanced H200 and B200 graphics processors, but Google's increasingly sophisticated custom chip TPU, Amazon's Trainium and Inferentia chips are also vying for customers.
The $20 billion licensing agreement that NVIDIA reached with Groq last week reflects the industry's shift from AI training to inference. Inference refers to the process by which trained AI models respond to prompts and provide answers.
Software developers like OpenAI are collaborating with custom designers like Broadcom to design their own chips. AMD, a chip manufacturer with a history of half a century, plans to launch its first GPU in 2026 that truly challenges NVIDIA's AI processors.
In October, Microsoft announced plans to double the footprint of its data centers over the next two years, which means chip manufacturers may see increased revenue in 2026.
Supply Chain Bottlenecks Highlighted
2026 may bring unprecedented challenges. Shortages of components such as transformers and gas turbines are hindering data center construction, and operators are also struggling to obtain the massive power needed to run computing clusters.
Another major challenge is the global shortage of components required for AI data center servers.
Products in short supply include ultra-thin silicon substrate layers needed for certain chips, as well as memory chips that deliver data to AI processors and help store computational results.
As data center construction accelerates and inference demand rises, the demand for high-bandwidth memory chips is surging. Micron Technology's Chief Business Officer Sumit Sadana stated:
We are far from meeting customer demand, and this situation will persist for some time Micron is one of the largest manufacturers of high-bandwidth memory chips used in the AI field, with its stock price rising 229% so far this year.
Micron, along with competitors such as Samsung and SK Hynix, has become a major beneficiary of supply shortages, allowing them to raise product prices and increase capital expenditures to expand manufacturing operations. However, building large-scale clean rooms and manufacturing plants to meet the demands of major chip companies takes time.
Doubts About Financing Sustainability
There are serious doubts about the sustainability of financing behind data center construction, and it remains unclear whether major customers like OpenAI can quickly raise enough funds to maintain a rapid pace of chip procurement.
Investors have become accustomed to extraordinary revenue growth on a quarterly basis, and any signs of a slowdown can easily trigger panic. This fall, investors heavily sold off AI stocks, including those of major chip designers, worried that the financing driving the procurement of AI infrastructure products may not be as solid as previously thought.
A significant portion of large-scale data center construction is driven by OpenAI, which has signed billion-dollar computing power agreements with Amazon, Microsoft, Oracle, and others. Companies like Microsoft have committed to ramping up data center construction by 2026, but some analysts believe this boom may slow down in 2027.
DA Davidson analyst Gil Luria stated:
2026 could be the peak. If we haven't heard about OpenAI raising $100 billion by the end of March, the market may start to hit the brakes.
Concerns are also growing about profit margins coming under pressure as more chip companies launch AI products.
Broadcom's stock price still fell after announcing record quarterly revenue in December, partly due to investor concerns that sales growth in its high-margin product lines will slow. However, some industry insiders hold a more optimistic view, believing that demand will remain strong.
Brad Gastwirth, global research director at Massachusetts-based computing hardware distributor Circular Technologies, said:
I don't think this will be the peak. The race for general artificial intelligence is still driving enormous demand for computing power from various customers
