Broadcom may be the first "disruptor" to break NVIDIA's AI monopoly! The stock price is poised to challenge $200

Zhitong
2024.12.02 08:01
portai
I'm PortAI, I can summarize articles.

Broadcom is seen as a potential "disruptor" to break NVIDIA's AI monopoly, with Wall Street analysts bullish on its stock price reaching $200. Despite Broadcom's stock price falling over 15% since October 9, its PEG ratio stands at 1.69, which is relatively attractive. The market has not fully priced in Broadcom's potential, with expectations that its AI chip market share will expand at a growth rate of 50%

The global large AI data center Ethernet switch chip, as well as the core supply force of customized AI chips in the AI hardware field—Broadcom (AVGO.US), has become the core market focus of the artificial intelligence investment wave since 2023, with its investment heat only second to the AI chip giant NVIDIA (NVDA.US). However, recently its stock price has underperformed against XLK and the Philadelphia Semiconductor Index, with Broadcom's stock price dropping over 15% from its historical high on October 9. Is there a trap in this?

Since October 2024, one of the biggest winners of the global AI boom, Broadcom, has recently seen its stock performance lag behind the ETF tracking popular tech stocks in the U.S. market—the Technology Select Sector SPDR Fund (XLK.US), and the benchmark for U.S. chip stocks—the Philadelphia Semiconductor Index (SOX).

Wall Street analysts have concluded that the logic behind this is mainly that investors are reallocating funds from popular semiconductor stocks focused on AI hardware infrastructure to SaaS software stocks that are relatively under-hyped in the semiconductor sector and also benefit from the unprecedented AI frenzy, especially those software stocks that have already achieved actual profits, such as Applovin (APP.US), Snowflake (SNOW.US), and ServiceNow (NOW.US), among others.

In terms of PEG valuation, compared to the median of 1.89 for the entire U.S. tech sector, Broadcom's forward adjusted PEG ratio is 1.69, which appears relatively attractive in terms of valuation. Furthermore, although hyperscale data centers are expected to generate significant capital expenditures on AI hardware, including AI GPUs and customized AI chips, the market seems to still be pricing NVIDIA's dominance at over 90%, and has not fully priced in Broadcom's potential to challenge NVIDIA's leading position, which may expand its AI chip market share at a growth rate of up to 50% in the future.

Broadcom—The strongest force in the customized AI chip field, Wall Street strongly bullish to $200

Some Wall Street investment firms, after Broadcom's earnings report in September, urged investors not to heed market concerns about its forward guidance, as Broadcom is not like NVIDIA or TSMC, where most of its business exposure is closely related to artificial intelligence. Broadcom's third-quarter results for the period ending August 4, announced in early September, showed that demand for AI-related chips is booming, while demand in non-AI areas such as analog chips and industrial sectors remains in a "darkest hour." Without AI contributions, the company's performance would be very bleak. Despite this, Broadcom's performance has slightly kept pace with the S&P 500 Index (SPX)(SPY), and Wall Street's average target price for Broadcom has reached $200, indicating a potential upside of up to 23% over the next 12 months. With strong demand for Broadcom's Ethernet switch chips from major data centers worldwide, and its absolute technological leadership in inter-chip communication and high-speed data transmission, in recent years, Broadcom has been the most important player in the customized AI chip field, such as Google's self-developed server AI chip—TPU AI acceleration chip, where Broadcom plays a core role, collaborating with the Google team to develop the TPU AI acceleration chip and AI training/inference acceleration library.

Broadcom's Ethernet switch chips are primarily used in data centers and server cluster devices, responsible for efficiently and rapidly processing and transmitting data streams. Broadcom chips are essential for building AI hardware infrastructure, as they ensure high-speed data transfer between GPU processors, storage systems, and networks, which is crucial for generative AI applications like ChatGPT, especially those that require processing large amounts of data input and real-time processing capabilities, such as Dall-E text-to-image and Sora text-to-video large models. More importantly, Broadcom has now become the most significant player in the ASIC customized chips for the AI field, with Google choosing to collaborate with Broadcom to design and develop customized AI chips, and tech giants like Microsoft and Meta, along with more data center service operators, are expected to partner with Broadcom in the long term to create high-performance ASICs.

In addition to chip design, Broadcom also provides Google with critical inter-chip communication intellectual property and is responsible for manufacturing, testing, and packaging new chips, thereby safeguarding Google's expansion into new AI data centers.

The strong demand for Ethernet switch chips and ASIC customized AI chips closely related to AI is evident in Broadcom's robust revenue data, which has consistently exceeded expectations in fiscal year 2023 and thus far in fiscal year 2024. In particular, customized AI chips have become an increasingly important revenue source for Broadcom, with market news indicating that U.S. tech giants Microsoft and Meta's parent company Facebook will choose Broadcom as the core partner for their self-developed chip technology. Meta previously collaborated with Broadcom to design the first and second generations of Meta's AI training acceleration processors, and Broadcom is expected to accelerate the development of Meta's next-generation AI chip MTIA 3 in the second half of 2024 and into 2025.

Therefore, despite the relatively weak stock price performance recently, most analysts believe that investors need not hold a pessimistic view on Broadcom's stock price and performance outlook. The Wall Street analyst ratings and target prices compiled by TIPRANKS show that Broadcom's average target price over the next 12 months is as high as $200, with a consensus rating of "Buy," and no "Sell" ratings have been issued. Among them, Cantor Fitzgerald recently raised Broadcom's target price from $200 to $225, while Mizuho significantly raised Broadcom's target price from $190 to $220 In comparison, Broadcom's stock price closed at $162.08 last Friday.

Customized AI Chips - NVIDIA's Strongest Competitor! The market expects stronger AI performance guidance

Broadcom will announce its fourth-quarter earnings on December 12, Eastern Time. Given the significantly disappointing performance guidance from the previous quarter, investors are expected to closely scrutinize the AI-related revenue prospects of this semiconductor leader. Broadcom CEO Hock Tan expressed his optimism during the last conference call in September 2024, highlighting the surge in demand for customized AI chips from leading hyperscale cloud computing companies (such as Microsoft, Amazon AWS, and Google) in terms of AI infrastructure capital expenditure.

The so-called data center AI GPUs sold by NVIDIA are the core infrastructure hardware driving globally popular generative AI tools like ChatGPT, but Broadcom also benefits comprehensively from the AI boom by providing related AI hardware components and software. The Ethernet switch chips provided by Broadcom, along with customized AI ASIC chips developed in collaboration with chip giants like Google, are particularly crucial for companies seeking to create generative AI applications similar to ChatGPT, as well as for tech giants like Google and Microsoft in building AI data center computing power while pursuing cost-effectiveness and urgently expanding AI computing resources.

As observed during Broadcom's recent third-quarter earnings season, this momentum continues. Therefore, hyperscale cloud computing giants are continuously increasing their investments in AI chips to maintain their competitive edge in the rapidly evolving AI race. Morgan Stanley, a major Wall Street firm, believes that the AI capital expenditure of hyperscale cloud computing giants could soar to $300 billion by 2025. NVIDIA's recent earnings meeting also emphasized the "crazy" demand from large data center operators transitioning from the Hopper architecture to the Blackwell architecture AI GPUs.

Thus, can Broadcom's latest guidance outperform its previous expectation of $12 billion in AI revenue for fiscal year 2024 given in September? Wall Street analysts believe there are sufficient reasons to trust Broadcom CEO Tan and his team. As large-scale AI clusters (>100,000 AI GPUs) become the focus, AI hardware infrastructure such as AI GPUs and customized AI ASIC chips is increasingly becoming the "bet" in the AI supremacy race. Therefore, we can see the construction of hundreds of thousands of AI GPU clusters, and such an assessment is not misleading. However, leaders in the AI field and hyperscale cloud computing giants are facing increasing pressure to address the challenges of AI computing resource availability and cost expansion Recently, multiple media outlets have reported that although technology companies focused on large artificial intelligence models are eager to improve their cutting-edge large language models, the performance scaling of artificial intelligence is slowing down. As a result, companies may pay more attention to AI inference and cost-effectiveness. Custom AI chips that are better at data-intensive matrix calculations and extensive high-load inference tasks, and are more aligned with their large models and cloud computing architectures, have cost and high energy efficiency computing power advantages on one hand, and on the other hand, they alleviate the severe shortage of AI computing resources caused by the high demand for NVIDIA's Blackwell and Hopper architecture AI GPUs (GB200/B200/H100/H200).

As artificial intelligence companies and hyperscale cloud computing giants reassess their financial capabilities for investing in emerging AI clusters, Broadcom's expertise in customized AI chips is expected to lead the next phase of explosive growth in performance. Given that designing and producing customized AI chips requires achieving a certain scale to realize cost-effectiveness, Broadcom's expertise in chip interconnects and its long-term focus on customized chip design are expected to be key advantages, as the demand for customized AI chips is anticipated to increase significantly.

Broadcom is also collaborating with partners like Google to break NVIDIA's absolute monopoly on the AI training/inference programming acceleration library ecosystem based on CUDA. Broadcom typically works with partners to create software development kits (SDKs) and hardware-software co-acceleration libraries specifically tailored for its AI chips. These SDKs and acceleration libraries are usually designed to optimize the performance of Broadcom's self-developed AI hardware, similar to NVIDIA's CUDA-based libraries like cuDNN and TensorRT, which are used for AI training and inference acceleration.

Broadcom CEO Hock Tan shared in September that customized AI chips account for two-thirds of its AI revenue. Additionally, Broadcom has established solid partnerships for customized AI chips and Ethernet switch chips with three hyperscale data center operators. The geopolitical headwinds faced by technology giants from China are expected to drive some of the most valuable tech companies in the country to turn to Broadcom for developing customized AI chips, as acquiring NVIDIA's AI chips becomes increasingly difficult and the pathways to obtain them become more complex.

The expansion of OpenAI makes collaborating with Broadcom to develop customized AI chips a viable business, which could improve operational efficiency and save costs. Therefore, some analysts suggest that Broadcom's assessment that customized AI chips may occupy a more prominent position in hyperscale data center AI workloads within the next five years is not unreasonable. Although NVIDIA is still expected to hold over 60% market share, the market size for Broadcom's customized AI chips is projected to grow at a compound annual growth rate of up to 50%. Amazon AWS's experience in collaborating with cloud computing customers to use customized AI chips indicates that as hyperscale data center operators compete to enhance cost competitiveness and massively expand computing resources, the customized AI chip business has high growth feasibility and long-term sustainability. Although Broadcom is the market leader in the Ethernet chip sector, it may face more severe challenges as NVIDIA seeks to develop more cost-effective "AI hardware system bundles" to support the adoption of its Ethernet and InfiniBand network products. Nevertheless, Broadcom's executives still believe that the semiconductor giant can maintain "industry standards," as Broadcom's highly market-share Ethernet products demonstrate their scalability and energy efficiency. Therefore, it is expected that the competition between NVIDIA and Broadcom will become increasingly intense, but Broadcom's advantage lies in its diversified business segments focused on different endpoints, which can mitigate unexpected weaknesses in any single business