AI PC,a new wave of productivity (AMD23Q3 conference call)

AMD (AMD.O) released its third-quarter earnings report for 2023 (ending in September 2023) after the US stock market on the morning of November 1, 2023. The summary of the conference call is as follows:

1. Incremental information from the conference call of $AMD.US:

1) Expectations for MI300: In the first quarter of next year, we actually expect revenue to be roughly in the range of $400 million. Most of it will come from artificial intelligence, with only a small portion from high-performance computing.

2) Data center business: In the second half of the year, we expect the data center business to grow by about 50% compared to the first half.

3) PC market: Inventory levels are relatively normalized. The holiday season is a strong quarter for us. When considering the market size, I believe that from a consumer perspective, this year's PC shipments may be around 250 million to 255 million units.

4) Gaming and embedded: From a gaming perspective, we do expect a larger decline. Then, as we enter the first quarter, there are also many things that need to happen. We expect the gaming and embedded business to decline in the first quarter, mainly due to seasonality.

2. Original text of the AMD conference call:

2.1 Management Overview

We performed well in the third quarter, achieving strong revenue and profit growth, reaching multiple milestones on our AI hardware and software roadmap, and significantly accelerating our momentum in collaborating with customers to develop AI solutions. In the PC space, there are currently over 50 notebook designs supported by Ryzen AI in the market, and we are working closely with Microsoft to develop the next generation of Windows, which will leverage our on-chip AI engine to deliver the biggest leap in Windows user experience in over 20 years.

In the data center, multiple large hyperscale customers are committed to deploying the Instinct MI300 accelerator and are benefiting from our latest ROCm software suite and the growing support of an open hardware-agnostic software ecosystem.

In terms of financial performance in the third quarter, driven by record server CPU revenue and strong Ryzen processor sales, revenue increased by 4% YoY and 8% MoM, reaching $5.8 billion. Data Center

The revenue of the Data Center department was $1.6 billion, flat YoY, but grew by 21% MoM, driven by strong demand for the third and fourth generation EPYC processor series, resulting in record-breaking quarterly server processor revenue. We gained server CPU revenue share this quarter, with over 50% QoQ growth in fourth-generation EPYC CPU revenue, accounting for the majority of our server processor revenue and unit shipments.

In the cloud domain, although the demand environment remained uneven this quarter, the deployment of EPYC processors by hyperscale enterprises to support their internal workloads and public instances, while optimizing their infrastructure spending, has led to consecutive double-digit percentage growth in EPYC CPU revenue. Companies such as Amazon, Google, Microsoft, Oracle, and Tencent launched nearly 100 new AMD-driven cloud instances this quarter, including multiple general-purpose instances that provide leading performance for general, HPC, bare metal, and memory-optimized workloads.

In the enterprise domain, although overall demand remains weak, we see strong indications that the significant performance and total cost of ownership advantages of Genoa, along with our expanded market investments, are paying off with double-digit revenue growth in the enterprise segment. We have achieved multiple new victories with leading automotive, aerospace, financial services, pharmaceutical, and technology customers, and the number of enterprise customers actively testing our on-premises EPYC platforms has significantly increased QoQ.

We also launched the Siena processor, expanding our fourth-generation EPYC processor product portfolio, which provides leading energy efficiency and performance for intelligent edge and telecommunications applications. Companies such as Dell, Lenovo, and Supermicro have introduced new platforms, expanding our EPYC CPU TAM to meet the needs of telecommunications, retail, and manufacturing applications.

With the introduction of Siena, we now offer the industry's highest-performing and most energy-efficient server processor product portfolio, covering cloud, enterprise, technology, HPC, and edge computing.

I am very pleased with the momentum we have built for our EPYC CPU product portfolio.

We are leveraging the next-generation Turin server processor based on the new Zen 5 core to further strengthen this momentum, which significantly improves performance and efficiency. Turin is now in the labs of our top customers and partners, and the customer feedback has been very strong. We expect to launch it in 2024.

On the hardware front, the launch and validation of our MI300A and MI300X accelerators are progressing as planned, with performance meeting or exceeding our expectations. Production shipments of the Instinct MI300A APU began earlier this month to support the El Capitan Exascale supercomputer, and we expect to start production shipments of the Instinct MI300X GPU accelerator in the coming weeks to lead cloud and OEM customers. In terms of software, this quarter we have further expanded our artificial intelligence software ecosystem and made significant progress in enhancing the performance and functionality of ROCm software.

In addition to ROCm being fully integrated into the mainstream PyTorch and TensorFlow ecosystems, Hugging Face models are now regularly updated and validated to run on Instinct accelerators and other supported AMD AI hardware. AI startup Lamini announced that their LLM running on Instinct MI250 GPU has achieved software parity with CUDA, enabling enterprise customers to easily deploy production-ready LLM with minimal code changes for fine-tuning specific data on Instinct MI250 GPU.

We have also strengthened our artificial intelligence software capabilities through strategic acquisitions of Mipsology and Nod.ai. Mipsology, a long-term partner, brings mature expertise in providing artificial intelligence software and solutions running on our adaptive SoCs for data center, edge, and embedded markets. Nod.ai adds an experienced team that has made significant contributions to open-source AI compilers and industry-leading software, which have been used by many of the largest cloud companies and AI companies. Nod's compiler-based automation software can significantly accelerate the deployment of high-performance AI models optimized for Instinct, Ryzen, EPYC, Versal, and Radeon processors.

Based on our rapid progress in executing the AI roadmap and the purchasing commitments from cloud customers, we now expect fourth-quarter data center GPU revenue to be approximately $400 million, and with the growth in annual revenue, it will exceed $2 billion by 2024. This growth will make MI300 the fastest-selling product in AMD's history to reach $1 billion in sales. I look forward to sharing more details about our progress at the AI event in December.

Client

Revenue increased by 42% YoY and 46% MoM, reaching $1.5 billion. With PC market inventory levels normalizing and demand returning to seasonal patterns, sales of Ryzen 7000 processors equipped with the industry-leading Ryzen AI on-chip accelerator significantly increased this quarter. Due to strong demand for Ryzen 7000 series laptop and desktop processors, revenue from our latest generation client CPUs supported by Zen 4 cores, more than doubled, providing leading energy efficiency and performance across various workloads.

In the commercial segment, we launched the first Threadripper PRO workstation CPU based on Zen 4 cores, delivering unparalleled performance for multi-threaded professional design, rendering, and simulation applications. Dell, HPE, and Lenovo have announced the launch of a series of expanded workstations supported by the new Threadripper PRO processors. This move is part of our focus on developing the profitable growth segment of our customers' businesses.

Looking ahead, we are executing our long-term Ryzen AI roadmap to deliver leading computing capabilities built on the Microsoft Windows software ecosystem, supporting the next generation of AI PCs and fundamentally redefining the computing experience in the coming years.

Gaming

Revenue decreased by 8% YoY and 5% MoM to $1.5 billion, as the decline in semi-custom revenue was partially offset by the growth in Radeon GPU sales.

Although the decline in semi-custom SoC sales aligns with our expectations for the gaming console cycle at this point, overall revenue for this generation of gaming consoles continues to significantly outperform the previous generation due to strong demand from Microsoft and Sony.

In the gaming graphics segment, revenue increased both YoY and MoM due to increased channel demand. We expanded the Radeon 7000 series by launching the new RX 7000 series high-end desktop GPUs, providing leading cost-effectiveness for 1440p gamers.

Embedded

As expected, revenue decreased by 5% YoY to $1.2 billion. Subsequently, revenue declined by 15% as delivery times normalized and customers focused on reducing inventory levels. We launched the first adaptive SoC with on-chip HBM memory this quarter, expanding our leading Versal SoC product portfolio to provide exceptional performance and efficiency for memory-constrained data center, networking, testing, and aerospace applications.

We also released the next-generation aerospace-grade Versal SoC, which integrates an enhanced AI engine and is the only solution in the industry that supports unlimited reprogramming during development and after deployment.

For the fintech market, we introduced the latest Alveo accelerator card, which has seen a 7-fold increase in latency compared to the previous generation and has been deployed by multiple trading firms in their ultra-low latency training platforms. Since completing the acquisition of Xilinx 1.5 years ago, our embedded business has achieved significant growth driven by our leading products.

Looking ahead, based on our current visibility, we expect a continuous decline in revenue in the embedded segment as customers continue to address rising inventory levels in the first half of 2024.

In the medium term, we see strong growth opportunities in the embedded business as our designs gain significant traction and our extensive and differentiated portfolio of embedded FPGA, CPU, GPU, and adaptive SoC products can meet most of our customers' computing needs.

In conclusion, I am satisfied with the financial performance in the third quarter, driven by the significant growth in Zen 4 server and client processor sales. 3Q Performance Review & 4Q Performance Outlook:

Our performance in the third quarter exceeded expectations, with revenue of $5.8 billion and diluted earnings per share of $0.70. Compared to the same period last year, revenue increased by 4%, as the growth in the client and data center departments was partially offset by the decline in game and embedded departments' revenue.

Driven by growth in the client and data center departments, revenue increased by 8% QoQ. Gross margin was 51%, a YoY increase of approximately 1 percentage point, primarily due to increased revenue from the client department and an enhanced product portfolio. Operating expenses were $1.7 billion, a YoY increase of 12%, mainly due to increased research and development investment to support our significant growth opportunities in artificial intelligence. Operating income was $1.3 billion, with an operating profit margin of 22%. Taxes, interest expenses, and other fees amounted to $141 million.

Diluted earnings per share for the third quarter were $0.70, compared to $0.67 in the same period last year.

Now turning to our reportable segments. Starting with the data center segment, revenue was $1.6 billion, flat compared to the same period last year, as the growth in EPYC processor sales was offset by a decline in sales of adaptive SoC products. Data center revenue increased by 21% QoQ, primarily due to strong sales of our fourth-generation EPYC processors to cloud and enterprise customers. The data center segment's operating income was $306 million, accounting for 19% of revenue, compared to $505 million, or 31% of revenue, a year ago. The decrease in operating income was mainly due to increased research and development investment to support future growth in artificial intelligence revenue and product portfolio.

Client department revenue was $1.5 billion, a YoY increase of 42%, primarily driven by the growth in Ryzen mobile processor sales. With the continuous improvement of the PC market, we have increased the Ryzen 7000 series to meet strong demand, resulting in a 46% QoQ revenue increase. Thanks to higher revenue and strict operating expense management, the client department's operating income was $140 million, accounting for 10% of revenue, compared to an operating loss of $26 million a year ago.

Game department revenue was $1.5 billion, a YoY decrease of 8%, mainly due to a decrease in semi-custom revenue, partially offset by growth in Radeon GPU sales. Game department revenue decreased by 5% QoQ, in line with our expectations as we are now in the fourth year of the gaming console cycle.

Game department operating income was $208 million, accounting for 14% of revenue, compared to $142 million, or 9% of revenue, a year ago, primarily due to increased revenue from Radeon GPUs. Embedded department revenue was $1.2 billion, a YoY decrease of 5%, mainly due to a decline in sales in the communication market. Embedded department revenue decreased by 15% QoQ, primarily due to inventory adjustments by several terminal market customers. Embedded department operating income was $612 million, accounting for 49% of revenue, compared to $635 million, or 49% of revenue, a year ago. Turning to the balance sheet and cash flow. In this quarter, we generated $421 million in cash from operations, with free cash flow of $297 million. In the fourth quarter, we expect to pay approximately $550 million in cash taxes, primarily due to deferred tax payments previously provided by the IRS for California disaster relief work. Inventory decreased by $122 million QoQ to $4.4 billion. As of the end of this quarter, cash, cash equivalents, and short-term investments remained strong at $5.8 billion. We returned $511 million to shareholders, repurchased 4.8 million shares of stock, and we have $5.8 billion remaining in stock repurchase authorization.

Now turning to our outlook for the fourth quarter of 2023. We expect revenue to be approximately $6.1 billion, with a fluctuation of $300 million, a YoY growth of approximately 9%, and a MoM growth of 5%. Compared to the same period last year, we expect double-digit strong growth in the data center and client segments, with a decline in the gaming segment considering our position in the gaming console cycle, and a decline in the embedded segment due to further softness in the embedded market. Next, we expect double-digit strong growth in the data center segment, an increase in revenue in the client segment, and a double-digit percentage decline in the gaming and embedded segments.

We expect the non-GAAP gross margin to be approximately 51.5%, non-GAAP operating expenses to be approximately $1.74 billion, non-GAAP effective tax rate to be 13%, and diluted shares outstanding to be approximately 1.63 billion.

Finally, I am pleased with our performance in the third quarter, with growth in revenue, gross margin, and earnings per share YoY. In the fourth quarter, we expect to benefit from the strong momentum in the data center and client segments driven by the growth of MI300 AI acceleration and the strength of our high-performance Zen 4 series products, despite lower sales in the gaming segment and further softness in the embedded market.

Looking ahead, our investments in artificial intelligence in the data center, client, gaming, and embedded fields enable us to offer one of the best and most comprehensive product portfolios in the industry, targeting the most compelling opportunities and driving long-term profitable growth.

2.2 Q&A

Q1: What proportion does artificial intelligence account for compared to supercomputing or other applications? In the field of artificial intelligence, perhaps you can talk about the breadth of your customer lineup. In the context of artificial intelligence, how should we consider the workloads you are addressing? Is it mainly training or inference, or both?

A1: We have made significant progress across the entire MI300 program. Additionally, we have made significant progress on the customer front. As we enter the first quarter, we actually expect revenue to be roughly in the range of $400 million. The majority of this is in artificial intelligence, with only a small portion in high-performance computing. As we move into 2024, we expect revenue to continue to grow on a quarterly basis, and this will primarily come from artificial intelligence. In the field of artificial intelligence, we have a high level of engagement with a wide range of customers, from large-scale enterprises to OEMs, corporate clients, and some new AI startups.

From a workload perspective, we hope that the MI300 can handle both training and inference workloads simultaneously. We are very satisfied with the inference performance of the MI300, especially for large language model inference, considering our memory bandwidth and capacity. We believe this will be a significant workload for us. However, we expect to see a wide range of workloads and widespread customer adoption.

Q2: Regarding server CPUs. Are you seeing growth similar to what you saw in the third quarter and what you guided for the fourth quarter? Is this primarily driven by market share gains or a market-wide recovery? There has been a significant shift from traditional computing to accelerated computing this year, but are you actually starting to see signs of stability or even improvement in traditional computing?

A2: We are very pleased with our performance in the third quarter as it relates to EPYC overall.

I think the fourth-generation EPYC (Genoa Plus Bergamo) is actually performing very well. We achieved crossover in the third quarter, which was slightly earlier than our previous forecast.

As I delve into this, I would say both clouds have strong growth. The cloud has strong double-digit growth. This adoption is quite widespread in both first-party and third-party workloads, as well as new instances. Then on the enterprise side, we also see some nice growth from our OEMs.

I think from a macroeconomic standpoint, the enterprise situation is still a bit complex, depending on the region. The cloud, to some extent, depends on the customer base. But overall, I think we are satisfied with the progress we are making, and EPYC's leadership is ultimately driving significant growth for us in the third and fourth quarters.

Q3: How has the situation changed in the last quarter in order to meet the expectations for the data center?

A3: We expect our data center business to grow by about 50% in the second half of the year compared to the first half. But now, based on what we are seeing, we continue to see a similar range of around 50%. We are very pleased and excited about the strong momentum in our data center business. In terms of GPUs, it's about $400 million. Throughout the quarter, we have had close interactions with our customers. So, we do see progress continuing, and we see customers placing orders. That's why as we go through this quarter, we are gaining more confidence in the revenue outlook for our guided fourth quarter.

Q4: Following the data center GPUs, can you talk about the range of customers you might see there? Can you give us an idea of the concentration?

A4: We will start with a greater focus on the cloud, somewhat like a few large hyperscalers. But our entire enterprise is also very actively involved and interested in this. Q8:在 MI300 上,您的许多超大规模客户都已准备好或正在准备内部 ASIC 解决方案。因此,如果推理是 MI300 的主要工作负载,您认为随着时间的推移,它是否会被内部 ASIC 取代?或者您认为 MI300 和 ASIC 可以与现有的 GPU 解决方案共存吗?

A8:对于 MI300 和内部 ASIC 的关系,我们认为它们可以共存。虽然一些超大规模客户可能会考虑内部 ASIC 解决方案,但我们相信 MI300 在推理工作负载方面的性能和效率仍然具有竞争力。此外,我们的 GPU 解决方案在其他工作负载和应用程序中仍然具有广泛的适用性和优势。因此,我们预计 MI300 和 ASIC 可以在市场上共同存在,并满足不同客户的需求。我们将继续与客户合作,提供最佳的解决方案。 A8: When we look at the future workload of artificial intelligence, we actually believe that it is very diverse. You train and infer with some kind of large language model, and then you can fine-tune the base model, and then you can directly infer what you can do there.

Therefore, I think within this framework, we absolutely believe that MI300 has a strong position in the market, that's what our customers tell us, and we are working closely with them.

Q9: A question about the interaction between artificial intelligence and traditional computing. Please give us some explanations. First, what happens with traditional computing deployments? Second, is there any difference in unit types and ASP interaction on the server CPU side?

A9: I think as we move forward, we have seen a recovery in the growth of the server CPU market. In this field, for example, the fourth-generation EPYC (Milan) falls between 96 and 128 cores. This requires a lot of computation.

So, I do think there is a framework where unit growth may be more moderate, but considering the number of cores and computational power, the growth in average selling price will contribute to overall growth.

Therefore, from the perspective of traditional server CPUs, I do think we have seen these trends. 2023 is a mixed environment, and I think things will improve as we enter 2024.

Q10: I want to ask about the embedded side. Last quarter, you talked about resistance primarily in the communication terminal market. With your guidance for the fourth quarter, I'm curious if this weakness will spread. And your competitors talked about some kind of reset, going back to pre-popularity levels. Just curious how you are designing the reset? You mentioned the first half will be weak.

A10: Yes, absolutely. I think when we look at the terminal market, I think the communication was weak last quarter, and it will definitely continue to be weak. We see an overall decline in 5G capital expenditure.

In another market, we see that what we call soft demand in the terminal market will be industrialized, and the regional differences are greater, so the situation in Europe is worse than in other regions.

Other terminal markets are actually doing quite well. What we just saw was high inventory, considering our delivery time during the pandemic and the high demand in the market.

As delivery times normalize, people are reducing inventory, and given the normalization, they have the opportunity to do so.

Therefore, from an overall perspective, we believe that demand is stable. We believe that we have a very strong product portfolio in the embedded field. We like to combine the classic Xilinx product portfolio with the embedded processing capabilities we have added. Customers have seen such a product combination, and we have gained some good design wins as a result.

So, we have to go through inventory adjustments in the next few quarters, and then we believe we will recover growth in the second half of this year. Q11: I want to ask about the PC market. I think both you and Intel have seen the low shipment volume in the first half of the year. Maybe you are now slightly overstocked and replenishing inventory. I'm curious about your views on the standardized operating rate of the personal computer market and any thoughts on whether inventory levels are starting to recover.

A11: As we look at the situation in the third quarter and the environment we are in now, I believe inventory levels are relatively normalized, so sales and consumption are quite close. We are preparing for the holiday season, which is overall a strong quarter for us. When I consider the market size, I think it may be around 250 to 255 million units from a consumer perspective this year.

We expect some growth in 2024 as we take into account the AI PC cycle and some existing Windows update cycles. I believe the personal computer market is returning to what we call a typical seasonal pattern, and underneath that, we have a strong product portfolio. We are very focused on the development of high-end gaming, ultrathin, high-end consumer, and commercial areas. That's our view on the personal computer market.

Q12: AI question. As you look to further advance your AI product portfolio, how do you see the future roadmap for CPU, GPU, and networking, especially the networking part?

A12: What we see in these AI systems is that they are indeed very complex when you consider putting all these components together.

Of course, we are working closely with partners to integrate complete systems, CPUs, GPUs, and networking capabilities.

The acquisition of Pensando has actually been very helpful in this area.

I believe we have a world-class team of experts in this field, and we also have partnerships with some of the ecosystem partners in the entire networking ecosystem.

So, looking ahead, I don't think we will be selling complete systems, what we call AMD-branded systems. We believe there are others who are better prepared for that. But I think from a definition standpoint, as we do the development, we are definitely considering the concept of the overall system. We will work closely with partners to ensure that this is well-defined so that customers can easily adopt our solutions.

Q13: I would like to delve into the gross margin. I didn't expect the gross margin to remain unchanged and continuously increase quarter by quarter throughout the year. But perhaps you can give us some insights into the put options and exposure of the gross margin, as well as the internal situation in each segment market where you are making progress, as I think some of the progress there is very positive.

A13: We are very pleased with the 140 basis points increase in gross margin QoQ in the third quarter. The revenue in the embedded segment actually declined by double digits. There are two main drivers. The first one is definitely the 21% QoQ growth in the data center, which should provide momentum for our gross margin. Secondly, as we went through the inventory adjustment in the PC market, we did encounter some resistance in the client business gross margin. In the third quarter, we saw a significant improvement in our customer group gross margin. Q14: I have a question about MI300. Can you introduce some competitive advantages listed by customers that you find particularly attractive compared to your main competitors?

A14: To begin with, it is simply a very capable product. From three aspects, its design gives us powerful computing capabilities, as well as memory capacity and bandwidth. Especially in terms of inference, it is very helpful. The way to think about this is that for these larger language models, you cannot install them on one GPU. You actually need multiple GPUs. If you have more memory, you can actually use fewer GPUs to infer these models, so from a total cost perspective, this is very advantageous. From a software perspective, this may be an area where we have to invest more money and do more work.

Our customers and partners are actually moving towards an area where they can move across different hardware, thus optimizing on a higher level framework. This lowers the barrier to entry for adopting new solutions. We also have extensive discussions about future roadmaps. It is very similar to our EPYC evolution. When you think about our closest partners in the cloud environment, we work closely together to make each generation better. So I think MI300 is an excellent product and we will continue to develop it in the next few generations.

Q15: I would like to focus on operating expenses. I just want to know what might happen in 2024.

A15: Our team has done an excellent job in reallocating resources within the budget to truly invest in the most important areas of artificial intelligence and data centers.

We are actually planning for 2024. I can comment at a very high level that given the tremendous opportunities we have in artificial intelligence and data centers, we will definitely increase our R&D investment and capital investment to seize these opportunities.

The way to think about this is that our goal is to drive revenue growth far faster than the growth in operating expenses, so our investments can drive long-term growth. We can also leverage our operating model to truly expand profits at a faster rate than revenue. This is indeed how we operate the company and drive the expansion of operating profit margins.

A: The first thing I want to say is that if you look at 2023, it is a very unusual year for the entire industry, especially the PC market. It is one of the most severe recession cycles in the past three years.

Therefore, in this downward cycle, our client business has definitely encountered resistance in terms of gross margin, but we have made significant progress in the third and fourth quarters of the second half of the year.

Entering next year, this combination will mainly drive our gross margin. The way to think about this is that data centers will be the largest incremental revenue contributor next year. Then, due to the continuous decline in gaming and embedded, I think it's all about the combination. We do expect gross margin to improve next year compared to 2023, especially in the second half.

Q20: I would really like to hear your thoughts on PC architecture. Qualcomm has a new X Elite chip. There are rumors that NVIDIA may do the same. What does this mean for AMD's TAM development?

A20: Our view on ARM is that ARM is a partner in many ways, so we use ARM in various parts of our product portfolio.

I think, for PCs, x86 still accounts for the majority of PC sales. If you consider the ecosystem around x86 and Windows, I think it's a very powerful ecosystem. What I'm most interested in in the PC space is actually AI PCs.

I think the AI PC opportunity is a chance to redefine the productivity tool of PCs and truly manipulate user data.

So I think we are at the beginning of this wave. We are heavily investing in Ryzen AI and have the opportunity to truly expand the AI capabilities of future PCs. I think that's the theme of the conversation. The focus is no longer on what instruction set you are using, but more on what experience you are providing to customers. From that perspective, I think we have a very exciting product portfolio and I feel good about the next few years.

Q21: Can you talk about our position in data center FPGAs and the prospects for AI FPGAs? Can we at some point integrate FPGAs into the MI300 module, or is there really no FPGA market for AI at the moment?

A21: My view on data center FPGAs is that it is another computing element. We do use FPGAs, or they are present in many systems. I would say that from a revenue contribution perspective, it is still relatively small in the short term.

We have made some design achievements for the future, and we will see some growth in content, but it won't be as much as 2024, even beyond that scope. I think for our data center partners, part of our value proposition is that whatever computing element you need, whether it's CPU, GPU, FPGA, DPU, we have the capability to combine these components together. When we consider the heterogeneity of future data centers, this is a key focus.

Risk Disclosure and Statement of this Article: Disclaimer and General Disclosure of Dolphin Research