Microsoft(Minutes): Current Investment vs Future Prospects is a Question All AI Players Need to Consider

The following is a summary of Microsoft's Q2 performance conference call for fiscal year 2025. For the financial report interpretation, please see Microsoft: The "Dislocation" Between Harsh Reality and Grand Vision

1. Core Information Review of the Financial Report:

2. Detailed Content of the Financial Report Conference Call

2.1. Key Information Presented by Executives:

In this quarter, Microsoft's cloud business continued to perform strongly, with revenue exceeding $40 billion for the first time, a year-on-year increase of 21%. Enterprises are beginning to shift from the proof-of-concept phase to enterprise-level deployment to unlock the full return on investment in artificial intelligence. The annual recurring revenue (ARR) of the AI business has now surpassed $13 billion, a year-on-year increase of 175%.

The scaling law of artificial intelligence continues to play a role in pre-training and inference computing. We have achieved significant efficiency improvements in both training and inference. In inference, due to software optimization, each generation of hardware typically brings over a 2x improvement in cost-effectiveness, while each generation of models can bring over a 10x improvement. As artificial intelligence becomes more efficient and widespread, we expect demand to grow exponentially.

Azure is the infrastructure layer for artificial intelligence. We continue to expand data center capacity based on short-term and long-term demand signals. In the past three years, our total data center capacity has more than doubled, with last year's added capacity being more than any previous year. Our data centers, networks, racks, and chips together form a complete system that provides new efficiencies for today's cloud workloads and next-generation AI workloads.

We continue to update our computing resources, with support for the latest products from AMD, Intel, and NVIDIA, as well as innovations in our self-developed Maia, Cobalt, Boost, and HSM chips.

At the data layer, Microsoft Fabric stands out. We now have over 19,000 paying customers, including Hitachi, Johnson Controls, and Schaeffler. Power BI is also deeply integrated with Fabric, with over 30 million monthly active users, a 40% increase since last year. In addition to Fabric, we are also seeing new AI-driven data patterns emerging. The number of Azure OpenAI applications running on Azure databases and Azure App Services has more than doubled year-on-year, driving widespread adoption of SQL Hyperscale and Cosmos DB.

We are pleased that OpenAI has made new significant commitments to Azure. Through our strategic partnership, we continue to benefit mutually from each other's growth, as OpenAI's application programming interface (API) runs exclusively on Azure We are well-positioned to support OpenAI's leading models as well as selected open-source models and self-trained models (SLM). Today, DeepSeek's R1 model is released through the model directory on Foundry and GitHub, featuring automated red team testing, content safety integration, and security scanning capabilities.

Microsoft 365 Copilot is the user interface for artificial intelligence. It significantly enhances employee productivity and provides access to a large number of intelligent agents to streamline employee workflows. We are seeing accelerated adoption rates among customers of various transaction sizes. Customers purchasing Copilot have increased their total seat count by more than 10 times over the past 18 months. For example, Novartis has added thousands of seats each quarter over the past year and now has 40,000 seats. Barclays, Carrier Global, Pearson, and the University of Miami all purchased 10,000 or more seats this quarter. Overall, the number of daily users of Copilot has more than doubled month-over-month.

In the past three months, over 160,000 organizations have used Copilot Studio, creating more than 400,000 custom intelligent agents, representing a more than 2-fold increase month-over-month. We have also seen partners like Adobe, SAP, ServiceNow, and Workday build their third-party intelligent agents and integrate them with Copilot.

As the support period for Windows 10 is coming to an end, we are seeing increased momentum in its development. Customers are choosing the latest Windows 11 devices for enhanced security and advanced AI capabilities. During this holiday season, 15% of high-end laptops in the U.S. are PCs equipped with Copilot, and we expect that in the coming years, most PCs sold will be equipped with Copilot.

Our consumer business, primarily LinkedIn. The number of comments on LinkedIn has increased by 37% year-over-year. Short videos continue to grow on the platform, with the growth rate of video creation being twice that of other post formats. We are also innovating through intelligent agents to help recruiters and small businesses find qualified candidates faster, and our recruiting business has gained market share in subscriptions again. This quarter, LinkedIn's premium subscription business has surpassed $2 billion in annual revenue for the first time. Subscription user growth has increased by nearly 50% over the past two years, with nearly 40% of subscription users utilizing our AI features to optimize their profiles. LinkedIn Marketing Solutions remains a leader in the business-to-business (B2B) advertising space.

Next is the search advertising and news business. We have regained market share on Bing and the Edge browser. The Edge browser has over 30% market share on Windows systems in the U.S. and has gained market share for 15 consecutive quarters. Our investment in increasing advertising rates is paying off, as advertisers increasingly view our network as an important platform for optimizing return on investment. Our Copilot consumer application has seen improved user engagement and retention due to its enhanced speed, unique personality, and pioneering features like Copilot Vision. Today, we are making the "Think Deeper" feature powered by o1 available for free to all Copilot users worldwide.

12. Productivity and Business Processes Division Despite the adverse impact of foreign exchange rates, performance exceeded expectations, mainly due to the better-than-expected performance of E5 products and Microsoft 365 Copilot. For Microsoft 365 Copilot, we continue to see growth in adoption, expansion, and usage. Average Revenue Per User (ARPU) growth was again driven by E5 products and Microsoft 365 Copilot. Paid Microsoft 365 commercial seats grew by 7% year-over-year, with an expanded install base across all customer segments, primarily in small and medium-sized enterprises and frontline worker products.

13. Intelligent Cloud Division: Excluding the adverse impact of foreign exchange rates, the performance of Azure non-AI services, on-premises servers, and enterprise and partner services was slightly below expectations, although the outstanding performance of Azure AI services partially offset this gap. Thirteen percentage points of Azure's growth came from AI services, which grew by 157% year-over-year, exceeding expectations, although demand still exceeds our existing capacity. Our non-AI services growth was slightly below expectations due to challenges in execution of market promotion, especially for customers reached through scaled promotion methods; we need to balance the consumption of non-AI services with the growth of AI business.

  1. More Personal Computing business achieved better-than-expected results, mainly due to the Windows OEM pre-installation business, traffic from third-party search partners, and the performance of the game "Call of Duty." Windows OEM and device revenue grew by 4% year-over-year, exceeding expectations, due to commercial inventory replenishment before the end of Windows 10 support and uncertainty regarding tariffs.

Search and news advertising revenue (after deducting traffic acquisition costs) grew by 21%, with a 20% increase at constant currency, exceeding expectations, driven by traffic from third-party partners. Growth continues to benefit from increased advertising rates for Edge and Bing, as well as healthy traffic growth.

15. Capital expenditures (including finance leases) were $22.6 billion, in line with expectations, with cash expenditures for purchasing property, plant, and equipment (PP&E) amounting to $15.8 billion. More than half of our spending in cloud and AI-related areas is for acquiring long-term assets that will support monetization for the next 15 years or even longer. The remaining cloud and artificial intelligence expenditures are primarily used for purchasing servers (including CPUs and GPUs) to serve customers based on demand signals (including customer contract backlogs).

Operating cash flow was $22.3 billion, an increase of 18%, driven by strong cloud service billing and collections, although partially offset by increased vendor payments, employee compensation, and tax payments. Free cash flow was $6.5 billion, a year-on-year decrease of 29%, reflecting the capital expenditure situation mentioned earlier.

Outlook for the third quarter:

In terms of commercial orders, we expect new order amounts to be roughly flat year-on-year. We anticipate that core annuity sales will remain stable, and customers will continue to make long-term commitments to our platform. It is important to note that the timing of large long-term Azure contracts can be difficult to predict, which may lead to increased quarterly fluctuations in our booking growth rate. Microsoft's cloud gross margin is expected to be around 69%, a year-on-year decline due to the impact of expanding artificial intelligence infrastructure.

In the productivity and business processes segment, we expect average revenue per user (ARPU) to continue to grow through E5 and Microsoft 365 Copilot, and given the scale of the existing installed base, we expect seat growth to slow down. For Microsoft 365 commercial products, we expect revenue to be roughly flat compared to the same period last year. It is important to note that Microsoft 365 commercial products include the on-premises components of the Windows commercial deployment within the Microsoft 365 suite.

The revenue growth rate for Microsoft 365 consumer cloud business is expected to be in the mid-to-high single digits, driven by Microsoft 365 subscription business.

For LinkedIn, while we expect all businesses to achieve growth, the trend in the talent solutions business will still pose a headwind to growth in the third quarter.

In terms of Azure, we expect revenue growth in the third quarter to be 31% - 32% at constant currency. As we shared in October, with more artificial intelligence capacity coming online, the contribution from our AI services will increase. Non-AI services are expected to continue to be affected in the second half of the year. Although we expect AI capacity to remain constrained in the third quarter, by the end of fiscal year 2025, given our significant capital investments, we should be able to roughly meet demand Search and news advertising revenue growth rate (after deducting traffic acquisition costs) is expected to be around 15%, with a slowdown in growth compared to the previous quarter, mainly due to additional impacts from foreign exchange rates and the recovery of third-party partner traffic to more normal levels mentioned earlier. The growth rate of the search business after deducting traffic acquisition costs will be higher than the overall search and news advertising revenue growth rate, and we expect the overall search and news advertising revenue growth rate to be in the mid-to-high single-digit range.

In terms of the gaming business, we expect the revenue growth rate to be in the low single-digit range. We anticipate that the revenue growth rate for Xbox content services will be in the low to mid-single-digit range, thanks to first-party content and Xbox Game Pass. Hardware revenue is expected to decline year-on-year.

Regarding capital expenditures, we expect the quarterly spending in the third and fourth quarters to be similar to the spending level in the second quarter. In fiscal year 2026, we expect to continue investing based on strong demand signals, including the backlog of customer contracts we need to fulfill, with investments covering the entire Microsoft cloud business. However, the growth rate will be lower than in fiscal year 2025, and capital expenditures will begin to shift towards short-term assets, which are more closely related to revenue growth.

We now expect the operating profit margin for fiscal year 2025 to increase slightly year-on-year. We anticipate that the effective tax rate for the entire fiscal year 2025 will be between 18% and 19%.

2.2 Q&A Analyst Q&A

Q: Azure's performance is at the lower end of the guidance range, which is a bit disappointing. We hope to analyze what execution issues may exist and what measures can be taken to address these issues. After the performance in the second quarter and the previous quarter, can the expectation of accelerated growth in the second half of the year still be achieved?

Amy Hood: The issues in the second quarter were in the non-AI Azure compute resources segment. Our Azure AI business performed better than expected, thanks to the excellent work of the operations team. In the non-AI business, the main challenge lies in what we call the scaling promotion model. These customers are primarily reached through partners and other indirect sales methods. In this case, the difficulty lies in these customers trying to balance AI workloads with foundational work such as migration. Since adjustments to the scaling promotion model take time, we expect to still be affected in the second half of the year. However, I am confident in the team's ability to respond; they have fully recognized the issues and are working hard to resolve them.

We announced this quarter's Azure growth rate at 31%, and we expect the growth rate for the next quarter to be between 31% and 32%. We are confident in the performance of the AI business and are able to achieve corresponding revenue. However, our capacity remains constrained, which aligns with the expectations I communicated to everyone last October. The capacity constraints involve two factors: first is space, which I typically refer to as long-term assets, namely infrastructure and facilities; Secondly, there are the devices. Our expenditure structure has changed, with continuous investment in long-term assets. We previously had shortcomings in power supply and space. With the gradual implementation of investments over the past three years, we will be closer to a state of capacity balance by the end of this year.

Q: Regarding the revenue from the artificial intelligence business exceeding expectations, can you elaborate on the driving factors behind this? We have already discussed the Azure AI business segment; are there other reasons? According to our estimates, the scale and growth rate of the Copilot business far exceed expectations. Can you break down the parts of Microsoft's AI business that exceeded expectations?

Amy Hood: This result is indeed better than expected. There are several reasons for this, first is the Azure business segment we just discussed. Secondly, the Microsoft Copilot business has performed exceptionally well. The key is that both the number of new and additional seats are increasing. While usage does not directly impact revenue, as people derive more value from Copilot, it will indirectly have a positive impact on revenue. Additionally, the pricing for each seat is quite reasonable, reflecting the value of the product. These are the main factors contributing to the performance exceeding expectations.

Q: You mentioned DeepSeek multiple times in your speech. Everyone is eager to know your thoughts on it. Are we now seeing AI being able to scale at a lower cost? Have we reached such a stage, or is there still time needed?

Satya Nadella: In a sense, the development of AI is no different from the development of conventional computing cycles; both are continuously breaking through in pursuit of higher levels of development. Moore's Law is operating at an accelerated pace, and on this basis, the laws of AI scaling, including pre-training and inference computing, are also being continuously reinforced, all of which rely on software support. As I mentioned in my speech, based on software optimization, inference computing can achieve a tenfold performance improvement with each iteration, which is a conclusion we have observed over the long term.

DeepSeek has indeed brought some real innovations, some of which are similar to the discoveries made by OpenAI in the o1 model. Clearly, these innovations will gradually become widespread and widely applied. The biggest beneficiaries of any software cycle development are the customers; after all, from the client-server model to the development of cloud computing, an important insight we gained is that more people are purchasing servers, only these servers exist in the form of cloud computing. Therefore, when token prices drop and inference computing costs decrease, it means that people can consume more AI services, and more applications will be developed.

Interestingly, today, at the beginning of 2025, we can run models on personal computers that once required powerful cloud computing infrastructure support, which was unimaginable in the past. Such optimization means that AI will become more widespread. Therefore, for large-scale cloud service providers like us and personal computer platform suppliers, this is undoubtedly good news Q: I would like to ask about the news regarding "Stargate" and the changes announced last week concerning its relationship with OpenAI. Most investors believe this indicates that Microsoft remains firmly supportive of OpenAI's success, but has chosen to take a backseat in funding OpenAI's future capital expenditure needs. I hope you can elaborate on the strategic decisions made around "Stargate." From an investor's perspective, what insights does this decision provide for understanding capital expenditure needs in the coming years?

Satya Nadella: We are very pleased with our partnership with OpenAI. As everyone can see, they have made significant commitments to Azure, and the bookings we currently have confirmed are just a part of it. Given our preferential purchasing rights, we will gain more benefits from it in the future. Clearly, their success also means our success, as can be seen from the detailed business collaboration arrangements outlined in the related blogs we have published.

Returning to your question, we are building a highly flexible computing resource system. We ensure a reasonable balance between training and inference, and we are deploying globally. We have made significant efforts in software optimization, which is reflected not only in the results brought by DeepSeek but also in our collaboration with OpenAI, where we have worked for years to reduce the costs of GPT models. In fact, we have invested a lot of energy in inference optimization, which is one of the key factors driving AI development. While launching cutting-edge models in the AI field is important, if the service costs are too high to generate actual demand, it is of no use. Therefore, optimizing inference costs to make it widely applicable is crucial.

This is the computing resource system we are managing. At the same time, it is important to remember not to over-procure any resources at once, as Moore's Law brings a 2x performance improvement every year, and software optimization can bring a 10x improvement. We hope to continuously upgrade and refresh computing resources, allowing for ongoing iteration, ultimately achieving a reasonable balance between monetization driven by demand and training costs. I am confident in our current investment strategy, and this flexibility will help us achieve longer-term business growth.

Amy Hood: I would like to add some thoughts on capital expenditure to help everyone better understand the flexible computing resource system that Satya mentioned. We have nearly $300 billion in remaining performance obligations (RPO), which are customer contracts that need to be delivered. The faster and more efficiently we deliver, the better it is for the company. This involves not only our collaboration with OpenAI but also the service of the entire platform to all customers.

Sometimes people overlook one point: when we talk about the flexibility of computing resources, it not only refers to those primarily used for inference but also includes training, post-training, and other key stages that are essential for building modern AI applications. Additionally, running commercial cloud services also requires these resources, and they need to be distributed and global in nature. All of this is very important because only in this way can we achieve the highest efficiency What everyone sees in capital expenditure is primarily used for infrastructure construction in the early stages. This not only helps us meet the demand for artificial intelligence infrastructure but also includes previously unmet needs in commercial cloud services. Subsequently, capital expenditure will shift more towards CPUs and GPUs. This shift is more strongly correlated with revenue, whether in collaboration with OpenAI or other clients, it will be based on contracts.

Q: Microsoft has a rich Copilot product portfolio, which has been on the market for over a year, with increasingly precise functions and continuously decreasing inference costs. How do you view the development direction ahead? And how should we adjust our product packaging and marketing strategies to meet broader customer needs?

Satya Nadella: We recently had two important releases. One is the launch of Copilot Chat in Microsoft 365 Copilot. This feature will be widely available to all existing users, and the IT departments of enterprises can enable this feature, allowing employees to immediately use a web chat service with enterprise control features. Copilot Chat comes with Copilot Studio built-in, which means users can start building intelligent agents. We believe that the combination of Copilot Chat and the full Copilot functionality will effectively promote user engagement with seats and drive the development of intelligent agents.

Additionally, on the consumer side, we launched the "Think Harder" feature powered by o1 yesterday, which is available for free to users worldwide. This fully demonstrates the advantages brought by inference optimization, as cost reductions allow previously high-end features to be more widely applied. This trend is also present in products like GitHub Copilot and Security Copilot. Our entire product system will develop in this direction.

Q: Looking ahead to the next few years, what do you think the ratio of proprietary models to open-source models will be in inference computing on Azure? Is this important for Microsoft?

Satya Nadella: That's a good question. In fact, multiple models will be used in any application. Take Copilot or GitHub Copilot as an example; a large number of different models are already used at the underlying level. During development, we build models, fine-tune them, and extract the essence of the models. Some models will be distilled into open-source models. Therefore, the future will inevitably be a mixed-use scenario of various models. We have always believed that having cutting-edge models is crucial. When building applications, one should fully leverage the best models, pursue higher goals, and then optimize.

Moreover, there is a time factor involved. The initial cost structure does not represent the final result, as we will continuously optimize latency and costs and switch between different models based on demand. In fact, to manage this complexity, we need new application services and servers. This is why we are heavily investing in Azure AI Foundry From the perspective of application developers, they need to keep pace with the constantly emerging new models, hoping that applications can continuously benefit from these innovations while avoiding excessive development costs, operational costs, or what people often refer to as artificial intelligence operational costs.

Therefore, we have invested significant resources in application servers to ensure that any workload can benefit from various different models (open-source, closed-source, different weight levels). From an operational perspective, this also improves efficiency and makes work more convenient.

Q: I'm glad to hear that the Copilot business is developing strongly. In what aspects is this strong growth reflected? Is it an increase in departmental-level transactions? Or are customers transitioning from the proof-of-concept stage to departmental-level procurement, or even multiple departments within the enterprise are procuring? You mentioned earlier that usage trends have increased; could you elaborate on common usage scenarios and how these scenarios provide confidence for future monetization growth?

Satya Nadella: Initially, in some departments with an urgent need to improve productivity, such as sales teams, finance departments, or supply chain departments, the application of Copilot is quite widespread. These departments have a large amount of SharePoint-based data, and they hope to combine this data with web data to obtain more valuable results. However, over time, as we have seen in the development of previous productivity tools, people will collaborate across departments and roles.

For example, in my daily work, I use the chat function to obtain results on the work tab and then immediately share them with colleagues through the page sharing function. I refer to this method as "thinking with the help of AI and collaborating with others." This collaborative model makes the widespread adoption of Copilot within enterprises inevitable. Initially, Copilot may only be used within departments, but the network effects generated by collaboration will promote its spread throughout the entire enterprise. Enterprises can gradually promote it by different groups.

The Copilot Chat we launched allows enterprise customers to apply Copilot more flexibly across the enterprise. This not only facilitates employee usage but also provides more convenience for enterprises to promote the adoption of Copilot.

Q: I want to talk more about commercial bookings and remaining performance obligations (RPO). Commercial RPO increased by $39 billion quarter-over-quarter, which is the largest quarter-over-quarter increase on record. Commercial bookings grew by 75% at constant currency, which is twice the average growth rate over the past decade. I know this metric has some volatility, but this quarter, there does seem to be some significant changes in backlog orders and bookings. Can you talk about the sources of this growth? Is it broad-based growth, or driven by a few large orders?

Amy Hood: As we mentioned before, OpenAI's commitment to Azure is one of the main factors driving growth. It should be noted that this is not a one-time collaboration; as OpenAI develops, they will continue to make commitments, and our partnership will deepen over time. In addition to this, there are other factors that have jointly driven growth First of all, our core business has performed exceptionally well. Core business includes the renewal of existing contracts and upselling, such as the promotion of products like Copilot and GitHub Copilot, which are important growth drivers. This quarter, the performance of the E5 product has also been outstanding. When discussing Microsoft 365 Copilot, I sometimes overlook the momentum of the entire suite, but the strong performance of the E5 product this quarter has been very encouraging.

The last factor is large Azure contracts. The situation with these contracts is showing good growth trends, just as we expected. There are two types of large Azure contracts: one is where existing customers have increased their investment after fulfilling their previous contractual obligations, which fully reflects their recognition and confidence in the platform; the other is contracts signed by new customers, which also performed well this quarter.

So, you are right, seeing such high growth data might make one feel that only one factor is at play. But in reality, in addition to the OpenAI factor, the stable execution of various business lines is also an important reason for achieving growth.

Risk disclosure and statement of this article: Dolphin Investment Research Disclaimer and General Disclosure