AI boosts eCPM and user engagement (Baidu 4Q23 earnings call summary)

Below is the summary of the earnings call for the fourth quarter of 2023 for Baidu.US. For an interpretation of the earnings report, please refer to "BIDU-SW: AI is the Key to Turnaround".

1. Review of Key Financial Information:

2. Detailed Content of the Earnings Call:

2.1. Key Points from Executive Statements:

  1. On the Operational Front:

Ⅰ Performance Growth: a) Revenue and Profit Growth: BIDU-SW's core business achieved an 8% YoY revenue growth in 2023, with a non-GAAP operating profit margin increasing from 22% to 24%; b) Incremental Revenue Sources: Through improvements in advertising technology and enterprise model construction services, BIDU-SW generated over 600 million RMB in incremental revenue in the fourth quarter of 2023.

Ⅱ Technological Innovation: BIDU-SW made significant progress in Gen AI and basic model development. The market application of ERNIE and ERNIE Bot continued to increase, with Ernie being seen as the foundational system for millions of future AI native applications.

Ⅲ Cost Optimization: The inference cost of ERNIE has been significantly reduced, with the inference cost of EB3.5 being only about 1% of the March 2023 version, mainly due to BIDU-SW's unique four-layer AI architecture (infrastructure) and end-to-end optimization capabilities.

Ⅳ AI Native Applications: Leveraging ERNIE to create AI native experiences in multiple products and services, such as the AI co-pilot function in BIDU-SW's library and the BIDU-SW search engine reconstructed based on ERNIE. Additionally, building an ecosystem around Ernie to explore various revenue sources.

Ⅴ AI Chatbots: BIDU-SW actively promotes AI chatbot technology, especially in the service industry, encouraging small and medium-sized enterprises to build AI chatbots as new marketing and service channels.

Ⅵ AI Cloud: BIDU-SW's AI cloud business achieved a 11% YoY growth in the fourth quarter of 2023, reaching 5.7 billion RMB, while continuing to improve profitability, with the growth of non-online marketing business mainly driven by the AI cloud business.

Ⅶ Intelligent Driving: Apollo Go made significant progress in intelligent driving services, with a 49% YoY growth in ride services, continuing to strive towards the goal of UE breakeven.Short-term Outlook:a) Incremental revenue is expected to increase to tens of billions of RMB by 2024, mainly from advertising and AI cloud business;b) BIDU-SW will continue to invest in opportunities in Gen AI and basic model fields;c) AI cloud should maintain strong revenue growth and achieve profitability under non-US GAAP.

Financial Aspects:

Revenue Overview:Total revenue for the full year of 2023 was RMB 134.6 billion, a YoY increase of 9%.a) Online marketing revenue: RMB 19.2 billion in the fourth quarter, accounting for 17% of BIDU-SW's core total revenue, a YoY increase of 6%, and an 8% YoY increase for the full year of 2023.b) Non-online marketing revenue: RMB 8.3 billion in the fourth quarter, a YoY increase of 9%, and a 9% YoY increase for the full year of 2023, mainly driven by the AI cloud business.

Costs and Expenses:a) Costs: RMB 17.4 billion in the fourth quarter, a YoY increase of 3%; RMB 65 billion for the full year of 2023, a YoY increase of 2%.b) Operating expenses: RMB 12.1 billion in the fourth quarter, a YoY increase of 5%; RMB 47.7 billion for the full year of 2023, a YoY increase of 9%.

Operating Income:Operating income in the fourth quarter was RMB 5.4 billion, and RMB 21.9 billion for the full year of 2023. Core operating income in the fourth quarter was RMB 4.7 billion, with an operating profit margin of 17%.

Cash Position:As of December 31, 2023, cash and cash equivalents, restricted cash, and short-term investments totaled RMB 205.4 billion. Free cash flow in 2023 was RMB 25.4 billion.

Q&A Analysts' Questions and Answers:

Q: How does the management view the macroeconomic landscape in China in 2024? What is the management's outlook for the growth prospects of the entire BIDU-SW company in 2024? Additionally, what percentage of AI-related revenue is expected in BIDU-SW's total revenue in 2024?

A: Despite the extremely challenging macroeconomic environment last year, our business performance remained robust. We made significant investments in the Gen AI field, and yet our Non-GAAP operating profit margin expanded YoY, with substantial revenue growth. It is worth mentioning that we have started to realize incremental revenue from Gen AI and basic models.

For this year, we have noticed efforts from both central and local governments to promote economic growth. During the eight-day Spring Festival holiday, we saw growth in consumption, especially in the tourism industry. However, we are still in a macro environment full of uncertainties. We are closely monitoring significant economic stimulus plans, which we believe are crucial to achieving this year's goals.

Nevertheless, BIDU-SW faces many opportunities. Our core business remains robust, and incremental revenue from Gen AI and basic models is expected to reach tens of billions of RMB in 2024, which will contribute to the growth of our total revenue.In more specific terms, due to our leading position in LLM and Gen AI, more and more enterprises are building models and developing applications on the BIDU-SW cloud.

For our mobile ecosystem, we have accumulated a large user base and continuously improved our products through AI innovation to enhance our monetization capabilities. Therefore, when we integrate cloud and mobile together, I believe we will be able to sustain our long-term growth, which will outpace China's GDP growth rate.

Q: How do you view the potential for cost reduction and optimization? What attitude should we take towards investments related to artificial intelligence? In the past, you have discussed the possible lag effects between low-cost investments and AI revenue contributions. How should we view the profit trend in 2024 if you plan to expand the business?

A: In addition to investing in our Gen AI business, we still have room to manage the costs and expenses of our traditional business. Looking ahead to 2024, we will continue to focus on our core business and make efforts to reduce resource allocation to non-strategic businesses. Furthermore, we will continuously enhance overall organizational efficiency by simplifying execution processes and compressing organizational structure levels to improve efficiency.

Therefore, this year we are very committed to continuously optimizing our operations to ensure we have a more efficient human resources team. Through all these measures, our goal is to maintain the robust profitability of BIDU-SW's core business, while our mobile ecosystem will continue to explore the most open markets and generate steady cash flow. AI cloud services will also continue to bring sustained profits.

Despite our investments in artificial intelligence, we have successfully maintained a stable operating profit margin. Since 2003, when we started investing in Gen AI and large models, these investments have mainly been reflected in our capital expenditures, mainly involving the purchase of chips and servers for modern AI training, among other aspects.

As capital expenditures will be amortized over several years, despite a 68% year-on-year increase in capital expenditure costs in 2023, our Non-GAAP operating profit margin still increased by 2% year-on-year.

Looking ahead, new investments are inevitable in the development of our new AI business. However, these investments are not expected to have a significant impact on our profit margin or profit.

In the early stages of market development, we will not overly prioritize the profit margin of the AI business. Because we believe that in the long run, this business is expected to generate better profit margins. In addition, some promotional activities may be carried out for AI local 2C products, and we will carefully manage and closely monitor the return on investment to balance investment and growth.

Incremental revenue generated from architectural improvements in Q4 has reached millions of RMB, and incremental AI cloud revenue from Gen AI and financial models has contributed 4.8% to total AI cloud revenue. In the future, we will continue to steadfastly focus on the development of Gen AI and large models.Q: Can you quantify or prove that the advertising revenue generated by BIDU-SW is purely from the incremental contribution of AIGC, rather than internal competition with existing search business? If artificial intelligence does indeed provide purely incremental contributions, can we expect its growth rate to be higher than the average level, and what would be the growth situation after excluding the artificial intelligence factor? How should we view the growth rate of core search in 2024?

A: As the largest search engine in China, we have nearly 700 million monthly active users. We have established a very strong brand presence among Chinese internet and mobile users who rely on us to obtain comprehensive and reliable information. Therefore, we have a strong and stable base of income and traffic.

However, we are also very sensitive to macroeconomic conditions because our advertising business covers various verticals. As I mentioned before, macro factors still pose uncertainties. But Gen AI and LLM are opening up new opportunities for us, both in terms of profitability and user engagement. I believe that incremental revenue is easier to quantify in terms of profitability.

As I mentioned earlier, Gen AI has already had a positive impact on the eCPM of advertisements, and our improved monetization system enhances our targeting capabilities, thereby generating and displaying more relevant ads. We have generated hundreds of millions of RMB in revenue from these initiatives in the fourth quarter, and we expect incremental revenue to grow to billions of RMB this year.

However, quantifying the impact on user engagement is relatively challenging. Gen AI is helping us improve user experience. In the future, we will continue to introduce new features to further increase user engagement and usage time, bringing us greater potential.

Therefore, I believe that the main source of purely incremental revenue will come from both monetization and user engagement.

Q: How should we view the incremental revenue growth driven by Gen AI? What is the product portfolio of Gen AI Cloud? What are the main growth drivers? In 2024, what should we expect in terms of overall AI Cloud revenue growth and profit margin trends for this year?

A: The total revenue of Gen AI and basic model-related businesses, including internal and external revenue, reached RMB 656 million in the fourth quarter, and this figure is expected to grow to tens of billions of RMB for the full year of 2024. We see a growing interest from enterprises in using Gen AI and LLM to develop new applications and features. To achieve this goal, enterprises are actively building models to support their products and solutions, which is the primary way we generate revenue from external customers. At the same time, we also note a significant growth in model inference revenue from external customers, so currently, inference revenue is still relatively small.

In the long run, this will become an important and sustainable revenue driver. Revenue generated from internal customers is also significant, as a considerable portion of it is used for model inference. BIDU-SW is the first company to rebuild all businesses and products using Gen AI and LLM. As the number of products and features driven by Gen AI and LLM continues to increase, the volume of Ernie API calls from internal customers is rapidly growing and has reached a considerable scale.The development demonstrates that Ernie and ERNIE Bot can effectively enhance productivity and efficiency in real-world applications. In the future, more external clients will use ERNIE to develop their own applications, driving our external revenue growth.

Regarding the questions you raised about our products, we have the most robust artificial intelligence infrastructure in China for model training and inference. Our infrastructure helps clients build and run models in a cost-effective and efficient manner. Additionally, our MOPS provides various models and a complete toolkit for model building and application development, including model builders and application builders. Furthermore, we have developed our own AI native solutions, such as GBI, which stands for Generative Business Intelligence, to improve enterprise productivity and efficiency.

Overall, we anticipate accelerated growth in cloud business revenue in 2024, surpassing last year's growth. We are also confident in the profitability of AI cloud. For enterprise cloud, we should be able to continuously improve gross margins; for traditional cloud businesses, Gen AI, and large model businesses, the market is still in the early stages of development.

Therefore, we should adopt a fairly dynamic pricing strategy to quickly educate the market and expand our penetration among more enterprise clients. In the long run, the standardized profit margin of new businesses should be higher than that of traditional cloud businesses.

Q: Can I get an update on the development progress of our artificial intelligence products? How is the traffic growth? Are there any key metrics to share about new generative search? How does artificial intelligence increase search traffic? How long does this growth take to manifest? When can we expect to see rapid traffic growth or the emergence of super applications?

A: We are using generative artificial intelligence to reconstruct all our 2C products, and I believe Gen AI and basic models are making all our products more powerful. For search, the introduction of Gen AI allows BIDU-SW to answer a wider range of questions, including more complex, open-ended, and comparative queries. By thinking ahead, we can provide direct and clear answers in a more interactive manner.

In recent months, more and more search results are no longer just providing content and links but generated by Ernie Bot. As a result, there is higher user interaction frequency with BIDU-SW, and new questions are being posed.

For example, an increasing number of users are using BIDU-SW for content creation, whether it's text or images. During the Chinese New Year holiday, BIDU-SW helped users create New Year greetings and generate personalized e-cards for their loved ones. This is not a typical use case for a search engine, but we see many users relying on BIDU-SW for this type of usage.

In the future, we will increasingly use ERNIE Bot to generate answers for search queries and then clarify user intent through multi-round conversations to address complex user needs through natural language. While this initiative has already enhanced the search experience, we are still in the early stages of leveraging ERNIE Bot to revamp critical search aspects.We will continue to test and iterate the Gen AI features based on user feedback. Before the large-scale launch, we will follow our typical process to test and optimize the new experience.

Overall, we believe that Gen AI will complement traditional search, ultimately improving user retention, engagement, and usage time on BIDU-SW. In addition to search, ERNIE Bot acts as a co-pilot, transforming the library from a platform for users to find templates and documents into a one-stop platform for creating content in various formats.

So far, I believe that about 18% of new paying users are attracted by the Gen AI feature in the library. At the same time, we are attracting and helping businesses to build applications in the learning field. We believe ERNIE's success depends on its widespread and active adoption, whether through BIDU-SW applications or third-party applications.

Q: Can you talk about Ernie's technical roadmap for 2024? Does it include multimodal features, similar to Sora, or perhaps opening an artificial intelligence store? Or maybe launching an AI agent? Can you discuss milestones or key metrics? Regarding the cost of running Gen AI, how should we consider future investments and revenue in managing inference costs? Obviously, you mentioned some ways to improve efficiency. Are there any additional levers to optimize this process?

A: Our current chips should be able to advance EB4 to the next level. We will take an application-driven approach to intensive learning, letting our users and customers tell us where we should improve and adjust our models. This may involve building multimodal models, AI agents, improving reliability, etc. We focus on leveraging Ernie to bring real value to users and customers, not just achieving high rankings in research papers.

Price will be a very important issue, making high-performance base model prices reasonable is key to large-scale operations. We have been continuously reducing model inference costs. Currently, the inference cost of EB3.5 is about 1% of the March 2023 version. By doing this, more and more companies are willing to test, develop, and iterate their applications in learning.

We understand that for many customers, they tend to strike a balance between efficiency, cost, and speed, so we also offer smaller language models and help customers leverage MOE, which is an expert-mixed model, to achieve optimal performance. Through our end-to-end approach, we believe there is still ample room to reduce the costs of our most powerful models and make them increasingly affordable for our customers. This will further drive the adoption of our models.

Internally, we are closely monitoring the number of applications processed by Ernie. As I mentioned earlier, Ernie now handles over 50 million queries per day, and currently, the Ernie API calls from internal applications are still greater than those from external applications. The costs of Ernie from external applications of different scales have been rapidly increasing.This is just the beginning. As more and more end-users utilize Ernie, it will become more powerful, intelligent, and useful, whether through the BIDU-SW application or third-party applications, enabling us to nurture an ecosystem around Ernie. With these applications and models actively used by end-users, they will also bring us substantial inference revenue.

Q: How does the adoption of ERNIE by enterprises compare with peers? Could you please share with us the latest number of enterprises using ERNIE to build models and applications, help us understand the growth compared to the previous year, and what potential driving factors are? Lastly, could you also help us understand if it can be assumed that enterprises integrating ERNIE API would rarely use other elements?

A: As of December last year, approximately 26,000 enterprises of various sizes and across different industries have called our Ernie API from our cloud platform, a 150% increase from the previous quarter. The daily call volume of Ernie API has exceeded 50 million times. We believe that no other company in China can acquire as many customers and receive such a large number of API requests.

Enterprises primarily choose us for several reasons:

Firstly, we have the most cost-effective AI infrastructure in China for model building and inference, mainly due to our strong end-to-end optimization capabilities. As I mentioned earlier, Gen AI and large models are reshaping the competitive landscape of China's public cloud industry, enhancing our competitive advantage. Our strong capabilities in managing large-scale GPU-centralized cloud computing and high GPU utilization continuously strengthen our AI infrastructure. Therefore, we can help enterprises build and run their models on our cloud platform at a lower cost and develop AI-based applications.

Secondly, the EB series models have attracted many customers to use our cloud platform. In the past few months, we have continuously enhanced the performance of Ernie and received positive feedback from customers. We also offer Ernie models of different scales to better meet customers' cost structure needs.

Of course, we are the first company in China to introduce model-as-a-service, providing a one-stop platform for LLM and AI-based application development. Our models make it easier for enterprises to use LLM. We also provide toolkits to help enterprises easily train or fine-tune their models and develop applications on our cloud platform.

Therefore, through these toolkits, customers can effectively train, customize models by integrating their proprietary data, and directly power their applications with the Ernie API. We can also help customers support different product features and adopt different models by using the MOE approach in application development.Therefore, companies can focus on identifying customer pain points rather than expanding their efforts on procedures. All these measures have helped us gain a first-mover advantage in the fields of GNI and LLM.

Regarding the last question, as more and more customers use our MASS platform to develop AI-native applications aimed at attracting users, our cloud platform will generate and accumulate a large amount of user and customer insights. Therefore, these insights will also help us further improve our toolkit.

As our tools become more user-friendly and help businesses easily fine-tune models and create applications, they will be more inclined to stay with us. Additionally, it is worth noting that at the current stage of using large language models, it is crucial for customers to create prompts that suit their choice of models.

Therefore, as they have to invest a lot of effort in building and accumulating the best prompts to use large language models, transitioning to another model becomes challenging as they would have to rebuild their prompt combinations. As a result, as the adoption rate and active usage of our platform continue to increase, customer satisfaction and conversion costs will help improve customer retention.

Q: I would like to understand how the recent further restrictions on chips in the United States have affected your artificial intelligence development? Are there any updates on alternative chips? Considering chip concerns, how does BIDU-SW differentiate from overseas counterparts in developing artificial intelligence model products and monetization methods? What can we expect to achieve, and what difficulties might we face? How will the company keep up with overseas counterparts in the coming years?

A: In the short term, the impact on our model development, product innovation, or monetization methods is minimal. As I mentioned last quarter, we already have the most powerful basic models in China. Our artificial intelligence chip reserves allow us to continue enhancing Ernie over the next year or two. The chips required for model inference do not need to be as powerful. Our reserves and the chips available on the market are sufficient to support us in powering many AI-native applications for end-users and customers.

In the long run, we may not have access to cutting-edge GPUs, but with the most efficient domestic software stack, overall, the user experience will not be affected. There is ample room for innovation at the application, model, and framework levels. Our end-to-end independently developed four-layer AI architecture, along with a strong R&D team, will support us in efficiently training and inferring models with less advanced chips, providing BIDU-SW with a unique competitive advantage among domestic peers. For enterprises and developers, adopting ERNIE when building applications will be the best and most effective way to embrace artificial intelligence.

Q: Recently, we have seen many developments in text-to-video or video generation technologies. How do you envision this technology impacting the broader artificial intelligence industry in China, and the potential impact on ERNIE? Could you provide a detailed overview of ERNIE's strategic roadmap? Additionally, how is ERNIE currently performing in tasks such as text generation, text-to-image, and text-to-video generation, and what improvements do you foresee in these areas?A: First of all, multimodal or multimodal integration, such as the integration of text, audio, and video, is an important direction for the future development of foundational models. This is essential for Gen AI, and BIDU-SW has already made investments in this area and will continue to do so in the future.

Secondly, when we look at the development of foundational models, the market for large language models is huge and still in a very early stage. Even the most powerful language models in the world are still not good enough for many applications, leaving plenty of room for innovation. Smaller-scale models, MOE, and agents are rapidly evolving. We are striving to make our solutions more easily accepted by various enterprises and capable of solving real-world problems in different scenarios.

Thirdly, in the field of visual foundational models, a particularly important application with huge market potential is autonomous driving, in which BIDU-SW is a pioneer and global leader. We have been using diffusion, transfer fusion, and Transformers to train our video generation models for autonomous driving purposes. We are also making continuous progress in target classification, detection, and segmentation to better understand the physical world and its dynamics. This enables us to transform images and videos captured on the road into specific tasks, thereby achieving more intelligent, adaptive, and safer autonomous driving technology.

In conclusion, our strategy is to develop the most powerful foundational models to solve real-world problems and continue to invest in this area to ensure our leadership position.

Risk Disclosure and Statement of this Article: Dolphin Research Disclaimer and General Disclosure