
Ming-Chi Kuo: DeepSeek will accelerate the trend of edge AI

Famous analyst Ming-Chi Kuo pointed out that the popularity of DeepSeek will accelerate the development of edge AI, increase the training demand for Nvidia H100, and promote the trend of local deployment of LLMs. TSMC and Nvidia expect that device-side AI will grow significantly by 2026. Although currently there are fewer users of local DeepSeek, which has little impact on the demand for Nvidia's cloud AI chips, in the long run, device-side usage may replace some cloud demand while also creating new cloud demand. Kuo is optimistic about the long-term growth trend of cloud services but notes that the rapid development of device-side trends may affect the investment atmosphere
According to the Zhitong Finance APP, recently, well-known analyst Ming-Chi Kuo stated that the trend of edge AI will accelerate after the rise of DeepSeek. The popularity of DeepSeek has directly increased the training demand for Nvidia (NVDA.US) H100, proving that optimizing training methods helps meet training demand; another more significant trend is the surge in local deployment of LLMs.
Kuo indicated that TSMC and Nvidia both expect significant growth in device-side AI by 2026. Previously, TSMC's earnings call mentioned that the trend of device-side AI would only become apparent in 2026, with Nvidia's AI PC processors N1X/N1 expected to enter mass production in 4Q25/1H26.
The rise of DeepSeek has directly increased the training demand for Nvidia H100, proving that optimizing training methods (which can also be seen as cost reduction) helps meet training demand, and further validates the advantages of the CUDA ecosystem (the reason users choose H100).
However, another more significant trend is the surge in local deployment of LLMs following the rise of DeepSeek. The optimization training methods proposed by DeepSeek R1 are beneficial for enhancing the performance of small to medium-sized LLMs on devices, coupled with concerns about the security of using cloud-based DeepSeek data, all of which are driving this trend. It is expected that more open-source models similar to DeepSeek will emerge, and the trend of local deployment of LLMs will continue.
Kuo stated that currently, the deployment or use of local DeepSeek is limited to a few individuals, so it has no immediate impact on the demand for Nvidia's cloud AI chips. In the long term, device-side AI will replace some cloud services, but the growth of device-side AI may also create new cloud demand (as in the case of H100), so the demand for both will continue to grow simultaneously and integrate into a new AI ecosystem.
Kuo is not pessimistic about the long-term growth trend of cloud services, but it is necessary to pay attention to whether the trend of device-side AI is faster than expected, which may cause the growth rate of cloud services to fall below previous market optimistic expectations and affect the investment atmosphere in the future.
Looking ahead, Kuo pointed out that the scaling law is accelerating again due to the successful mass production of GB200 NVL72, and the increased visibility of commercialization for new AI applications (such as robotics, autonomous driving, and multimodal) will help reduce the uncertainty of cloud growth. TSMC remains one of the biggest winners in the device-side AI trend (due to upgrades in device processors), but Nvidia faces significantly higher competition on the device side than in the cloud, which is unfavorable for the short-term investment atmosphere
