Wallstreetcn
2024.06.12 13:37
portai
I'm PortAI, I can summarize articles.

"Apple's Most Knowledgeable" analyst Ming-Chi Kuo: "Apple Intelligence" is expected to redefine terminal AI

"At least Apple has successfully conveyed to the public the rich features and unique selling points of its terminal AI," Guo Mingchi pointed out, currently, the "Apple Intelligence" terminal AI large model requires about 2GB or less of DRAM

If the iPhone redefined smartphones, can "Apple Intelligence" have the opportunity to redefine terminal AI?

Renowned Apple analyst Ming-Chi Kuo gave an affirmative answer.

In an article released on Wednesday, Ming-Chi Kuo stated that currently, the AI capabilities of the Samsung S24 phone are still relatively limited, and Microsoft's AIPC has not provided consumers with a clear understanding, leaving only Apple to redefine terminal AI.

At least Apple has successfully conveyed to the public the rich features and unique selling points of its terminal AI.

Ming-Chi Kuo believes that Apple's success in terminal AI is likely to inspire other tech companies to accelerate their research and innovation in this field, leading to industry-wide competition and rapid imitation and follow-up. This competitive situation will further drive the accelerated growth of various industries related to device-side AI.

Ming-Chi Kuo also pointed out that the current "Apple Intelligence" terminal AI large model requires approximately 2GB or less of DRAM (Dynamic Random Access Memory).

Defining Terminal AI with "Apple Intelligence" from the Supported Models

According to Apple's previous introduction, the iPhone 15 equipped with the A16 processor cannot support "Apple Intelligence," but devices with the M1 chip can support this feature.

This indicates that the key factor determining whether a device can support "Apple Intelligence" should be the size of DRAM, rather than AI computing power (usually measured in TOPS, i.e., trillions of operations per second).

Ming-Chi Kuo estimates that the AI computing power of M1 is about 11 TOPS, lower than A16's 17 TOPS, but A16's DRAM is 6GB, lower than M1's 8GB. Therefore, the current "Apple Intelligence" terminal AI large model requires approximately 2GB or less of DRAM.

To run applications on AI-enhanced devices, the device needs to be able to run large models with a parameter scale of 3 billion. The specific capacity of DRAM required depends on the model compression method used.

Ming-Chi Kuo points out that Microsoft considers 40 TOPS computing power as a key specification for AIPC, but for Apple, 11 TOPS combined with cloud AI is sufficient to provide rich device-side AI applications.

If consumers want to purchase Microsoft's AIPC, they need to calculate whether it meets the 40 TOPS requirement, while Apple directly informs consumers which models can support "Apple Intelligence," giving them a sales strategy advantage.

In the future, "Apple Intelligence" terminal AI will definitely be upgraded (most likely to a large model with 70 billion parameters), which will require larger DRAM to operate.

By then, will Apple use the upgrade of device-side AI to further differentiate its product line? Additionally, whether the user experience is as good as Apple claims remains to be seen