Wallstreetcn
2024.06.16 05:24
portai
I'm PortAI, I can summarize articles.

Why can't even the iPhone 15 use "Apple Smart"? The issue lies in memory, not computing power!

At this year's WWDC, Apple announced the launch of a new AI technology called Apple Intelligence. However, only the iPhone 15 Pro series and devices with the M1 chip such as iPad and Mac support this technology, while other iPhone models do not. Analyst Ming-Chi Kuo believes that this may be due to the fact that supporting Apple Intelligence requires larger DRAM memory, rather than AI computing power. Compared to the 16GB or 24GB of DRAM offered by Android phones, Apple devices typically provide less DRAM, mostly not exceeding 6GB. Therefore, the size of DRAM may be a key limitation for iPhone to support Apple Intelligence

At this year's WWDC, the most surprising thing may not be iOS 18, but Apple's AI: Apple Intelligence.

However, the official statement is that Apple Intelligence will only support iPhone 15 Pro, iPhone 15 Pro Max, and iPad and Mac devices equipped with M1 or later chips.

To translate: Among all the iPhones currently on sale, only the 15 Pro series supports Apple Intelligence, even the 15 and 15 Plus released at the same time do not! However, the iPad Pro released three years ago can!

Why is this the case?

Recently, well-known analyst Ming-Chi Kuo analyzed that from this, it can be inferred that the key to supporting Apple Intelligence should be the size of DRAM, not AI computing power (TOPS).

The issue lies in memory, not computing power!

In terms of specifications, the computing power of the M1 chip is about 11 TOPS, while the A16 chip's computing power is about 17 TOPS. However, the minimum DRAM for M1 is 8GB, up to 16GB, higher than A16's 6GB, the extra 2GB could be the "key to the problem." Kuo speculated:

"Therefore, the current Apple Intelligence's on-device AI LLM requires about 2GB of DRAM or less."

Furthermore, Kuo stated that Apple Intelligence's demand for DRAM can be verified from another perspective:

"Apple Intelligence adopts on-device 3B LLM (should be FP16, M1's NPU/ANE has good support for FP16), after compression (using a mixed configuration of 2-bit and 4-bit), about 0.7-1.5GB of DRAM needs to be reserved at any time to operate Apple Intelligence's on-device LLM."

Compared to the "large" DRAM of 16GB or even 24GB commonly seen in Android phones, Apple devices usually offer less DRAM, mostly not exceeding 6GB. For older models like the iPhone 13, the DRAM is only 4GB.

Therefore, many analysts believe that **Apple's setting of a 2GB "memory barrier" may promote consumers to "upgrade" their old devices, and in the upcoming iPhone 16 series models, Apple may provide higher DRAM options **

Ming-Chi Kuo said:

"In the future, Apple's on-device AI in Intelligence will definitely be upgraded (most likely to 7B LLM), requiring larger DRAM to operate.

It is worth observing whether Apple will use this as a product differentiation strategy between high-end and low-end models."