Zhitong
2024.08.14 03:34
portai
I'm PortAI, I can summarize articles.

CICC: Apple releases Apple Intelligence smart assistant, heterogeneous chips may become a new direction for AI computing power

Apple unveiled the smart assistant Apple Intelligence at the 2024 Global Developers Conference, integrating powerful generative models aimed at efficiently handling users' daily tasks. This assistant is supported by the base model AFM and equipped with Google's TPU chip to enhance computational efficiency. In addition, Apple Intelligence has various functions, including writing and polishing text, creating interesting images, etc. Currently, it is limited to registered developers for trial use, with a subscription fee of $99 per year

According to the Financial Intelligence APP, Changjiang Securities released a research report stating that at the 2024 Global Developers Conference, Apple (AAPL.US) launched a personal intelligent assistant called Apple Intelligence. This assistant includes multiple powerful generative models that can quickly and efficiently handle users' daily tasks, and can adapt to users' current activities in real time. In addition, Apple released the basic model AFM to empower the underlying operating system. Apple Intelligence is powered by the AFM basic model. In terms of computing power, the AFM model is supported by Google's TPU chip, which rivals Nvidia's flagship chip in performance.

Apple Introduces Apple Intelligence Intelligent Assistant

Apple introduced the basic model AFM to empower the underlying operating system. At the 2024 Global Developers Conference, Apple launched the personal intelligent assistant Apple Intelligence. Apple Intelligence includes multiple powerful generative models that can quickly and efficiently handle users' daily tasks, and can adapt to users' current activities in real time. Apple Intelligence can write and polish text, prioritize and summarize notifications, create interesting images for conversations with family and friends, and take in-app actions to simplify cross-app interactions. Apple Intelligence will be available on IOS18, IOS18ipad, and MacOS18 operating systems. Currently, Apple Intelligence is available on the iOS18.1Beta version for registered developers to try out, with a subscription price of $99 per year; regular users still need to wait in line.

Apple Intelligence Empowered by AFM Basic Model

The AFM basic model mainly consists of two parts: the edge-side model and the cloud-side model. The edge-side model is designed for specific scenarios of edge-side applications, can only handle language-related single-modal tasks, and can be locally deployed on devices such as iPhone, iPad, and Mac, with a model containing 3 billion parameters. The cloud-side model is designed for private cloud application scenarios, has multi-modal capabilities, higher generalization capabilities, and can handle more general tasks. These two basic models are part of the generative model family created by Apple. In addition to the two models mentioned above, Apple Intelligence also includes an encoding model and a diffusion model. The encoding model is based on the AFM language model and is used to inject intelligent features into Xcode; the diffusion model helps users express themselves visually, such as in the Messages application.

AFM Cloud-side Model Performance Rivals GPT-3.5, Slightly Inferior to GPT-4.

During the model performance evaluation phase, Apple designed 1393 tasks to compare the performance of the AFM model with other mainstream models. The comparison results show that the performance of the AFM cloud-side model surpasses models such as Mixtral-8x22 Hybrid Expert Model and GPT-3.5, and is slightly inferior to GPT-4 and LLaMA-3-70B models; In terms of edge-side models, the performance of the AFM edge-side model is close to mainstream edge-side models in the market. Human evaluation results show that the performance of the AFM edge-side model surpasses mainstream models such as Gemma-7B, Phi-3-mini, Mistral-7B, Gemma-2B, and is slightly inferior to the LLaMA-3-8B model; the results prove the excellent performance of the AFM edge-side model, which is expected to demonstrate high practicality on devices such as iPhone and iPad.

Heterogeneous chips may become a new direction for AI computing power development

In terms of computing power, the AFM model is supported by Google's TPU computing chips. Google provided computing support for this training, with the cloud-based AFM-server model trained on 8192 TPU V4 computing chips. During the training phase, Apple divided the 8192 chips into 8 groups, with each group of 1024 chips connected in series to form a basic unit, maintaining parallel relationships between groups, and training data and iterations completed within each group; the edge-side AFM-on-device model was trained on 2048 TPU V5p computing chips.

Google's TPU performance rivals flagship computing chips from NVIDIA. TPU (Tensor Processing Unit) is an ASIC chip designed specifically for tensor operations. TPU achieves efficient operations through a systolic array mechanism. Compared to GPUs, TPU does not need to access memory frequently, reducing the number of interactions with memory, thereby significantly improving computational efficiency. Therefore, TPU has a higher effective utilization of computing power compared to GPUs; GPU's computing power utilization rate is usually 20%-40%, while TPU's computing power utilization rate often exceeds 50%.

Risk Warning

  1. AI technology advances slower than expected;
  2. Downstream application demand is lower than expected