Tech giants successively launch new "anti-NVIDIA" AI chips! This time it's Meta, with an update to MTIA

Wallstreetcn
2024.04.10 20:26
portai
I'm PortAI, I can summarize articles.

Tech giant Meta has released its self-developed AI chip to reduce its reliance on external companies like Nvidia. This chip is the latest version of Meta's Training and Inference Accelerator (MTIA) released last year, and will be used to rank and recommend content on Facebook and Instagram. Meta's shift towards AI services has increased the demand for computing power, with plans to invest $35 billion to support AI. Despite tech giants like Meta developing their own chips, the demand for Nvidia in the AI industry remains high

Author: Zhao Yuhe

Source: Hard AI

Meta Platforms announced on Wednesday that it is deploying a self-developed chip to help support its artificial intelligence services. Analysts believe that this move is aimed at reducing its reliance on chips from external companies such as NVIDIA.

Last year, Meta released its first training and inference accelerator (MTIA) product. It is understood that the self-developed chip announced on Wednesday is the latest version of MTIA, used to help rank and recommend content on Facebook and Instagram.

Meta's shift towards AI services has increased its computational power demand. Last year, Meta released its own AI models, competing with OpenAI's ChatGPT. Meta also added new generative AI features to its social apps, including custom stickers and chatbot characters with celebrity faces.

In October last year, Meta announced plans to invest up to $35 billion to support AI, including data centers and hardware. At that time, Meta CEO Mark Zuckerberg stated, "AI will be our biggest investment area in 2024."

A significant portion of this expenditure may still flow to NVIDIA for purchasing the popular H100 graphics cards to power AI models. Earlier this year, Zuckerberg mentioned that the company would purchase 350,000 of these chips, each costing tens of thousands of dollars.

However, more and more tech giants are starting to develop their own chips. Now, Meta is joining the ranks of Amazon's AWS, Microsoft, and Google in an attempt to reduce reliance on high-cost AI chips. Nevertheless, analysts believe that this process will not be completed quickly. So far, the AI industry's demand for NVIDIA's AI accelerators remains strong, and these efforts by tech giants have not made a dent.

Currently, the AI boom has helped NVIDIA become the world's third-largest tech company by market value, behind only Microsoft and Apple. Its revenue from data center operators totaled $47.5 billion in the 2024 fiscal year, a significant increase from the previous year's $15 billion. Analysts predict that this figure will double again in the 2025 fiscal year.

Meta's stock price rose 0.67% to $520.36 during Wednesday's trading session.

NVIDIA's stock price rose 1.76% to $868.53 during Wednesday's trading session.

Based on online user reviews, Meta's new chip announcement has received positive feedback from many users. Some users expressed that everyone is now starting to develop their own chips, which is the right direction to take. Please accelerate and don't stop

Some netizens have expressed that the better Meta does, the more competition NVIDIA feels, the faster humans will reach general artificial intelligence.

There are also netizens who say that humans have reached the "technological singularity" (referring to a point where technological development will undergo a massive near-infinite progress in a very short time), with significant progress happening every day. Soon, catching up with AI will become a full-time job.

However, some netizens who analyze from a technical perspective believe that Meta's chip is nothing special. Some say that when you don't care about tedious general computing workloads and only want to perform 8-bit integer operations, you can eliminate a lot of content. This is just an overdeveloped SIMD core with a rudimentary cache system. Everything depends on the compiler, and if they can't use the LLVM compiler, they will definitely find it very difficult.

Some netizens mention that the new MTIA chip has a small area, a thermal design power (TDP) of 90W (typical training accelerators range from 350-500W, multi-chip modules (MCM) range from 700-1000W), and approximately only one-third of the floating-point computing power (TFLOPs) of H100. Therefore, overall, it may be a victory, similar to Google's strategy of using many small TPUs for expansion. Nevertheless, it is indeed a great time for chips!

There are also netizens who question the significance of caring about this chip if they can't buy it.

From the perspective of the capital market, some netizens say that in any case, this is good news for TSMC...