SoftBank deploys 4,000 NVIDIA Hopper GPUs to upgrade AI platform computing power

Zhitong
2024.11.04 07:21
portai
I'm PortAI, I can summarize articles.

SoftBank announced that it has installed approximately 4,000 NVIDIA Hopper GPUs on its Japanese artificial intelligence computing platform, enhancing computing performance to about 4.7 exaFLOPS per second. The platform was originally equipped with 2,000 Ampere GPUs and plans to increase the total number of GPUs to about 10,000 by fiscal year 2025, aiming to achieve a computing capacity of 25.7 exaFLOPS per second. The platform will be used by SoftBank's subsidiary SB intuition, which is dedicated to developing large language models in Japanese

According to Zhitong Finance APP, SoftBank (SFTBY.US) announced that it has installed approximately 4,000 NVIDIA (NVDA.US) Hopper GPUs on its expanding top-tier artificial intelligence computing platform in Japan. It is reported that SoftBank began operating its AI computing platform, which has a performance of 0.7 exaflops per second, in September 2023. This platform was previously equipped with about 2,000 NVIDIA Ampere GPUs, and with the installation of approximately 4,000 NVIDIA Hopper GPUs, the total computing performance of the AI platform has been enhanced to about 4.7 exaflops per second.

SoftBank plans to further strengthen the performance of its AI computing platform from fiscal year 2024 (ending March 31, 2025) to fiscal year 2025 by deploying NVIDIA DGX SuperPOD with NVIDIA DGX B200 systems, increasing the total number of installed GPUs to about 10,000, and aims to achieve a total computing capacity of 25.7 exaflops per second in Japan through subsequent installations of additional GPUs.

The AI computing platform will first be used by SoftBank subsidiary SB Intuition, which is dedicated to developing a large language model specifically for the Japanese language. SB Intuition aims to build a large language model with approximately 390 billion parameters within fiscal year 2024, supporting multiple modalities. With the enhancement of the AI computing platform, the development of the large language model will be further accelerated