Wallstreetcn
2023.07.17 08:43
I'm PortAI, I can summarize articles.

Server power consumption soars threefold! Is AI's final battle about not just computing power, but also electricity?

Computing power is the foundation of AI development, and the bottleneck of computing power ultimately lies in electricity. As the competition for AI arms race intensifies, the battle for electricity becomes increasingly important.

Computing power and energy consumption are both sounding the alarm.

Computing power is the foundation of AI development, and the bottleneck of computing power ultimately lies in electricity. How much energy does it take to train large AI models like ChatGPT?

On July 15th, according to the latest report published by Digital Information World, the energy consumption generated by data centers for training AI models will be three times that of regular cloud workloads. It is estimated that by 2030, the power demand of US data centers will increase at a rate of about 10% per year.

The data shows that training GPT-3 by OpenAI consumes 1.287 gigawatt-hours, which is approximately equivalent to the annual electricity consumption of 120 American households. And this is only the preliminary power consumption for training AI models, accounting for only 40% of the actual power consumption when the model is in use.

In January 2023, OpenAI consumed the equivalent of the annual electricity consumption of approximately 175,000 Danish households in just one month. Google AI consumes 2.3 terawatt-hours of electricity per year, equivalent to the annual electricity consumption of all households in Atlanta.

It is understood that AI servers have a power consumption 6-8 times higher than ordinary servers. This will also increase the demand for power supply. General-purpose servers used to only require two 800W server power supplies, but the demand for AI servers has directly increased to four 1800W high-power power supplies. The server energy consumption cost has directly soared from 3100 yuan to 12400 yuan, a three-fold increase.

Data center customers are also paying higher electricity bills. John Dinsdale, Chief Analyst at market research firm Synergy Research Group, admitted that data center operators are directly passing on the additional costs generated by running AI applications to their customers.

Some data center operators have taken the opportunity to raise commercial leasing prices to cope with the additional costs of powering and cooling computer server clusters (stacks) for increasingly energy-intensive workloads.

According to CBRE Group, one of the world's largest commercial real estate services companies, data center customers range from small businesses to large cloud service providers. Currently, the rate of electricity consumption is faster than the rate at which data center operators are expanding their capacity. Due to the increasing use cases of artificial intelligence, supply constraints are becoming more severe, putting upward pressure on the prices charged by data centers. Under the strong demand for electricity, according to statistics from CBRE, the cost of electricity paid by data center customers in Northern Virginia, USA, reached $140 per kilowatt per month in the first three months of this year, an increase of 7.7% compared to $130 a year ago.

The Power-Hungry Giant

According to modeling and predictions by consulting firm Tirias Research, data center power consumption is expected to reach nearly 4,250 megawatts by 2028, an increase of 212 times compared to 2023. The total cost of data center infrastructure and operations could exceed $76 billion.

This can be described as the "AI-driven data center revolution," and its growth poses challenges to the business models and profitability of emerging services such as search, content creation, and AI-driven business automation. Moreover, the cost is more than double the annual operating cost of Amazon AWS.

The firm states that the various innovative features brought by AI-driven data centers come at the expense of high costs in terms of performance and power consumption. Therefore, while the potential of artificial intelligence may be limitless, physical and cost limitations may ultimately come into play.

To reduce costs, the firm suggests using highly optimized, and even simpler and more specialized, small-scale neural network models to lower data center costs. This can be achieved by reducing the scale of cloud-based models, using massive parameter networks for rapid training of smaller models, and completely offloading workloads from the cloud. This way, AI-driven applications can be more economically and efficiently distributed to distributed platforms such as smartphones, PCs, vehicles, and mobile XR products:

"Five years ago, companies sounded the alarm on data center power consumption at the annual Hot Chips semiconductor technology conference, predicting that global computing demand could exceed global power generation within a decade. That was before the rapid adoption of AI-driven technologies, which have the potential to increase computing demand at an even faster rate.

There's no such thing as a free lunch - consumers will demand better AI-driven outputs, which will offset efficiency and performance improvements. As consumer usage increases, costs will inevitably rise. Shifting computation to the edge and distributing it to clients such as PCs, smartphones, and XR devices is a key way to reduce capital and operating costs."

In the current global battle of large-scale models, the amount of energy consumed and carbon emissions generated is enormous.

Following the current technological roadmap and development model, AI progress will give rise to two problems: on one hand, data centers will become increasingly large in scale, resulting in higher power consumption and slower operation.

On the other hand, AI chips are evolving towards higher computing power and integration, relying on process technology to support the growth of peak computing power. As process technology becomes more advanced, power and water consumption also increase. Additionally, constrained by Moore's Law, traditional computing capabilities are approaching their limits and facing challenges in terms of von Neumann architecture and physical miniaturization.

Both of these problems will exacerbate the issue of AI energy consumption.

Power Struggle in Full Swing

The enormous demand for electricity undoubtedly puts further pressure on digital infrastructure, as existing data centers are unable to meet the growing power needs of AI.

Last year, a data center in Northern Virginia faced a power outage crisis, highlighting the imminent energy problem.

To address the rising energy consumption of AI, DigitalBridge, a digital infrastructure company, is planning to invest billions of dollars in constructing and renovating data centers specifically designed for generative AI workloads.

According to Digital Information World, the next generation of data centers will have more comprehensive facilities than the current ones in Virginia and Santa Clara. The selection of locations should be in low-cost areas with no issues in power supply. To meet the computational demands of AI, data center operators must adapt and adopt innovative solutions.

As the competition for AI supremacy intensifies, the battle for electricity becomes increasingly crucial.

The demand for power challenges the business models and profitability of major companies, as "Generative AI Disrupts Data Centers." The various innovations brought about by generative AI come at the cost of high performance and power consumption.

Therefore, while the potential of artificial intelligence may be limitless, there may ultimately be physical and cost limitations. Major technology companies are exploring every possible strategy to ensure power supply and lay the foundation for a transformative energy future.