Jensen Huang: AI can help improve electricity utilization, and nuclear energy is a good choice for data center energy
Although training AI models requires a large amount of energy, Jensen Huang believes that it is possible to more effectively distribute electricity through smart grids to improve the efficiency of the computing process. He mentioned that "completing the same work can save 100 times, 1000 times the energy". He is skeptical about the external predictions of significant electricity consumption by AI, stating that "there is likely to be redundant calculations"
NVIDIA CEO Jensen Huang believes that nuclear energy can help offset the huge energy consumption generated by artificial intelligence (AI) technology and its applications. The benefits that AI brings to the grid and society can ultimately offset this energy consumption.
On Friday, September 27th, Eastern Time, Huang was interviewed by Bloomberg and stated that nuclear energy is a good choice as a renewable energy source to meet the increasing needs of data centers. He said:
"Nuclear energy is one of the best energy sources and a sustainable one. It won't be the only energy source. We need various energy sources and need to balance supply, cost, and sustainability of energy."
With the explosion of AI applications, the issue of high energy consumption is becoming increasingly prominent. Currently, large global companies are investing billions of dollars in what they see as the new infrastructure for computing in the future, making the energy consumption issue more severe. In some regions, the electricity generation capacity is already insufficient to support the establishment of new data centers. Due to power constraints, some data centers cannot operate at full capacity, and some are located far from population centers.
In addition to being optimistic about nuclear energy supporting AI energy demands, Huang also believes that AI can help improve energy efficiency.
Also on the same Friday, during an event at the Bipartisan Policy Center in Washington, Huang told some reporters that future electricity demand growth will be offset by reduced grid waste and the "amazing productivity" brought by "better energy, better carbon capture systems, and better energy production materials."
Although training AI models requires a large amount of energy, Huang believes that people can more effectively distribute electricity through smart grids to improve the efficiency of the computing process.
"This is the most energy-efficient way to perform computations. When using many applications, completing the same work can save 100 times, 1000 times the energy."
The massive energy consumption of AI applications has become a highly concerning issue in academia and industry.
At the Davos meeting earlier this year, OpenAI CEO Sam Altman stated that the electricity consumed by AI technology will far exceed expectations. A report released by the International Energy Agency (IEA) in January estimated that by 2026, the electricity consumption of global data centers, AI, and cryptocurrency industries could double, significantly higher than the 3.4% annual growth rate of global electricity demand in the next three years. By 2026, the total electricity consumption of data centers may exceed 1000 terawatt-hours (1 terawatt-hour = 1 billion kilowatt-hours).
A report by Wells Fargo Bank predicts that after a decade of plateauing electricity growth, by 2030, electricity demand in the United States will increase by 20%. AI data centers alone are expected to increase electricity demand by about 323 terawatt-hours in the United States. Currently, New York City's annual electricity consumption is 48 terawatt-hours. Boston Consulting Group predicts that by the end of 2030, the electricity consumption of US data centers will be three times that of 2022, with this increase mainly coming from AI model training and more frequent AI queries. Goldman Sachs estimates that by 2030, data center electricity consumption will account for 8% of total electricity consumption in the United States.
However, on Friday, Jensen Huang expressed skepticism about the significant electricity consumption predictions for AI by the outside world, believing that "there is likely to be duplicate calculations." He stated that the U.S. federal government needs to "better understand the actual electricity needs" and decision-makers should also "ensure that no U.S. companies need to establish data centers abroad for energy reasons." Wall Street News recent article mentioned that due to the need for a large amount of clean energy to meet the electricity demand of its AI data centers, Microsoft signed a 20-year power purchase agreement with Constellation Energy last Friday - the Three Mile Agreement, which may incentivize the optimization and expansion of the nuclear energy supply chain, including uranium mining, nuclear fuel processing, nuclear reactor construction, and maintenance.
This agreement has shown the huge potential for clean energy demand in the market. As data centers and other large energy consumers increase their electricity demand, nuclear energy as a stable and low-carbon energy choice may continue to grow in demand.
Jensen Huang also mentioned Microsoft's Three Mile Agreement this Friday, calling it "fantastic." He also mentioned that NVIDIA's new generation of chips consume more power than the previous generation, but are more efficient, allowing for faster training and operation of AI software, replacing multiple old components.
Furthermore, Jensen Huang stated that he is doing his best to serve Chinese customers and comply with U.S. export restrictions. "The first thing we have to do is comply with all the policies and regulations that are being implemented, and at the same time, make every effort to compete in the markets we serve, where many customers rely on us, and we will do our best to support them."