Does the United States Have the Energy for an AI Revolution?

Will artificial intelligence create a huge increase in demand for electric generation, including natural gas and nuclear power plants? Will AI redefine the energy transition? Some computing experts think it might.

But with AI development in uncertainty and upheaval right now, most are saying, “Wait and see.”

A German word, “dunkelflaute,” has a place in this outlook. Hyperscale data centers also play a role. A concept known as the Jevons Paradox has a part in it. And of course, so does Elon Musk.

Late January brought a remarkable trifecta of events in artificial intelligence and energy demand.

First, by executive order, the Trump administration declared a national energy emergency in the United States. The order stated that “a precariously inadequate and intermittent energy supply, and an increasingly unreliable grid, require swift and decisive action. Without immediate remedy, this situation will dramatically deteriorate in the near future due to a high demand for energy and natural resources to power the next generation of technology.”

Then, OpenAI, Oracle and investment company SoftBank announced the formation of The Stargate Project, with plans to spend $500 billion over the next four years building data centers and other new AI infrastructure in the United States.

Elon Musk, on his own X platform, wrote that the venture partners “don’t actually have the money.” He later underlined the point by bidding less than $100 billion to buy OpenAI outright.

Also, Chinese AI company DeepSeek released its new R1 chatbot model. Analysts praised R1’s capabilities and DeepSeek researchers announced that their model was trained at a fraction of the cost of earlier, comparable AI models. Basically, AI model training turns algorithms loose on enormous datasets so they can learn to generate the most valid output possible. (See Enspired on page 40 for the full scoop on that.)

All of these developments occurred around the same time and it seemed like AI in the Wild West, if hostiles had brandished Nvidia chips instead of six-shooters. People began to wonder just how much electric power AI development might really need in the future and if those demands would challenge existing power systems.

The Good News

The initial response from computing-power experts: The United Stats has a highly capable, high-capacity electric generation and delivery system, with considerable flexibility. So, energy concerns might be overblown.

Please log in to read the full article

Will artificial intelligence create a huge increase in demand for electric generation, including natural gas and nuclear power plants? Will AI redefine the energy transition? Some computing experts think it might.

But with AI development in uncertainty and upheaval right now, most are saying, “Wait and see.”

A German word, “dunkelflaute,” has a place in this outlook. Hyperscale data centers also play a role. A concept known as the Jevons Paradox has a part in it. And of course, so does Elon Musk.

Late January brought a remarkable trifecta of events in artificial intelligence and energy demand.

First, by executive order, the Trump administration declared a national energy emergency in the United States. The order stated that “a precariously inadequate and intermittent energy supply, and an increasingly unreliable grid, require swift and decisive action. Without immediate remedy, this situation will dramatically deteriorate in the near future due to a high demand for energy and natural resources to power the next generation of technology.”

Then, OpenAI, Oracle and investment company SoftBank announced the formation of The Stargate Project, with plans to spend $500 billion over the next four years building data centers and other new AI infrastructure in the United States.

Elon Musk, on his own X platform, wrote that the venture partners “don’t actually have the money.” He later underlined the point by bidding less than $100 billion to buy OpenAI outright.

Also, Chinese AI company DeepSeek released its new R1 chatbot model. Analysts praised R1’s capabilities and DeepSeek researchers announced that their model was trained at a fraction of the cost of earlier, comparable AI models. Basically, AI model training turns algorithms loose on enormous datasets so they can learn to generate the most valid output possible. (See Enspired on page 40 for the full scoop on that.)

All of these developments occurred around the same time and it seemed like AI in the Wild West, if hostiles had brandished Nvidia chips instead of six-shooters. People began to wonder just how much electric power AI development might really need in the future and if those demands would challenge existing power systems.

The Good News

The initial response from computing-power experts: The United Stats has a highly capable, high-capacity electric generation and delivery system, with considerable flexibility. So, energy concerns might be overblown.

Prasad Enjeti, professor of electrical and computer engineering at Texas A&M University, cited the efforts of the Electric Reliability Council of Texas, which oversees the flow of electric power to 27 million Texas customers.

“Electricity demand is rising fastest in Texas, where ERCOT manages 90 percent of the state’s power grid. Recognizing the impact of large energy consumers like AI data centers and Bitcoin miners, ERCOT classifies them as large flexible loads and has established an LFL Task Force,” Enjeti noted.

Some of those LFL facilities, like cryptocurrency mining operations, have agreed to curtail power usage voluntarily during periods of high grid demand or low generation availability, he observed.

“This allows them to participate in ERCOT’s energy and ancillary service markets, offering grid flexibility that helps offset the challenges of surging electricity consumption,” he noted.

In February, the Nicholas Institute for Energy, Environment and Sustainability at Duke University issued the report “Rethinking Load Growth: Assessing the Potential for Integration of Large Flexible Loads in U.S. Power Systems.”

The report was based on an analysis of 22 of the largest U.S. electric-load balancing authorities, serving 95 percent of the country’s peak load. It refers to the power system’s ability to integrate new load demand as “curtailment-enabled headroom.”

Its results “suggest the U.S. power system’s existing headroom, resulting from intentional planning decisions to maintain sizable reserves during infrequent peak demand events, is sufficient to accommodate significant constant new loads, provided such loads can be safely scaled back during some hours of the year.

“In addition, they underscore the potential for leveraging flexible load as a complement to supply-side investments, enabling growth while mitigating the need for large expenditures on new capacity,” the report stated.

Because the power system is designed to meet peak demand, it has extra capacity at non-peak times – sometimes a lot of extra capacity. With demand flexibility, that extra capacity can handle significant new power requirements.

The Caveats

And there might be the first point of contention, the first rub. The Duke report categorizes AI training as “delay-tolerant” and subject to “temporal flexibility.”

“The central hypothesis is that the evolving computational load profiles of AI-specialized data centers facilitate operational capabilities that are more amendable to load flexibility.

“Unlike the many real-time processing demands typical of conventional data center workloads, such as cloud services and enterprise applications, the training of neural networks that power large language models and other machine learning algorithms is deferrable,” the report noted.

However, it’s entirely possible that AI developers don’t look at power demand in the same way. They might be envisioning a continuous supply of available, high-capacity generation to train their models, especially since they see themselves in a race to develop more powerful AI tools.

A second rub comes from the nature of the energy transition. Traditionally, electric systems and grids have been seen as more or less continuously fed by power plants and other generation, like hydroelectric.

Wind and solar power are subject to a concept known as “dunkelflaute,” defined as an extended period of little or no wind or sunlight. Renewable sources of electricity are, overall, less continuous sources of power generation.

“Generative AI does indeed affect the energy transition. Hyperscale datacenter operators – Microsoft, Meta, Google, Amazon, etc. – have outlined strategies based on renewable energy and power purchase agreements,” noted Bejamin Lee, professor of electrical and systems engineering at the University of Pennsylvania.

“Combined with battery investments and flexible load scheduling, moderately sized 50-100 megawatt datacenters might have been able to compute entirely with carbon-free energy. These solutions clearly do not scale when we consider multiple-gigawatt datacenters,” he observed.

Lee said computing is still exploring the frontier and capabilities for generative AI, requiring continued investment in data center and energy infrastructure.

“Although recent advances – e.g., DeepSeek – have demonstrated strategies for reducing the training costs for large language models, the frontier will likely be characterized by different workloads.

First, we will look beyond training to inference, the computation required when using trained models to produce responses for users,” Lee noted.

“Inference will consume increasing amounts of energy as users and software applications make more queries to a model, as the number of users and applications increase, and as ‘reasoning’ strategies cause models to compute more to produce a better answer for each query.

“There is significant uncertainty about energy costs for inference because we have not yet found a compelling application that triggers broad user adoption. If such an application is found, inference costs would grow rapidly,” he observed.

Gigawatt Datacenters

In addition, multi-modal AI incorporates not only text but audio, images, video and other forms of data. Agentic AI selects which of many specialized AI models to invoke on a user’s behalf. In retrieval-augmented generation, an AI model consumes more than just user inputs and considers additional inputs, for instance from databases and search engines.

“All of these capabilities will extend the frontier of todays’ models and require more computation. Training for these new models require many GPUs working in a coordinated fashion, which leads to discussion about gigawatt datacenters that deploy a million GPUs,” Lee noted.

Gigawatt datacenters may be coming. DeepSeek hinted that AI training might be less power-voracious than earlier predicted, but some observers then cited Jevons Paradox, where technological advancement that makes a resource more efficient to use can actually result in higher resource consumption.

Currently, uncertainty over AI and future energy demand has brought confusion to all sides, in computer resources and power supply and the energy transition outlook.

Writing for the World Economic Forum, Leila Toplic, chief communications and trust officer for Carbonfuture, noted “the frenzy surrounding DeepSeek’s high-performance, low-cost offering has also exposed a deeper, more troubling reality:

“There is no reliable way to forecast AI-driven energy demand.”

Comments (1)

Embarrassment, questionable sources & quotes, Driving AAPG Members away
This article is an embarrassment. Many FORMER AAPG Members have told they are “sick of a Woke society.” This article is more gasoline on that fire. First. There is no "Energy Transition." It’s an Energy Addition. Yet Explorer still uses this archaic term. MY THESIS: Why don't your reporters check their sources for obvious bias before quoting them to build a story? Sad the journalist strung together a speculative hypothesis based on “cherry picked quotes” from, in part, some very suspect green characters. The article quotes “environmental advocates” to question electricity demands for data centers. Duke's Nicholas Center's webpage shows it is Woke with the usual “Eco-cliches” e.g., Wind = Sustainably; Oil and Gas = Habitats at Risk, and “Environmental Inequality and Gender” plus other green tripe. NEWSFLASH! US Fish and Wildlife sells Bald and Golden Eagle “taking/killing” permits, but only to Wind companies. O&G companies cannot apply. Please, Explorer, tell us AAPG Members who are sick of Woke Dribble in a magazine that OUR DUES PAY FOR: Aren't Wind Turbines encroaching on wildlife habitats if they require Eagle kill permits? Nicholas Center has clear bias FOR wind and AGAINST oil, yet you quote them anyway! Quoting Ms. Toplic on energy is journalistic malpractice. Ms. Toplic's bachelor's degree is in “Peace and Justice Studies” from Wellesley College, according to its' webpage. Ms. Toplic currently works on the "intersection of technology, ethics, and human rights." Whenever I see words like, "intersection," “Wellesley,” “peace and justice” -> I know heavy-duty Woke mis-information is coming. Here are better quotes for your article: US DOE (2024) reports "... data centers consumed 4.4% of US electricity in 2023 and are expected to consume approximately 6.7 to 12% of total US electricity by 2028." AND "... total data center electricity usage climbed from 58 TWh in 2014 to 176 TWh in 2023 and estimates an increase of between 325 TWh TO 580 TWh by 2028."
3/10/2025 2:38:18 PM

You may also be interested in ...