Powering Data Centers a Do-Able Challenge – Duke Study

Some of the panic about Data Center driven demand growth could be overheated.

Daily Energy Insider:

Duke University released a study that examines the untapped potential of the U.S. power system to handle more load capacity.

The study, called Rethinking Load Growth: Assessing the Potential for Integration of Large Flexible Loads in US Power Systems, says the U.S. power system has the potential to more quickly add large loads while mitigating the need for costly system upgrades. It adds that this potential exists as long as those loads can occasionally cut their power use when the grid is most stressed.

The analysis also provides an estimate of the volume of new flexible load that could be added within the existing capacity of each of the 22 largest balancing authorities, which represent 95 percent of the power system.

There were three main key takeaways from the report. They are:

  1. Load flexibility could be an important tool to support economic growth while maintaining grid reliability and affordability. Load flexibility refers to the ability of customers to temporarily reduce their electricity consumption from the grid. That task could be accomplished by using onsite generators, shifting workload to other facilities or reducing operations. This offers a near-term alternative to more expensive—and less climate-friendly—measures.
  2. Balancing authorities could collectively add nearly 100 gigawatts of large loads to the grid with minimal impact. The study introduces a new concept called curtailment-enabled headroom. This concept is used to describe how much additional load the grid can absorb using existing capacity, with modest, brief reductions in usage. The 100-GW estimate assumes that these new loads would be curtailed by an average of 0.5 percent of their maximum uptime each year to help the grid meet peak demand. Peak demand would include extremely hot or cold days. The average curtailment time would be about two hours, which is consistent with the storage capacity of short-duration batteries.
  3. The estimated annual curtailment time is comparable to existing demand response programs already in place around the country. These programs incentivize customers, mostly industrial and commercial energy users, to change their electricity usage. The changes help reduce peak loads or provide other services, such as targeted deferral of grid upgrades or integration of wind and solar energy that are available at varying times.

“Our study demonstrates that existing U.S. power system capacity—intentionally designed to handle extreme peak demand swings—could accommodate significant load additions with modest flexibility measures,” said lead author Tyler Norris, a Ph.D student at Duke University’s Nicholas School of the Environment with more than a decade of experience in the energy sector. “Overall, the findings suggest that load flexibility offers a promising near-term strategy for regulators and market participants to more quickly integrate new loads, reduce the cost of capacity expansion and enable greater focus on the highest-value investments in the electric power system.”

Jigar Shah on Open Circuit podcast:

So I think there’s real questions around whether load forecasts are significantly inflated, and if a PUC doesn’t properly scrutinize that, as we’re seeing in PJM when the utility load forecasts are just stacked on top of each other, that creates some fairly staggering statistics that don’t necessarily pass the sniff test. In PJM, folks are saying that 30 out of 32 gigawatts, roughly 94%, of new load by the end of the decade is only data centers. I think that’s potentially highly suspect, and a lot of that flows from what PUCs across that network are or aren’t doing around scrutinizing utility IRP load forecasts.

World Resources Institute:

An Electric Power Research Institute paper from 2024, for instance, found that electricity demand for data centers could consume anywhere between 4.6% and 9.1% of all U.S. electricity consumption by 2030. The difference between those figures, around 200 terawatt-hours (TWh), is equivalent to the energy consumption of almost 11 million homes.

The consequences of this uncertainty could be massive. If not managed properly, this unfettered growth could lead to higher energy bills for consumers, increased greenhouse emissions and a less reliable energy system. Utilities across the country are already seeking rate increases in response to new data center demand coupled with more frequent weather events. While some efforts are being made to isolate data-center related costs, homes and businesses could end up stuck with extra costs from overbuilt, unnecessary or underutilized infrastructure. Furthermore, plans to expand natural gas generation and delay coal plant retirement to support data center demand could lock in greenhouse gas emissions for decades, even if demand fails to materialize.

5 thoughts on “Powering Data Centers a Do-Able Challenge – Duke Study”


  1. I’d seen, in Utility Dive or elsewhere, some coverage of the fact that grid demand numbers are being skewed by the problem of data center developers trying to announce a single data center they plan to build, but going to two or three different locales with the pitch, meaning two or three locales are projecting new load for a single site that might not end up getting built at all. So potential data center bubble inflated by AI hype, combined with the way they try for competitive bids, could lead to bad siting decisions for new generation.

    Whatever comes of all this, one thing I think would be helpful would be for the gigantic sites used for training be required to sit behind large-scale battery installations designed to prevent power fluctuations on-site from disturbing power quality on the local grid feeding in. I’ll link to a paper that isn’t peer-reviewed yet, but looks at the issue.

    Here’s a snippet from the abstract:
    “However, there is a largely overlooked issue as challenging and critical as AI model and infrastructure efficiency: the disruptive dynamic power consumption behaviour. With fast, transient dynamics, AI infrastructure features ultra-low inertia, sharp power surge and dip, and a significant peak-idle power ratio. The power scale covers from several hundred watts to megawatts, even to gigawatts. These never-seen-before characteristics make AI a very unique load and pose threats to the power grid reliability and resilience.”

    “The Unseen AI Disruptions for Power Grids: LLM-Induced Transients”
    https://arxiv.org/html/2409.11416v1


    1. right, I was looking for that article, but couldn’t find it on short notice, let me know if you find it.


      1. I think this was it – longer ago than I was guessing.

        A fraction of proposed data centers will get built. Utilities are wising up.
        One expert estimated that speculative interconnection requests were five to 10 times more than the number of actual data centers, but the scale of the problem remains elusive.

        Published May 15, 2025
        https://www.utilitydive.com/news/a-fraction-of-proposed-data-centers-will-get-built-utilities-are-wising-up/748214/


  2. Jevons is deeply unpopular here, ha ha, but just saying: the data center phenomenon is Jevons Paradox in action – make a commodity more efficient (electricity in this case), and humanity will find some way to use it more than ever before. It doesn’t decrease as a result – it increases.

    Renewables can replace fossil fuel use, yes, but increased usage makes that transition much more difficult and time-consuming.

    One person has written that increasing efficiency in data centers themselves is more likely to increase electricity usage than decrease it:
    https://www.nature.com/articles/s44284-025-00289-9

Leave a Reply to jimbillsCancel reply

Discover more from This is Not Cool

Subscribe now to keep reading and get access to the full archive.

Continue reading