How Much Power Will Deep Seek Suck? Can US Grid Handle Data Demand?

We’ve been hearing for several years now that big Data Centers and AI are going to need so much power that we MUST embark on a massive program to build fossil gas power plants, or the economy will collapse.
It’s not the first time there’s been a panic about generation capacity, and a few of them in the past did not pan out. Early predictions that the internet would suck up a huge portion of our supply fizzled.
In the 60s, similar predictions of demand doubling every 7 years forever stumbled when the energy crisis of the 70s showed just how cheap energy efficiency was – but not before a massive and costly overbuild of nuclear and coal plants that shook Utility finances.
Now, the wild card is new AI technologies that use vastly less energy to accomplish the same ends.
Not saying we don’t need to be building – we do, and it should be wind, solar and batteries – but getting too far out over our energy skis has risks, too.

Deep Impact:

In short, the recent AI bubble (and, in particular, the hundreds of billions of spending behind it) hinged on the idea that we need bigger models, which are both trained and run on bigger and even larger GPUs almost entirely sold by NVIDIA, and are based in bigger and bigger data centers owned by companies like Microsoft and Google. There was an expectation that this would always be the case, and that generative AI would always be energy and compute hungry, and thus, incredibly expensive.

But then, a Chinese artificial intelligence company that few had heard of called DeepSeek came along with multiple models that aren’t merely competitive with OpenAI’s, but undercut them in several meaningful ways. DeepSeek’s models are both open source and significantly more efficient — 30 times cheaper to run — and can even be run locally on relatively modest hardware.

As a result, the markets are panicking, because the entire narrative of the AI bubble has been that these models have to be expensive because they’re the future, and that’s why hyperscalers had to burn $200 billion in capital expenditures for infrastructure to support generative AI companies like OpenAI and Anthropic. The idea that there was another way to do this — that, in fact, we didn’t need to spend all that money, had any of the hyperscalers considered a different approach beyond “throw as much money at the problem as possible” — simply wasn’t considered. 

Utility Dive:

A preliminary load forecast presented Dec. 9 by the PJM Interconnection, which hosts proportionally more data center capacity than any other load balancing authority, showed its summer and winter peak load growing by averages of 2% and 3.2% annually through 2045, up from 1.6% and 1.8% growth in its 2023 forecast.

But DeepSeek’s apparent dramatic improvements in efficiency suggests further AI performance gains may require less energy-intensive “compute” than assumed. That threatens “the bull thesis on independent power producers and most integrated utilities [that] is entirely dependent on data centers,” Jefferies said.

..regulated utilities were expected to benefit from data centers driving new generation needs, and “a slowdown in data center projections … would have an adverse impact on the higher premium utilities that investors expect to increase rate base,” Jefferies said.

Floodlightnews:

The new AI model from China, DeepSeek, uses less power and cheaper computer chips than the AI technologies currently in broad use in the United States, according to the Chinese company and analysts.

“There is a reason electricity stocks fell alongside tech stocks yesterday,” said Logan Atkinson Burke, executive director of the Alliance for Affordable Energy. “The news of more efficient AI means the plans and promises for unlimited load growth from AI points to the likelihood that energy needs have been overstated.”

The Alliance is a utility watchdog in Louisiana tracking the development of what would be one of the state’s larger power plants to provide energy mostly to a massive Meta data center in north Louisiana. The cost of the center, pegged by Entergy Louisiana in state regulatory documents as at least $5 billion, was announced later as a $10 billion project.

A recent U.S. Department of Energy study found that by 2028, data centers could consume 12% of the nation’s power — they currently use about 4%. A significant percentage of that power would be for artificial intelligence.

Up to 50,000 megawatts (MW) of new electric generation could be needed by 2030 to power this data center boom, according to an analysis by S&P Global. That’s an amount that could thwart the move to fight climate change, activists worry.

But if data centers switch to a more energy efficient technology, like DeepSeek, residential and other customers could be left paying for new energy infrastructure that is not needed, consumer advocates say.

Utility Dive:

U.S. electricity demand forecasts are growing, driven by data centers, manufacturing and electrification of homes and vehicles. But it takes years to plan and build power plants and transmission lines, posing a potential barrier to the new loads.

However, the Nicholas Institute report finds that the U.S. grid has headroom to handle significant amounts of near-term flexible load. The researchers found that the balancing authorities with the largest potential load integration capacity at 0.5% annual curtailment are the PJM Interconnection at 18 GW, the Midcontinent Independent System Operator at 15 GW, the Electric Reliability Council of Texas at 10 GW, the Southwest Power Pool at 10 GW and Southern Co. at 8 GW.

“Flexible load strategies can help tap existing headroom to more quickly integrate new loads, reduce the cost of capacity expansion, and enable greater focus on the highest-value investments in the electric power system,” the researchers said.

The results also underscore the potential for using flexible load as a complement to supply-side investments, “enabling growth while mitigating the need for large expenditures on new capacity,” the researchers said.

Growing demand for grid access by new large loads has sharply increased interconnection wait times, with some utilities reporting delays of up to seven to 10 years, the researchers said.

Data centers typically have not participated in demand response programs, but that may change because of various factors, including changes in computational load profiles, operational capabilities and broader market conditions, according to the report. AI-focused data centers are able to shift their computational loads to different times and locations, the researchers said.

In October, the Electric Power Research Institute launched the DCFlex Initiative, which aims to show how flexible data center operations can support the grid, the report noted.

For the areas studied, the researchers found that load curtailment would occur for 85 hours a year under the 0.25% curtailment rate, 177 hours under the 0.5% curtailment rate and 366 hours under the 1% curtailment rate. On average, for 88% of the curtailment time, half the new load would continue running, according to the report.

The researchers didn’t consider transmission constraints in their study, which could limit available headroom on the grid, according to the report. They also based their calculations on peak demand levels and didn’t consider reserve margin capacity, a factor that could increase potential headroom.

Still following up on this. There is a school of thought that more efficient AI, following Jevon’s paradox, will just encourage more buildout of the technology. Stay tuned.

6 thoughts on “How Much Power Will Deep Seek Suck? Can US Grid Handle Data Demand?”


  1. It’s not intelligent: it can’t make a joke, connect a canard, it’s just a bigger, more complex database

    That said I too have been following this closely from my 20+ years in the business and the conclusion I’m coming to is the same one I’ve had all along: it’s been a vulture capital scam all along. Musk is particularly good at it: gin up a cool concept, pony in a few bucks and let others, usually the taxpayers, pay the rest. Kinda’ the opposite of hollowing out and selling off as scrap, puffing it up till it implodes, walk away

    I’m not surprised it would (note the language) run on a modest laptop, but that isn’t to say it will. We’re dealing with open source, The Cloud … it’s pretty hazy what is local and what is not. Almost reminds me of old dumb terminal technology, , AS400s, though here robust dumb terminals: basic function workstations working out of a centralized database. Otherwise reserving judgement till I have more information


  2. “Not saying we don’t need to be building – we do, and it should be wind, solar and batteries – but getting too far out over our energy skis has risks, too.”

    Because of the quick turnaround of building PV solar, grid batteries and even wind power plants, I’m guessing a larger portion of the lead time needed these days comes from physically attaching to the grid (building the transmission lines, building the power connect interface, and testing the new systems).

    Also, while this post focused on the reduced need for powering AI, the load from the increased need for air conditioning—which increases nonlinearly for every degree away from the target temperature—will Shirley eat up a lot of future power generation.

Leave a Reply

Discover more from This is Not Cool

Subscribe now to keep reading and get access to the full archive.

Continue reading