
..a new study from researchers at Lawrence Berkeley National Laboratory and the consulting group Brattle suggests that, counterintuitively, more electricity demand can actually lower prices. Between 2019 and 2024, the researchers calculated, states with spikes in electricity demand saw lower prices overall. Instead, they found that the biggest factors behind rising rates were the cost of poles, wires and other electrical equipment — as well as the cost of safeguarding that infrastructure against future disasters.
“It’s contrary to what we’re seeing in the headlines today,” said Ryan Hledik, principal at Brattle and a member of the research team. “This is a much more nuanced issue than just, ‘We have a new data center, so rates will go up.’”

North Dakota, for example, which experienced an almost 40 percent increase in electricity demand thanks in part to an explosion of data centers, saw inflation-adjusted prices fall by around 3 cents per kilowatt-hour. Virginia, one of the country’s data center hubs, had a 14 percent increase in demand and a price drop of 1 cent per kilowatt-hour. California, on the other hand, which lost a few percentage points in demand, saw prices rise by more than 6 cents per kilowatt-hour.
That runs counter to traditional wisdom. Economics 101 teaches students that if demand rises, prices tend to go up. But electricity isn’t like any other economic market. Most of the costs in the system aren’t from pushing electrons through the grid, or what experts call variable costs. Instead, the largest costs are fixed costs — that is, maintaining the massive system of poles and wires that keeps electricity flowing. That system is getting old and is under increasing pressures from wildfires, hurricanes and other extreme weather.
More power customers, therefore, means more ways to divvy up those fixed costs.
Politicians have also been bickering over the extent to which renewables, such as wind and solar power, raise rates. But the new study shows that the costs of operating and installing wind, natural gas, coal and solar have been falling over the past 20 years. Since 2005, generation costs have fallen by 35 percent, from $234 billion to $153 billion. But the costs of the huge wires that transmit that power across the grid, and the poles and wires that deliver that electricity to customers, are skyrocketing. In the past two decades, transmission costs nearly tripled; distribution costs more than doubled.
And finally, escalating extreme-weather events are knocking out local lines, forcing utilities to spend big to make fixes. Last year, Hurricane Beryl decimated Houston’s power grid, forcing months of costly repairs. The threat of wildfires in the West, meanwhile, is making utilities spend billions on burying power lines. According to the Lawrence Berkeley study, about 40 percent of California’s electricity price increase over the last five years was due to wildfire-related costs.
Lawrence Berkeley National Lab:
4.1. National-average retail electricity prices have tracked inflation in recent years
Over the last five years, national-average retail electricity prices increased sharply in nominal terms—rising by 23 % from 2019 to 2024 (Fig. 3, left panel). However, after adjusting for inflation, prices have been trending downwards for decades and largely remained flat over the last five years outside a bump upwards in 2022 corresponding to the onset of the Ukraine-Russia war. Electricity bills are, of course, impacted by prices and consumption—and so prices, alone, do not fully describe the impact of electricity expenditures on households or the economy more broadly. Yet total electricity costs as a fraction of GDP and residential electricity costs as a fraction of overall household expenditures have also generally been on a downward trajectory.
———–
One key point worth mentoning: a lot of the panic about coming demand from Data Centers might be due to some shaky forecasting by Utilities hoping to squeeze more revenue out of state regulators.
Charles Hua on Open Circuit Podcast:
So I think there’s real questions around whether load forecasts are significantly inflated, and if a PUC doesn’t properly scrutinize that, as we’re seeing in PJM when the utility load forecasts are just stacked on top of each other, that creates some fairly staggering statistics that don’t necessarily pass the sniff test. In PJM, folks are saying that 30 out of 32 gigawatts, roughly 94%, of new load by the end of the decade is only data centers. I think that’s potentially highly suspect, and a lot of that flows from what PUCs across that network are or aren’t doing around scrutinizing utility IRP load forecasts.



Our Austin neighborhood was without power for just over four days during the 2021 lasting deep freeze, but we lost power for six days after a 1 day ice storm in 2022. That summer the city spent a lot of money on tree trimming services. Keeping the overhead lines clear in our tree-ful neighborhood must be a great expense for Austin Energy.
(The first place we lived in central Texas was in a new subdivision with buried power lines. The expense to bury lines in this old neighborhood would be insanely high.)