QR Code
Close

Viewpoint: The Unintelligent Powering of AI

Original publication by John Mandyck • December 10, 2025

America is headed towards an energy crunch due to unprecedented demand from AI data centers. Besides driving up costs, it’s prompting a rush for new power sources to feed the AI beast. With 139 data centers under construction, plus 268 planned, nearly every state is affected. If we continue on this path, homeowners and businesses will bear untenable energy costs in areas with AI data centers. 

So how can we maintain AI leadership without bankrupting homeowners and businesses? The answer may be surprisingly simple: use less energy in buildings to offset the high energy needs of AI. 

New data shows wholesale electricity now costs 267% more than it did just five years ago for many areas near AI data centers. In some places–like Nebraska and Iowa, where Google is building large data centers–new customers may actually be lowering electricity prices in the near term by spreading fixed costs for the grid over more demand. But in many other places, AI data centers are already pushing existing supply, infrastructure and prices to the brink. This is especially pronounced in the Mid-Atlantic and Northeast regions of the country. In Virginia, for example, data centers are estimated to consume 39% of total electricity. The local utility estimates those data centers will drive peak demand more than 75% by 2039, compared to just 10% without. With affordability factoring front and center in national, state and city elections, growing electricity bills will continue to weigh heavily at the ballot box. 
 
At the same time, old power plants are coming back online to fill the more than tripling of power demand anticipated for data centers by 2030. It’s no wonder when it takes ten times more energy to do a ChatGPT search over a standard Google search. Case in point: Constellation Energy is investing $1.6 billion to restart Three Mile Island in Pennsylvania, the site of America’s worst nuclear accident, to power Microsoft’s data centers. In other states, coal-burning power plants are extending operations to power AI while spewing climate emissions and pollution that harms public health. In fact, the U.S. Department of Energy now says “most” coal burning power plants will delay retirement for AI. 

The first, best, cheapest source of energy continues to be the electron not used. Eight years of program data from 21 states shows that improving energy efficiency typically cost 3 to 68 times less than developing new power sources to meet peak grid demand. This approach offers significant potential to power AI while driving down homeowner and business costs.  

Consider the $1.6 billion to restart Three Mile Island. Imagine what an investment of this size could fund or incentivize: power-conserving air conditioners, office equipment, kitchen appliances and more. For comparison, it’s more opportunity than the $1 billion New York utilities and the state’s energy authority will spend next year for energy efficiency and electrification across the state. 
 
Buildings in particular are major opportunities. Nationally, they consume 75% of electricity demand, with common sense conservation approaches that can free up electrons for AI. For example, a tune-up of existing commercial building systems can save as much as 30% energy and lower costs.

Another tactic is “load shifting,” which moves energy demand away from peak periods through energy storage and demand management. In New York, the Brooklyn Queens Demand Management program offers good lessons. Through incentives for efficiency, demand management, solar installation and more, a $1.2 billion investment to expand the local power substation was avoided. 

Given their unprecedented energy needs, the AI industry should compensate homeowners and businesses to use less electricity so more is available for growing data centers. How that’s done will require policy ingenuity to answer real questions on authority and avenue. Will homeowners get a check from Microsoft or Nvidia to buy an efficient dishwasher? Probably not. But could AI tech giants and data center operators pay into a fund to finance building energy efficiency? Maybe yes, especially since such funds already exist in many states. 

With five-year backlogs for gas turbines to build new power plants, and a federal government cancelling renewable power development, energy-efficient buildings can become the go-to source to “power” AI by giving them a bigger piece of the energy pie. AI needs are met, building operating costs are reduced, as are climate emissions. It’s a triple win. 
 
A more intelligent approach is needed to power AI. And it’s already built in the homes and businesses across America. 

John Mandyck is CEO of the environmental non-profit Urban Green Council and Adjunct Professor at the University of Connecticut School of Business 

Three real estate sustainability risks to manage in 2025 

Here are the three real estate sustainability risks to manage in 2025.

arrow_clover

Grid Ready: Powering NYC’s All-Electric Buildings

Grid Ready looks at how power is delivered to NYC and how heat pumps will change electricity demand.

arrow_clover

Explore our NYC building data hub

Learn more about NYC's buildings and their energy use in this interactive data hub.

arrow_clover