As Artificial Intelligence continues to become a bigger part of our everyday lives, sustaining this progress requires building more data centers. That’s why, in January, the company SoftBank launched an initiative to fund AI infrastructure growth, joining many other tech companies investing in their own AI ventures.
While this expansion is promising, it comes with a challenge: increasing energy consumption with power grids that are too old. Data centers already consume 10 to 50 times more energy per square foot than typical commercial office buildings—a demand expected to rise even further as AI continues to evolve.
The question of whether our current energy grids will be able to keep up with this demand becomes important.
While replacing an entire grid is prohibitively expensive, new innovations and breakthroughs in energy infrastructure technology and AI training methods will alleviate much of the problem at a fraction of the cost.
US Data Center Demand is Set to Triple by 2030, and Will Rely on Decades-Old Power Grids
By 2028, data centers will consume approximately 7 to 12% of total U.S. electricity, double the 4.4% just two years ago, according to the Department of Energy. Other private organizations like the MIT Sloan School of Management and Goldman Sachs also deliver similar predictions of increased energy usage at the pace data centers are expanding.
So far, that’s proven to be problematic in the real world. A recent Bloomberg study found that the expansion of data centers is affecting the amount of electricity that flows into households, particularly in regions where data centers are concentrated, like Northern Virginia, extending even to surrounding rural areas. While the supply of available energy is the primary issue causing these energy flow disruptions, the problem is exacerbated by electrical infrastructure, which creates a supply of energy that is either too high or too low for the demand from a household, resulting in flickering lights to blackouts, and even house fires.
The situation will get even worse if we don’t come up with a way of updating energy grids that are decades old, with 70% of the total infrastructure older than 25 years.
There is Much Room for Improvement in the Demand for AI Places on Power Grids
But thanks to new innovations in energy infrastructure, full-blown grid replacement is not the only tool in our toolbox. For example, Grid Enhancing Technologies (GET) uses hardware and software to boost power transmission capacity, efficiency, and reliability. In practice, GETs could boost energy grid capacity by rerouting the flow of electricity to congested areas of the grid during times of peak demand, balancing power flows during off-demand hours, and using sensors to continuously recalculate the flow of power based on real-time weather conditions and demand spikes.
According to a Department of Energy report from April 2024, some GETs can be deployed in just a couple of years with predicted improvements averaging 10-30% at just a fraction of the cost.
On the other side of the power equation, data centers training artificial intelligence are guzzling power inefficiently. The growing demand for data centers comes from AI models needing more powerful chips and processors, which perform computational tasks, to improve their predictions in their learning process. As they require more chips and other inputs, the models become bigger, driving the use of energy and computing power.
Many researchers are working to make AI training more energy efficient using a method called Distributed Low-Communication Training of Language Models (DiLoCo). This approach reduces the amount of data that needs to be transferred between computing clusters, significantly cutting energy use. DiLoCo allows AI models to be trained on smaller, more efficient setups rather than relying solely on large data centers. Trials have shown that this method can make AI training more sustainable, marking a major breakthrough in the field.
Build More Power, but Don’t Forget about the Rest of the Energy Equation
In the face of increasing energy demand driven by artificial intelligence and data center growth, the strain on America’s aging energy infrastructure is undeniable. Yet a slowdown in America’s investment in the AI race would be a mistake. Instead, the US should build more power and improve the transmission side of the energy equation while we’re at it.
While a complete overhaul of the grid remains economically out of reach, technologies like GETs offer a pragmatic and cost-effective way to optimize the supply of energy by improving efficiency without requiring a full rebuild. On the demand side, innovations like DiLoCos show that AI itself can become part of the solution by dramatically reducing the energy required to train models.
If we don’t act soon, energy supply will deteriorate, straining AI development—but these advancements show that while the challenge is great, there are ways to address the growing energy demand.
Co-authored by Tech and Innovation Research Fellow Pablo Garcia Quint