Why measuring AI’s energy use matters as much as is economic impact
The current AI revolution resembles those early, energy-intensive transitions more than the later, efficiency-oriented ones. AI systems deliver strong economic returns but rely on massive computational power, continuous data processing, and energy-hungry infrastructure. Training large models and running inference at scale consume vast amounts of electricity, much of it concentrated in data centers that require constant cooling and backup power.
New research published in the journal Energies suggests that the long-term sustainability of the AI revolution cannot be measured by economic growth alone and may already be at risk if energy productivity fails to keep pace.
Titled “A New Lens on the Sustainability of the AI Revolution,” the research introduces a novel analytical framework to assess whether AI-driven growth is compatible with climate goals and long-term economic efficiency. Instead of focusing solely on GDP or carbon emissions, the study centers on how much economic value societies generate per unit of energy consumed.
Measuring AI through energy productivity, not growth alone
The study introduces a new indicator called Economic Productivity of Energy, or EPE. The metric measures how much gross domestic product is generated for each unit of energy consumed. According to the authors, EPE offers a clearer picture of sustainability than GDP growth alone because it captures the relationship between economic expansion and energy efficiency.
The researchers trace EPE trends across countries and over time, drawing comparisons between different phases of industrial development. One of its key insights is that not all technological revolutions improve energy productivity at the same pace. Some drive economic growth while temporarily worsening energy efficiency, especially when technologies are deployed before their energy implications are fully understood.
Historical analysis plays a central role in the paper. During the first industrial revolution, the widespread adoption of steam power dramatically increased economic output but led to a sharp decline in energy productivity. Energy use surged faster than efficiency gains, and it took decades before improvements in engineering and scientific understanding restored balance. By contrast, later technological shifts such as electrification and microelectronics were grounded in more mature scientific frameworks, allowing energy productivity to rise steadily alongside economic growth.
The current AI revolution resembles those early, energy-intensive transitions more than the later, efficiency-oriented ones. AI systems deliver strong economic returns but rely on massive computational power, continuous data processing, and energy-hungry infrastructure. Training large models and running inference at scale consume vast amounts of electricity, much of it concentrated in data centers that require constant cooling and backup power.
This dynamic creates a risk that AI-driven GDP growth may outpace gains in energy efficiency, leading to stagnation or decline in EPE. If left unchecked, such a trend would undermine the sustainability of AI-led economic expansion, even if headline growth figures remain strong.
Advanced economies face the highest AI energy risk
The study’s cross-country analysis reveals stark differences in how the AI revolution may affect energy sustainability across levels of development. Advanced economies dominate global GDP and account for a large share of energy consumption, making them particularly exposed to changes in energy productivity.
Over the past four decades, advanced economies have shown a near-linear increase in EPE, reflecting steady improvements in efficiency driven by technology, regulation, and structural change. This trajectory has allowed them to grow economically while gradually reducing the energy intensity of production. However, the study warns that AI could disrupt this pattern.
Because AI adoption is fastest and most intensive in advanced economies, these countries face the greatest risk of an energy productivity slowdown. Large-scale AI deployment in sectors such as finance, manufacturing, logistics, and digital services could significantly increase electricity demand. Without parallel advances in energy efficiency, the net effect could be a flattening or reversal of long-standing EPE gains.
Developing economies present a different picture. Their improvements in EPE have historically lagged behind those of advanced economies, reflecting slower industrial upgrading and less efficient infrastructure. While AI could help leapfrog certain stages of development, the study cautions that without access to efficient energy systems, developing countries may struggle to capture AI’s benefits without exacerbating energy strain.
Underdeveloped economies show high apparent EPE values, but the authors stress that this is misleading. In many cases, high EPE reflects low industrial activity rather than genuine efficiency. As these economies industrialize and adopt AI technologies, their energy productivity could decline sharply unless efficiency is built into development strategies from the outset.
Across all groups, the study underscores that AI’s energy impact is not evenly distributed and that existing inequalities in infrastructure, governance, and scientific capacity will shape outcomes. Countries with strong regulatory frameworks and transparent energy reporting are better positioned to manage the transition, while others risk locking in inefficient growth patterns.
Why governance will decide AI’s sustainability outcome
The sustainability of the AI revolution is not predetermined by technology itself. Instead, it depends on how quickly scientific understanding, efficiency standards, and governance frameworks adapt to AI’s energy demands.
The authors highlight a paradox at the heart of AI deployment. On one hand, AI has the potential to improve energy efficiency by optimizing production processes, reducing waste, improving logistics, and enabling smarter power systems. On the other hand, the infrastructure required to support AI can dramatically increase baseline energy consumption, particularly when efficiency is not prioritized.
Without transparency, the true energy costs of AI remain largely hidden. Many companies and institutions do not publicly disclose the energy consumption of AI training and inference. As a result, policymakers and markets lack the information needed to assess trade-offs or design effective incentives.
To address this gap, the study calls for systematic monitoring of EPE as a core sustainability indicator. Regular reporting of energy productivity alongside GDP growth would make it harder to ignore declining efficiency. The authors argue that linking innovation incentives to productivity-adjusted energy performance could steer AI development toward more sustainable pathways.
Earlier industrial revolutions suffered energy productivity collapses partly because technologies were deployed before their physical limits were understood. AI, the authors suggest, is currently in a similar phase. Rapid deployment has outpaced understanding of its long-term energy implications. Closing this gap will require investment in energy-efficient algorithms, hardware optimization, and system-level design.
Regulation is another decisive factor. Just as environmental standards helped curb pollution during past industrial transitions, energy efficiency standards and disclosure requirements could shape AI’s trajectory. Without them, market forces alone may favor short-term profitability over long-term sustainability.
- FIRST PUBLISHED IN:
- Devdiscourse

