Business

AI could consume nearly half of global datacentre power by year-end, new analysis warns

Artificial intelligence systems could account for nearly half of all power consumption in global datacentres by the end of this year, according to new research — fuelling growing concerns over the environmental impact of AI technologies.

The analysis, conducted by Alex de Vries-Gao, founder of the Digiconomist tech sustainability platform, suggests that AI could represent up to 49% of total datacentre energy use by the end of 2025. The study is due to be published in the energy journal Joule and comes just days after the International Energy Agency (IEA) forecast that AI could require nearly as much electricity by the end of the decade as Japan consumes today.

Based on electricity drawn by chips from major AI hardware providers including Nvidia, AMD, and Broadcom, the research estimates that AI currently accounts for around 20% of total datacentre energy consumption — already a significant slice of the 415 terawatt hours (TWh) used by data centres globally last year, according to the IEA (excluding cryptocurrency mining).

De Vries-Gao factored in variables such as hardware efficiency, cooling systems, and workload intensity to estimate AI’s growing share of demand. He warns that the pace of expansion in AI hardware and model training could soon drive AI-specific energy consumption to 23 gigawatts — more than twice the total power usage of the Netherlands.

“These innovations can reduce the computational and energy costs of AI,” said De Vries-Gao. “But efficiency gains can also encourage wider adoption — and ultimately more energy use.”

The analysis comes amid a rapid surge in sovereign AI initiatives, with countries investing in their own AI infrastructure — a trend likely to increase global hardware demand. One example cited is Crusoe Energy, a US-based startup that recently secured 4.5GW of gas-powered capacity for new datacentres, with OpenAI reportedly a potential customer via its Stargate joint venture.

On Thursday, OpenAI confirmed the launch of its first Stargate facility outside the US, in the United Arab Emirates. De Vries-Gao warned such developments could exacerbate dependence on fossil fuels, undermining the green ambitions of leading AI companies.

Both Microsoft and Google have admitted that their aggressive AI expansion efforts are threatening their internal environmental targets, as the energy footprint of AI workloads grows beyond projections.

Despite growing concerns, De Vries-Gao said data on AI’s operational power consumption remains scarce, calling the sector “an opaque industry.” While the EU AI Act will soon require companies to disclose training energy consumption, it does not mandate reporting on the energy used to run AI models daily — which is increasingly a major contributor to ongoing emissions.

“We urgently need more transparency on the energy cost of AI,” said Prof Adam Sobey, sustainability director at the Alan Turing Institute, the UK’s national AI research body.

Sobey added that although the front-end energy consumption of AI is high, the technology could still play a role in reducing carbon emissions elsewhere, particularly in sectors such as transport and energy, where AI-powered optimisation tools can lead to significant savings.

“I suspect we don’t need many very good use cases to offset the energy being used on the front end,” Sobey said.

As governments, investors, and companies push further into AI development, the findings underscore the need for greater visibility, regulation, and innovation to balance AI’s transformative promise with its growing environmental footprint.

Read more:
AI could consume nearly half of global datacentre power by year-end, new analysis warns