The AI Energy Crunch: Bigger Models, Bigger Consequences

Artificial intelligence (AI) is rapidly becoming one of the most power-hungry technologies on the planet. New research suggests that by the end of 2025, AI’s electricity use could surpass that of Bitcoin mining, and even rival the energy consumption of entire nations like the UK or Ireland.
Driven by a surge in generative AI tools and services, the demand for AI chips and data centers has exploded. As more organizations race to build smarter models, the specialized hardware powering them, especially Nvidia’s H100 accelerators, is being deployed at a massive scale. These chips are central to the growing AI infrastructure and require significant electricity to operate.
A single Nvidia H100 chip consumes around 700 watts of power when running complex models. Multiply that by the millions of units now being shipped worldwide, and the numbers start to add up quickly. According to researcher Alex de Vries-Gao from Vrije Universiteit Amsterdam, the AI hardware produced in just 2023 and 2024 could require between 5.3 and 9.4 gigawatts of electricity, enough to outstrip Ireland’s total national electricity use.
But the trend doesn’t stop there. De Vries-Gao’s study, published in the journal Joule, highlights how manufacturing capacity is doubling to keep pace with demand. Taiwan Semiconductor Manufacturing Company (TSMC), a major producer of AI chips, is rapidly expanding its use of CoWoS technology, a method that allows high-speed memory and processors to be packaged together. TSMC already doubled its CoWoS production in 2024 and plans to do so again in 2025.
Read More: Can Bitcoin Ever Be Green? The Future of Sustainable Cryptocurrency Mining
If current growth continues, AI systems could require up to 23 gigawatts of power by the end of 2025. That’s roughly equal to the UK’s average electricity demand and more than the entire global Bitcoin mining sector consumes today.
This surge in AI energy consumption has generated fear among climate researchers and policymakers alike. The International Energy Agency (IEA) has warned that AI could soon double the electricity use of all data centers globally. While companies are working to improve energy efficiency and invest in renewable power, those gains are being quickly outpaced by the scale and speed of new deployments.
Adding to the challenge is the ongoing “arms race” in chip design. New generations of AI chips require increasingly complex manufacturing, and newer packaging methods like TSMC’s CoWoS-L face low production yields, making them less efficient to produce.
Some tech giants are now grappling with what they describe as “power capacity crises.” Google, for example, has struggled to secure enough electricity to fuel its growing AI infrastructure. In some cases, companies have turned to fossil fuel sources, including natural gas, to meet their needs, with one project securing 4.5 gigawatts of natural gas capacity just for AI workloads.
The environmental impact of AI varies widely depending on where the data centers are located. In coal-heavy regions like West Virginia, carbon emissions from running AI models are nearly double those in states like California, where renewables are more common.
Yet, there remains a serious transparency gap. Most tech companies don’t disclose where or how their AI systems operate. Without clear data, it’s difficult for regulators, researchers, and the public to fully understand or manage the climate implications of AI’s growth.
Also Read: Bhutan to Mine Green Crypto to Boost its Economy
As AI continues to expand, its electricity demands are set to challenge not only the power grid but also the global push for climate accountability. The question is no longer whether AI will consume more power, but how much more, and at what cost.
For more relevant and exciting news, follow our ESG Tech News.
Source: TECHSPOT