Hate to be a ‘Debbie Downer’ but all those prompts we’re using to make action figures, Ghibli memes, and the countless less exciting life and business prompts we’re stuffing into ChatGPT and other popular generative AI systems are coming at a cost, and one that may be landing on our doorsteps.

Don’t get me wrong, I’m a huge fan of AI as I think it’s the first technology in a generation to have truly society-altering implications but, if you’re like me, you’ve been reading for some time about the ultra-high energy costs associated with Large Language Models (LLMs), especially trianing them, which according to the IEEE, “involves thousands of graphics processing units (GPUs) running continuously for months.”

AI model training is resource-intensive. Compared to traditional programming, it’s like the difference between playing checkers and interdimensional chess against all the galaxies in the Star Trek universe. The number of parameters these systems examine to learn the essence of something, so they can instantly recognize a dog or a tree, because the models understand what makes up a dog or a tree, is, in human terms, almost inconceivable.

AI understanding is so much more complex than pattern matching. And not only do these models need to understand these things, they also need to know how to replicate representations of trees, dogs, cars, people, and scenarios, and realistically at that.

Feeding the AI monster

It’s a heavy lift, and as Penn State Institute of Energy and the Environment noted in its April 2025 report, “By 2030–2035, data centers could account for 20% of global electricity use, putting an immense strain on power grids.”

However, those energy costs are rising in real time now, and what I never really accounted for is how energy availability is a sort of zero-sum game. There’s only so much of it, and when some part of the grid is eating more than its fair share, the remaining customers have to divvy up what’s left and shoulder skyrocketing costs to keep backfilling their energy needs (as well as the energy needs of the data centers).

In the US, we’re seeing this scenario play out in our pocketbooks as, according to PJM Interconnection (one of the country’s largest energy suppliers), energy bills are rising in response to AI’s overwhelming energy demands.

Data centers, which are dotted across the US, are often responsible for serving the cloud-based intelligence needs of systems like ChatGPT, Gemini, Copilot, Meta AI, and others. The need for supporting live responses and fresh training to keep the models in step with current information is putting pressure on our creaky energy infrastructure.

PJM, it seems, is spreading the cost of supporting these Data Centers across the network, and it’s hitting customers to the tune of, according to this report, as much as a 20% increase in their energy bills.

In need of a solution yesterday

Because we live on AI Time, there is no easy solution. AI development isn’t slowing down to wait for a long-term solution, with OpenAI’s GPT-5 expected soon, Agentic AI on the rise, and Artificial General Intelligence on the horizon.

As a result, energy demand will surely rise faster than we can backfill with better energy management, improved infrastructure, and new resources. The International Energy Agency predicts that in the US, “power consumption by data centers is on course to account for almost half of the growth in electricity demand between now and 2030.”

The issue is exacerbated by a faltering energy infrastructure in which older energy plants are becoming less reliable, and some new rules that restrict the use of fossil fuels. Most experts agree that renewable resources like solar and wind could help here, but that picture is recently far less sunny.

Tilting at wind mill farms

Earlier this month, the Trump Administration issued an Executive Order to “terminate the clean electricity production and investment tax credits for wind and solar facilities.” President Trump famously hates Windmill farms, calling them “garbage.”

As the US pumps the brakes on clean and renewable resources, the current grid will continue to huff and puff its way through supporting untold numbers of meme-generating prompts, requests for business proposal summaries, and AI videos featuring people eating cats that turn into pasta (yes, that’s a thing).

At home, we’ll be opening our latest electricity bills and wondering why the energy bill’s too damn high. Perhaps we’ll power up ChatGPT and ask in a prompt for an explanation. One could only hope that it points you back to this article, but that seems equally unlikely.

You might also like

By

Leave a Reply

Your email address will not be published. Required fields are marked *