Christophe
ChristopheDec 19
Energy

No, your AI prompts aren’t causing an energy crisis

12 min video5 key momentsWatch original
TL;DR

Your individual AI prompts use negligible energy, but data centers powering AI are reshaping electricity infrastructure and hiking bills for communities near them.

Key Insights

1

0.3 watt-hours per queryOpenAI and Google disclosed in summer 2025 that a typical text query costs 0.34 watt-hours (ChatGPT) to 0.24 watt-hours (Gemini)—roughly equivalent to running a 60W light bulb for 18 seconds.

2

Images and videos cost way moreImage generation uses roughly 10 times more energy than text (3 watt-hours), and video generation uses roughly 300 times more (90 watt-hours). Most of that energy goes toward training, not answering individual prompts.

3

One-150,000th daily emissionsA single chatbot prompt accounts for approximately 1/150,000th of your daily carbon emissions—far less than switching one burger to plant-based or driving a few fewer feet.

4

4 percent to 12 percent by 2028Data centers consumed over 4% of US electricity in 2024 and could reach 12% by 2028. Their power sources are 48% more carbon-intensive than the national average because most still run on fossil fuels.

5

Utilities profit from building infrastructureUtility companies profit by building infrastructure, not by efficiently delivering power. When a data center arrives, utilities build new power lines and spread those costs across all customers' electricity bills, even those who don't use the data center.

6

Infrastructure costs hit local communitiesThe real energy crisis isn't individual AI use—it's that data centers are forcing expensive new infrastructure that gets subsidized by regular people through higher electricity bills, particularly in communities hosting those facilities.

Deep Dive

The Energy Myth Gets Numbers

Christophe starts from a personal question: does using ChatGPT carry hidden environmental costs worth worrying about? He discovers companies have been secretive about energy consumption, but finds a leaderboard run by researcher Jaywan Chung at the University of Michigan. When Christophe asks Chung for an average energy cost, Chung hedges—there's a 100x difference between the most and least efficient queries depending on model size, hardware, and request length. Then in summer 2025, the companies actually disclosed numbers. OpenAI's Sam Altman claimed ChatGPT averages 0.34 watt-hours per query. Google estimated Gemini at 0.24 watt-hours. Averaging to roughly 0.3 watt-hours, Christophe puts this in perspective: it's like running a 60-watt light bulb for 18 seconds, brewing coffee for 10 seconds, or using a microwave for 1 second.

The Image and Video Problem

While text queries are cheap, generating images jumps to about 3 watt-hours—10 times more, equivalent to a light bulb for 3 minutes. Video generation is far worse at 90 watt-hours, roughly an hour and a half of light bulb use. These numbers matter because platforms are aggressively pushing image and video generation everywhere. Christophe also learns that the 0.3 figure only counts answering prompts. Training AI models takes energy too. Google's research shows 60% of AI energy goes to answering queries and 40% to training, so the true per-prompt cost is closer to 0.5 watt-hours. Still, expert Andy Masley puts individual usage in context: one prompt equals about one-150,000th of your daily carbon emissions. Switching a single burger to plant-based or driving slightly less saves thousands of times more emissions than skipping a chatbot query.

Data Centers Are the Real Story

Even if individual prompts are tiny, the technology's total footprint is massive. Data centers made up over 4% of US electricity consumption in 2024, with projections reaching 12% by 2028. Christophe discovers over 4,000 data centers across America on datacentermap.com. Zooming into Richland Parish, Louisiana on Google Earth, he sees Meta building a facility that would cover a huge chunk of Manhattan and consume as much electricity as 2 million households. The problem: most data center power still comes from fossil fuels, making their electricity 48% more carbon-intensive than the national average. Tech companies are investing in nuclear and renewable energy to eventually reduce per-prompt emissions, but for now, communities hosting these data centers live with the consequences.

Infrastructure Costs and Bill Shock

Residents near data centers worry about their electricity bills rising. Christophe learns from Ari Pesco at Harvard Law School and Michael Thomas from Clean View that data centers are indeed driving increases—but not because they directly consume so much power. The real culprit is the utility business model. Utilities profit by building infrastructure like power lines and poles. When a data center arrives demanding a gigawatt of power, utilities happily build new transmission infrastructure and spread the cost across all customers in their service area, even those nowhere near the data center. This means electricity bills climb for regular people subsidizing infrastructure they didn't request or benefit from. Utilities have no financial incentive to refuse: they literally make money by building stuff. Data centers are now the perfect excuse to construct massive new infrastructure and charge customers for it.

The Invisible Redistribution

Christophe concludes that individual AI use is not worth stressing over—text queries are negligible, though image and video generation deserve more thought. But the broader story reveals an unfair system. AI training and operation is becoming a sizable chunk of total energy consumption. The consequences aren't evenly distributed. They land hardest on communities hosting data centers, not because those facilities demand extraordinary electricity, but because utilities use them as justification to build expensive new infrastructure and pass costs to ordinary people through higher bills. Those costs may start small and remain invisible to most users, but the technology is radically reshaping infrastructure around us. Someone is already footing the bill, and it's not the tech companies reaping the rewards.

Takeaways

  • Stop worrying about your individual ChatGPT usage—a text query is genuinely negligible in energy and carbon terms. But image and video generation are worth being more conscious about given how aggressively platforms are pushing them.
  • The real problem is systemic, not personal. Data centers are forcing utilities to build expensive new infrastructure that gets paid for by regular people through higher electricity bills, particularly in communities hosting those facilities.
  • Utility companies have a perverse incentive structure—they profit by building infrastructure regardless of actual need. Data centers have become their excuse to expand aggressively and shift costs to consumers.
  • If AI energy is a genuine concern, push for cleaner power sources (nuclear and renewables) and demand transparency about infrastructure costs and who bears them, rather than optimizing your prompt behavior.

Key moments

0:42Numbers finally disclosed

In June, OpenAI CEO Sam Altman published a blog post claiming that the average ChatGPT query uses 0.34 watt hours of energy. And Google released a paper estimating that the median text query on Gemini consumes 0.24 watt hours of energy.

1:30What 0.3 watt-hours actually means

The classic 60W light bulb, if you leave it on for 1 hour, consumes 60 watt hours of energy. So a typical chatbot query is roughly the same as running that light bulb for 18 seconds.

3:55Individual carbon footprint perspective

My best guess right now is that each one of those is probably going to be around 150,000th of your daily emissions. You could save so much more CO2 emissions by just like switching out a single burger for a plant-based burger.

6:20Data center scale in Louisiana

In Richland Parish, Louisiana, Meta is building a data center big enough to cover this much of the island of Manhattan. It'll consume as much electricity as 2 million households.

9:40How utilities profit from data centers

The utility says, Great. I make all my money when I build power lines and then charge it to customers. They suddenly have to build this new power line that wouldn't have needed to be built if not for the data center. When that happens, then someone's bill goes up.

Get AI-powered video digests

Follow your favorite creators and get concise summaries delivered to your dashboard. Save hours every week.

Start for free