Here’s your ONE drop:

OpenAI wants to consume 250 gigawatts of electricity by 2033.

Some analysts say that’s about a fifth of America’s entire electric-generation capacity. Just them. One company.

That’s roughly the annual electricity production of major countries like France, South Korea, or Brazil, if the target is achieved.

You’d need around 250 new nuclear reactors to match that power, and at current cost estimates we’re talking about $12.5 trillion. And for context, about 12.5 GW of natural-gas capacity is expected to come online this year, one of the biggest additions ever. OpenAI's target is twenty times that...

Solar could theoretically handle it, but recent policy changes in the U.S. removed key tax credits and added new permitting hurdles. So… that’s going well.

Even if the exact numbers shift, the scale doesn’t. Training and running these models takes mind-bending amounts of power, and most of us never think about it. We just ask ChatGPT to write an email, edit a photo, summarize a meeting. Like it runs on air.

It doesn’t.

It runs on grids. On water for cooling. On mining for chips. On subsidies that someone, somewhere, pays for.

The AI boom has felt almost abstract, like it’s happening in a parallel universe of venture capital and tech billionaires.

But energy isn’t free. Someone pays.

// Ann


Keep Reading

No posts found