editor’s note:
hi readers! this is the first edition of “intris”, a weekly look at the most important information i got out of my conversations with ChatGPT this week. some of my “interest” in energy infrastructure came from watching Sam Altman on the B2G podcast and the heat he took on twitter for snapping back at people who don’t understand how OpenAI will pay off their data center commitments. i also constantly wonder what the future of OpenAI will look like, but most importantly needed to first understand some of the outside factors of data centers like what the constraints are to build them, why we can’t just use nuclear (sorry if you think that’s a dumb question!), and how this buildout will affect the everyday American.
1. Data centers are gonna cost YOU
It all started with one question that kept showing up in discussions around AI: if data centers keep scaling at this rate, what happens to local energy prices? Not in theory but in actual places where these facilities cluster.
A few things became clear very quickly. Data centers pull enormous, round-the-clock loads that force utilities to build more generation and transmission capacity. Those system upgrades raise costs for everyone, not just the companies requesting the power. In some regions this effect is already visible.
Bloomberg’s reporting shows that wholesale electricity prices in zones with heavy data center growth have climbed as much as 267 percent over five years.
The takeaway is simple. When a data center arrives, the energy bill of a regular household is going to more than likely be affected, whether they are techno-optimistic or not. The reason is clear though, the more AI is offered to you, the more you will engage, and the more stakes you will also have to invest in whether indirect or not.
2. The Real Constraint Is Not Land. It Is Power.
After understanding the cost effects, I wondered a lot about how this could affect the environment (which could be a whole blog post in itself). I also thought, if data centers keep being announced at this pace, what is the actual constraint? Is it space? Is it water? Is it real estate? Why are all of these AI company CEO’s being dragged through the mud by the press on these buildouts?
Maybe all of these could be true, but the most pressing constraint is power.
Data centers are being built faster than utilities can expand generation and transmission. Entire facilities are sitting dark because the grid cannot energize them yet.
For example, in Santa Clara, it is being reported that almost 100 MW of data center capacity is built and ready but cannot be turned on until the utility completes significant upgrades.
This is not an isolated event. The pressure appears across the entire energy system. Queue backlogs are growing nationwide, which means thousands of new energy projects are waiting for permission to connect to the grid. Many of these projects sit in line for years while utilities study whether the grid can handle the additional load. Transmission lines also take years to plan and build. Natural gas plants face strict siting limits and community approvals. Renewables need significant storage to deliver steady, round-the-clock power. Each new data center adds even more demand, pushing the grid to solve these challenges at a pace it was never built to handle.
So what could a solution be…nuclear???
3. Can’t Nuclear solve it all?
This was the question I keep seeing on X. If nuclear is clean and reliable and constant, why is it not simply the answer to all of these constraints.
The short version is this: Nuclear is viable for data centers, but scaling it is slow, expensive, and heavily regulated. The United States has very few new reactors scheduled to come online. Small modular reactors are promising but still in early stages, and the timeline to build large nuclear assets does not match the timeline to build data centers.
For nuclear to truly power the next generation of compute, here are a few factors that would need to fall into place at the same time:
Licensing for new nuclear reactors would need to become faster and more predictable, since current approval processes take many years and create uncertainty for any company trying to build new nuclear capacity.
Designs for reactors would need to be standardized so plants can be built in a modular, repeatable way rather than as one off projects that take a decade or more to complete.
Federal financing structures would need to be clear enough to support the large upfront cost of nuclear construction and reduce the financial risk for utilities and private developers.
Data center operators and nuclear developers would need long term power purchase agreements that guarantee stable revenue for reactors and provide data centers with reliable, firm power.
Transmission planning would need to move in parallel with reactor development so nuclear power can reach data center clusters without running into grid bottlenecks.
In other words, nuclear is not the switch I thought it was. It is a system. And that system is not yet aligned with the pace of AI infrastructure growth (as far as we know publicly). But it remains a promising potential long term solutions, and the real question is whether the United States can execute on the timeline required for America to win the AI-race.

