Skip to main content

By 2028, AI companies will need twice New York City's peak electricity demand just for training their models. Just for training. Not for the millions of daily users running queries, generating images, or analyzing documents. Just to teach these systems how to think.

But here's the kicker: these projections are probably missing a zero or two.

While everyone debates whether AI will achieve consciousness or steal our jobs, a massive infrastructure crisis is building beneath the surface. Your association's AI costs might be about to explode—and availability might plummet. The culprit isn't technology or regulation. It's something far more basic: we're running out of power.

The Real Numbers Are Staggering

The official projections sound alarming enough. Frontier AI companies will need 2-gigawatt data centers by 2027 and 5-gigawatt facilities by 2028 for single model training. Total U.S. frontier AI demands could reach 20-25 gigawatts by 2028.

But these numbers focus on training—the process of teaching AI models. The real explosion comes from inference, which is what happens every time someone uses an AI tool. And inference demand will absolutely dwarf any current estimates.

Here's why the math breaks every projection model: AI creates a value feedback loop. The better these systems get, the more people use them. The more people use them, the more compute power we need. The more compute power we deploy, the better the systems become. It's exponential growth feeding exponential growth.

Think about your own AI usage. A year ago, you might have experimented with ChatGPT occasionally. Now? AI assists with emails, documents, research, images, and analysis. Multiply that behavior change by every knowledge worker, every student, every creative professional. Then add autonomous agents that will run continuously in the background. The infrastructure demands don't just grow—they explode.

We're not talking about adding a few more data centers. We're talking about reimagining the entire power grid.

Geography Suddenly Matters Again

For decades, the tech industry has pretended geography doesn't matter. Your data could live anywhere. Your compute could happen in any cloud region. Distance was dead, or so we thought.

AI infrastructure shatters this illusion. Unlike traditional computing that can be distributed globally, AI training requires massive, concentrated power in specific locations. This constraint is actually an opportunity in disguise.

Data centers can be built in less desirable locations—extreme climates, remote areas, places where land is cheap and populations are sparse. More importantly, we can co-locate power generation with consumption. Build the nuclear reactor next to the data center. Generate the power where you need it.

This approach solves multiple problems. Transmission losses disappear when you don't need transmission. Grid limitations become irrelevant when you're off the grid. Even intermittent power sources like solar and wind become viable when paired with flexible AI workloads that can scale up during peak generation.

The old model of generating power in one place and consuming it hundreds of miles away doesn't work for AI scale. The new model puts generation and consumption in the same location, creating AI-specific power ecosystems.

The Nuclear Option Isn't Optional

The scale of power needed for AI can't be met by solar panels and wind farms alone. While renewable sources will play a role, they can't provide the consistent, massive baseload power that AI infrastructure demands. Coal is environmentally disastrous. Natural gas still produces significant emissions.

New nuclear technology, particularly Small Modular Reactors (SMRs), offers a different path. They're smaller, safer, and can be deployed much faster than traditional nuclear plants. They're manufactured in factories and assembled on-site, reducing construction time from decades to years.

France already generates the majority of its electricity from nuclear power safely and efficiently. They've proven the model works at national scale. Meanwhile, China is racing ahead with new nuclear deployments, adding capacity at a rate that dwarfs U.S. efforts.

The infrastructure math is unforgiving. Meeting AI's power demands while maintaining environmental commitments requires every option on the table, including modern nuclear technology.

The Coming Price Stratification

Current AI pricing already shows massive disparities. Premium models from companies like Anthropic can cost 20 times more than open-source alternatives running on efficient infrastructure. When infrastructure constraints hit, these gaps won't just persist—they'll explode.

We're heading toward a two-tier AI market. Organizations that can afford premium access will get the most capable models with guaranteed availability. Everyone else will compete for leftover capacity, facing rate limits, quality degradation, and service interruptions.

This isn't speculation. We've seen this pattern before in other infrastructure-constrained markets. When demand outstrips supply, prices don't increase linearly—they spike exponentially for guaranteed access while basic service degrades for everyone else.

An Interstate Highway Moment

What we're facing isn't an incremental infrastructure challenge. It's an interstate highway moment—a need for massive, coordinated investment that only comes once in a generation.

The interstate highway system transformed America's economy by enabling efficient transportation of goods and people. The AI infrastructure build-out will be similar in scale and impact. It requires federal investment, state coordination, and local implementation. It needs environmental reviews, land acquisition, and workforce development. Most critically, it needs to happen now.

Every month of delay compounds. While we debate and study, other countries build. While we argue about permits, global competitors lock in advantages that will last decades. The infrastructure we build (or fail to build) in the next five years will determine AI leadership for a long time after that.

What Associations Must Do Now

The window for action is narrowing. Here's what associations need to prioritize:

Build hybrid AI strategies today. Don't bet everything on one AI provider or model. Create architectures that can switch between premium and efficient models based on the task. Use expensive models only for high-value work while routing routine tasks to cost-effective alternatives.

Advocate for infrastructure investment. Your members' future depends on policy decisions being made today. Push for expedited permitting for nuclear power. Support transmission corridor development. Advocate for federal lands to be made available for AI infrastructure. 

Educate members about the coming constraints. Most organizations don't understand that AI availability isn't guaranteed. Help members understand why they need to build flexible systems, lock in partnerships, and prepare for a world where AI access might be rationed or priced dynamically.

Invest in efficiency expertise. As AI becomes infrastructure-constrained, the organizations that can do more with less will win. Build expertise in model optimization, efficient prompting, and workload management. These skills will become as valuable as AI expertise itself.

The Future Is Physics, Not Philosophy

The AI revolution depends as much on infrastructure as innovation. Power generation, transmission capacity, and cooling systems will shape AI availability and pricing in ways most organizations haven't considered.

The numbers are clear: current infrastructure projections are likely too conservative by orders of magnitude. Inference demands will dwarf training requirements. Geographic constraints will reshape how we think about data center placement. And the organizations that prepare for these realities now will have significant advantages.

For associations, understanding these infrastructure challenges is essential for guiding members through the AI transition. By building flexible AI strategies, advocating for necessary infrastructure investments, and educating members about coming constraints, associations can help ensure their communities have the access they need to remain competitive.

The infrastructure buildout required for AI represents one of the largest coordinated efforts in modern history. The decisions made in the next few years will determine AI accessibility for decades to come. Associations that recognize this shift and prepare accordingly will be best positioned to serve their members in an AI-powered future.

Mallory Mejias
Post by Mallory Mejias
August 6, 2025
Mallory Mejias is passionate about creating opportunities for association professionals to learn, grow, and better serve their members using artificial intelligence. She enjoys blending creativity and innovation to produce fresh, meaningful content for the association space. Mallory co-hosts and produces the Sidecar Sync podcast, where she delves into the latest trends in AI and technology, translating them into actionable insights.