Skip to main content

OpenAI just committed over a trillion dollars to AI infrastructure. AMD will supply six gigawatts of GPUs. Nvidia signed for ten gigawatts. Oracle's providing the cloud capacity to run it all.

These numbers are so big they're almost meaningless. A trillion dollars? Six gigawatts? What does that even look like?

Here's what they mean for your association: AI is about to get substantially cheaper. Maybe even free for many of the things you need to do. And that changes everything about how you should think about deploying it.

The Scale of What Just Happened

OpenAI orchestrated the largest AI infrastructure agreements in history through interconnected deals with AMD, Nvidia, and Oracle. The breakdown:

  • AMD: Will supply six gigawatts of GPUs starting late 2026, with OpenAI potentially taking 10% ownership through warrants
  • Nvidia: Signed to provide ten gigawatts via a $100 billion investment
  • Oracle: Secured a $300 billion five-year contract to provide cloud infrastructure, funded partly by Nvidia's backing

The total: 16 gigawatts of compute capacity. That's enough electrical demand to power multiple major cities. And this isn't just OpenAI. The entire industry is building at this scale. Google, Microsoft, Amazon—everyone with AI ambitions is pouring resources into infrastructure.

The Circular Money Problem

The financial structure of these deals is worth understanding, even if you're not an investor. The money flows in a circle: Nvidia invests in OpenAI, which buys Nvidia chips and leases Oracle cloud capacity, which Oracle builds using Nvidia's funding. The same companies act as investors, suppliers, and customers simultaneously.

This kind of arrangement has precedent. Think back to the telecom buildout before the dot-com bust, when companies like Cisco and Nortel had similar circular dealings. Many investors lost their shirts. Companies failed. Consolidation happened.

Here's the thing though: the technology didn't go away. The infrastructure that got built during that bubble powered the internet boom that followed. The same will happen here. Many investors will lose money. Some companies won't survive. But the infrastructure being built right now will support AI applications for decades.

For associations trying to figure out AI strategy, the bubble dynamics don't really matter. What matters is the infrastructure being built and what it means for costs.

What Supply Explosion Actually Means

Basic economics: when supply increases dramatically, prices drop. That's what's happening here.

But demand is also increasing fast. Really fast. That's why these companies are investing so heavily—they see the demand coming. So will prices actually drop if demand keeps pace with supply?

Yes. Because competition is heating up. AMD is challenging Nvidia's monopoly on AI chips. New inference-specific chip architectures are emerging from companies like Grok and Cerebras. Google has advanced TPU designs. More players entering the market means competitive pressure on pricing, regardless of demand levels.

Inference costs—the actual cost of running AI models to do work—are heading toward zero. Not literally zero, but close enough that cost shouldn't be a primary consideration in your decision-making.

The Shrinking Model Revolution

There's another factor making AI cheaper that doesn't get enough attention: models are getting smaller, faster, and therefore cheaper to run.

Here's a pattern that's held true for the past couple years: today's small models are roughly as capable as last year's frontier models. Right now, the best open-source models perform somewhere around GPT-4o level, maybe better in some specific areas. GPT-4o was state-of-the-art 12 months ago.

You can run these open-source models on your own computer. Or you can use them through inference providers at a fraction of the cost of frontier models.

The gap between cutting-edge proprietary models and strong open-source models? Less than 12 months, and shrinking to something more like six to nine months. That gap used to be years.

For most association workloads, you don't need GPT-5 Pro or whatever comes next. You need something that can:

  • Analyze member feedback
  • Generate content recommendations
  • Answer basic member inquiries

Mid-sized models handle all of that just fine. And they cost pennies per task. Sometimes fractions of pennies.

What This Means for Your Budget

Stop thinking about AI as expensive.

The cost of AI inference is eroding toward zero over the coming quarters and years. The massive infrastructure buildout happening right now will make it effectively free for the workloads that matter to associations.

This doesn't mean you won't have costs. You'll pay for platforms, for integration work, for the time your team spends implementing and managing AI systems. But the actual compute cost—the cost of running the AI models themselves—isn't where you should be focusing your concern.

Don't let cost worries prevent you from exploring AI applications. The constraint isn't budget. It's imagination and willingness to try.

Where to Focus Instead

If cost isn't the constraint, what is?

Understanding what problems you're solving. AI is a tool. You need to know what you're trying to build or improve before you deploy it. Don't start with "we need to use AI." Start with "we need to handle member inquiries faster" or "we need to personalize our content" or "we need to reduce administrative burden on staff."

Getting started and putting in the reps. The people who've been working with AI tools for one to two years now have something valuable: momentum. They started with simple tasks. They automated basic workflows. They built literacy around how these systems work. That experience compounds. One small automation leads to ideas for more. You start seeing opportunities everywhere.

Building organizational literacy. AI isn't something one person should handle. It's not an IT project. It needs to be part of how your whole organization thinks about work. That requires time, experimentation, and willingness to learn as you go.

The Incremental Progress That Matters

Not everything has to be a moonshot project. In fact, the most valuable AI work happening in associations right now isn't the big transformational initiatives. It's the incremental improvements.

The unsung heroes are people saving 5%, 10%, 20% on routine tasks. This matters for two reasons. First, the time saved adds up quickly. Second, and more importantly, it seeds the idea machine. When you successfully automate one thing, you start noticing other things that could be automated. You start thinking differently about your work. That creativity compounds over time into real organizational change.

This is worth recognizing and celebrating. When you talk about AI initiatives in your association, don't just focus on the big deployments. Acknowledge the people making small improvements that accumulate into significant impact.

Practical Applications for Associations

You don't need enterprise-scale deployments to get real value. Here are things you can do right now with AI models that cost almost nothing to run:

Member inquiry handling: Route questions to the right department, draft initial responses, flag urgent issues for human attention. A mid-sized model can handle the bulk of straightforward inquiries, freeing your team for complex cases.

Content work: Summarize long documents, extract key points from member feedback, generate drafts of routine communications, repurpose existing content for different channels.

Data analysis: Pull insights from member surveys, identify trends in engagement data, flag unusual patterns that need attention.

Research synthesis: Combine information from multiple sources into coherent summaries, compare different approaches or solutions, identify gaps in your current knowledge.

These applications don't require frontier models. They don't need custom training. They work with off-the-shelf AI that costs almost nothing to run. The barrier to entry is simply deciding to start.

Start Somewhere

A trillion dollars in infrastructure investment sounds like it's about big tech companies jockeying for position. And it is. But it's also about creating abundance where scarcity existed.

The massive buildout happening right now will make AI inference essentially free for the workloads that matter to associations. The barrier to entry isn't cost anymore. It's recognizing the opportunity and committing to getting started.

The associations that understand this will move fast. They'll experiment. They'll fail sometimes, learn from it, and try again. They'll build organizational muscle around AI that becomes a genuine competitive advantage.

The ones waiting for AI to get cheaper are missing the point. It's cheap enough right now. The infrastructure to support you is being built at unprecedented scale, making the future you're waiting for available today.

Pick one workflow. One member touchpoint. One operational inefficiency. Deploy AI there. Learn from it. Build momentum. The cost won't be what stops you—if anything stops you, it'll be hesitation. And hesitation has its own cost that compounds over time.

Mallory Mejias
Post by Mallory Mejias
October 15, 2025
Mallory Mejias is passionate about creating opportunities for association professionals to learn, grow, and better serve their members using artificial intelligence. She enjoys blending creativity and innovation to produce fresh, meaningful content for the association space. Mallory co-hosts and produces the Sidecar Sync podcast, where she delves into the latest trends in AI and technology, translating them into actionable insights.