In 1997, world chess champion Garry Kasparov sat across from IBM's Deep Blue supercomputer in one of history's most famous matches. Kasparov's brain operated on roughly 20 watts of power—about the energy you'd get from eating a single hot dog. Deep Blue, meanwhile, consumed enough electricity to equal 250,000 hot dogs worth of energy. For the exact same game.
This staggering inefficiency remains at the heart of today's AI infrastructure challenges. It's why your association's AI tools crash during critical moments, why your overall AI spending keeps growing despite falling per-token prices, and why tech companies are seriously considering building nuclear reactors just to keep pace with demand.
Mark Hersam, Chair of Materials Science and Engineering at Northwestern University, thinks the solution lies in copying nature's design. His lab has already demonstrated computer chips that mimic brain architecture, performing machine learning tasks with 100 times less power than traditional computers. His recent TED talk dives deep into how the approach could fundamentally reshape how associations access and afford AI technology.
By 2027, artificial intelligence will consume 100 trillion watt hours of electricity annually—equivalent to the entire country of Argentina. Data centers will require 200 billion gallons of water just for cooling. For associations, this could translate into service interruptions, rate limits, escalating subscription costs, and the frustrating reality that the most powerful AI capabilities remain locked away—existing but inaccessible due to power constraints.
The culprit is something called the Von Neumann bottleneck, a relic from the 1940s when we first designed computer architecture. Traditional computers separate memory storage from processing units. Every piece of data must travel back and forth between these two components, consuming massive amounts of energy in the process.
Think of it like having your membership database stored in one building while your engagement analytics tools sit in another city. Every single query requires shuttling information across that distance. Now multiply that by billions of operations per second, and you understand why AI infrastructure is buckling under demand.
While this shouldn't necessarily stop you from working with AI vendors today, understanding these constraints helps explain the growing pains and positions you to make informed decisions as the technology evolves.
Your brain doesn't have a Von Neumann bottleneck. Instead of separating memory and processing, your neurons handle both functions in the same place. No shuttling data across vast distances. No wasted energy on transportation. Everything happens locally, dynamically, and efficiently.
This biological architecture allows your brain to perform pattern recognition, creative problem-solving, and complex reasoning while consuming less power than your desk lamp. When neurons need to strengthen a connection, they do it on the spot. When they need to access a memory, it's already there.
The human brain achieves this through approximately 86 billion neurons, each connected to thousands of others through synapses that can strengthen, weaken, or reconfigure based on need. The design is adaptively efficient, constantly optimizing its own architecture for the tasks at hand.
Evolution spent millions of years perfecting this design. We spent 80 years building computers that work completely differently.
Scientists at Northwestern and research labs worldwide are developing neuromorphic computing—chips that mimic the brain's architecture by co-locating memory and processing functions.
Hersam's nano-electronic devices use electric fields to reconfigure themselves, similar to how synapses adapt in your brain. Instead of fixed circuits shuttling data between distant components, these systems process information where it's stored. The lab demonstrations show machine learning happening at a fraction of traditional power requirements.
Current AI models that require megawatts of power could potentially run on kilowatts. Tools that today demand massive data centers could one day operate on desktop hardware. The AI capabilities currently reserved for tech giants with billion-dollar infrastructure budgets could become accessible to every association, regardless of size.
Working prototypes exist in laboratories today, demonstrating the core principles at smaller scales. The challenge now is moving from laboratory demonstrations to commercial production—a journey that typically takes 5-10 years but could accelerate given the urgent demand.
While neuromorphic chips won't hit the market tomorrow, understanding this trajectory helps with strategic planning:
Short-term reality (1-2 years): Infrastructure constraints will persist. Build redundancy into your tech stack by working with multiple AI providers. The energy crisis will intensify before improving, meaning occasional outages, rising costs, and limited access to cutting-edge capabilities. This shouldn't discourage AI adoption—just inform your approach.
Medium-term opportunity (3-5 years): As neuromorphic technologies begin commercialization, early adopters will gain advantages through lower costs and better reliability. Start tracking which vendors are investing in efficient AI infrastructure. Consider how dramatically reduced computing costs might change your member services.
Long-term transformation (5-10 years): The efficiency gains from neuromorphic computing will enable new categories of AI applications. Real-time, personalized AI assistants for every member become feasible. Predictive analytics that process your entire historical dataset instantly become affordable.
The brain's energy efficiency is one example of biology outperforming traditional computing. Microsoft researches storing data in synthetic DNA, potentially achieving storage densities millions of times greater than current hard drives. Amazon optimizes warehouse routing using algorithms inspired by ant colonies finding food. Neural networks—the foundation of all modern AI—were themselves inspired by studying how brain neurons connect and fire.
When faced with seemingly impossible technical challenges, there's often value in asking: how does nature solve this? The answers won't always translate directly to your association's operations, but they can spark unexpected insights about efficiency, adaptation, and resilience.
Neuromorphic computing represents a significant shift in how we approach artificial intelligence. When AI becomes 100 times more efficient, it becomes more accessible and affordable for organizations of all sizes.
For now, associations should continue their AI initiatives while staying informed about infrastructure developments. Monitor which vendors are investing in efficiency improvements. Build flexibility into your technology contracts. Understand that today's limitations are temporary, even if the timeline remains uncertain.
The gap between biological and silicon efficiency has persisted since computing began. Hersam and his colleagues are working to close it. Associations that understand this trajectory will be best positioned to leverage these advances when they become commercially available.
Start by auditing your current AI dependencies and identifying which use cases truly require cutting-edge capabilities versus those that can run on efficient alternatives. Build organizational flexibility to adopt new technologies as they emerge. Most importantly, continue experimenting with AI tools today while keeping an eye on tomorrow's possibilities.