The organizations Conor Grennan has trained on generative AI are undeniably impressive: OpenAI, McKinsey, Google, JP Morgan, NASA. But perhaps more impressive than teaching Fortune 500 executives is what happened when he brought his 16-year-old son Finn to teach AI workshops in Nepal—proving that from Silicon Valley to the Himalayas, our collective response to generative AI—the excitement, the fear, the questions—is strikingly universal.
We recently interviewed Conor and Finn on the Sidecar Sync podcast about their Nepal adventure, and their insights offer powerful lessons for any organization grappling with AI education.
An Unlikely Journey
On a plane bound for Nepal, Conor and Finn Grennan weren't sure what to expect. They'd prepared their presentations, discussed their approach, but teaching AI in another country? Neither knew how it would land.
For Conor, Nepal held deep personal significance. Years before becoming Chief AI Architect at NYU Stern School of Business, he'd lived there, founding an organization that rescued trafficked children and reunited them with their families. It was there he met his wife, an American volunteer. As Finn pointed out during our conversation, he exists because of Nepal—his parents met there while doing nonprofit work.
Now he was returning with Finn to teach artificial intelligence to educators, NGO leaders, and students. Their backgrounds made them uniquely qualified for this moment: Conor, who brought a non-technical perspective after diving into AI in 2023, and Finn, whose high school had initially banned LLMs like ChatGPT entirely, understood firsthand the barriers people face when approaching this technology.
When Students Become Teachers
Their workshop structure was deliberate. Conor would start, explaining why our brains struggle to process AI as anything more than a sophisticated search engine. He'd lay the groundwork for thinking differently about this technology. Then, Finn would take over.
That's when the room's energy shifted.
Here was a 16-year-old American teenager, standing before a packed room of teachers, NGO leaders, and students who had given up their one day off to attend. He talked about how the education system—teacher teaches, student learns, student gets tested—hadn't changed in hundreds or even thousands of years. He explained how AI disrupted this ancient cycle because suddenly, you couldn't tell if you were testing the student or the AI.
It's one thing to hear about transformative technology from an established professional like Conor. But hearing it from a 16-year-old who faces the same homework pressures and school restrictions as their own children? The audience was listening, recognizing that if this teenager could master AI, perhaps they could too.
The Universal Question
As Finn shared specific use cases—how teachers could use AI for lesson planning, how students could use it for language practice—hands shot up around the room. But one question came up more than any other, the same question we've all heard in discussions about AI and education:
"Will this replace critical thinking?"
Finn understood their fear intimately. He'd lived it. When his school banned generative AI, he explained, it became like the trees on a ski slope—an obstacle to avoid rather than a tool to embrace. He shared his ski analogy: when you're skiing through a forest, you need to focus on the path, not the trees. By banning AI entirely, schools had turned it into a tree, something students crashed into at 11 PM when papers were due, using it to cheat rather than learn.
The room nodded collectively. Whether in Nepal or New York, the pattern was identical: ban the technology, and students find it anyway, but only in moments of desperation rather than exploration.
Finn then shared one of his early AI experiences that captured the technology's potential. At 14, he had an assignment to write a journal entry from the perspective of a German immigrant arriving in America in the 1850s. Instead of just writing about it, he asked ChatGPT to become that immigrant. Suddenly, he was having a conversation with "Heinrich," who described the smell of sausages from the dock, the sight of horses pulling carriages, the mix of hope and fear in his heart. The AI transformed a routine homework assignment into an immersive historical experience. This wasn't cheating—it was learning through dialogue, bringing history to life in ways a textbook never could.
The Power of the Unexpected Messenger
As the workshops progressed, Conor noticed something profound. Having Finn as his co-presenter created a dynamic he hadn't anticipated.
When NGO leaders saw this teenager confidently navigating AI, when teachers watched him demonstrate practical classroom applications, when students saw someone their age teaching adults, it shattered preconceptions about who can learn and teach this technology. The generational divide that often defines technology adoption simply evaporated.
"If Finn can figure this out at this level of his experience, truly anyone can do it," Conor observed during our interview. The message wasn't about intelligence or technical prowess—it was about willingness to explore.
Fears Without Borders
What struck both Conor and Finn most deeply was the consistency of concerns across cultures. The fear levels, Finn noted, were completely identical between Nepal and the United States. Despite vast cultural differences, despite being literally on opposite sides of the world, educators everywhere worried about the same things: Would students stop thinking? Would writing skills atrophy? Would creativity die?
These weren't Nepali fears or American fears. They were human fears about what happens when a technology that seems like magic enters the classroom.
Conor had trained some of the world's most sophisticated companies, but here in Nepal, he was reminded of a fundamental truth: AI isn't really about the technology. It's about people. And people everywhere respond to massive change with the same mix of excitement and trepidation.
The Nepali educators weren't just politely listening either. They pushed back, asked hard questions, and engaged intellectually with the challenges AI presented. Conor noted they were "intellectually honest" about their concerns, asking why they should even allow this technology that might undermine the critical thinking skills they worked so hard to develop in students.
Starting With Need, Not Capability
Throughout their presentations, Conor emphasized an approach that resonated across cultures: don't start with what AI can do. Start with what you need to do.
This shift in perspective transformed how the audience thought about the technology. Instead of seeing AI as this overwhelming force with infinite capabilities, they began to see it as a tool that could address their specific challenges. A teacher struggling with differentiated instruction suddenly saw possibilities. An NGO leader drowning in grant applications glimpsed efficiency. A student struggling with language practice found a patient tutor available 24/7.
The same approach that worked in Fortune 500 boardrooms worked in Nepali classrooms because human needs—to save time, to be more effective, to serve others better—are universal.
As Conor explained during our interview, when organizations ask him how AI applies to their specific industry—whether pharmaceuticals, finance, or education—his answer is always the same: "It applies the exact same way to everyone." It's not a specialized tool for different industries; it's a second brain for anyone who uses it. The key is starting with your actual work, your real challenges, not with the technology's theoretical capabilities.
Lessons for Associations
The Nepal workshops revealed several powerful lessons for associations bringing AI education to their members:
- The messenger matters as much as the message. Finn's age and experience made him relatable in ways Conor couldn't be. Associations should consider who delivers their AI training—perhaps your most effective teachers are members who've recently mastered the technology.
- Address fears with honesty, not dismissal. The concerns in Nepal were identical to those in the US because they're human concerns. Acknowledge that yes, AI is disruptive. Yes, it changes how we work. But banning or ignoring it only drives usage underground.
- Start with problems, not possibilities. Instead of showcasing everything AI can do, help members identify their daily challenges first. The technology becomes less overwhelming when it's solving real problems.
- Create space for intellectual honesty. The best discussions happened when participants felt safe to voice their real concerns. Don't rush to defend AI—engage with the legitimate worries your members have.
- Mix generations intentionally. The most powerful moments came when traditional hierarchies dissolved—when students taught teachers, when teenagers instructed adults. This generational mixing can accelerate adoption.
The Path Forward
As Conor reflected in his post about the experience, he's "the opposite of naive when it comes to change in the developing world." He knows that their workshops were "a drop in the bucket." But he also knows that change has to start somewhere, and that this technology can bring good to the world. A filled bucket, after all, begins with a few drops.
The workshops in Nepal revealed something profound: in an age where technology often divides us, our response to AI is surprisingly universal. The questions a teacher asks in Kathmandu echo in Connecticut classrooms. The fears a nonprofit leader voices in Nepal mirror those in Manhattan boardrooms.
For associations, this universality is actually good news. The challenges your members face aren't unique to your industry or geography. The solutions that work elsewhere can work for you. And sometimes, your most powerful AI advocates might be the ones you least expect—perhaps the newest members who see AI not as a threat to their expertise but as a tool to amplify their impact.
As Conor and Finn discovered halfway around the world, the best way to teach AI isn't to focus on the technology at all. It's to focus on the very human need to learn, grow, and adapt—needs that transcend every border, every culture, and every generation.

July 21, 2025