Summary:
In this heartwarming and enlightening episode, co-hosts Amith Nagarajan and Mallory Mejias welcome the dynamic father-son duo, Conor and Finn Grennan, to discuss their journey teaching generative AI in Nepal. Connor, Chief AI Architect at NYU Stern, and his 16-year-old son Finn share stories of cultural resonance, the universal challenges of AI adoption in education, and how peer-to-peer learning can drive real change. From Kathmandu classrooms to critical thinking conundrums, this conversation explores why the messenger matters just as much as the messageโand why students everywhere might be the future's best AI evangelists.
Conor Grennan is Chief AI Architect at NYU Stern School of Business and trains large companies on how to drive AI adoption. He's worked with organizations like NASA, McKinsey, PWC, Google, Amazon, and many more. Finn Grennan just finished his sophomore year in high school where he has been asked by the administration of his school to help set AI strategy.
Timestamps:
00:00 - Introduction
๐ Thank you to our sponsor
๐
Find out more digitalNow 2025 and register now:
https://digitalnow.sidecar.ai/
๐ค Join the AI Mastermind:
https://sidecar.ai/association-ai-mas...
๐ Check out Sidecar's AI Learning Hub and get your Association AI Professional (AAiP) certification:
๐ Download โAscend 2nd Edition: Unlocking the Power of AI for Associationsโ for FREE
๐ AI Tools and Resources Mentioned in This Episode:
ChatGPT โ https://chat.openai.com
Claude โ https://www.anthropic.com
AI Mindset โ https://www.ai-mindset.ai/
Conor's Post About their Trip to Nepal โ https://shorturl.at/W3GwR
https://www.linkedin.com/company/sidecar-global
https://twitter.com/sidecarglobal
https://www.youtube.com/@SidecarSync
โ๏ธ Other Resources from Sidecar:
More about Your Hosts:
Amith Nagarajan is the Chairman of Blue Cypress ๐ https://BlueCypress.io, a family of purpose-driven companies and proud practitioners of Conscious Capitalism. The Blue Cypress companies focus on helping associations, non-profits, and other purpose-driven organizations achieve long-term success. Amith is also an active early-stage investor in B2B SaaS companies. Heโs had the good fortune of nearly three decades of success as an entrepreneur and enjoys helping others in their journey.
๐ฃ Follow Amith on LinkedIn:
https://linkedin.com/amithnagarajan
Mallory Mejias is passionate about creating opportunities for association professionals to learn, grow, and better serve their members using artificial intelligence. She enjoys blending creativity and innovation to produce fresh, meaningful content for the association space.
๐ฃ Follow Mallory on Linkedin:
https://linkedin.com/mallorymejias
๐ค Please note this transcript was generated using (you guessed it) AI, so please excuse any errors ๐ค
[00:00:00] Amith: Welcome to the Sidecar Sync Podcast, your home for all things innovation, artificial intelligence, and associations.
[00:00:14] Mallory: Hello everyone and welcome to the Sidecar Sync Podcast. We're so happy to have you here with us. My name is Mallory Mejias, and I'm one of your hosts along with Amith Nagarajan, and we have a real treat of an interview lined up for you today. We are interviewing a father and son for the first time on the Sidecar Sync Podcast, Conor and Finn Grennan.
[00:00:38] Conor Grennan is the Chief AI architect at NYU Stern School of Business, and he trains large companies on how to drive AI adoption. He's worked with organizations like nasa, McKinsey, pwc, Google, Amazon, and many more. And his son Finn Grennan and just finished his sophomore year in high school [00:01:00] where he has been asked by the administration of his school to help set up AI strategy.
[00:01:05] Conor and Finn recently traveled to Nepal where they led gen AI workshops for education NGOs and local schools. Finn. 16 years old at that time was teaching these generative AI workshops. In this interview that you're about to hear, they talk about their experience in Nepal and share their insights from these workshops.
[00:01:26] What might surprise you, certainly surprised me, is that in a country across the world with a culture that's quite different from that of Conor and Finn's on the east coast of the United States, the excitement. Curiosity and the concerns around generative AI seem to be quite similar. Another key takeaway from this conversation is that oftentimes the messenger matters.
[00:01:48] Learning about generative AI from someone you consider a peer, either in age or profession or industry is quite powerful. We have a great interview lined up for you. [00:02:00] It's a feel good one. I felt like I was smiling throughout the whole conversation, so please, please enjoy this interview with Conor and Finn Grennan.
[00:02:09] Conor and Finn, thank you so much for joining us on the Sidecar Sync Podcast. I was just saying before we started recording how excited we are for this episode. We've never had someone Finn as young as you on the podcast, and we've also never had. A father and son duo. I also don't think we've ever interviewed two people at once.
[00:02:26] So lots of firsts for us. Yeah. On the Sidecar Sync Podcast, um, we'll
[00:02:31] Conor: find out together how this goes.
[00:02:34] Mallory: I'm hoping first and foremost if you can share a little bit with our audience about your backgrounds and kind of your, each of your journeys with ai. Connor, if you wanna start it off and then we'll go to Finn.
[00:02:45] Conor: Sure. Yeah. No, thanks for having us. This is really cool. I mean, uh, so Finn and I have been, so Finn is 16. I'm older than, uh, 16, and we have been, you know, doing this together for a while. So it's, it's, we are really excited to come on here and, and share with you guys, especially with your [00:03:00] mission and everything like that.
[00:03:01] Um, so yeah, so for me, I, I don't have a tech background, uh, at all, which, um, I've been at NYU Stern for the last, I don't know, decade or something like that, and I've been in the administration. I've, you know, been, I have an academic background, a writing background, things like that. And a couple of years ago, uh, kind of early 2023 is when, you know, chat bt sort of really sprung out and I sort of was like, wow, this is like unbelievable and thought very quickly this is gonna change everything for our students who are the MBA, basically the MBA program at Stern and, uh.
[00:03:38] And tried to figure out like, well, how do we use this? So I started talking to companies and figure out how they were using and found out very quickly that they were not using it. And so ended up going out and trying to teach companies and teach our students and back and forth and now, uh. My days are really taken up training sort of large organizations on AI adoption.
[00:03:58] And um, [00:04:00] and just to sort of like, as I, as I handed it over to Finn, I'll just kinda give you a little, the anecdote of how Finn and I kind of got involved together. It really started, uh, actually really early 20, 23, I think, right? Where, um, you know, where I was like, Hey, have you used this? And Finn's like, not really, haven't seen it yet.
[00:04:16] And again, this is very early on and uh, I was like, Hey, so. Tried. And I think the first thing he said was like, your school didn't allow it, or something like that early on. And then I was like, Hey, just go give it a whirl. See what? And he came back like two hours later, like having kind of quote unquote figured it out in whatever that means.
[00:04:34] But anyway, so that's kinda how we started this journey together. A little bit, but I'll pass over to you and sort of like fend to, to talk about how you got here.
[00:04:40] Finn: Yeah, absolutely. Um, so I again, kind of first heard about AI in spring of 2023, I believe. Yeah. And. When I first heard about it, it sounded like magic.
[00:04:50] I heard about it. For some people, they were talking about this like magic thing that could now create poems, especially do your schoolwork, anything like that. [00:05:00] Then my father, he told me about it and I'm like, okay, this has to be something actually official. It's not some sort of made up thing. People are randomly talking about this is a real thing.
[00:05:10] So about maybe a week later, my school comes out and is like, absolutely no using ai, don't you dare? Of course. I mean like it makes sense. They don't want, 'cause there's so much you can do with it. And so they don't want kids using it even responsibly. But I expected maybe within the year they would come up with responsible guidelines or something like that where I could start to use AI in a good way.
[00:05:32] Um. My dad started getting into this whole AI business and I kind of kept along, like followed along with that. I saw what he was doing and I used it a couple times, but it wasn't anything serious because of course there was no real place for me to use it. I wasn't allowed to use it for schoolwork, and there wasn't that much else going on at that point where I could be using ai.
[00:05:54] There was a couple creativity projects early on just for like my own fun. I think I was doing a project [00:06:00] about, um, German immigrants. So I asked it to become a German immigrant and just had a whole talk with that and it was really interesting and I could fully kind of start to see the power of what it might be able to do.
[00:06:13] But schools weren't really accepting of that, so I kind of had to stop using it for a while.
[00:06:18] Conor: It was, it was actually really wild, just sort of, and again, it's, it's funny 'cause we're two years into this and schools still have a hard time. NYU still has a hard time. I mean, even a few days ago. I was talking to faculty and they're saying like, so what's our policy?
[00:06:31] And I'm like, it's kind of vague. So every school is, is sort of like that. But um, just to piggyback on what Finn was saying, 'cause it, this was sort of my moment too, of realizing how accessible this was. 'cause at the, were you 14 at the time? I guess it was right. Two years ago. And, uh, and it was just funny because I'm like, well let's, and you know, Finn's Bright and all that sort of stuff, so I'm like, oh, he'll figure something out.
[00:06:55] But what I did not expect was. The creativity that it [00:07:00] launched. Right. And so, and again, that first project that I, I have to piggyback on for a second because I ended up using a lot of fence examples. Now in my own work, I don't really credit him, uh, because it's better for me to get the credit. But, um, but it was so cool because like the assignment was, if I recall right, and I do recall, 'cause it was, I used it a lot as an example, was, you know, a kind of a typical, I guess freshman example or whatever, which was, you know, right.
[00:07:26] Journal entry from the vantage point of a German immigrant getting off the boat in what, 1850 or something like that? Right. And uh, you know, and Right. What that would be like, which is a typical thing. And then again, what Finn did was he told JBD to become that and all of a sudden this thing is like, oh, hey, I'm Heinrich whatever.
[00:07:43] And Finn's like, okay, well what do you see? And it was like, oh, well the horses are coming and I could smell the sausage infants. Well, let's, where are you going right now? And it was, it has been like, that's why I love that word magic. 'cause it just, it felt like magic, right? It was like unlike anything I've ever seen before.
[00:07:58] Yeah.
[00:07:59] Amith: That's pretty [00:08:00] incredible. As I'm excited about all the AI conversation coming up in, uh, this conversation, I'm really interested, maybe we'll have to do this offline, Connor, of getting some tips from you on parenting. If you can get your teenagers to believe what you say is official because my kids do not think what I say is official.
[00:08:17] Conor: Alright. Alright. End the interview. That's the interview That's cut.
[00:08:21] Mallory: I love what you said about AI feeling like magic. I don't know. We're on our, at this point, I don't know what number podcast, this will be probably somewhere in the eighties or nineties. For us, we post one episode a week. I don't know that we've ever really talked about AI feeling magical, but in many ways it does.
[00:08:37] Uh, and even for people who deal with this every day, I'm sure you both have the same experience. You still are surprised. When you try out a new use case, when you try out a new tool and think, wow, like how, how is this possible? So I wanted to, to note on that. And then I also wanted to say or ask you, Finn, you mentioned that your school didn't allow artificial intelligence.
[00:08:58] It sounds like maybe [00:09:00] now they do. So do you feel like your school has. AI guidelines in place. We often talk about this on the podcast with the nonprofit Market Association struggling to get those usage guidelines in place or ones that are really effective. So I'm curious what you've seen at your school.
[00:09:15] Finn: So yeah, it's been an interesting ride. I think, I'm trying to let, lemme think. A couple months ago, probably in the beginning of the school year, they kind of let teachers do their own thing. So instead of creating. Guidelines overall for the entire school because of course it's gonna vary from class to class.
[00:09:34] You have different sort of projects, different sort of assignments, so they kind of let teachers do their own thing with it. But the funny thing is a lot of teachers also just didn't use it, and it wasn't now. It wasn't because they weren't allowed to use it, it was because they hadn't used it in so long, but they also didn't really know what to use it for.
[00:09:52] They had systems in place right now that already worked and they didn't really want to experiment with it. Um, so the first class I actually used it [00:10:00] in was my French class. Um, and I used it there to talk to a French person in French. And so that was really interesting and I used it for a while, but. I mean, we kind of stopped using it because we started to try and use certain use cases such as like audio.
[00:10:18] Mm-hmm. Talking back and forth with audio. And it didn't work too well, especially during the time it was happening because as you know, a 10th grade French class isn't gonna have the best pronunciations. So sometimes AI can't pick up what words were saying. Yeah, yeah. Um, and so we kind of pushed too far and when it stopped working, we kind of dropped it all together as a concept.
[00:10:39] Which, I mean, it's kind of a sad thing because there's so many ways it can be used, but there's also some limits to it. And so of course it should be used for these really amazing ways that it can be used. But for these other things, I mean, you can't let that stop you.
[00:10:56] Mallory: Mm-hmm.
[00:10:57] Finn: And so used it in French class and then [00:11:00] eventually.
[00:11:00] My history class we started be, we're like allowed to use it for research, things like that, in certain projects, of course. So they still wanted us to learn research skills and like learn how, learning how to think critically, how to like make sure, um, because this was at a time where AI was hallucinating a bunch, and so they didn't want us to get false information and start believing maybe some crazy conspiracy theories that were on the internet.
[00:11:25] Mm-hmm. Of course it was more regulated than that, but I don't think the teachers really realized that. But then classes like English never used it, never even mentioned it, just because it's hard to use it in a class like that where it's so critical to create your own ideas and think. I think that's mainly the point of a class like English, where it's you're learning how to generate your ideas and then form 'em on a page without using some, like as I said, magical bot that can do it for you.
[00:11:55] That has already has all the stuff on all the topics. In its mind. [00:12:00] So it's hard to be able to use it in some classes and not be able to use it in others because it makes it very confusing for some students on where the guidelines actually are. Um, but now I've been working a little bit with my school to create some more, um, general guidelines for the school, just because these teachers need to have some sort of guidelines without just being able to use it like freely, however they want.
[00:12:27] If you let them use it however they want, they most likely won't use it that much. So I've been working with my school in that way. Just talking about like maybe doing some projects for the next year, next school year that require you to use AI at least once in each of these classes so that teachers now get a feel for it if they haven't used it already.
[00:12:49] And then just, I don't know, teaching teachers to make sure that they know how to use it so when the students have to come to them. You have experts in ways, right?
[00:12:58] Mallory: Yep. Education is that [00:13:00] first step that we always harp on on this podcast. And I think oftentimes you think guidelines are stifling and that they won't promote innovation, but a lot of times they can make you feel more free to experiment when you have some parameters.
[00:13:12] Yes. So I appreciate you sharing that. Uh, ame I'm gonna go to you next. I think you initially saw Connor's posts on LinkedIn. Uh, what caught your eye about that ame? And, um, yeah, just share what, what was notable about this, this post. I
[00:13:29] Amith: just thought a, a father-son duo, helping people in another part of the world to get up to speed on AI was just fundamentally interesting, touched on the heartstrings and it's also so aligned with, uh, our community of nonprofit leaders.
[00:13:41] So that all of those things, uh, caught me to reach out. And I think Connor, I'd been following you on LinkedIn just seeing your posts for maybe a month or two before that. And I just noticed this one. Uh, so that's how we got in touch because I thought it would be just a fascinating topic aside from just the general interest.
[00:13:55] I also think that, uh, when you're dealing with it. Different approach or, or a [00:14:00] different audience in this case. Um, you can learn things from it. You can learn things from how you're teaching kids in this country, how you're helping, uh, kids in another country, and maybe find ways to apply that, uh, to the work that our friends in the association and not-for-profit community are doing.
[00:14:15] So those are all the things that came to mind and why I reached out to these folks.
[00:14:18] Mallory: Mm-hmm. And that's what I wanna get into next. So, the Post was all about your experience teaching AI to folks in Nepal. So I wanna ask you both, what is your connection to Nepal? What inspired that trip and, and how did it go?
[00:14:32] Conor: Yeah, I mean, so yeah, just, just starting off before we get into sort of what actually happened out there, which was really cool. Uh, so my background is actually, it's, it's, I spent a lot of years in nonprofit, so, you know, first in like a non-governmental organization in, in, uh, Prague in the Brussels doing conflict resolution work for eight years.
[00:14:51] And then I moved out to Nepal. It's kind of a longer story, but I went out to Nepal and I ended up starting an organization that rescued traffic children and reunited them with [00:15:00] their, uh, reuniting them with their family. And that was going on for many, many years. We only, I think it was only last year or two years ago that we ended up rolling that into a local organization after, you know, 17 years.
[00:15:10] So, so we are, you know, I have a lot of experience in nonprofit and we're pretty passionate about nonprofit and uh, you know, and through our church and everything else, we do a lot of stuff like that. So, so for us, I think for Finn and I, you know, when we were thinking about Nepal, Finn, and I had. Uh, done an Everest base camp trip together maybe three years ago.
[00:15:29] Uh, just he and I and you know, when he was 13 and, uh, it was amazing, right? It was like this phenomenal trip, like 11 years and Finn got to know a lot of the folks. So I, I lived out there for about a couple of years, so I, I knew the place pretty well. But Finn got to know a lot of the kids who are now older kids, obviously, uh, that we helped out.
[00:15:48] And it was just like, it's an amazing spot. So even before the AI stuff. It's just a cool country. Right?
[00:15:55] Finn: Yeah. And I also wanna mention just the reason I'm alive and I'm here right [00:16:00] now is because my dad met my mom while he was working in Nepal. Wow. Nepal is why I'm here.
[00:16:06] Conor: That's actually, yeah. So yeah. Quick story on that.
[00:16:08] So Liz is, uh, she's American, but she came out to volunteer in Nepal and I was always joking that like, you know, I only started working in orphanage 'cause I thought it would be like the best, you know, pickup line of all time. Right. And like, it just like, seemed like it'd be make me look like really cool to people.
[00:16:23] And it worked. 'cause then, you know, the love of my life rolled into town and within days I was actually. Thinking, how do I get her to marry me? And I proposed to her very, very quickly. We, we, when we did the dates, we found out that I had only spent, uh, about three weeks in the same room as her before I proposed.
[00:16:40] So I proposed to her very, very quickly. That's Finn's mom. We've been married about 17 years, but yes, so DePaul is also responsible for, for,
[00:16:48] Mallory: so it's full circle. You going back it's full circle in Nepal. I mean, I love that.
[00:16:52] Conor: Yeah. Yeah.
[00:16:54] Mallory: So what did the workshops look like? How many were there? What topics did you cover?[00:17:00]
[00:17:00] Conor: Yeah. I mean, well, when we thought about how to do it, we really went back and forth because the truth is we didn't know. Yeah, right. Like I, I, I've done a lot of this kind of work where I've done with organizations. I do it for nonprofits too, but I do it a lot with just big companies. So I knew what that looked like.
[00:17:17] I knew what it looked like to teach here, and the way that I was taught was sort of through this. You know, kind of a different way of looking at ai, like through a behavioral shift. It's not about, Hey, this is what it can do. It's like, Hey, here's why our brain has trouble with this, because you actually don't need to know anything.
[00:17:30] Right? It's really if you just talk to like a human. You're fine. That's all you actually really need. You don't need to learn anything. Uh, but that's actually, as it turns out, very hard because your brain has a hard time processing that. It has a hard time looking at this thing and telling itself, oh, this is a person because it doesn't look like a person.
[00:17:48] So. I knew how that worked and I know it was very, I knew it was very effective in corporate context and everything else. And Finn and I had done a couple of things already with school and church [00:18:00] and uh, and we had done things around. I'm like, okay, so what's important for schools and everything? So, so I think we had an idea about how it worked here.
[00:18:08] Do you wanna talk a little bit about like, how we prepared for it? Because when we, again, when we didn't know, how did we even go in, kind of, and what I'm trying to remember, like what our first thoughts were going in. Yeah, that's a
[00:18:17] Finn: great point. Um, preparing. Let me think. I mean, so I think how our presentations there kind of worked were, my dad would start off talking about what AI was.
[00:18:29] Yeah. Kind of how our brains thought about it and maybe why so many people weren't using it. And then about halfway of the presentation, it would transfer over to me and I would talk more about specific education contexts. Yeah. Um, so I would talk about how the education system has worked in the past and why it's actually not built for integrating something like AI the way it is now.
[00:18:50] Yeah. Um, and that was primarily just because, as you guys probably know, the education system hasn't really changed in hundreds and hundreds of years. It's always this [00:19:00] three step cycle, which I brought up there, which was the teacher teaches something. The student learns it through their classwork, homework, whatever they do, and then the student is tested to see how well they know it.
[00:19:11] Um, and it's always pretty much remained the same. And AI is a huge disruptor to the cycle because now even if the students maybe don't do this classwork or homework or whatever, by the time you get around to the student being tested, it's pretty hard to tell now if it's the student being tested or the AI being tested.
[00:19:30] The AI is gonna know a lot more than the students here. Obviously, it has access to a ton of information, and it speaks like a human. So not only is it impossible to tell a part human versus AI at that point, but it's also, you might get rewarded for it because the AI does such a better job than what a 10th grade student can do.
[00:19:51] And so I just talked about that a bit, and then I went into specific use cases for maybe. How it could work in their education system. [00:20:00] And I think that worked pretty well just because like, it's good, it's a, I know you are kind of anti use case in a way. Mm-hmm. Because it's more about just thinking about it, like how you think about it and being creative.
[00:20:13] But I think use cases worked really well here. Mm-hmm. Just because, mm-hmm. It's always helpful to start off by knowing exactly what to do, maybe the exact even prompts on what to put in.
[00:20:21] Conor: Yeah, I think, I think that's right. It's, and it was, it was so interesting because I think it did a couple of things like, first of all, them.
[00:20:27] See, so we worked with, you know, a couple of nonprofits, a couple of schools, right? And, um, and it was, I, I think it was useful because again, like we didn't know how they'd respond. Like this is a pretty radically different culture. It's about as different as you can get. Yet we also had the theory going in that they, as to Finn's point, education hasn't changed in hundreds of years because it doesn't really exist in a different form all around the world.
[00:20:52] It's sort of, it's, it's, it's almost intuitive. Like cave people were probably teaching in the same kind of way, right? Like it's, you know, [00:21:00] teacher teaches, the student learns and then the student reflects back and that's how you grade essentially. And so we thought. Okay, well let's just try this. In fact, our original idea is were like, should we teach students?
[00:21:10] But we were like, well, students are probably figuring it out faster than anybody, but if, you know, it's probably the teachers and the, and the parents and the nonprofits that needed to do it, so, so I think one of the interesting things was that I would, yeah, as Vince said, I would sort of like start and teach like, Hey, this is like why we're looking at this differently.
[00:21:27] That's when Finn was sort of saying, yeah, dad, like I know that we don't usually do use cases, and the reason to not do use cases is just if you teach it like that, then people tend to think of generative AI and CHATT as a tool. You take off the shelf, like if you're like, Hey, if you're in sales, here's how you do it.
[00:21:39] People are like, okay, great. If I'm in, this is what I do, instead of thinking, it's actually not about that. It's like, start with what you do. You know, it's, it's more, you know your day better than anyone else. You know what you do better than anyone else. Let's talk about what you do and then bringing generative AI after that.
[00:21:56] And so that's kind of like our, [00:22:00] our approach. And we thought, well, humans are the same with that. But then Fin's approach as sort of in the second half of the presentation would be, yeah, but if people really haven't had experience with it, then we're gonna just have to show them. And I was like, yeah, that, that's fair.
[00:22:12] So, so then fin would come in, first of all with this idea of. Again, we, we like to go super fundamental. Like, listen, here's, this is gonna sound very simple, but when you don't usually put it into terms it's hard to grasp, which is, it's not just like education, it's like, what is education? Education is, teacher teaches the student student tests and then that cycle keeps repeating and keeps repeating.
[00:22:35] And so when we were thinking, well, how do we actually get this across? Uh, yeah. It became, you know, Finn ending with, um, okay. Lemme just show you like, I think it was like. 10 ways students did this, 10 ways teachers can do this. And it just, it, it was the magic of the moment to, to help them feel that number one.
[00:22:54] But the other cool thing, which I don't think I expected it only, it only kind of came up later, [00:23:00] um, was that, was that, and this is sort of how you, how we all got together too, is that. You know, with the, with these things, and this is by the way, you guys know nonprofit. I know nonprofit a little bit, but nonprofit is all about just the like, oh, we did not expect this to go this way, but like, let's just ride this.
[00:23:19] Like, oh, we thought we were doing this, but we're actually, turns out we're doing this. You know? And you have to adapt very quickly in the nonprofit world. And with this, I think what we found was we sort of thought, oh, we'll go in and sort of like teach this, but I think one of the really. Kind of positive externalities of this.
[00:23:32] Like in other words, like something that happened that we didn't expect in a very positive way was Finn as a messenger was, I would say, almost one of the most powerful things because they were like, oh, wait a minute. There's this kid, you know, high school kid, but still a kid. He's teaching it like he knows this so well and clearly he doesn't have 20 years of experience in something and he doesn't know Cody and he doesn't know anything.
[00:23:55] 'cause I can say that all I want, but they might be thinking, ah, he does. But with Finn it's like, no, [00:24:00] no, he's in high school. Like this is just something that he just started using. And if I can use it, and not to diminish obviously that, but like if, if Finn can figure this out at this level of his experience, it's not about intelligence, but it's about level of experience.
[00:24:13] Truly anyone can do it. And I also think that the impactfulness of it was, and I think it evolved as we were going through, like from one thing to like, after everyone, we would debrief all night, be like, okay, so what could we have done differently? And one of the really cool things I think was realizing, um, you know, the power of this is just how this is really for everyone.
[00:24:34] You know? I mean, it's just like, look at how different these two people are up on stage. And that became half of the message, if that makes sense.
[00:24:42] Amith: That's really exciting. And, you know, you're talking about democratization of the world's best possible delivery of education and what you described of that three step process in education.
[00:24:50] I was gonna ask you, uh, if that is pretty consistent around the world in your experience, and you, you already answered that, that it is, you know, across a variety of, of even very different cultures. Uh, [00:25:00] and I think that's. A lot of that is the, uh, ability to scale education. You know, when we, when we talk about going full circle, anything in terms of whether it's, you know, 16 years or thousands of years, you know, going back to the ancient Greeks, you know, the, the highest quality of education was one-on-one tutoring, but that doesn't scale.
[00:25:15] And one-on-one tutoring is, is decidedly non-linear. And so you can just have conversations and go back and forth and test different ideas and be pushed and pulled and all these things. And for the first time ever, we can scale. That type of education, which is part of what's so foreign to everyone, because we're not used to that.
[00:25:32] We're not used to having the abundance of that resource. So that's the thought that comes to mind. Hearing you guys talk about that is. It's gonna require a reset in thinking, uh, for the education system, not just to allow it, but to fully, you know, benefit from this incredible, uh, shift that we've had all of a sudden.
[00:25:49] So it's, uh, it's pretty tremendous. Um, one, one particular thing I wanted to ask you guys is, uh, in terms of cultural barriers, specifically that particular ones, sounds like there was consistency in the [00:26:00] expectations in terms of the way the school system worked. Was there a difference in Nepal in terms of the degree of fear?
[00:26:07] That people running schools are involved in the education system had relative to here. I don't have broad experience with teachers and administrators in the states, but the limited exposure I've had speaking at my son's school, uh, a couple times about CS and talking to teachers and administration there, there's, there's a very high level of fear, uh, here and I, I've heard that similar things anecdotally from other parents.
[00:26:27] So what do you guys think about that?
[00:26:31] Conor: A question because this, this.
[00:26:40] Clearly there's a level of fear. And clearly, and, and I remember, um, you said this in your presentation a lot, which is that fear is sort of justified, right? I mean, like, if, if, if teachers all of a sudden there's this, they know that there's this tool out there where students can literally press a button and in a completely undetectable way, it spits out a paper on Hamlet [00:27:00] that, you know has, they have no idea if a student wrote it or didn't write it.
[00:27:05] That's. That's one thing. The um, and then just in terms of the cultural thing, Nepal, before I hand it over, is, um, I don't know. I I, I was really impressed with their questions actually. 'cause I, there's, and we went to different places and all these places that we went were, were quite different. So in, for, for example, in one of the places, you know, teachers were more like, yeah, just teach us and all that kinda stuff.
[00:27:26] But in the other place, I think they're a little more intellectually honest in saying. Why are we even letting this happen? You know what I mean? Like, why are we, like, we have a hard enough time getting students to think critically. Like, doesn't this just remove that? But I don't know how you kind of like saw that.
[00:27:41] Finn: Yeah. So onto consistency, I think fear levels are completely consistent. Um, just because I think, you know, the same amount of, lemme see how to phrase this. The same amount of, um. AI being banned, like AI being [00:28:00] banned in the US is pretty much similar to AI being banned in Nepal. They take a full on approach of just banning it all.
[00:28:06] Yeah, right. Because it's easier to completely get rid of it than try and control some of it. Um, and I was wondering, it was a little shocking to me that. Somewhere like Nepal would be very similar to somewhere like the US in just levels of fear around this one thing. Yeah. Where the US is supposed to be like really advanced in technology, but part of our presentation there was Yeah.
[00:28:30] Even the western schools, even these schools in the us, the top bring schools are terrified of ai. Yeah. So it makes sense that you're terrified of ai, but I think the most interesting part of the whole thing was the fear was exactly the same. And I think you already mentioned this, but. The questions I've gotten from schools in the west schools in the US versus schools in Nepal, the most common question out of both was, is this gonna replace critical thinking?
[00:28:54] Right? And it makes sense. It's a huge fear because like if we dive deep into it, [00:29:00] school is meant to teach learning. Of course, all these subjects are like, all these individual subjects are really important depending on where you go into for your future degree or future job. But I don't know how much I'm really gonna use.
[00:29:14] Chemistry in my future job. It's not like the most interesting thing to me, but it's important that I learn how to problem solve, work through things, and learn how to work hard at these types of things that I might not know as well. So it's more about the learning rather than the actual subject, which is why AI is so risky, because it gets rid of that fact that you might not need to learn how to think, how to write, how to, um, compile your ideas.
[00:29:42] Express them in words or on paper. And so the fear was mostly the same, I would say, across the board. Um, yeah.
[00:29:50] Conor: Yeah. I think, yeah, no, that's right. And I think one of the, the, I, I was surprised at how many questions we got around critical thinking because I, I, I think going in, I kind of thought it would be more around [00:30:00] like, yeah, just teach us what, you know, they're not gonna push back.
[00:30:02] And they really pushed back. I mean, not like they were like, get outta here, devil, you know, but it was more like they really were trying to figure it out. And I think Finn being able to say. Yeah, I mean, like we go to, you know, he goes to a good school in Connecticut and, uh, and we're wrestling with it there.
[00:30:19] And I think that, uh, I don't know, it's sort of normalized it in a way, right? Where they, I think they really felt like, oh, right, everybody's dealing mean. In other words, um, you know, it's not like medicine, right? Where like you could be living in this, you know, really remote place and be like. I don't know if somebody has diabetes.
[00:30:35] They just have diabetes, and they're like, no, no, no. Look, you just have this insulin things and you just carry this everywhere. It's like, so here's the solution. I think this was one of the first things, at least in my mind anyway, where it's a technology where it exists the same everywhere. Everybody's wrestling with it in the exact same way because it's a technology that doesn't really act like a technology.
[00:30:54] It acts more like. Per like a person, you know? And that's so to [00:31:00] not have cracked that code any more than we could have cracked the code on like, well, how do you guys parent in America? It's like, well, I don't know. How do you parent here? You know, it's, it's one of those things where it was much more like, of a relational impossible to solve thing.
[00:31:13] And so I, that's why I love when FIN would just dive in with like, yeah, you're scared because it's scary and because it's disrupting all this, why wouldn't it be scary? So I thought that was always just really interesting.
[00:31:23] Mallory: Mm-hmm. Yeah, that really resonates the critical thinking piece. I often think about that myself, and I don't think any of us have all the answers, but it's important to discuss it on platforms like this one.
[00:31:35] I, uh, used to host an intro to AI webinar for sidecar. And normally the questions we would get to your point are like, which platform do you use? Right? Or, uh, Mallory, can you showcase this thing in chat GBT? And then one time we got a question that I still remember that moment where someone dropped in the chat, isn't this gonna make us all lazy?
[00:31:54] I remember that one kind of threw me for a loop because I was used to answering, you know, the quick one-offs and I had to sit there for a second and think, [00:32:00] well. I mean, I suppose it could, I hope that it doesn't as a society, as humanity, but yeah, I mean it definitely provides some shortcuts. Finn, uh, you being 16 and on the podcast, I, I have to ask you this question.
[00:32:13] We've been talking about kind of our angle, I guess, as adults in the workforce of the lack of critical thinking that could come from AI potentially with students and in schools. Finn, from your perspective. How does your generation view this technology? Are all your friends using it or some using it?
[00:32:31] Some not. Um, are they taking the shortcuts with ai or what, what is that experience like from your perspective?
[00:32:38] Finn: Yeah, so I think it's a great question. Um, it really depends by person to person, student to student. I don't think there's a collective idea on this for my generation, but I think there are two pretty much main groups.
[00:32:52] Um. So there's this example I used in Nepal a bunch, which is like the ski analogy. Um, and what it is, there's this piece of [00:33:00] advice that says if you're skiing downhill through a forest, you gotta focus on the path and not the trees. And the reason this is, is because if you're only staring at the trees, you're only paying attention to the trees and you're most likely gonna hit a tree.
[00:33:12] 'cause you're not looking where you're going, you're only looking at the trees. Versus if you're only looking at the path and you only follow the path, there's no trees in the path. So you'll most likely get down safely. And it's a piece of advice about like, oh yeah, focus on the good things in life, not the bad, not the obstacles.
[00:33:27] But I think it really applies to AI here too, in the way that like, because it's been banned across the country basically and across the world so quickly, it almost became a tree in the way that like you're telling these students, they can't use ai. And so instead of this powerful tool that they could be used to help their learning, you're putting an obstacle in front of them.
[00:33:50] I guarantee you by the time, if the time comes at like 11 o'clock at night when you have a paper due in 10, 30 minutes, most likely now, because some [00:34:00] students who might not be like, who have power and have access to this technology, instead of taking a bad grade on this paper or try and get an extension on it.
[00:34:11] They can now do it in the span of 10 minutes because the ai, AI will write it for them insanely quickly, incredibly quickly, and better than almost they could write it. And so in a way, it became an obstacle that was only used when they really needed to cheat or to help them out on some sort of assignment.
[00:34:29] But because it was banned, it was an obstacle. And so you have the kids who would use it like that, and then the kids would be so terrified to even experiment with it because what if the school saw them on the browser? Accessing Chacha bt, they'd be terrified that, oh, maybe they get, might get kicked outta the school.
[00:34:45] Who knows? Like it's such a severe reaction to it that it seems like the root of all evil at some points that maybe it seems like cheating. And so it became such an obstacle that I think a lot of people were either terrified to use it or really [00:35:00] secretive to use it. And because most of our time is spent in school that.
[00:35:06] Became almost the only use case for it. No one thought to experiment outside of school. So it really just depends on the student. But I think if people had taken, like if the schools had taken a different approach to using AI and maybe creating great guidelines right away or safe guidelines and constantly experimenting with it, I think people's outlook on AI would be incredibly different.
[00:35:31] Conor: That makes a ton of sense to me. I, I agree. It's, it's to, to me as well, and I think it's, it's, uh, it's, it's, it's a, it's hard, right? It's, it's, it would be great to be like, oh, but students should just. Shouldn't cheat or whatever, but like we were all students, right? I mean like we were looking for shortcuts 'cause like we're overworked and also there's so much pressure on students get into a good college.
[00:35:53] 'cause there's so like grades are essentially value wages. Yeah. They're valuable, right? I mean like that's [00:36:00] the value to students. Like what's more valuable than that's, so if you're telling people or telling students don't use this thing, which is unbelievably easy to use, like well good luck with that.
[00:36:08] But to, you know, and Finn and I talk about this all the time, but it's. It's too bad because I also think that if chat GT did nothing else except help you learn, it would be the most valuable technology of all time. Like that's how I use it all the time, like because. For me, I'm always having to, you know, sound smart about things and I just don't know about a lot of stuff.
[00:36:27] And so I'm constantly using it to sort of say like, Hey, this person just like wrote me this note. I, I don't know what that means. Like what does this mean? And it's, Hey, Connor, don't worry. Uh, think about it like this. And I'm like, okay, come up with a different analogy. Okay. How about like this? Like, okay, now come up with, and it's an incredible learning device and, and you can imagine why, right?
[00:36:45] It's a great learning device because we're flawed. As teachers because you can't possibly get into the head of every one of your students. Like if you had something that really understood everything that you are interested in to exactly the level that it should teach you. And you only had one [00:37:00] student in the whole world, and you could focus completely on them, and you could come up with amazing analogies at a moment's notice, great.
[00:37:06] But that's not, that's not the world. So as a learning tool, it's, it's amazing. But it's, but as Finn says. You sort of like can't do. This is why I kind of like flying the, building a plane while you fly it or whatever is, is so tricky because it's one of these things where in schools we're trying to do two things at once, right?
[00:37:22] We're both trying to stop people from cheating, but also think like, okay, so now how do we, how do we also get them to learn? And where Finn started this conversation was, the education system hasn't changed in however long, like maybe th maybe thousands of years. It's hard to, and by the way, we have the same problem at N NYU with faculty who a lot of 'em are like, yeah, let's do this.
[00:37:43] Let's learn. But then we have certain faculty who have been really successful for 20 years, and you're like, why would I change? I already know how to teach this. What do you want me to do differently? So as, as, as a learning tool's, incredibly powerful. But how do we make that switch? I don't know. I'm not really sure.
[00:37:59] Amith: I think one [00:38:00] interesting, uh, piece we can pull from this that directly applies to a lot of people in our audience in the association not-for-profit community, uh, is this idea of, uh, setting clear guidelines. We talked briefly about that earlier in this conversation, but I just wanna highlight that for a second.
[00:38:15] If you, in the absence of any policy, then people are confused. They're not sure what's okay, what's. Not. Okay. Uh, sometimes policy is set to outright ban certain things which results in other behaviors. Uh, and then sometimes if you have a policy that says, these are the things that we would like you to do, these are the things that are acceptable, but they're risky.
[00:38:33] Be aware, and these are the things we really would like you to avoid doing, and if they're reasonable and well, uh, considered. Communicated and perhaps collaboratively built. Ideally, uh, that can be very powerful because then you enable people, then they feel really good about, uh, you know, experimenting with the new technology.
[00:38:49] Um, for you guys, uh, for the benefit of you guys as our guests on the pod, a lot of times the association community, particularly over the last, uh. Two, three years in adopting this stuff has said, Hey, you know, [00:39:00] we just, we're scared of people using this technology because we're worried our IP is gonna leak out, or we're worried that the quality will be bad, or we'll misinform someone, all the usual things that people are concerned with, and so therefore we're not gonna do anything about it.
[00:39:12] And that's, you know, oftentimes the, the worst reaction is the absence of a policy. But the outright banning of things, you know, causes people to still use it, but use it in clandestine or shadow type ways. So I think it's interesting to see how, uh, kids may reacted in schools, but um, you know, one thing I'm curious about is when you're thinking about other people that are in, let's say in high school, but even in college, I think, um, what are some ideas that come to mind for you guys, and particularly Finn, maybe you can speak from your own experience.
[00:39:45] How can we get younger people excited about ai? What, what are some things we could do? Because the, the association world plays a role in that. You know, associations represent professions from accounting to law, to engineering, to every branch of science and medicine. [00:40:00] And a lot of times they work deeply with, uh, universities and sometimes even high schools to try to get people interested in their fields.
[00:40:06] Uh, what can we do as a profession and just as a community to get more, uh, young people interested in AI and, and open to experimenting with it.
[00:40:15] Finn: Yeah. Well, I think that's a great question. How do we get young people interested in anything? Um, I think, I've been talking to my school about this a bunch and it's, I think it just has to come from other students.
[00:40:29] Mm-hmm. Because there's an incredibly powerful tool out there, and as we know, students usually take the path of least resistance when it comes to doing a bunch of work. They don't wanna spend six hours on one project that could maybe be done in two max. With the help of some extra tools, but if you hear it coming from maybe teachers or things like that, it usually feels like a burden because, let's think about it.
[00:40:54] What do teachers usually give you when it comes to help? They give you maybe extra [00:41:00] work, extra tools, things like that. But it usually feels like, oh, now I'm forced to do this. And I think it's very similar too, like. I don't know if anyone can relate here, but if I'm doing a chore by myself, like outta my own heart, I'm like, oh, you know, it'd be great.
[00:41:14] Maybe I could clean the kitchen right now. So by the time my parents come back, they'll be really happy. But then I get a text from my parents saying, clean the kitchen right now. It's not gonna be all good, and I'm not gonna wanna do it in the same way I wanted to do it before.
[00:41:27] Mallory: Yeah,
[00:41:28] Finn: yeah. And so I think it's exactly like that where, wow, look at this incredibly powerful tool, but as soon as.
[00:41:34] Teachers or the school says, okay, use it in exactly this way. Like you just start using this tool. It's gonna feel more like schoolwork than it feels like some actual really helpful thing. Yeah. But if I'm, for me at least, if I'm hearing something from another student, it more feels like a piece of advice because it's someone who's been through the exact same thing I've been through in the same time.
[00:41:56] Like they're experiencing the same things I'm experiencing. Maybe the [00:42:00] same teachers I've had, same projects I've had. And so when they tell me something, I really open my ears and listen to 'em. Like, oh, this has helped that person. Great. Maybe this could help me too, because they've done the same thing.
[00:42:12] They know exactly how it feels to be in my position. And so you're always gonna have a different, um, experience, and you're always gonna listen to something in a different way based on who's saying it. Of course. People in authority. Sometimes you're gonna listen to some people in authority way more than you're gonna listen to your fellow classmates, teammates, whatever people you're working with.
[00:42:35] But sometimes you're more gonna likely to listen to people like your fellow workmates just because you know that they know how you feel and they've done the same thing as you. They're in your position where the teacher's not in the pos, your position. So, yeah, just fellow classmates for schools is how I think you could really get people excited about this and involved.
[00:42:57] Conor: Yeah, it's, this is why it's fun to work [00:43:00] with Finn on this because it's stuff where, you know, I know how cool I am. I'm like a super cool dad and everything like this, and my kids always telling me like, gosh, dad, you're the best dad of all time. But like, but here's the Finn is, um, not, uh, shaking his head madly as in the fear, not, uh, if you're listening to this audibly, but, but the thing is, like as parents and teachers, and we actually don't know, right?
[00:43:21] We can kind of guess. This is what's interesting about this, right, is to sort of hear, well, what would it reminds me like in sports when, you know, the athletes have like a closed door meeting, like no coaches, anything like that. It's like, what do we want? And I think students really do know intellectually that it's important to learn how to, you know, write and things like that, right?
[00:43:42] Some and to critically think and all that sort of stuff because it will help them. I don't even know about college, but it'll certainly help them out in the working world. However, you also have this tool because right now what they need is a good grade. They need that tomorrow because that is going to impact them hugely.[00:44:00]
[00:44:00] And in some ways, you know, materially, right? If they get a good grade right now. So that is kind of like the, the gold coin that they're earning right now. But then how do we. How do we do that? Right. So I think that, you know, first of all, hearing like, well, what do they, yeah, what do students come, like, what do you guys want to do?
[00:44:17] You know, you sort of just say to the students, and like, you go off in a room. No teachers, administration, parents, anything like that, just. Come back to us, tell us, tell us like what you think the issues are. Tell us how you would solve them. And I don't think we do that nearly enough, which is why I'm so proud of the Finn School for bringing him in and sort of saying like, Hey, how do you, how should we be thinking about that?
[00:44:36] That's number, that's number one. But number two is, um, you know, in the same way, you know, when we put guardrails up, I think guardrails are. For the good. And Mallory, you sort of said this early on too, like guardrails are for the good of the people, you know, it's, and Right. It's sort of like, uh, this is gonna be a stupid example, but like with our dog, the reason why we don't like let our dog like beg at the [00:45:00] table or something like that is because our dog for like two hours will be.
[00:45:03] Expecting something and then disappointed. Whereas if the dog just like, no, this isn't a thing, then the dog can go like, and chill out, you know, on the couch or something like that. Right? It's kind of a dumb analogy, but you see what I, you see what I mean? Right. If, if, if we can give this so it's easier for students to just kinda like live their life.
[00:45:19] So in other words, this is why it's so much on the teachers. Like I think that's, and by the way, this is very hard for teachers who have been teaching the same for, for many years. Uh, I don't envy them at all, but it's one of these things where it's like, alright, what would it look like if you took your same class and.
[00:45:33] Required ai. Right. And like, and if you could have a small enough class, it would look probably something like go home. Like use all the tools you want, use all the ai, figure out how to, and then come in and teach this class to the other 15 kids in the, you know, in, in the classroom or something like that.
[00:45:50] That's what you can now be able to do with ai. Now, I, I'm not sort of saying that flipped classrooms can work all the time 'cause they're, they're hard. But it's that sort of thing. I think it's first of all like hearing what students want, and second of all, [00:46:00] really putting this on the teachers to set guardrails and to have students walk into a classroom and be like, this is the way we're gonna do it.
[00:46:07] Amith: That all makes a lot of sense. You know, one quick takeaway I'd also offer for our association listeners specifically is you guys have communities with a lot of members and uh, folks who are interested in collaborating. And some of the comments that were made earlier about peer-to-peer learning, encouraging that kind of thing, encouraging that kind of sharing associations are built for that, you know?
[00:46:27] Yeah. And so there's an opportunity there, whether it's for younger folks or for folks you know, well further into their careers. So I think that's a really interesting observation that's super applicable to our community.
[00:46:37] Mallory: I think internally with the association as well. Amit, we talk about the idea of, you know, top down, or you talked about this in your book, ascend, unlocking the power of AI.
[00:46:45] For associations top down, change is important. We need to see leaders educate themselves and be proactive, but at the same time that bottom up change, having your staff be the ones that are experimenting, feeling empowered. They're the ones in the [00:47:00] trenches like Finn said, you know, they're the ones that understand the day-to-day workings of the association.
[00:47:03] So if you can kind of get. Them together and excited about this new technology. I feel like that's where the power really lies internally.
[00:47:12] Conor: Yeah. Yeah. It's actually just one thing on that too. 'cause I, when I, I was just with an association the other day. I was, um, around medical. Uh, the publication, sort of like industry for, so it's called I-S-M-P-P, and it's, and talking to them, it's, it's the same kind of thing, right?
[00:47:27] They're all trying to like, learn from each other because even though they're, they're in other organizations, these, what, what's so powerful about associations is that it becomes this like really safe place. Like, you know, the watering hole in this, that Harris and like in Africa or some of that where like all these different people can come and actually share things before they kind of go out and go back to, you know.
[00:47:45] Competition or whatever it is, or a nonprofit. Still we're all competing for the same dollars and nonprofit, everything else. But I agree in terms of like bottom up, like that's how we have to, that's how you have to approach it. You have to have people learning from each other on this. And this is what I love about Fin's approach too, which is, [00:48:00] and this is, and this is even me, me in business, like whenever I'm doing a demo, I always try to bring up somebody from the company and be like, okay, let's talk about what you actually do.
[00:48:07] Because that's who people listen to. They listen to people who look like them and sound like them and everything else in the association industries. It's, as you say, I mean, it's exactly what it's built for.
[00:48:18] Mallory: Mm-hmm. I feel like many of our listeners are in the predicament, maybe all of them, where as an association, their members are looking to the association as the organization that needs to teach them about ai, how AI is going to impact, impact their industry, their profession, uh, now and in the future.
[00:48:37] And so I'm curious for you both coming out of this experience teaching AI in Nepal. If you have any key takeaways, any things that you would do differently that might be helpful to our associations who have that responsibility to bring that education to their members?
[00:48:53] Conor: Yeah. Uh, I'll try to think of something first.
[00:48:56] So I, I think what I took away from our experience in Nepal [00:49:00] anyway is, and even what my work here is that, um. Well, just to kinda give you a sense, so I really work across industry. I've worked pretty much across every industry you can imagine. And when I get called into a place, and this is the same for Nepal, it's always like, okay, but how does it apply to us?
[00:49:17] Well, how does it apply to pharmaceuticals? Or how does it apply to finance? And my answer is always, and I always say, oh, well, here's how it applies. But the truth is it applies the exact same way to everyone, the exact same way, because it is not about. Uh, you know, a tool for, again, like, you know, pharmaceuticals and a tool for education and a tool.
[00:49:36] It is just about a second brain for everybody who uses it. It's sort of like, well, how do you use the internet in pharmaceuticals? Or how do you use the internet in, uh, education? It's like, well, what do you need to do? And so you have to forget about what AI can do for a second and start with what you need to do.
[00:49:53] And I think Nepal really drove that home for me, which was, and I, I, I was surprised 'cause we didn't know going in, right? I was like. [00:50:00] I don't know how they're gonna take, they get, Finn and I are like on this, you know, 20 hour plane ride and we're like, no idea how this is gonna go. You know, like we just didn't know.
[00:50:07] But I think what we found very quickly was, this is applicable across it, but it has to be, I think, taught in this way that, uh, this is our bias, but this is why we do it. How Finn does it and how I do it, which is. Forget about, um, you know, what exactly, you know, like, is it better for this or better for this?
[00:50:26] Just think, what do you need to do? And so Finn's use cases, were like, if you are a teacher, here's just 10 ways that teachers use it, because this is what you do. If you're a student, here's 10 things that you do. And when I go to businesses, I'm like, just think about like, like how do you start your day?
[00:50:39] Like, what drives value for you during the day? And that's, that answer's gonna be different, but it's gonna be the user that determines that. But otherwise, the AI is so flexible. That it's less about, well, how should I use it? And more about, talk to me about like what you do. So for me, I think the, the lesson that I took away from Nepal was even [00:51:00] in the, probably the most, the most different culture, right?
[00:51:04] We could have possibly gone to. Ultimately people still had the same question. So anyway, that was the part for me that I found interesting.
[00:51:11] Finn: No, I mean, I completely agree. I think. Well, I said you were an anti use case earlier, and this is what I mean, like the reason I could speak to teachers and students and give use cases is because I'm a student myself, so I kind of understand this, but it's harder to do this when you're going across multiple businesses.
[00:51:30] And so I think if this was a use case, if AI just was a use case game, if all it was is if you had the right prompts, then you'd be amazing. At ai it would be incredibly different. And also. I think dad, you work with a co, like a wide variety of different businesses and I think if it was a use case game, not to be mean, but I don't think you'd be able to Right.
[00:51:50] Be as successful. Right. I don't You'd be able to be as successful as he could be's. Right. In this, I don't think he'd get as good feedback because he doesn't understand these certain businesses. Exactly. You, you've had a couple [00:52:00] experiences with different businesses. You'll never work in pharma. I don't know if you'll, but you'll never work in pharmaceutical.
[00:52:04] Exactly. Anything like that. And so it's more of that idea of, yeah, what do you need to do? And teaching maybe how to think about AI but not teaching the right prompts. Because even if you have the right prompts, it might work for very certain things, and you could keep recycling those again and again and again.
[00:52:22] But that's not the true potential of ai, right? Because yeah, there might always be a tool for certain specific things. AI is so wide and so powerful that you can just keep creating your own ways of using it. That's the true power in it.
[00:52:35] Amith: Yeah.
[00:52:36] Finn: So I think that's, yeah, the most powerful part.
[00:52:40] Amith: I think the, the framing you guys are offering is really powerful and, and, uh, I think hopefully our listeners and viewers on YouTube are, are really getting a lot out of that and the kind of the framing and the thinking around how to generally view AI as a companion brain, second brain.
[00:52:54] Think of it as an assistant, however you like to frame it. It's maybe even go to the AI and say, what can you do for me? This [00:53:00] is the problem I have. And, uh, the AI can be quite helpful. The other dimension of change, I think is quite, uh, both fun and also, uh, perhaps challenging is that, you know, you get used to the idea of, well, you know, I tried this with AI six or 12 or 18 months ago and it didn't work.
[00:53:14] Or didn't work well, and now I should just. Assume that perhaps I could retry the same thing and it will work. Well, that's odd because that's not normally how software is like for today. Today, for example, uh, we're recording this on May 22nd is the public release of Cloud four. And so if you're a Claude fan, as as I am, I love that product.
[00:53:32] Uh, and you know, cloud three seven was great and now cloud four. It's gonna do stuff that Cloud three seven didn't do. I haven't really explored it much yet, but it's one of those things that the interface looks the same. It's still same text box that's blinking right, where I have to tell it what to do.
[00:53:46] Uh, and it, it is a tough thing for people to keep up with. And I think therefore, you know, coming back to what you were saying about the framing is really important because you, you don't make as many assumptions about a specific static list of use cases that work or don't work. You're just really [00:54:00] exploring the space and, uh, asking.
[00:54:03] The, the software basically to help you do your job or help you do whatever it is that you're trying to accomplish. So I think that's a really good way to look at it.
[00:54:09] Conor: Yeah. It's so, it's so, right. It's, I, it's actually one of the biggest hurdles that I have is that a lot of people tried it early on and Feno was mentioning about hallucinations, and so, uh, I think you were talking like in French and stuff like that.
[00:54:20] I'm guessing it's way, way better now, but like when you tried it in 2023. It was getting things wrong and then people were like, oh, all this is is a really bad Google. Do you know what I mean? Like it's a Google that lies to you. Like what? Nobody, nobody. Like Google's already good at Googling like, why do I want Google?
[00:54:33] That sometimes lies, right? So that's the big thing with like hallucinations. Like that's actually a question I have for you too, Finn, which is like that reduction of hallucinations. Should it, do you think people in school like understand or teachers under like understand that hallucinations are now like really low?
[00:54:49] 'cause wouldn't that come back? I mean, wouldn't that. Open up a lot more possibilities or, that's a good
[00:54:53] Finn: question. Well, I think, to your point, me, as you were saying, it do doesn't usually work that way where you can just try the same thing again [00:55:00] and it works completely differently. And so I really don't think they even realize because, I mean, with a new technology that's so strong, you try it once and your first impression of it is usually fully set there.
[00:55:14] Like, oh, maybe it didn't work out the first time. What's to stop you from? Like what's gonna stop you from never using it again? Right, right. There's not that much, like people can continue talking about it, but maybe you think, oh, but that only works in that certain use case that they're talking about it.
[00:55:28] When it applies to my life, it doesn't work at all because it is so wide, and so I think if they knew how much better it got, I'm, maybe they would be using it more. Yeah, I agree.
[00:55:39] Mallory: Weird. Almost at time. Finn, I've just gotta say, you are so wise, some of the things you are saying, I'm like, what? Where's this wisdom coming from in a 16-year-old?
[00:55:48] But I would love for you both to share kind of what's on the horizon for you. Any fun trips coming up? Any more AI workshops? Finn, are you gonna like go into AI when you go to college? Like, do you have any [00:56:00] thoughts there? Just in general? What's, what's next?
[00:56:02] Conor: Yeah. I mean. For me, like I, I, you know, I'm just going around and working with organizations, but for me it's, it's almost more fun to sort of see what FIN is doing next.
[00:56:12] Finn: Yeah. I mean, I would love to try and do more stuff. I, we've had people, people reaching out and so we're gonna try and do that. It's mostly a matter of when I don't have school, but Yeah. Besides that, yeah.
[00:56:24] Mallory: Awesome. Well, thank you both so much for being our first father-son duo, first duo in general on the podcast.
[00:56:32] It was such a pleasure to have you and to hear Finn, especially your insights as a 16-year-old. Um, we so appreciate the time. Thank you.
[00:56:40] Finn: Thank you for having us.
[00:56:40] Amith: Thank you.
[00:56:42] Mallory: Absolutely.
[00:56:44] Amith: Thanks for tuning into the Sidecar Sync Podcast. If you want to dive deeper into anything mentioned in this episode, please check out the links in our show notes.
[00:56:52] And if you're looking for more in depth AI education for you, your entire team, or your members, head [00:57:00] to sidecar.ai.