Leadership Spotlight: Helping Independent School Educators Use AI to Teach, with Eric Hudson
Eric Hudson has spent his career teaching teachers how to use technology for instruction and reach students from a long distance away. This episode explores how Eric became a thought leader in the space of AI and education, the current relationship between teachers and AI, and where independent schools are struggling and achieving when it comes to preparing students for a future with AI.
Resources
- Learning on Purpose (Substack newsletter)
- Global Online Academy
- Stanford University Human-centered Artificial Intelligence
- Ethan Mollick, associate professor at The Wharton School
- Diffusion of Innovations theory by Everett Rogers
- Denise Pope, senior lecturer at Stanford University
- AI Resource Guide for Independent School Trustees, published by ATLIS, NAIS, and ISCA
Transcript
Narrator 00:02
Welcome to Talking Technology with ATLIS, the show that plugs you into the important topics and trends for technology leaders all through a unique Independent School lens. We'll hear stories from technology directors and other special guests from the Independent School community and provide you with focused learning and deep dive topics. And now please welcome your host, Christina Lewellen.
Christina Lewellen 00:25
Hello everyone and welcome back to Talking Technology with ATLIS. I'm Christina Lewellen, the executive director of the Association of Technology Leaders in Independent Schools.
Bill Stites 00:34
And I am Bill Stites, Director of Technology at Montclair Kimberley Academy,
Hiram Cuevas 00:39
and I'm Hiram Cuevas, Director of Information Systems and Academic Technology at St. Christopher's school in Richmond, Virginia. Hello,
Christina Lewellen 00:46
gents. Hiram, Bill, it's good to see you guys again, good to have this little opportunity to chat. How's everybody doing?
Bill Stites 00:52
Doing? Okay, a little tired. But other than that, I'm managing
Christina Lewellen 00:56
bill. You're jetlag? I am you just got back from Australia? Are we going to make it? Are you gonna get punchy? Are you going to fall asleep? What's going to happen here? Punchy is
Bill Stites 01:05
par for the course for me, but I think I'll make it if you see me just nodding off or sounds like my head's hitting the mic. Just give me a little rattle.
Hiram Cuevas 01:14
I expect all of the above. We
Christina Lewellen 01:15
can mute you if you start snoring. Well, it's good to see you guys. I've been thinking a little bit about the nature of in person events due to the fact that ATLIS is conferences coming up. And so when we're recording this, we're in the final details, our team is getting really excited, our attendees are getting really excited. Our registration is really hot this year, we have a lot of people coming. And so in the last few years, I think we've all come to realize that the value of in person, whether we're talking about gatherings for an industry, conference, or school, there's incredible value around the face to face and having kids and teachers and faculty in the same space at the same time. Today, we're going to explore a little bit of a conversation around AI and what the future of all this looks like. But before we do that, and before we welcome our guests, I wanted to ask you guys, how are you feeling about the face to face thing the in person community building? How do your schools think about that? How do you think about it? You know, after these few years coming out of the pandemic,
Hiram Cuevas 02:21
I tell you, there was no greater time than the conference in Orlando, the energy was so high. When everybody was back together again, I think what we're starting to see from a programmatic perspective, everybody just wants to be together in the community. Again, that goes for being at your schools, at games for the kids performances for the students, proms are going on again, a sense of normalcy has taken over and I think it was so missed during the pandemic, that people can't get enough of it.
Christina Lewellen 02:55
Do you think we value it more?
Bill Stites 02:57
I know I do. I think getting back and getting in with people is probably some of the best things that we can do. Because it gives us an opportunity to come together to share to meet as a group, the last week, like you're just alluding to, you know, I spent in Australia working with some people at Parker College, and we had been virtual up into that point, just because of distance. And what you come to learn from people, when you get together with them, when you get to spend time with them. When you get to see them in their own element, whether it's at a conference, or whether it's visiting them at their school, as I was for the week, it really gives you more than what you could ever get from any of the things that we do virtually. And I think we've done a lot of those things well, but I really think that time that you get with people face to face, spending that time, it can't be reproduced in any other way in any other fashion and just love it. Yeah,
Christina Lewellen 03:51
I'm with you. And I think it leads us well into our conversation that we're going to have today around AI. We've been talking about AI, there's a lot of news headlines around artificial intelligence. But there's somebody in particular in our community that I wanted to welcome to the podcast because he has been having real incredible thought leadership around where this is going and the things that we needed to be thinking about rather than just sort of locking up and getting scared. And I do think that there's some conversations around being together the future of the workplace, the future of schooling, you know, if we get all worked up around AI, you can kind of go to the place where robots are taking over the world, right? But I think that instead, we can continue to ask questions about how we preserve community, and how we drive even more value from our in person or actions in a world with AI. So with that, I want to welcome to the podcast, Eric Hudson. Eric, you are a consultant and you are an ATLIS board member and we are so glad to have you here on the podcast. Thank you for joining us.
Eric Hudson 04:55
Thank you so much for having me.
Christina Lewellen 04:57
So why don't we start by having everybody He just understand a little bit about you. Tell us who you are. And tell us a little bit about some of the stuff you've been working on.
Eric Hudson 05:06
Sure, I'm a teacher at heart. And by training, I spent the first 12 years of my career in the classroom, mostly teaching middle school and high school English at independent schools. And that's where I really got interested in technology. You know, I was blogging with kids in 2006, to kind of age myself. And I also just really learned about how empowering technology can be when it's applied thoughtfully in the classroom. And after being in the classroom, I spent 10 years at Global Online Academy. It's a nonprofit consortium of schools around the world. And GOA does passion based online courses for high school students, as well as PD for teachers and school leaders and my role, I had a lot of different hats, a GOA, but the sort of core of my work at GOA was really teaching teachers how to use technology to teach well and to reach kids across distance. And that sort of led me into all the work I've done on AI, I left GOA, this past summer, and when I was on my own, and the timing was such that AI was really peaking at that moment. And I felt like I had something to say and a lot to learn. And I really care about schools, I really care about doing school well, and for the benefit of students. And so I've been doing a lot of work with schools on how AI fits into that picture, really, at all levels, teaching leadership and board.
Christina Lewellen 06:36
It makes sense to me that your background is as an English teacher, because your voice has been prominent in terms of AI, there's a lot of noise. And there's a lot of news around AI. But everyone is turning to you. And in particular, you have quite a distinct voice on LinkedIn, where you're sharing your thoughts, and you're distilling the things that you're processing things that you're thinking about. So tell me a little bit about that. I mean, was that kind of accidental or intentional on your part, you have such an incredible presence in that space. And I think a lot of people are really coming to rely on you for it. Thanks.
Eric Hudson 07:13
I have a Substack newsletter called Learning on Purpose. And when I left GOA, one of my sort of like post GOA resolutions was to write more and to get myself into a routine of writing. And the Substack really began as a place for me to document my thoughts and research. You're absolutely right. And thank you for saying that it's turned into something much bigger than that. I think having a background as an English teacher, and someone who thinks about writing a lot probably helps. But also, I was at GOA for 10 years. But the second half of that was sort of the beginning of COVID. And through the first three years of COVID. And I really learned a lot about how to talk to people about disruption, and innovation, and how to think about big things that feel overwhelming and try to make them a little more manageable, and try to recognize where we can kind of exert our agency and influence over things that feel much bigger than us. And I think I bring a lot of that experience into how I think about AI.
Christina Lewellen 08:21
So let's go there for a second. I'm sure we'll dig into it a little bit. But how about we start with a really high level overview. What do you think about AI? Where do you land on the spectrum?
Eric Hudson 08:32
The spectrum is wide. My last substack post was about this concept I got from the Stanford Center for Human Centered artificial intelligence called being a human in the loop. And it talks a lot about how you know AI systems really should be designed for humans to be coaching and driving the bots forward. Right, we shouldn't be designing AI systems to fully take away human agency and human influence. And I think that's a really helpful way to think about the role we should play with AI in schools. And I think it really captures well what I believe about all technology and education, which is we need to center and prioritize our humanity. And we need to center and prioritize how we think about the role humans should play in technology. You know, at least for right now. AI is a tool. And as a tool, we have agency over it. And my whole career has been about thinking about ways to nurture and support agency in teachers and students. And so I kind of really liked the idea that this is a very powerful tool with a lot of positives and negatives, but it is still a tool that we can use.
Bill Stites 09:42
Eric the one thing I can appreciate is the fact that you were just here at MKA and specifically working with our English department, which I think was just absolutely one of the best things we could do here. Because as you said, you know those big things that may seem overwhelming I think that was a lot of what our English department and English departments in general that whenever I go to conferences, that's where I usually hear the most concerned skepticism about AI. And the one thing that you just said that really struck with me is how to engender this idea of creating agency around? If you could, can you speak to some of that work, and let people know what you did with our staff here and how you develop that, to bring that about, because again, that was something that we needed, and that you were able to bring to bear for our school. And I'd love for you to share that with people.
Eric Hudson 10:34
Yeah, I loved that day with your colleagues. It's very, very rare that someone like me gets a whole day with a department, you know, a small group of educators focusing on one thing, and they were great, I think they were also really representative of my experience working with schools, and specifically English teachers in that there was a real spectrum of feelings about AI, a spectrum of experience with AI, and a real spectrum of curiosity, I guess about AI. And the thing I did with them that I tried to do in every single session I run, whether it's teachers, or students or board members, or whoever is, I want us to get in a chatbot. I want everyone to be on their device. Using a chat bot, I provide guided prompts. I provide follow up questions, I provide reflective questions, because I think the best way to make sense of AI is to use it. And I think dedicating time to think about a teaching problem you're trying to solve, see as AI can be supportive to you in solving that problem. And use that experimentation to develop a sort of informed consumer assessment of how well the tool works for you, or doesn't work for you. In my experience, a lot of the fear and anxiety that folks have about AI is mitigated when you actually get into it and understand that these tools are powerful, but limited, that doing the things you're afraid of actually shows how hard it is to do the things you're afraid of right? Like the output, at first is pretty generic, it takes a long time to coach AI to do the things you want it to do. And when you approach it through the lens of what the teaching problem I'm trying to solve, it goes back to that question of agency, it's like, oh, I can actually bring something I care about, I can bring something that relates to my expertise, and I can use AI to address that thing. And it becomes very, in my experience, very empowering for someone to actually be inside the tool and start to see the real practical applications that they could use it for. Do you
Christina Lewellen 12:48
have a sense of where you think it's going? Let's see in the English teacher space, how successful were you? I mean, were you able to bring them along with you? Or there's still some concern? Sounds like it was a great day. But do you think that educators will come along as they begin to work through their anxieties?
Eric Hudson 13:08
You know, there's only so much I can do in a day, right? My goal is, whenever I do a session, it's like, Have I convinced people to keep practicing? I think Ethan Maalik at Penn talks about, you know, everyone needs probably a minimum of 10 hours with AI to actually develop a clear sense of what it can and can't do for them. And so when I spend some time with folks, I want them to work with me, but also feel motivated to keep trying it. In terms of the future. I mean, I see lots of different pathways for educators specifically in English. I think they're sort of the teacher workflow pathway of how can these tools support my individual work as a teacher, support me with generating rubrics and learning outcomes? Support me with making my feedback better? Or managing my feedback? How can it support me with not writing because I think English teachers are naturally good at writing, but maybe like data processing, synthesizing assessment data, I think that's one pathway. I think the other pathway is how can I work with students with AI? Right, like that kind of AI assisted assessment idea, which is how can I bring it into my classroom? You know, how can I weave AI into the writing process as a way to kind of model for students what appropriate use of AI looks like what AI can do to assist with thesis generation or paragraph construction, or editing or copy editing. And then there's kind of like the AI free pathway, which is now that AI is out there and every student has access to it. How do I have to kind of redesign or reimagine some of the stuff I've been doing for a long time to be aI aware, even if I'm not using AI with students, there's probably some adjustments I need to make to the way some lies Testaments are designed the way I structure the writing process. And I think that's another pathway, especially for those teachers who may not feel super comfortable with tech or may not feel super tech savvy, there is very much a sort of non AI pathway to addressing AI, you know, the answer to AI is not necessarily more AI right? To me, as long as you are addressing it in an intentional way, I'm a little less concerned about the exact way you're addressing it, because I think there's multiple pathways into it.
Hiram Cuevas 15:34
So Eric, I find it interesting, because you started off by talking about your own personal belief with AI as being a wide spectrum. And I think you're touching on that right now, in terms of that spectrum. Within the teaching industry, we still have teachers who believe that using AI as a form of cheating the profession, and it's fascinating to see that because they don't want to use it, because they feel like they are selling their students short, as opposed to using AI as that thought partner, to assist them with some of the more mundane tasks or some of the things that they may lack the skills in that AI can then benefit their teaching practice. The other thing that struck me is when you you mentioned the problem versus what I'm going to call the exercise, I think so many times people are used to just solving exercises, and knowing what the outcome is going to be. AI is so open ended that when you type in your prompt, you're introducing you to a problem, each time you address that prompt differently, it's gonna give you a different outcome, you feed that beast, and it ends up learning from you. And so much can be benefited by that process. If our teachers were able to just take that leap of faith and embrace it, more than trying to push it away.
Eric Hudson 16:52
I totally agree. I mean, in some ways, AI feels new, but in many ways, it's not new, you know, it goes all the way back to like the diffusion of innovation curve, which came out I don't know, in the 1960s. Innovation in general requires vulnerability, right, you have to be willing to fail, you have to be willing to present yourself in a way that you might not feel comfortable or you might not be used to. And especially in schools, a lot of a teachers identity. And a lot of teachers, sense of confidence comes from that expertise, right, the sense of their own expertise. And so to try something new to do something with students where they're not quite sure of the outcome, they're not quite sure if it will, quote unquote, work. I think that's a big and understandable hurdle for a lot of teachers. So I think you have to think about, you know, when I design sessions, it's really thinking about, how do we create a space where people feel comfortable being a little vulnerable with colleagues? And how do we have a conversation where we can sort of air out some of our vulnerabilities and process them? Because ultimately, you know, adoption of technology is not about the quality of the tool, right? adoption of technology is about the willingness of the person who's supposed to adopt it. And so it's not even a leap of faith, it's being able to stretch in a way that doesn't feel like you're letting go of everything that you believe in, or holding on to right. And that stretch can sometimes take a lot of work for teachers, especially after the last four years we've been through and education
Christina Lewellen 18:28
is a really great point. And given that, do you feel like schools based on your experience, and you've been working with schools, you've probably gotten a lot of feedback from schools based on your very public sharing of your ideas? Are schools missing the boat on AI? Or are they taking advantage of the opportunity? I worry, based on what you just said, we're tired, right? Educators are tired administrators are tired. Our societal issues put pressure on schools like never before? And now here comes AI barreling at them. So are we missing the boat? Are we taking advantage of the opportunity? What's your view on that? The
Eric Hudson 19:07
tide is turning right. I mean, I think if you compare this to when Chat GPT came out, whatever it was a year and a half ago, to now schools are far more open, taking much more of a learning stance towards AI. I think when Chat GPT first came out, we all remember, you know, New York City had a very well publicized ban on Chat. GPT right. The conversation was all about cheating. And like this thing can write essays and it's like the end of English class or whatever, you know, sort of think piece headline you want to come up with. I think that's definitely turned schools hire me because they're open to AI. Right. But you know, I do think that because of the way Chat GPT rolled out, there's a real lingering culture of secretiveness, or even shame around use of AI in schools, like people are using it but they're not talking about because they're scared to talk about it, and I would put both students and educators in that bucket. And I think that's going to be the thing that slows down schools the most, is this lag around? How do we create a sort of open inquiry based culture around AI, rather than this sort of like, it's bad, don't tell me what to do vibe around it right now. But I definitely see things shifting. And I also see more investment in learning about the tools and giving teachers access to like Chat GPT for or giving teachers access to like, the higher level of magic school or whatever it is trying to get the early adopters, the resources they need to be able to continue to explore, I think those are all positive signs.
Bill Stites 20:52
The one thing I think is really interesting, as you talk about some of the concerns, some of the fears, some of the thoughts around all this, you know, where teachers are landing on the spectrum of this, one of the most interesting things I think I've experienced in the past couple of workshops or conferences that I've gone to this is when we've had student panels, and when we've engaged the students in the conversation, and it's very interesting, because I think what you often hear from the educator is how the kids are going to use this for x, y, and z reasons that are going to do all the things that you're just talking about, when you talk to the kids about it. They're like, No, you know, we kind of use it, but we don't count on it for everything that we need. And, and I think everyone just kind of needs to take a moment because I think all too often we think that our students are so far ahead of where we are with this stuff. But when you sit down and you have those conversations, it doesn't often pan out to be the case, of course, you've got the outliers on either side of it whenever you've got anything like this, but I think for the vast majority of them, both teacher and student are like are still in this questioning phase, about how do I use this? How do I use this in a meaningful way? How do I use this in a way in which is going to improve my teaching or my learning? Or both? Depending on what side of it you're coming at? I think that's something I think we need to do more of, is have those conversations.
Christina Lewellen 22:12
Like we can't assume that it's going to be used for evil. Oh, exactly.
Bill Stites 22:16
Yeah. And getting that student perspective. I mean, one of the things that I've always loved about what we do here at MKA is we have our student ad tech leaders in our library leadership kids, and those are the kids that we turn to, whenever we've got questions about anything. It's like, you know, what device do you think we should get? Or what do you think of this new AI thing? And the conversations that come from those interactions are the ones that I think really have the most meaning. And I think, really, if you're going to share anything with educators, it should be those conversations because it gets everyone on kind of like the same level. And you can really have a meaningful conversation and dialogue moving forward, rather than assuming all of these things are going on when they're not necessarily the case. I
Eric Hudson 23:02
couldn't agree with you more. I mean, one of my favorite activities, when I go to schools is I asked schools if we can do like a lunch or something where students and teachers are sitting at tables together. And I sort of work them through a few discussion prompts and Case Study discussions. for exactly that reason bill, you're saying is that like, you need to make sure that you understand student's lived experience with these tools. And also, that students are on as wide and varied a spectrum as adults, when it comes to AI, over and over and over again, when I talk to students, I am kind of surprised by how few of them are using AI on a regular basis. And I would point folks to Denise Pope at Stanford who has actually done some surveying on this, like she's finding that the majority of students are not using it. And the majority of students who are using it are not using it to cheat. So I think that there's this communication gap to be bridged between educators and students in schools that could be really valuable for both to really understand the perspective isn't concerns that folks are bringing to the table.
Hiram Cuevas 24:12
Eric, I like the fact that you've mentioned the communication bridge. I think regrettably, there are some schools out there that they've made AI so punitive, that the kids, unfortunately are very, very scared to talk to students in a couple of different schools. And I said, Sorry, are you allowed to use AI and then oh, we can't touch it. We can't look at it's blocked at our school. And it is like the forbidden fruit of the education field right now. It makes me sad because I look at how, in the short time that we've been able to explore AI as a faculty on our campus. We started off with that punitive sense and we're just starting to turn that corner because they're starting to understand that for our students, this is a life skill. This is not something you can ignore. If they do not have Access, or the ability to learn how to use it appropriately, they are going to be behind. And that's not what independent schools are about.
Christina Lewellen 25:06
And I would add, it's the future of the workforce. As we wander around and do sessions, the ATLIS team, we talk a lot about the fact that at ATLIS, we use AI all day every day to help us advance our work and take away mundane tasks. And the students that independent schools today are going to be the workforce of tomorrow. These are the folks that I will need to hire to run a great organization. And I need them to be aI literate. I'll also just share one story, Eric, that I have shared publicly among educators. And you can see the eyebrows go up and I can see their head tilt a little where I maybe have cracked some of the hard shell. And that is that our oldest daughter is adopted, we adopted her right before she turned 17. And she bounced around from school to school, her entire childhood. And she had some significant holes in her game, including in the realm of reading and comprehension, that core, younger year foundation that she needed, she just didn't get. And so the High School in our area, the public school just pushed her through, right, they just needed to graduate her and then COVID happened and all this, but we're not exactly sure that she was reading. She certainly was not at grade level when she graduated. So fast forward a few years. She wants to go to community college. And she's doing great by the way, she wants to become a kindergarten teacher. So she's about to head off to a four year college in the fall. But part of what we recognized or saw was that the community college was giving her zero guidance on AI. And yet, I'm her mother. And so I went home and taught her how to use it ethically, I taught her about the concerns, I told her about the things that I would consider cheating. So she uses AI all the time. In sometimes it's simply to take a professor's prompt and put it into AI into Chat GPT she tells it to rephrase it, to help her understand the deliverables around it. So it has really advanced her learning. It's like a tutor, right. And yet, her Community College gives her no guidance on how it can be used well, it how it can be ethically applied to her education, how it can be a thought partner, I want your reaction on that. Because I just took it into my own hands as mom and knowing my child very well. Now she's an adult, but knowing my human very well that she's got the goods to become an educator, but she just was failed by the system she came through. Right. So I guess my question then is, it's not very fair to her to not have guide rails. And what I wonder is are some of our schools in the independent school space, letting our educators and our students down by not addressing this policy wise?
Eric Hudson 27:57
Yeah, I think so. I love that story. By the way, I'm so happy for her. That's fantastic.
Christina Lewellen 28:03
I'm so happy for her to she's really doing so well. And she you know, I think that she's going to be an incredible success story of how this type of tool can really help you know, English as a second language learners or any learner who has any kind of disadvantage, but I sure wish they had given some guidance. Yeah, in terms of how it can be used. And
Eric Hudson 28:24
I think that connects to a lot of what we're talking about, which is the kind of fear and anxiety among educators right, a reluctance to put themselves out there on the topic. But I think what you raised is kind of, I have a slide that I share with boards and heads all the time, which is like what is the job of schools in a world being disrupted by AI, like you can love AI or hate AI. But you have a job to do, right. And your job is to educate students. And so I would argue that it is our job to educate students about AI because the job of schools is to prepare students for the world beyond it right. And so I think there is a kind of a missed opportunity. And I would even argue a responsibility at schools to educate students about AI. I also think that that story illustrates the ways in which AI is not new in the sense of what do we consider to be appropriate academic assistance or academic support? What do we consider to be sort of equitable distribution of support and assistance? Right? Independent Schools have been dealing with questions of tutor support, parental support, peer support, Internet support? These are questions that we've been wrestling with for a very long time. And I think we are able and therefore equipped to address these questions of what does responsible support from Ai look like? You know, the goal of all the Silicon Valley companies is for every student to have a high quality personalized tutor in their pocket, right. So what do you as a school want to do about that? School is so much Much more than a personalized tutor in your pocket. But what is it right? And what do you want to be in a world where every student has access to that kind of thing? Because right now a subset of your students have access to that kind of thing. But we're getting very close to a time when every single one of your students will have access to a high quality tutor. And so what does that mean? And I think it's a big, important question that schools need to be thinking about right now. Eric, do you think
Bill Stites 30:27
that there's going to be a gap there between the haves and the have nots was something like that, you know, I've been playing as I'd like to say, most people should do with these things, you got to get in there and play. But in some cases, there's a fee wall that you're going to hit in terms of where these things go. I think about Hudson Harper, who was on the podcast talking about AI, and some of the tools he's developing on the back end, use this, but you need the pay version of this in order to do that. And I just think about, again, whether it's using Grammarly or any of these tools, these AI based tools, there comes a point where we start looking at it, we're saying, Okay, we provide everyone a laptop, we provide them all the software, but what are the things? You know, you mentioned tutoring that's always been, you know, like that thing? Can your families afford a independent school education, and a tutor? And if the virtual tutors are now introducing these paywalls, where do our students end up at that point? And how do we address that? That's
Eric Hudson 31:28
a great question for all the tech director is on this call, like in terms of budget, and where you spend your money and where you don't spend your money. Like I think right now, it's very hard to imagine a school committing to a single vendor, because it's like the Wild West, right now in terms of AI tools. But we're already seeing data out of pew about an increasing socio economic gap between folks who know about and use AI versus those who don't. And I think that in service of many independent schools commitment to equity, you have to really address that question of how do we kind of lower financial barriers to these tools when they get good. And also, you know, what is the sort of like nonprofit or educational enterprise financial model that's going to work that allows schools to adopt some of these tools in a way that protects their data? Because right now, I can't tell the business office at an independent school to upload spreadsheets into Chat GPT, to learn about how well it does processing data, because you should not be uploading proprietary information into the widely available chatbots. And yet, the enterprise model Chat GPT is prohibitively expensive for the vast majority of schools. And so this financial question is like a huge problem for schools. But I go back to sort of looking at your ecosystem of technology at your school, and maybe using this AI moment as a moment audit what you've currently got, how well it's serving you the utility of those things, and where AI can fit in. Teachers talk about this all the time, we can't do anything new until we stop doing something old. So what is the thing that schools might be able to stop doing technology wise, or integrate, at least with AI to make some of those tools more accessible, especially for the students who need the most, I
Christina Lewellen 33:26
mean, it's temporary, right? Like the price is going to come down. And it will also be integrated into the tools we already pay for. So I do think that we're kind of in this on that bell curve, I think that it's gonna get easier and cheaper as we go, which is why it's also important that you're out there, Eric banging the drum of guys pay attention to this. Now, be proactive about these conversations. Because whether you want it or not, it is going to be in all that it already is in a lot of our tools. And so it's not just like blocking or getting access to one thing, Chat GPT, which is I always say the gateway drug of AI, because it's just so easy for everybody to understand it and use it. And yet it is like literally one of a bajillion tools that leverage artificial intelligence. So hopefully, it won't be that pricey forever. But it's good to be on top of it now and think about it.
Bill Stites 34:18
When you look at the tools that schools are using, I mean, I'll put it, you're in two camps. You're either like a Google school, or you're like Office 365 tool. And one of the things that you see is like our Chat GPT you know, can students even use it? Do the Terms of Service apply to that. So one of the things I would love to see, for instance, we MK we pay for the enterprise level of Google for Education, we want all the extra features. At that point. I would love to see Bard included under their educational like the core services there. And then apply the same types of privacy protections that you've got when you're using those education pieces with Google To those things that Microsoft can do the same thing with co pilot, you've got schools in two camps, they've already had policies around what they're doing with those from our data privacy standard, you're already paying for it, give us the tools, because again, it's in their best interest those companies, it's in their best interests to do that. Because similar to what like Adobe does with making the pricing on the creative suite that much cheaper for schools, because they want to hook you early, they want you to be used to using that tool. So it's a tool that you use when you go off to college or university or when you go out. So I think the more pressure we can put on those two companies, in particular, to have those tools inclusive of their educational suites covered the way in which they do from a privacy and data protection standard. It benefits us and I think it behooves them in order to build that base early on. And that's really what I'm hoping we see. Because too often it's like, you know, we spent half our time now reviewing the terms of service and privacy policies to say, can we even use these tools, and it would be so much easier if we just had it in what we're using bundled in the two biggest companies out there, they know what they're doing, they can figure this out, just make it work for
Hiram Cuevas 36:14
schools. Yeah, Bill, I agree with you wholeheartedly. And then I want to comment on something that Eric mentioned about and that is auditing the tools that you already have. And so often schools just go ahead into the next fiscal year, they just pay for it, they don't even take a look to see what the usage history is they don't take a look at you have the right number of subscriptions, see counts, all that kind of stuff. It nickels and dimes, schools, budgets to death. And one of the prerequisites that I try to make, when we have our meetings with the different divisions is, before you introduce a new application, you've got to make sure that there isn't something that already does that very thick. It just can't be the new shiny. We've had people ask about using Chat GPT, they'd love to go to 4.0 it but then, you know, at 20 bucks a pop, it gets very, very costly. Really, really fast.
Christina Lewellen 37:08
Yeah, absolutely. So Eric, I would imagine that when you're talking to educators, when you're talking to administrators, so just schools in general, it's hard to isolate this conversation of AI because it is like a spiderweb, it touches everything. If you peel back the onion a little bit, even if they bring you in to talk about AI, and kind of where a lot of your thought leadership is right now. I'm certain that this plays deeper, right, like strategy and different leadership issues, etc. So what are some of the other trends or issues that you're helping schools wrestle either related to AI or kind of independent of it?
Eric Hudson 37:45
Yeah, that's a great question. I'll start from the classroom and kind of work my way up. Schools have been working for a long time on how can we be more flexible in our assessments? How can we be more personalized in our assessments? How can we connect our assessments to the real world? I think AI can be supportive in all those ways, right? As long as we're sort of helping students and teachers build proficiency and how to use AI in those ways. At the leadership and strategy level, I work a lot with schools on sort of drafting their AI policies. And a lot of that boils down to what is their position on AI? What do they believe about AI? What do they think AI is good for? And I think that's a really important strategic conversation for schools to be having, which is, how does AI fit into your strategic plan? Even if you've got one now that you're working on? Where does it fit? How does AI align or not with your mission? And again, this idea that AI is a tool? How can you think through your own mission and strategy through the lens of AI? And you know, we haven't really gotten into, from the strategic perspective, AI applications and the non academic side of schools, like the operation side, the business side, the marketing communications side, what are applications on those areas of schools that align to our school's mission and strategy. It's something that I've schools have started asking me to do a little bit more is present to non academic staff, non teaching staff so that they can increase their literacy. Because, Christina, as you said, Those folks are in the workforce, they do need to learn how to use it to stay competitive in their industry. And so from a leadership perspective, especially heads, we're sort of looking at the entire ecosystem of a school. You know, thinking about both academic and non academic can be really helpful in sort of laying out the most effective pathways forward when it comes to AI. Yeah,
Christina Lewellen 39:50
I'm speaking with a lot of accreditation organizations at their events through the summer in the fall and it's more on operations than it is academic where they want to know like, what are the tools for the business office? What are the tools for admissions and you know, enrollment. So I've been diving deep into productivity tools. I know what we use at ATLIS. But I also think that there's a lot more that would help. And this plays to burnout, and work life balance, there's health issues that we can address by becoming more efficient businesses. So I'm not surprised that you're getting that request, because I'm definitely seeing it too. Yeah,
Eric Hudson 40:30
AI superpower right now is data processing, right? Like pattern recognition. You just think about competency based education for employees and mastering things like Excel, or learning analytics, or whatever. AI can be so powerful in that regard, with students too. But to your point, I think the application is a little more relevant and probably a little easier on the non academic side of schools. We're replacing
Bill Stites 40:56
our switch stack and our access points. And if our AI policy was so restrictive to say that we couldn't use it, that's the main reason I went with the product that I went with, because all the stuff to be more proactive. He, as you said, you know, the numbers crunching right, I need somebody because I don't have the staff to be sitting there monitoring logs, looking at all these things. I need hardware, I need software to work together to monitor and to check all those things to say, Hey, Bill, take a look at this bill, take a look at that, because, oh, I would do it, it'd be doing that I can't. So that's where you think about all the offices, but on the IT side on the infrastructure side of it, you can't avoid it, because it's only going to make us better at our jobs.
Christina Lewellen 41:43
ATLIS recently worked with a couple of other organizations in our space, the National Association of Independent Schools, and the Independent School Chairpersons Association to create a guide for trustees around AI like what kind of questions should they be asking? Where does the responsibility lie for trustees versus things that are clearly more operational, we don't want to get into that space. But trustees also have a role. And I know that you were a fan of this product. We just recently launched it, it will be available to all ATLIS members, NAIS members, including the Independent School Chairpersons Association members. What do you think about that, Eric? Like, is there a role for trustees to play when it comes to all this policy, conversation and strategy setting around, you know, basically, like the philosophy, the school's philosophy of AI,
Eric Hudson 42:31
I love that document. I share it, I promote it every time I present to boards. I'm a good board member. I love that document. First of all, because that document really focuses on questions. And I think that that's what boards should be doing right now is asking questions of their heads and of senior leadership at schools rather than providing solutions. I'm working with a board later this week where that is very explicitly the goal, the board chair said she's like, I want us to articulate the questions we should be asking, because from my perspective, the primary responsibility of a board is stewardship. And in order to be good stewards of an institution, boards need to be able to look ahead and they need to exercise foresight, which that ATLIS document talks about a lot. They don't need to be operational, they need to keep their heads of school or their executive directors looking forward. And how do we raise those questions for heads? How do we ask questions that are a little bit bigger and a little bit broader? Especially because inside schools on a day to day basis, it's very in the weeds when it comes to AI? What do we do about this kid who cheated? What do we do about purchasing GPT? For for this teacher? Right? So I think the role of boards is just pull up the institution a little bit, try to look at the horizon and help the head balance the day to day decisions with the long term strategic visioning for it. So I think there's absolutely a role for trustees to play in all of this and they need to be educated to in order to be supportive of their heads or their executive directors in doing that work.
Christina Lewellen 44:11
Absolutely. You've given me a great pivot point. And that is you are a great board member. You are a wonderful board member, you've been on the ATLIS board for a bit. So if I can, I'd like to ask you a couple questions about that. Because you have a tech background, you have a teaching background, and now you've shifted into this consultant role. I find you so incredibly valuable at the board table, and Hiram is a board member, too. So I've got a witness. But one of the killer roles that you play for us is that you have helped ATLIS kind of re envision our thoughts around leadership succession planning, and how to get great people on the board. You're the chair of what we affectionately call our NomCom, our nominations committee. So tell us a little bit about that. In some of the work that you've been doing in that space, because I think that, you know, ATLIS is coming up on our 10th anniversary being an organization. And it was time to think about our leadership plans and strategy. And you have just carried so much of that project. So can you tell us a little bit about that?
Eric Hudson 45:18
Yeah, sure. You know, as I said, the nominations committee is a subcommittee of the board. And the sort of charge of the committee was to basically make sure that we were adding new board members, qualified board members on a certain cycle every single year. And this is my first year as NomCom. Chair, and my first decision was to kind of stop nominating people for a year just for basically going against the job description, because right, you
Christina Lewellen 45:44
call the timeout, right? Call the timeout. Yeah,
Eric Hudson 45:46
I call it up. Because, as you said, ATLIS is coming up on his 10th anniversary. And it was time to really look at the way we compose the board on a year to year basis, especially given ATLIS a strategic plan, especially given its DEI framework, how do we think about a nominations process that can be as open and as inclusive and as accessible as possible. And so what the committee did this year was, look at our existing processes, look at our existing protocols for adding new members to the board. And basically deciding that we wanted to move from the sort of network ease practice of I know this person who would be great, I'm going to suggest that person a very sort of internal, almost like blackbox process, to moving towards a self nomination process through an open application, where we really invite the ATLIS community at large to nominate themselves and to encourage folks they know, to nominate themselves. So that we're drawing a little more deeply and a little bit more broadly, from the wider ATLIS community. And we're also being very open and transparent about the process that goes into joining the ATLIS board. And I think that's really important for an organization that is growing as fast as ATLIS is is diversifying as fast as ATLIS is, we just want to make sure that the composition of the board is representative of the experiences and perspectives of the entire community. And so, you know, we're going to launch that new process next year. And so I think it was a pause, worth taking for the organization. And I learned a lot about organizational pauses in general, I have to say, the sort of relentless desire to keep checking boxes, and what's the value of actually very intentionally stopping for a while and actually spending that time reflecting on what you do and why you do it. I learned a lot from that process.
Christina Lewellen 47:48
I think we all did, it was incredible. I mean, because we really only had kind of like one seat to fill. So rather than go through the whole process, like we always do, we call the timeout, and then the NomCom was able to do all this incredible detailed work, like stopping and thinking about it and doing this re envisioning of the process. I can't wait to roll it out. I think it's going to be incredible.
Hiram Cuevas 48:11
And Eric, I'm supposed to ask you a question. But I want to pay you a compliment more than anything else. As a new board member. And actually, as a member of the NomCom. I've been incredibly, incredibly impressed as to how thoughtful, you run that subcommittee, it is so clear to me that you spend the time thinking carefully, very carefully. What you're going to say what you're going to write the intentionality behind it is really awesome to see because you mentioned it earlier, schools are very bad at taking away. And we're successful at doing that. And I think so much of us can learn from that, that just by hitting pause, it added so much fresh air to that subcommittee, that it allowed us to really grow in ways that I don't think we ever anticipated.
Eric Hudson 49:01
So thank you. Thank you. That's so nice. Thanks for being a part of it. It helps when you're working with a really good team, I have to say it was a lot of work. It
Christina Lewellen 49:10
was you know, everybody rowing in the same direction is always fun. Yeah, it was great. So we kind of started this podcast and right before you joined us, Eric, I was asking bill and Hiram about how they feel about in person things right? We're getting ready for the ATLIS conference, school, there's always going to be that we have discovered, right, coming through the pandemic, we've discovered there's always going to be a role for community in what we do as humans, we like to be together. That's kind of the nature of how we are. So now I want to ask you that question through the lens of AI. Some people are afraid of AI and think that AI is going to you know, robots taking over the world. Right? And I think that AI in some ways could amplify the community that we're seeking. If we free up time for teacher If we free up some of the mundane assessment that has to get done among our learners, does that create more opportunity for community and I know I'm kind of leading you. So you feel free to push back if you feel otherwise. But I think some people worry that AI is going to take away our humanity. And I just wonder how you feel about that.
Eric Hudson 50:18
I was going where you were leading anyway. So you know, I, it's not a zero sum game. And that's how I feel. But all technological innovations in school, like it's not an either or, I really believe that AI can enhance the human elements of school, I go back to what I said earlier, if we know that the goal of these companies that own these tools is to have a high quality personalized tutor in the pocket of every student. Okay, but that's just one element of what school is right? What are the other elements of school, the relationships, the friends, you make? The mentors, you develop the social emotional learning, right? The experiential learning school is so much more than just instruction school is so much more than just content delivery, or feedback on work, right. And I think that schools just kind of need to think through, like, what is their value proposition in a world where AI is going to be able to take on a lot of that work. And I think, school's value proposition is going to be it's sort of status as a human centered organization. You know, if you look at any data or research on what students love, and remember about school, it has to do with relationships, it has to do with meaningful experiences. It doesn't necessarily have to do with this one thing they learned in math in 10th grade, right. And so I think that's kind of the thing to be very aware of, is that we still are human centered organizations. And we can really lean into that expertise and experience and integrate AI in ways that really elevate it, so that those two things don't necessarily have to be in conflict. I
Christina Lewellen 52:09
can see why you have such an avid following on LinkedIn, you just bring such a clarity around this. I mean, I don't know if you guys, Hiram, and Bill, if you're feeling pretty inspired about all this. But really, it does, in some ways, give us a vision of what we're pointed toward. We don't need to be afraid of all this. And I think Eric, it just doesn't surprise me that people are flocking to your thoughts. And I hope that you continue to feed the beast, because we're definitely inspired and encouraged by the things that you're saying. So this is incredible. I really appreciate it. Oh, thanks.
Hiram Cuevas 52:41
Let's just say I was very jealous to see that he was at Mk.
Bill Stites 52:47
Oh, no, we were very lucky to have him. That was for sure.
Christina Lewellen 52:51
There's always this rivalry. Are we fighting over? Eric now? Do we need a custody agreement?
Eric Hudson 52:56
Yeah, be nice boys.
Bill Stites 52:59
It was great having him here. Like I said, the time you had with English, but just the clothes that we were able to have with some of the leadership team in the different areas and bring everyone together. You know, it was able to pull everything that happened during the DA together, but also have some further conversations to really plan the path forward. It was fabulous having them here. It's fabulous. having me on the podcast, just being able to get those moments in that time together I think is always important.
Christina Lewellen 53:24
Yeah, I'm feeling inspired. Eric, I hope that you will come back because this is definitely changing and quickly. So I have a feeling that even if we have a conversation in six months or a year, it'll be a bigger and different conversation. So please come back and see us.
Eric Hudson 53:40
Yeah, I'd love to thank you so much for having me. This is a great conversation.
Narrator 53:46
This has been Talking Technology with ATLIS produced by the Association of Technology Leaders in Independent schools. For more information about ATLIS and ATLIS membership, please visit the atlis.org If you enjoyed this discussion, please subscribe, leave a review and share this podcast with your colleagues in the independent school community. Thank you for listening.