Navigating the Impact of Artificial Intelligence on School Programs
The impact of Artificial Intelligence on education demands that our professional practices adapt. Join Kelsey Means and Christopher Esposito to explore the profound implications of AI for school libraries and learning. We will delve into the core challenges AI presents—specifically, the critical need to address bias, promote deep AI Literacy as an extension of digital literacy, and ensure accountability in student work. Simultaneously, we will highlight the exciting opportunities AI offers, such as fueling creativity and creating a deeper impact on student learning. Discover how our school is actively addressing these issues, from establishing a Responsible AI in Learning (RAIL) task force and revising AUPs to offering professional development and integrating tools like Gemini AI for students and teachers. Attendees will gain practical strategies for navigating these changes while keeping A.I. Privacy and Safety.
Transcript
Welcome, everyone.
Wonderful to have you with us today.
We are going to be talking about navigating the impact of artificial intelligence on school programs.
I think you're really going to enjoy today's sessions.
We have some fantastic presenters with us.
So we've got Chris, who is the Director of Technology at Lancaster Country Day School, and Kelsey, the librarian there, and they're going to co-present for us today.
Chris, Kelsey, welcome.
We're so thrilled to have you with us today.
Thank you, Ashley.
Yeah, thanks for having us.
Yeah, we're happy to be here, and we're happy to share what we've learned on our journey through this, with everybody else and kind of hear what everybody else is doing as well.
All right.
So we already kind of got introduced, but I'm Kelsey Means.
I'm a library media specialist, but also an ed tech instructionalist as well.
This is my second full year at Lancaster Country Day.
I'm also a PhD student at Towson University.
I'm studying, can't talk.
I'm studying instructional technology.
Before coming to LCDS, I did about eight years in public school, so I'm still very new to the independent school world, but happy to be here.
And I'm Chris Esposito.
I'm the Director of Technology here at LCDS.
This is my second year at LCDS.
Prior to that, I was at Charlotte Latin in Charlotte, North Carolina, for six years, as their IT manager.
And before we get too far, I want to make sure I'm going to put in the chat a link to our live presentation, because there will be some links throughout you might want to explore as we go along.
So before we dive into what we're going to talk about today, I just wanted to just give a little background information on our school.
So for those that are not familiar, Lancaster Country Day School is situated in beautiful Lancaster County, tourist destination of Pennsylvania, and the school sits on a 26-acre campus.
It was founded all the way back in 1908, and one of the unique things about our school is that we pride ourselves as being one school under one roof.
So we have preschool to 12th grade all in one building, and it creates a great family atmosphere here.
On average, we have about 600 students total, ranging from preschool to 12th grade.
All right, so that being said , what are we going to talk about today and what are we going to discuss? So the goal of what we want to discuss today are four things.
So we want to touch on policy creation, AI philosophy, how to use AI in the classroom, and some helpful tools at the end that maybe you can take back and use in your classrooms as well and your schools.
I preface that by saying neither of us are experts in any of this, but we want to share what we've learned along the way, on our journey since we've been here with AI and kind of hear what you guys are doing in your schools.
And hopefully we both take something away from this.
So before we get into the nitty-gritty, we just want to preface this by saying, here's kind of our overarching philosophy for this presentation.
AI is not going anywhere, and as with any emerging technology, our professional practices have to adapt.
This is no different than any other really big tech change, like the internet or introduction of one-to-one devices.
And even if we were to say, "No, we don't want to touch generative AI at all," it's increasingly being integrated into everything we do.
Every application we use, every digital tool, we're seeing it pop up pretty much everywhere.
So we will focus a little bit on how to get our teachers on board.
But it does have to start with our institutional policy.
All right.
So we're going to take a little journey, a policy journey that is through where we started, where we're going, and where we're headed.
So, before we get into that, I just want to say, so from a classroom perspective and an IT perspective, the only AI that is allowed to be used here is Gemini through Google, through education.
All of the other ones are blocked or should be blocked.
So when we decided to make an official AI for the school, that kind of led us to look over our policies, kind of review our policies, and kind of see what we needed to change and where we need to go.
As of right now, only middle school and upper school have access to use Gemini, but obviously things can change.
We really would love to get more of that in lower school, but Kelsey will touch on what is appropriate and what's not appropriate for the lower school students later on in this.
So last year, we started in earnest with AI policy review, by participating in RAIL, which is Responsible AI in Learning, and it's an accredited thing by the Middle States Association.
So we picked to be on our RAIL team, in addition to Kelsey, myself, one person from each division.
So we had someone representing lower school, someone representing middle school, and someone representing upper school, so we can get everyone's thoughts and different perspectives about how it's used, how they feel it should be used, and where it should be going.
So we worked for months, I think it was six months- Mm-hmm ...
on that, kind of digging through what was out there, what other schools were doing from a policy perspective, and we crafted the framework of what the policy here at LCDS should be for students and their families, as well as staff and faculty.
Once we had that framework in place, we broke it into a smaller group called the AI Task Force.
So the AI Task Force was, again, Kelsey and myself, with one representative from each division, but this time we included students, and we had a student focus group thatHad three upper school students.
And it was really interesting to hear their perspective.
They actually brought a lot of great ideas that we weren't thinking of- Mm ...
from their view, and a lot of what they said was very eye-opening about how it's used in the classroom, how we need to be clear about what we're doing with AI.
And it was really, really great.
The plan is to restart the task force next year as well as the student focus group.
So what we did is we provided a baseline for the professional development that Kelsey and I later did for faculty and staff so that everyone had a basic understanding of where we were starting from.
So that was last year, '24, '25.
So bring us into 2026.
So we're still at it.
This time we've partnered with a group called Consylium, which is basically AI strategy for educators by educators.
We've had a few meetings with them so far, and I have to say, it's really, really great- Yeah ...
the stuff that they've provided, the information that they've given us.
Sometimes I feel like , and you guys may feel like this too, a lot of the AI stuff that we hear is just regurgitated information and there's only so much you could say.
But it's really nice to hear it from educators because I think that puts a fresh perspective on how we're doing things.
So far with them, we have looked at our policy from last year, got a little feedback on what needs to be tweaked, what can stay, where we should go, where we should edit and change that.
And we also learned about effective AI policy and the life cycle of a policy.
That should always be rooted in the mission and the philosophy of the school.
And you don't want to make it so detailed because AI is changing so rapidly, you have to go back and edit that policy every week or every month.
But you also don't want to make it so broad that you can't protect yourself and your students from what is coming out there.
The plan is to keep working with them and, like I said, reestablish the task force and, Kelsey and I have some more PD plan for faculty- Mm-hmm ...
staff in cooperation with Consylium in the fall once everyone gets back.
So, I wanted to share with you what our policy looks like right now.
Obviously, it's a little in flux.
But I have linked here, and if you follow the link to the presentation in the chat, you should be able to open this link.
I basically just took out our AI policy and put it in its own Google Doc so you could compare to maybe what your school has.
So the family and student handbook has a policy.
We're not going to read it in detail, but feel free to take a look.
And we also have a separate policy for faculty and staff.
But in addition to that, this school year, every teacher was expected to put some type of AI statement or policy in their syllabus.
And the departments worked together to kind of make sure they were aligned, like all of humanities kind of worded theirs similarly, et cetera.
We tried to, and I'll talk about this a little bit more in-depth later, but we tried to encourage no zero tolerance.
That was feedback we got from the students was, "Well, if I'm not allowed to use any AI, am I allowed to use spellcheck?" To them, it was very confusing to take a zero tolerance stance.
They really wanted us to be as explicit as possible, even assignment to assignment if the teachers can.
So this here is just an example of one of the syllabus AI policies that one of our teachers used, and we kind of used this one as the pilot to show other teachers how to create that.
Which brings me to Let's take a poll.
Let's take a poll.
Again, this should be linked under "Let's take a poll" if you're in the live presentation.
But we just wanted to kind of see, I know policy was slow going.
Trying to get out of it here so I can show you what we're looking at.
Policy was slow going with creation here, and I know it was for a lot of other people.
So I'm just curious, if you want to put in the Mentimeter here.
Let me present it so I can get you the code again.
See if you have a policy.
Oh, yep, I can relink the presentation in the chat.
I just saw that.
There we go.
And this is the time, if you want to make any comments about your policy or your policy process, feel free to put it in the chat or even unmute.
Yeah.
What do you guys think is a good way to create policy? Mm-hmm.
What's a good way to look at this stuff? And for those that have voted yes, and that's really good that a lot of people are voting- Mm-hmm ...
yes, that you have an AI policy, when was the last time your AI policy was revised? Mm-hmm.
And how often do you guys revise it? And again, you can put it in the chat or even unmute if you'd like.
While we're doing this, I also want to add that, so we're a one-to-one school.
We have iPads for first grade through eighth grade, and then coming next year new is we're going to do MacBook Neos for all the upper school.
Mm-hmm.
But each family that gets a device, as part of their AUP/RUP, they have to sign it.
So the child has to sign it as well as the parent/caregiver has to sign it and acknowledge that so that we make sure that these things are being read- Mm-hmm ...
by families, by the student them themselves, and then that gets handed back to us.
Because sometimes if you bury this stuff in the handbook, it's easy to get overlooked when you're trying to fill out a bunch of paperwork.
But we made it so in order to receive a device, that this is something that you have to read, you have to acknowledge, and both parties need to sign so nobody could say that they were not aware of any of these policies.
Because to use a device, especially these days, is a big responsibility for kids, and they're putting a lot of power in their hands, so we want to make sure that it's being used for educational purposes and it's being used responsibly.
It looks like a lot of people have similar policies to us.
Yeah, this is good.
Yeah, this is great.
Thank you so much for sharing.
I'm going to go back to our presentation here.
So we touched on this a little bit already, but here just our policy philosophy so far.
Obviously, AI should be used intentionally.
This is something we say a lot about all technology.
We want to use AI intentionally.
Like Chris said, school policies should be just right, not too broad, not too specific, Goldilocks, if you will, as the librarian here.
And like I said, zero tolerance is not helpful.
And this was, I think, the most important bit of feedback we got from students was that when teachers say, "Well, you just can't use any AI," it's super confusing to them.
They want to know explicitly what they can and can't do with AI, and they explicitly want to know what are the repercussions if they have been found to be using AI when they're not supposed to.
So that was our goal with teachers and their syllabus policy.
I will also mention here, we're not touching a ton on AI bias or the environmental impact, but our students are very concerned about the environmental impact of AI.
Just this year, just talking to students about AI, that is their number one complaint.
If I have a student that is severely against the use of generative AI- Sure ...
it's most likely because of the environmental impact.
So we didn't want to not say that throughout this.
We are well aware, and again, that's one of the reasons why we stress intentional use of AI, because of that environmental impact.
So, wanted to just mention that.
So let's talk a little bit about AI in the classroom.
You'll notice in bold at the bottom of the slide, I put, "This is super overwhelming for teachers." At least that's what we're finding at our school.
I'm sure you're finding at your school as well.
Any new technology like this that redefines how we teach is going to be a huge adjustment for our teachers.
Again, think introduction of the internet, introduction of one-to-one devices.
These concepts that I'm going to talk about, I feel like they really apply to any new technology, so that if we can do this rollout right for our teachers, we will prepare them for whatever the next big thing is.
We will be prepared to know how to roll it out.
They will be prepared to trust that we have their backs with any new emerging technology.
So the first thing we stressed with our teachers is redefinition of base skills.
So what skills do our students need to know now that this AI tool is available to them? First of all, what skills do we want them to know how to do without AI? What do we want them to know how to do without needing to touch AI at all? But then also, once they master that skill, are we saying, "Okay, you can use AI for that now?" And again, this applies to different disciplines.
This is something that departments have to look at.
We also need another layer of information in digital literacy.
We're all teaching digital citizenship.
We're all teaching information literacy.
But now we need another layer of that, and we'll get into a little bit of the nitty-gritty of that a little later.
But I think it's reassuring if you feel overwhelmed with this, to say, "You're already doing it." We just need to add in a little bit more to specifically address AI.
Prompting an AI is not the easiest thing in the world.
It actually requires students to use some higher level thinking.
I'm going to give an example in upcoming slides about a social studies teacher who really scaffolded this really well for his teachers.
I can't take credit for it at all.
I was just observing in the classroom and really loved how he actually created the prompt for students, so that he could test it out beforehand to introduce them to using AI, by saying, "Okay, here's the prompt.
Copy and paste it into Gemini.
Here's what we're going to get as a result." And I thought that was a really neat way to scaffold for students.
So some strategies to get teachers on board, because we're always going to have teachers or educators or even administrators who are going to say, "I don't want to touch that.
I think it's awful.
I just don't want to deal." The first thing, and this is easier said than done sometimes, is creating that baseline understanding.
What is generative AI? How does it work? What are the pros and cons? Being very transparent about that and saying, "We know that this is a really cool tool, but also we know that it holds a lot of bias.
That it can have students cheat fairly easily.
We know that it has a huge environmental impact." So weighing the pros and cons with teachers is really helpful and being extremely transparent about what AI tools we are okaying to use, why and why not.
We've had a lot of those conversations here at LCDS, a lot of ChatGPT fans, but- Mm-hmm.
Sometimes they prefer one to the other.
Yeah.
And I think part of explaining it to them is that it's easier for us to manage.
There's a ton of reasons why we would want to steer them to one.
Personal information that's being recorded, some of them do not, some of them do, and a lot of times, the teachers don't understand the finite details of these.
Yeah.
Thinking about data privacy- Mm-hmm ...
a lot of times is not something that comes to a teacher's mind, but as an ed tech- It comes to our minds ...
and an IT, it comes to our minds immediately.
Also, identifying those early adopters.
Yeah, I love the question about vetting tools.
I'm actually going to share a really helpful tool here at the end, but- Most of the time, community members do run things by us.
So we have- Yeah ...
if they want to use something, there's a form on our intranet where they would have to request access toSuch and such.
And that would then be vetted through Kelsey, and then it would then be vetted through me and my team once we deem it safe.
And Kelsey will touch on certain factors that we use to determine what is safe, and it will vary by student age as, as well.
Yeah.
We're also planning on doing a complete app review this summer.
That's basically the one thing I'm going to accomplish this summer probably.
So we're looking at COPPA compliance, data privacy, ads, all the things.
Yeah, sometimes apps change too.
That's the other thing that we want to make sure we're on top of.
Yeah, be very aware of if you're using an app now, maybe they add an AI component in, which is something that we found out that an app that our students were using in middle school, they added an AI component, and no one was aware.
So now that changes how things are viewed from a student privacy lens and things like that.
So always being able to review what's in use on student devices is key because- Mm-hmm ...
like we said earlier, the technology is changing all the time, so we have to make sure we're keeping up with it to make sure that what we're allowing to be used on student devices and to be used in the classroom is compliant.
And we plan on doing that every summer, making sure we have a Google sheet of the app list of what apps are on which students' devices, like by division, but also by subject.
And then I plan on taking all of those and doing an extensive review this summer.
Hopefully to be able to catch, especially if any apps are going to update with an AI component, I want to know if at all possible.
I know that's kind of whack-a-mole, but having an annual review process, we hope will help us keep up with that for sure.
Back to our teachers, it's really helpful if you can get some early adopters on board.
We were able to do that when we-- Actually, funnily enough, the early adopters were ready for Gemini before our department was, which is actually kind of funny when you think about it.
We were going to very slow roll it out, and we had a couple science teachers who were like, "Oh, I want to use that like week one." And we were like, "Oh.
Oh, okay." So, early adopters really can help spread the word.
Once you've worked the kinks out with the early adopters, then using that as a showcase to other teachers who might be a little more hesitant, can sometimes work to get more teachers to use this in the classroom.
But it's really all about relationship building, for sure.
And sometimes word of mouth is the best way that you can spread this stuff.
You could conduct tons of PD and whatever, but until a teacher sees it used in the classroom by another teacher, they'll say, "Wow, we can do X, and we didn't know that this is something that you can do with this tool." And that's kind of how buy-in spreads.
Yeah, absolutely.
So one of the things that was put on our radar is some type of menu or scale about how AI is being used in the classroom.
We have not officially adopted one yet.
I'm going to show you two different kinds.
This one is a traffic light, where green means encouraged, yellow means permitted with guidance, and red means not permitted at all.
And then the bullet points underneath give examples.
So for green light, this would be brainstorming and outlining, faculty using AI for lesson planning or feedback, students using AI as a study partner to practice language, explore ideas, or staff using AI for those administrative tasks to save time.
Yellow, again, where we need to have maybe a little bit of caution, AI assisted writing, which I know is a huge concern with our English humanities teachers.
But you can do this with proper attribution and reflection.
A lot of times, examples I've seen, because we're not quite there yet with LCDS, but I've seen teachers who require the entire chat be submitted.
Or like for my PhD program, we have to have an AI statement that says what we've used AI for while writing our paper.
That's the standard.
So this, I believe, is going to be an academic committee discussion first, and then we're going to be polling our teachers to figure out which menu or traffic light makes sense to them.
I'm guessing we're going to start more with this traffic light because it's very clear-cut.
When I show you the next menu, it's a little more in detail.
I think maybe we need to grow our understanding of AI to effectively use a more detailed menu.
This I feel like is very clear-cut, which at the moment I think is what our teachers are going to want, but we have not landed on one yet.
Let's see.
Using AI tools to process student data with proper vetting, that's huge.
Like Gemini, again, is locked down, so we can feel confident using Gemini, but we would not want our teachers putting any student data into ChatGPT or something similar.
Research support.
The librarian in me, but I love using this as research support, asking for primary sources.
Did I miss any sources? Here's my bibliography.
Am I missing something on this topic? Is super helpful.
And then AI-generated visuals or content with some type of disclosure.
I think we had a question in the chat that Chris is going to look at.
And then obviously red would be not permitted.
So that would be things like submitting AI-generated work as your own, sharing student information with unvetted AI platforms, using AI to complete assessments, or any type of unsupervised AI use, but especially in lower school, that can be pretty dangerous for those students under 13.
So I'm just going to briefly show you, this is Kincaid School's approved menu, and you can see at the bottom they use Flint, Canva, TypingMind, and other AI tools in Adobe Creative Suite.
This was the example that was given to us.
I really like it.
I think we just need to grow a little bit till-Are faculties ready for it? It is a lot more detail- Yeah ...
and this one had a lot more nuance.
So this identifies AI as what role AI is taking when you're using it.
So is it a thought partner, an editor, an analyst, or a co-creator? And then at the bottom of each, it says, "Students must disclose AI usage, copy of the chat must be submitted." And then as you get to analyst, they actually have to cite the AI, which you can now do in MLA and APA, fun library fact.
And again, for each of these, copies of the chat is required.
And I think that's a really good system to get into.
Yes, you may use AI.
I need to see the copy- Your prompts and chat ...
of the chat submitted with your work.
I think that's more than fair of us to ask of our students.
So let's talk about some real examples of how AI's being used in the classroom so far.
These are just touching on each division here.
Math and computer programming, we know that AI could totally write code, but our coding teacher is like, "I want the students to learn it from scratch, but if they want to check their coding against the AI to see if they have any mistakes or if they get stuck," really they're using it here as a study partner.
Same as practice problems for math class.
She'd be like, "If you're stuck on a problem, give the problem to AI, but ask it to create and walk you through a similar problem so you're not just getting the answer to that problem," which I honestly thought was a very creative use of AI as a study partner.
I touched a little bit on our middle school social studies teacher.
He had students working in Gemini to create outlines.
However, they did not give their information that was going to go into the outline to the AI.
They had the AI, they copied and pasted a prompt the teacher created.
The AI created the outline, and then they plugged in the information.
Which again, I thought was a really creative use of AI to help you, but not to do your work for you.
Middle school science totally wanted to jump into this, which I love.
They used it for brainstorming for science fair ideas, which they had to be really careful with their prompts.
It was actually a little funny because I was in the class while they were doing this, and the way some of them prompted Gemini, it would not give them a response.
So I just want to touch a little bit on some of these AI, like Gemini is what we're familiar with, but I know there are others that have an educational lockdown version of AI.
It will not give them certain information.
If you ask them controversial questions, unsafe questions- Religious, politics ...
religious.
Things like that.
Yeah.
So it was kind of funny watching them try to get science fair ideas and Gemini saying, "That doesn't sound safe," or, "That doesn't sound appropriate." And so they had to really rethink their prompting.
Which is good in the long run- It's great ...
because it helps them think.
Some of the complaints we get about Gemini for education is that it's harder to prompt, but in a way that's actually good because it forces you to think about your prompting and be really strategic and intentional.
I also did a lesson with this science fair crew, about research questions versus their hypothesis and how we research the science behind their science fair projects, and then we allowed them, once they were about halfway through creating their research questions, to plug them into Gemini and say, "Take a look at these, suggest any changes, suggest additional questions," which I thought was really cool.
And my librarian counterpart in lower school I know has used the AI, like the magic image creator in Canva with fifth graders who were doing a biography project.
It was actually a really cool project.
They ended up creating little graphics that looked kind of like Pokémon cards, but for their person that they were reading the biography for and doing research on, which was pretty cool.
So this next slide is just our hopes and dreams of where we're going to start with AI literacy in each division.
Starting with lower school, we really want to try to do as many unplugged activities as possible.
I feel like our goal is to be even more intentional with lower school tech use.
So we'd like to lay a machine learning or coding foundation with them without even touching the computer, and icode.org is great for that.
We also start with the basics of digital citizenship concepts.
I know they learn that in library in lower school.
So again, it's just adding that extra layer of AI literacy by laying that machine learning and coding foundation so they get the general idea of how a machine learns, how the AI works.
In middle school, working on prompt engineering, which is great higher-level thinking skills for our middle schoolers, and as an editor or brainstormer, kind of like the science fair example I gave.
In upper school, we'd love to see it used as research support, editing their writing, and a study and feedback partner.
I've gone into at least one upper school classroom to show them NotebookLM, which is one of my favorite AI tools, to show them how to use it to study or even just to dissect a academic journal article.
NotebookLM, if you're not aware, is another Google tool that allows you to provide the AI with the sources it should pull from.
It does not search the open web, which means you don't have to worry about hallucinations.
And I love that when you ask it questions, it will link back to your source to show you exactly where it found the answer to that question.
So it's a great study partner.
They keep adding tools in there.
You got flashcards, you got your podcasts.
It's really good for analyzing videos too.
Yes.
If you have a bunch of videos on a subject, and you pull those links from YouTube, you can dump them all in there and it'll basically go through them all and give you a summary of what all those videos were talking about.
I've talked a lot about Gemini, but honestly, NotebookLM is probably even better for the students than Gemini, simply because it eliminates those hallucinations, and it's a great study partner.
Highly recommend it if you haven't checked it out.So our last slide here is just some helpful tools that I'm going to take you through, so give me a moment.
The first is this lovely Google sheet, and you can feel free to click on that link.
It should work.
This was shared to me by our IU13.
We have a group of tech integrators that get together every other month, and this was created by Sarah Wood, who's an ed tech, I believe in Michigan.
Don't quote me on that.
This is a live living document available to anyone who wants to use it, and it basically is a privacy and safety check on different AI tools.
I like going to this Usage tab, because it shows you by grade level what we should use with caution, what completely passes the safety check for each grade band.
So for example, I'm going to pick one that is with caution here.
Let's see.
For School AI.
School AI, yeah.
School AI.
So we'll go to that sheet.
Here it says why it's with caution for each one, but as you scroll down, they've actually picked out the privacy and safety policy- Hot days and the wind blows.
Woo.
Woo They actually pull out the pieces from the policy that are concerning, and they have it color-coded over here, yellow, red, like, "Ooh, that's really not good." And as you scroll down, right here, personally identifiable information is collected, so that's a red flag.
It's just they've already done a lot of the work for you.
As someone who has to read lots of privacy policies when teachers ask for new apps, this is great.
Yeah, this is a great starting point to go with- Absolutely.
And it is updated, like Kelsey said, it's a live document.
It is updated pretty frequently to keep up and to keep pace with- Mm-hmm ...
how everything else is changing and being updated.
I know I've already used it at least once or twice when admin has come to me and said, "A student's trying to use Copilot.
Can they use Copilot?" And I'm like, "Well, no, because regular Copilot, you have to be 18 or older." And so instantly that just shut it down.
And being able to have this document to just look up to see if a certain tool is in here is really, really lovely.
And what's good about this is not only if you do need to give an explanation of no, it gives you the why.
Yeah.
So that it's not just a no, but you can explain why, and you can see why.
Yeah, absolutely.
All right.
I thought I had Common Sense open, but I did not.
So hopefully everyone's aware of Common Sense Media.
I love Common Sense Media.
I just linked to their AI literacy lesson collection.
I like to take a little bit from each of these and mash it up for whatever grade level I need, which is great.
I'm doing the same thing with their copyright stuff, which is lovely.
Code.org, I told you we have some unplugged activities for K to five.
Again, this is somewhere to pull from each lesson you like and mix it up.
These are great activities to do with lower school students, and you don't even have to touch the technology, which I feel like is the best selling point for this.
They also do have an entire AI and machine learning section on their website.
This is not completely unplugged, but it gives great machine learning introductions and will walk you through exactly what you need to do.
Google has two, I guess, kind of games that I recommend if you're doing, like I did these with grade level meetings, just as a very short introduction to AI.
The first one is Say What You See, and basically it's like a little game.
It gives you an example, a picture here, and then you have to prompt the AI to create a picture that's as close to it as possible.
And then it will grade you on how well you do.
The second one is Quick Draw- Like Pictionary ...
which is kind of like training the AI to do Pictionary.
It's actually quite fun.
But this is the world's largest doodling data set, and it's shared publicly to help with machine learning research.
So it's probably going to play some sound, but you try to draw these as fast as you can and see how quickly the neural network can pick it up.
I see hula hoop or shoe.
Oh, I know.
It's face.
Okay.
So it already got it.
It's really fast.
It is really fun.
At the end, after you draw six of them, you can actually go and look in the database and see what other people have drawn of those objects that it prompted you to draw, which is pretty cool.
So we just want to make sure everybody could access those.
So if you can't, please let us know.
Yeah, students love it.
That's great.
It does get competitive.
Yeah.
It really does.
And I just think it's such a fun way to teach a concept.
Any way we can gamify things a little bit, is great.
So that's kind of an overview of what we're doing, where we are now, where we- How we're doing it, yeah ...
hope to go.
I'm sure this presentation has morphed many times as we've learned more throughout our experience here at LCDS.
But we would love to answer any questions or hear about what other schools are doing.
We love to learn from each other.
Oh, yeah.
Definitely Google and Gamified.
But I think the prompt engineering aspect of the Say What You See, I think I was doing it with 9th or 10th grade last year, and they were really determined to get good prompts in there so they could match the picture.But yeah, feel free to put any questions or comments in the chat or even unmute.
Yeah.
Yeah.
Has anyone seen a return to some more non-tech related assessment? For example, handwriting.
Some of our teachers in the literature department are doing a little more blue books for baselining because you can tell that that's the original student's work.
You get a sense of where their skills initially are, and then after that, moving to maybe using typing, and then if they type up that initial draft, and then maybe using AI later in the process.
I know our humanities department is doing this, a lot of handwriting in class as writing samples.
And we've seen a return to handwriting in lower school as well.
We are definitely seeing a less tech approach, and it's a conversation that we've had that's ongoing.
Yeah.
I don't know if you wanted to say anything about that.
I think across the board there's been kind of like a, not so much a rejection of a lot of this technology, but more of like a return to basics- Mm-hmm ...
with stuff where technology is still there, but it's not the key driving thing in a classroom.
Which it should not be.
It should help augment and help be a tool, but not the main thing.
But there seems to be, especially in lower grades, I would say.
Oh, sure.
Like I have two lower school students myself and, in their school, the iPad and the apps and AI is, it's not the main focus nor should it be, but that seems to be a trend, if you want to use the word trend, which I don't really want to use, but that's how I would describe it right now.
Yeah.
And hopefully we could find the right balance of this stuff where there is more back to basic stuff and the technology, which still has its place, but it's used more of a, as a tool at the right time- Mm-hmm ...
to teach a certain skill.
And I think we have a leg up here in independent schools with smaller class sizes.
Teachers can get to know their students write...
I know writing is a huge concern- Mm-hmm ...
can get to know their students' writing a lot more than, I'm coming off eight years in public school, where it's even harder to figure out who's writing what, who's using AI, who's not using AI.
I saw revision history is a tool that I know, when I worked in public school, teachers would use quite a bit, and I've shared with our teachers here at LCDS.
It basically allows you to see a video of the student's writing.
You could simply do this also in a shared Google Doc and look at revision history, but if you see them copying and pasting large chunks of text.
I'm sorry, we got an echo.
I don't know if it's me.
But- Echo back? Yeah.
If you see them copying large chunks of text, that's kind of a, "Hmm, where did that come from?" And some teachers have told me they've actually asked students.
They're like, "Well, I wrote in another Google Doc," and they're like, "Okay, then show me the other Google Doc." It's a good place to start if they're writing digitally.
That was a good question.
Thank you.
Any other questions? Or not only questions, you guys have any- Yeah ...
experiences of how this stuff is going on in your schools or how, let's see.
I haven't seen a huge shift in what's classwork and what's homework.
But again, this is only my second year at LCDS, and we as a, I'm thinking upper school, we give a decent amount of homework.
So I don't think that has shifted a lot except, like I said, humanities, they've been doing a lot of writing samples, actually physically writing on paper, which I could see that continuing.
That gives the teachers a good idea of where each student is at for any papers they write, what they type on the computer then.
Mm-hmm.
Yeah.
And they were really excited about revision history.
I know the humanities teachers were excited about turning in work that way, so they could kind of get an idea if, at least the first clue, if they're using an AI tool they shouldn't be.
I think there's a tension that some of our teachers are having, and they've expressed to us around not just the tool aspect, but also just the complexity of trying to apply good pedagogy and good understanding of cognitive development and learning to a tool where a lot of the research around it is so new- Mm-hmm ...
and trying to navigate that.
And one of the things that we've been actually tasked with our revision of our policy is to try and fold some of that, as best we can, into the policy to try and alleviate some of that cognitive load for our teachers to be able to discern, at what kinds of AI use are going to have what kind of effect on- Mm ...
student learning, and where are the suggested red lines based on the kinds of tasks that are, and learning that students will be engaging with, and using AI potentially on the side.
Mm.
So I think that's a really challenging question because we're kind of trying to predict the futureAnd the research is just catching up with a lot of this.
There's some research available now, but it's trying to synthesize a lot of historical research around that the education community has for a long time around how students learn and develop, and to synthesize that with this new technology that's changing the way that students think in such a fundamental way if it's used wrongly.
Yeah, that's a huge question.
I can tell you there's more research coming, I'm sure, but that's too little, too late sometimes.
I would say almost everyone in my PhD program who's doing their dissertation research right now has some AI component in it.
It's the hot topic.
I would say redefinition of those base skills-- I'll give a library example, which this is a very small example, but we do teach this in upper school.
But for the most part, we have NoodleTools, which helps the students walk them through how to create proper citations in different citation styles, MLA, APA, Chicago.
I remember learning how to actually write out the citation, how to type it out, opening my formatting book and flipping to the right thing.
And so what we as schools, as departments, as teachers have to ask ourselves is what base skill do I definitely need them to know, and what is okay if we speed this up with the use of, it might not even be AI, technology in general.
But yeah, that's a huge question that I definitely could not answer in the next 10 minutes or so.
Yeah.
Thank you.
Yeah, I know it's a huge question.
But it's just something we're struggling with, and we're trying to fold it into our policy as well, so to try and relieve the cognitive load.
And I think that was what some teachers are saying, "We're choosing the red option because of that." Mm-hmm.
"Not because we don't want to use it, but because we want to do this really well, and we're worried about the impacts of this tool on learning and cognition for the kids.
And so out of an abundance of caution, given that we don't know, we're opting to just say no to some of these opportunities." And so that's been one of the things that we've been tasked with to try and fold in- Yeah ...
as thoughtfully as we can into our policy to try and provide a little bit of structure and then also to then try and have that feed into our PD and the baseline that we develop with teachers so that we can have that common understanding around, here's our best understanding of how this tool will impact in a somewhat novel way, the way that- Right ...
cognition is changing and learning is changing to some extent.
Yeah, and this is not an easy or fast task to take on with any technology.
So my role as ed tech is somewhat new to LCDS, so we're seeing a little bit just on any technology of education needed for our teachers as to, "Yes, I want to come into your classroom.
Yes, I want to work with you in real time on this." And I feel like this connects to what you're saying.
But I have a lot of teachers who are just like, "Oh, I'm so sorry." I'm like, "No, this is the model," right? "We come in, we collaborate, so you feel more confident moving forward." Again, not a fast system by any stretch of the imagination, but I think any time we can collaborate and come alongside as co-teachers to our teachers, we can help alleviate some of that cognitive load.
Mm-hmm.
And again, takes time to learn each teacher's style and the curriculum for each grade level and each subject.
But once we have that relationship built, I feel like we can really come alongside our teachers and remove some of the things from their plate in order to get that AI literacy, that tech literacy, that information literacy, whatever it be.
Yeah, because it helps build things like trust- Mm-hmm ...
and it helps build a relationship where the teacher will trust you going forward.
And trust themselves.
And trust themselves.
Yeah.
It helps them feel more confident in what they're doing as far as technology and their own- Mm-hmm ...
understanding of it.
And I think a lot of this, a lot of the questions I've gotten, most of it stems from a lack of confidence.
Yeah.
And I think the more you can instill confidence in our educators, the better that they'll trust themselves and that they'll feel like they're making the right choices when it comes to this.
I think a lot of them feel overwhelmed by it, which is- Mm-hmm ...
I think something we can all relate to.
Things change very quickly, and I think the more we can help them and get into the classroom and feel like we're working as a partner and be transparent- Mm-hmm ...
with what the technology side is doing with the faculty side, I think it'll bring out a better end result for the students, which is what everything should be about.
Yeah.
And my goal with going into the classroom is that the teacher, A, asks for help again if they need it, but B, maybe they're more willing to try the new tool we've just acquired now because we've worked together on something, and it went great.
And that's usually how I try to get a whole grade level band to do the same things.
"Hey, we did this, piloted it with this early adopter.
Now I'd like to come into your class and do it with your class as well." And so far, that has worked for me.
It worked in public school.
It's working so far at LCDS as well, which is great.
Hi, folks.
Can you hear me? Yeah, we can hear you.
Hi.
Sorry.
Walt Warner here.
What you're saying really resonates with me, having spent 45 years in the classroom and as an administrator, and just recently retired.
I have a book coming out that actually focuses almost entirely on what you were just talking about.
And it really focuses on the teacher and building confidence for the teacher.
Oh, okay.
And I really believe in the book, as soon as it comes out, I'm not here to plug things, but since you brought it up, I think it's really the big focus that we all need to be taking.
And again, not to sound like an infomercial, but Gen Edge- Gen Edge Consulting, I started the business to help with exactly the problems that, the issues that you guys are talking about.
And there's a little plug in the chat there.
And what I'm really trying to focus on, it seems like just about everybody in this webinar is well on your way, so you may not need the kind of resources that I'm trying to help provide, but I really want to try to help provide the kind of support and encouragement for teachers, especially to schools that are on the lower end of the technology resource and probably the dollar resource as well.
So, yeah.
That's all the plugging I'm going to do.
I didn't come here to be an infomercial.
But, I did put my website in the chat, and if anybody is interested, even if you don't need the services that we're going to be offering, if you know of a school that might benefit, or if anybody here would like to get to know a little bit more about what we're doing and the book that's going to be published here very soon, please feel free to contact me.
Yeah.
Thank you so much.
It's one thing I could probably talk about for the rest of the afternoon, is collaboration and how we can work together with our teachers to get some of these skills embedded into the classroom.
Probably will be the subject of my dissertation, so I could probably plug and talk about it forever as well.
But I think we can't be islands to ourselves, even when we talk about library media.
That really needs to be embedded into what the teachers are already doing in the classroom.
And I think as ed techs, as library media specialists, we are uniquely positioned to walk alongside our teachers and show them that collaboration can make their lives easier.
Be a partner.
Or yeah, a co-teacher, a partner.
We're not showing up and saying, "Use me." We already have the ideas.
We get to know them and their curriculum and can really intertwine what we're doing, which I get really passionate about.
So thank you.
Yes.
I'm very excited about it.
There's some folks in Toronto that are interested in what we're doing.
And some folks in the UK that I've been collaborating with, and really trying to create a global network of folks.
Again, I spend most of my time in the independent school world, but there's a huge cadre of folks out there that are, some in the independent school world and some in the public school world, that are all facing the same challenges.
And I'm trying to hook up with people, which is why I'm here, at this webinar.
Trying to hook up with people that are looking at the possibilities for AI.
Mm-hmm.
Recognizing what all the potential pitfalls are.
But, I'm really interested in collaborating with and supporting people who are willing to look past the fears and the trepidation and say, "What can we do collectively to move this forward in ways that will make education better for kids?" Because that's really why we're all here, I think.
Absolutely.
All right, y'all.
Well, thank you so much, Kelsey and Chris.
We really appreciate you coming and sharing today.
Fantastic job.
Thank you.
Yeah.
We're glad that y'all took some time out of your busy end of the year to come be with us.
So thank you, and we'll see you all soon.
Thank you, everybody..
Takeaways
-
Intentional Policy Development
Schools should develop flexible, mission-aligned AI policies that avoid "zero-tolerance" stances, as students often find total bans confusing and impractical in a digital world.
-
Collaborative Task Forces
Building a successful AI strategy requires input from diverse stakeholders, including IT directors, librarians, teachers from all divisions, and student focus groups to gain varied perspectives.
-
Redefining Base Skills
Educators must identify which fundamental skills students should master without technology versus when AI can be introduced as a partner to augment learning and higher-level thinking.
-
Rigorous Tool Vetting
Protecting student privacy is paramount; schools must establish clear processes for reviewing apps for data privacy, COPA compliance, and the sudden emergence of integrated AI features.
-
Teacher Support and PD
Successful AI adoption relies on building teacher confidence through hands-on collaboration, relationship building, and showcasing successes from early adopters to alleviate the "cognitive load" of new tech.