Info Session: 9ine Privacy, AI, and Tech Academies
Presented by:
Schools are facing a new wave of challenges: rapidly evolving AI tools in classrooms, tightening privacy regulations, and ever-present cybersecurity threats. To help schools navigate this complex landscape with confidence, ATLIS and 9ine have partnered to bring you two powerful professional development programs: the AI & Privacy Academy and the Tech Academy.
In this live info session, you’ll discover how these academies equip school leaders, Data Protection Officers, and technology teams with the knowledge, tools, and hands-on guidance to build resilient, compliant, and future-ready schools.
Transcript
Hi everyone.
We are so glad that you took time out of your busy day to join us.
We've got a really great information session for you today.
We're going to be talking with our friends over at nine.
Neil, if you're not familiar with nine, they have a global presence.
They're based in the uk but um, they have offices in Singapore and I believe in, uh, Canada as well.
So all over the, the place.
And so they have a really great global perspective and they work a lot with schools just like yours.
And so we're excited to bring them in.
We've got James from the nine team and he's gonna be talking to us today, giving us a little bit more information about some of the academies that we have partnered with them to offer.
And these are really great technical deep dives.
James is going to give you a lot more information.
But the thing that I love about these is that it's specific to the independent school community.
So these cohorts are going to stay with you throughout this whole process.
And it's gonna be other Atlas members, other people from the Atlas Independent school community that are going through this with you together.
We're gonna talk about who all this is for, but I do encourage you to consider not only this program for yourself, but consider the possibility of going through with a couple of team members or people at your school that can really get in there and do the work and make these things happen.
Um, so we're excited for James to come on and tell you more about it.
James, welcome.
How are you today? I'm really well, Ashley.
Thank you for asking.
It's always so nice to do these with yourselves.
Um, I was, uh, I was a big part of the Privacy Academy last year with Atlas.
And I have to say, the little cohort groups we had were fantastic.
Got to know everyone really, really well.
Uh, I think the bit that I loved most about it, which I'll talk about the kind of structure and what the privacy and the, uh, tech academies are, but I think what I loved about it was that, yeah, people got to know people, they got to work with each other, right? There's a, there's a lot of kind of tricky stuff that we're dealing with here.
So I think the fact that you can make some peer friends in other schools who are sort of struggling with the same challenges as you, you know, obviously you've got the Atlas connection, you've got nine in the background as well, but also you just get to meet like-minded people who are doing the job you do, struggling with the things you are struggling with.
So yeah, we talk you a lot of stuff in the last year as well.
But also I think it meant those groups could take it forward.
They could work together and collaborate and share best practices.
And I think that's what we're always looking to strive to do.
Try and try and form those best practice groups.
Um, so I'm the head of data protection at nine, so, um, I will be in all the academies though.
I'll be in the tech AC Academy as well.
Um, so we've got a couple of, uh, academies that we've partnered up with with Atlas, um, so that you can kind of experience them.
We've got the privacy in AI Academy, which I'll be talking about today.
Uh, they're together, they're combined because AI and privacy, they have a lot of links.
And then we've got our tech academy as well.
Uh, and that's really what sort of nine is.
Okay.
If you don't know who we are, we've been around for about 16 years now.
We only work with education, so we do, we love it.
Um, uh, but we do have two sort of sides to our business.
So we've got data protection and ai, and then we've got technical.
We've got our consultancy team that sits on both sides there.
I'll be introducing you to some of those in a minute.
And who are gonna be our trainers, uh, for these academies.
And, uh, therefore that's why we have two different professional development courses.
One that specializes in the technical side of schools.
How do we do things like firewall configuration? How do we make sure our cybersecurity is the highest posture it can be, uh, practically speaking? 'cause it can be tricky in schools.
And then obviously we have things that like our PII our privacy requirements, how do we protect personal data? And of course now we have AI to deal with as well.
Another factor that we have to think about, how do we comply to that? How do we keep our schools safe? So bit of an, uh, apology on my part.
First of all, I'll, I'll explain why two of my colleagues we're gonna be on the call.
Dan Brooks, who's the head of our technical services.
He's, uh, he's sort of pioneering the technical academy.
And I'll flick through very quickly and then go back and Julia Ick, who's our Senior Data Protection Consultants.
So she's, uh, she's the one who's building out all the privacy and AI academies as well.
Now, the reason they're not on the call today, um, not to go on a bit of a flat note, is because they're dealing with a very big cybersecurity incident and data breach at the moment of the school.
We do this a lot, uh, as a big part of the job.
Uh, we do a lot of proactive work with our schools, things, our professional government development training, helping them with your program.
So cybersecurity.
Uh, but we do unfortunately deal with a lot of incident, a lot of breaches.
Um, and I was actually talking to, um, Ashley and Kelsey just before this started, um, as to what the nature's been.
We've had a lot of breaches this year.
Uh, a disproportionate amount.
I would say this probably the most I've ever seen when I've been at nine.
I've been here for five years, um, almost six.
Um, and the trend, because it's the same question, uh, that my, our colleagues at Atlas asked was, what is the trends you've seen this year? The trends been through vendors.
So not to say that none of the schools we've been working with have had, um, sort of, uh, on-premise attacks where their servers have been, uh, exploited or perhaps they've had, uh, phishing.
But actually, um, one of the main things we've noticed here is third parties, it's vendors, uh, where there's been supply chain attacks.
A pretty nasty one in the UK happened recently where, um, actually it was, uh, a lot of the sensitive information that schools use this vendor for to be able to do screening checks on teachers to make sure that they, you know, they don't have a criminal record, their passports, seeing these kind of things.
Um, and that all got breached and the attackers gained access to that.
So, but we've had probably about eight, maybe nine, um, supply chain attacks as they're called, where vendors were affected.
Um, I think that's, um, gonna be the trend, if I'm honest, moving forward.
If we think about how much we're moving to cloud, uh, how much we want to, to adopt AI tools, this will be very relevant to today.
Um, how much we already rely on EdTech platforms.
I think cyber attackers are starting to think, well, we could spend our time trying to attack a score and just get the data that they hold, or we could hit the jackpot and go for a vendor, uh, a third party, and then we might get 20 schools data, a hundred schools data, a thousand schools data.
Um, so yeah, that's, that's what the guys are doing at the moment.
Uh, and I'm sure they'll give you lots of case studies, lots of examples.
It's what we love to do in these, uh, professional academy courses.
We'd love to go through real life things that happen in education in schools.
We don't need it to be theoretical.
We've got lots of examples.
So Dan Brooks has been here, uh, for a number of years.
He's kind of right at the start of nine.
Um, he has created the tech academy.
Um, so, and this is all based on his experience.
Dan actually works at nine and also in a school, so you can't get in better hands than that.
He knows what you have to deal with on a day-to-day basis.
Uh, his colleague, Marcus O'Brien, principal technical consultant, will also be doing the tech academy as well.
Uh, Marcus has been here almost as long as Dan.
Um, so they've been here for absolutely years.
They've worked with hundreds of schools over that time.
Their expertise is very much around.
They've done, uh, cyber testing, pen testing, implementation projects.
You know, they know everything you need from the firewall reconfiguration to network installations, even thinking about things like capital expenditure plans in it.
Um, so you're in great hands with these two.
And then my colleagues in my team, uh, Julia is our senior Data Protection Consultant.
She has a wealth of experience when it comes to COPA and ferpa.
She's the expert at nine.
Um, so she absolutely is poised for our Atlas, uh, privacy and AI Academy because she's the expert.
She understands all of your legal requirements across, uh, the US and Canada.
Um, so she'll be the lead on the privacy and AI Academy.
And then we've got my colleague Katie Snelling joined the business.
Uh, just, just less than two years ago.
Katie has worked in the charity sector for a very long time.
Obviously there's some similar synergies between education and charity.
So she's been doing data protection there for years.
Uh, she's a very valued new member of the team.
She works with a lot of our clients as well.
So really great team, uh, that's gonna be supporting you through that.
Just as a bit of clarification, it is CPD certified.
Okay? So, you know, it's, it's proper training.
Uh, you are gonna be able to, you're gonna be able to put that against your name for both courses.
So if you are thinking about that, I would like some certified training, um, for whether it's cybersecurity and, you know, back to, um, my, my friends at Atlas's point, it doesn't just have to be you on the call.
And I'll talk about the different types of people who might be really relevant to these different types of courses.
'cause we cover a lot of stuff.
But if we do wanna get some CPD certification, we wanna get some official training, uh, we wanna be able to demonstrate, you know, we've got the right skills to manage ai.
We've got the right skills to manage data protection.
Or on the flip side, cybersecurity and sort of technical projects, technical hardening projects.
Yeah, this is exactly what this is for.
So that you can actually not just learn, but it's something you can put onto your kind of professional development.
You can put it onto your CV as well, which is, you know, we appreciate the time commitment you make, so we want you to be able to take this forward as well.
So, um, there's two courses.
Uh, the first one is nine AI and Privacy Academy.
So it's changed a little bit this year.
So last year's Privacy Academy was predominantly data protection with a little schmack around, as I say, of ai, okay? We had a session or two around AI this year there's gonna be a much higher blend, uh, because AI really is becoming a very pressing topic for you all.
You'll al be at different kind of stages of AI adoption.
Most of you may have a policy now, um, acceptable use policy with students.
But you'll still be kind of in the, the thick of determining what are the right AI tools for us to use.
Are they safe? Do we need to worry about the ethics around ai? Um, so we're gonna be touching on this a lot and I'll go through some of these core sessions that we'll be doing.
I would suggest that the privacy in AI Academy is gonna be very relevant to, you know, there's the obvious title, which not many schools have, if I'm honest, is a data protection officer.
If you do have one, excellent, uh, but most of the time you're having to kind of take that responsibility on as an extra.
From my experience, it, you get this a lot, uh, get given data protection.
But if you're thinking about your DPO, your compliance officer, absolutely anyone who manages risk.
But when we start to think about ai, we're gonna be thinking about, okay, well maybe if we've got an AI task force or our ed tech specialist, our ed tech lead coordinator director, this is all gonna be incredibly important to them.
'cause it's going to be going through and understanding how AI needs to be assessed.
How do we need to understand the ethics around it? What's safe? How do we use it in a manner that's gonna adhere to our policies? But, and it's also gonna improve the education program or improve our kind of administration function, but also not take away from it as well.
How do we sort of worry about the displacement of staff and things like that? Because AI is a very complex thing.
Um, so, and then again, you may also be thinking around having people there from it, uh, depending on how much you have to involve yourself with deploying AI tools.
And then we may also be thinking that, um, when it comes to data protection, we've even had people from marketing and communications join.
'cause we talk a lot about breaches.
We talk a lot about information rights, where there is a lot of communication that has to come from that kind of communications team.
You do need to know how to speak to your parents and staff when there's been a breach, and how are we gonna communicate that back to them.
So these are actually surprisingly diverse.
There's a lot of people that will find it beneficial to come to these kind of things because compliance touches us all.
We can't really avoid it.
Um, it's part of our lives now.
And in schools you can be quite heavily regulated.
There's a lot of things you have to consider.
So for the privacy Academy, it's gonna be some key learning objectives.
And I won't kind of go into these too deeply because you can, I'll share the presentation deck, but we will be looking to understand the key principles of data protection, uh, and privacy laws where they're relevant to you.
You obviously have state laws, but also some of you between them where you're based, you've got your own very kind of developed privacy laws as well that we need to consider.
Now, we understand all laws.
That's back to the point.
We're global.
So we work with every school you can imagine in every jurisdiction.
So, you know, we work schools in Japan to Canada and everywhere in between.
Um, so we understand all privacy laws.
Uh, but obviously Julia's gonna be leading this 'cause she's got such an expertise when it comes to American, uh, law.
Uh, she understands them like the back of her hand.
We'll also be going through the development of AI laws.
Okay? They're still emerging, they're changing, but we will be giving indications of what they look like currently where you are based, but also how they're developing globally as well.
'cause people do tend to mirror these kind of things.
We want to be able to understand how we identify risks.
I think a lot of people always know there are privacy risks.
We know we've got AI risks, but it's kind of getting to the crux of, well, what are they, how do we start to extract them from the hundreds of pieces of technology we use and all the practices we have with our, with our staff, the human beings, the students of the school.
How do we understand what those risks are? How do we start to manage them where possible we'll be teaching you how to apply privacy by design.
Okay? How do we actually make things safe from outset? How do we actually design our practices and procedures to protect RPII straight away? Um, and, you know, we'll be going through those kind of projects, these kind of things that we can do.
How do we conduct those risk assessments? We'll be training you how to do all these things.
You understand them.
You will become a data protection and AI expert.
Uh, when it comes to risk and compliance by the end of these sessions, I can guarantee it.
How do we develop our own frameworks? Okay? Do we want AI governance frameworks in our school, uh, from the very kind of senior, whether it's your board trustees ownership group.
Do we want it to start from the very top all the way down? How does that filter down to teaching staff as well? Evaluate vendor compliance? So back to my point, I said it with intent.
Um, vendor compliance has probably never been so important.
Uh, how do we assess third parties? How do we understand particularly with EdTech where there may be risks? Obviously EdTech being used by our students of varying age.
There are a lot of risks that can occur there.
It's not just personal information that we need to be concerned about how that's shared with vendors, but it's also child protection, safeguarding, making sure there's no harm occurring to those students.
You know, we see a lot of platforms that are very heavy on gamification, which in small doses is fine, but every every single platform is using gamification.
It's a bit of a risk, right? We're, we're encouraging too much screen time, addictive behaviors.
Um, discord, some of you'll be very familiar with Discord, that's a bit hot topic at nine at the moment.
Lots of schools are starting to use discord.
Are we aware of those platforms where it allows external contacts to outside users of the school? 'cause of course we know that can be a risk.
Of course we know that can be a threat, but ultimately we want to make sure that the vendors we're using the third parties, do we have all the legal documentation in place to be safe as an organization to use them if something happens to them, if they had a breach, if they did have a supply chain attack, a cyber attack, um, other than I'm afraid dealing with the fallout of disgruntled parents, um, or maybe staff in some cases, would we be, um, legally protected? Uh, would we be safe for mitigation? Because we do have the right documentation, we have the right privacy policies, we have the data processing agreements we need from them so that we are adhering to our law and our vendors adhere to that as well.
Big topic lots we cover in there.
Uh, accountability mechanisms with ai, responsible use of ai.
So we'll be talking about that, you know, right from the very top, how policy is set at the very highest levels of the school.
And, and then how we filter that all the way back down.
So teaching staff understand how to use AI safely with students or in their own role.
How do we integrate safeguarding and ethical considerations into AI systems? Okay, it is tricky at the moment.
It's the wild west as me and Mark our owner.
Keep calling it AI's popping up everywhere at the speed of light.
Uh, it's very commercial.
So that doesn't mean every single vendor's necessarily thinking about the risks of ai, they don't necessarily think about, uh, is this going to, uh, impact the students negatively? Um, so we're gonna be teaching you how to do those checks.
How do we understand these things? How do we manage them? How do we communicate this all back effectively to leadership board stakeholders? And that could even be parents, right? How do we make sure that we are demonstrating as a school, as an organization, that we do take this all very seriously.
You know, compliance is key to us.
Safety is key to us.
And we extend that through things like AI and data protection.
And ultimately how do we embed this as a culture? Okay? We want everyone to understand it.
AI is brilliant.
It's got tons of advantages.
Um, our vendors are great.
We want to use these really innovative technologies.
And yes, of course we have to share lots of PII and personal data with them for that to happen.
So how do we build a culture of safety where people can use these brilliant tools, but at the same time, we're not putting anyone at risk, we're not putting students at risk and staff at risk.
So these are all kind of the key learning objectives where they're gonna wanna be, um, making sure we distill to you across this.
Now, the privacy in ai, uh, academy this year is different, as I said because we've had to introduce a lot more AI into it this year 'cause it's such an important topic for you all.
So there's actually gonna be two key components to it.
Whereas last year it was just the mainstream.
And like I said, there was one AI session.
There's actually gonna be a mainstream and an advanced stream.
Now when you register through at this, your access to both, okay? And I think you'll probably find all sessions useful.
You may as I go through them at a high level, start to realize that these would be really great for me.
But actually that would be brilliant for of my colleagues at the school.
I could see why they want to join those.
Hence why we encourage you to bring extra people.
Um, because there are lots of sessions here that really apply to so many people across the school.
Uh, the first session is the introductory session.
You get to meet me again.
I love to talk.
So you're gonna get to, we're all gonna network.
I'm gonna introduce you to everyone, all your cohort that year.
We're play a game, we'll do a quiz, wanna build, build a team, build building before we kick into these sessions.
Um, so that will be starting in November the 18th.
That's when that first session will be happening.
Then we'll get moving into the serious stuff.
So first session is gonna be around AI and education.
So we're gonna be talking about the regulations around ai, the emerging risks.
And we are noting a lot of risks.
We do so much work with schools, we're spotting all those risks.
Um, I'll give you a quick one as an example.
I don't know if you're starting to use those AI meeting note taking tools, but that's a big risk we're seeing at the moment depending on how schools are deploying them.
I've got a couple of examples.
How recently, um, a missions team had, uh, set up a meeting they hadn't realized the parent they invited had invited their AI meeting, uh, tool as well.
Parent didn't join.
Admissions team had a conversation about parent meeting, closed AI notes went to parent and they weren't very happy about some of what was being discussed.
So we know that can be a privacy concern.
We know that can be obviously a reputational concern.
Um, and on top of that as well, we're seeing some schools who are really wanna start to embrace this.
But there's a balance there of risk if we wanna have it at a board meeting, okay, that's acceptable as long as we're getting consent from everyone.
But there are some schools we're speaking to, wanna bring it into counseling meetings, HR meetings.
And that can be tricky, can be tricky.
And we have to think about how we do those kind of things.
So we're talking about the types of risks and the different scenarios we're seeing pop up so that you can consider those, take those back and say, well, you know, we, we wanna do this, but we'll manage it in this way.
Accountability and compliance.
This will be very much talking about kind of the, the, the, the top level.
How do we get, always get buy-in from senior leadership.
Do we know from the strategic level at the top of the school why we, what we're having to do with data protection, what we're having to do with ai So it can be filtered down.
Okay, we get that buy-in, we get that commitment.
We'll be talking about data mapping and lawful processing.
So this is very much around do we know how PI is being managed across the school? Could you take a step back right now and say I know exactly what admissions does with personal data.
I know exactly what marketing does with personal data and there are different types of data subjects and the vendors that we share it with.
Do we have that map or do we not? And there are times where you will need that map.
If there's a breach, it's incredibly useful.
We can hone in very quickly.
And where impact might have been, uh, if we ever have a request from a parent or a member of staff around their personal data, uh, again, we can work that very quickly.
So we'll be teaching you how to do this.
How do you construct that map? Uh, and our platform, which I'll talk about later, the nine platform is part of the subscription to the, um, the academies.
So you'll get to be able to use the tools that we create in our own software for you to be able to do these things quickly, easily.
Uh, and and more importantly, you can report on them as well, which is very useful.
We'll talk about information rights, which varies a little bit, um, from state to state.
But ultimately data subjects have some rights depending on where you are based.
And how do we manage those when people ask for them? And do we have the right policies and procedures in place to adhere to them? Breach management and instant response, it's very popular session.
Uh, I think all of us worry about having a data breach.
All of us worry.
What would we do? Will we talking about, you know, how do we manage that? What policy do we have? Have we ever practiced that procedure? How do we create those fins? We'll be talking about simulation training.
Like we need to simulate a breach in our school.
Uh, and that's what one bits of homework we'll be giving you.
So you can actually pull the right people into the room and go, right, we're gonna practice how we manage a breach.
Um, because we want to make sure if it did happen, we'd know exactly what to do and how to respond.
Safeguarding and ai, so we're talking about here around the safeguarding and AI risks.
We see, uh, with new platforms, EdTech or administration, very specific in those areas.
Uh, we'll be talking about, um, privacy by design and security by design or sometimes referred to as information security.
So, uh, this sort of bleeds a little bit into technical, a little bit into cybersecurity, but it's very specific around what hardening do we have around our systems when it comes to PII and personal data.
Because, you know, you can have the best policies in the world, you can do all the training in the world with staff, but if we do have those weaknesses, it's gonna make it very easy for attack surface areas for people to get into our systems and gain access to that data.
So what are the things we can be doing to try and protect that? Uh, risk assessments.
It's great fun, isn't it? Uh, we'll be teaching you how you actually do risk assessments.
So, you know, when we do notice a, a very high risk process at the school, how do we do a, a formalized risk assessment on that from a privacy and an AI lens? So we can always demonstrate that due diligence that we understood that.
And there's risks that we can try and manage there.
Uh, and again, then creating and embedding your AI use policy.
So we do a lot of this at the moment.
You'll have your kind of AI policy as a, as a school.
You know what we, where our threshold is, our tolerances, but then we've got acceptable use policies for students, which many of you have probably done already.
But a lot of the schools I work with are thinking very much around kind of like plagiarism, how the students are using that to make sure they're not using that to kind of cheat effectively.
Uh, but we'll be talking about it also from an other risk perspective.
So we'll be talking about things like do students understand what they can and can't do when it comes to personal data and ai? Do we understand those kind of risks and have we got them in acceptable use? Do we explain that to staff what we we suggest they can and can't do with the AI tools we offer? So that's the mainstream, the advanced stream is how it sounds.
It's a little bit more advanced.
Um, but I'm really looking forward to these sessions 'cause these are what I would say the ultimate problems we see with our clients.
The tricky stuff that we're gonna work with you and train you and show you how to solve.
So one of those is a roper risk analysis that will probably mean nothing to most people.
Uh, that's back to that data mapping I was talking to you about.
So it's brilliant understanding how PII goes all around the organization.
But the point is we're doing that to understand what the risks are.
Is there a particular vendor that poses us a highest risk in one department? Is there a particular type of data or way we are managing data or the way human beings are managing data in one of our processes in one of our departments that would keep us awake at night because that's what that and risk analysis.
How do we audit that and then go, okay, here's our risks, here's our priority, this is how we're gonna fix them.
It's a really meaningful way of having your privacy and AI risk log and then it become an ongoing part of risk management.
Records retention.
Loads of you would've heard this.
So data deletion of personal information.
Um, again, workloads of schools, lots of 'em have a records retention policy.
They may even have a schedule if they've got that far.
I think the bit that people find tricky is implementing it.
How do we actually find a way to make sure we are deleting this PII and personal data in timely manners? And of course it varies from different needs.
Some things you have to keep for a very long time and some stuff you shouldn't keep for a very long time.
But how do we do that in 2025? 'cause there's lots of technical ways of doing it.
You've got your kind of, your main tendencies like Google and Microsoft.
They'd brilliant ways of doing it.
You can automate deletion of emails there.
But how do we do that? What's the safe way of doing it? Then we think about our MIS and RSRS, what can we do there? So this will be really going into the kind of the weeds of what can we do from a technical solution perspective to help manage data deletion.
And then when does the human element come in and what does that look like? So this will be a really good session 'cause I know most schools I've worked with struggle with this immensely.
Uh, managing complex AI and privacy incidents.
This is gonna be the case study session.
We're gonna go right into the weeds of real life examples where we've seen AI incidents and privacy incidents and sort of unwrap them and talk about how, how they could have been dealt with differently, what we could have done to make sure that didn't happen.
So you can kind of take that away and apply those best practices back to your back to your schools.
Um, ethical audits of AI systems.
Okay, so this is very new.
Um, it's a very new thing to be doing, but how do we do an ethics audit on AI systems so we can demonstrate all those kind of responsibilities that we may have to show that we are taking ethics into account with this new technology That's sort of like the world's craze.
Um, this one I'm doing, I'm doing a few of these.
This one I'm super excited about, but it was my idea.
So I guess I'm a bit biased.
Visual data ethics, deep fakes consent and image safety.
So we live in a world now where people can do a lot with your image.
Um, you know, 10, 15 years ago, you'd have to have been like a Photoshop expert to be able to do anything crazy with an image.
Not anymore.
You can run it through a, uh, you can run it through an AI tool and you can, you can do all kinds of things.
Create deep fakes.
Um, so we have to start thinking about the ethics around that on our website.
We have students on there.
No one is suggesting you cannot have that.
But what can we be doing to protect the student image, the school image? Because we have had instance one in particular in the UK that springs to mind where, uh, a student was, uh, the deep fake of a student was created and it was sexually explicit and they ransomed members of staff, um, with that to try and cause that kind of concern that they would release that image of the student and try and do a typical kind of ransomware in that sense through a, through a deep fake image.
So we'll be talking about how we actually can manage that.
Things like watermarks, et cetera, et cetera.
So again, marketing may find that an incredibly useful session to come to 'cause images so powerful for them.
But we do live in a world where it can be used in a bad way.
It can be misused ultimately.
And then strategic leadership.
So we'll be talking very much about policies here.
A lot of you will have policies.
Do we have the right policies? Are they procedurally correct? Um, one of the things that you do get from, uh, the academies is we have an insane amount of resources at nine every type of policy you could possibly need for AI and data protection.
Cybersecurity we have created so many times before, uh, we template them all out.
So with each of these sessions, you always get resources.
So policies is a great example 'cause we'll be giving you our policy templates so you can just launch off them and save yourself a lot of time.
Obviously they'll be very, they'll be legally compliant to your requirements.
You know, you'd have to get a lawyer to do that normally.
Um, but also we'll be helping you.
The consultants will be helping you if you need to jump onto sessions or meetings with them to build out your policy.
So that's a brilliant session as well.
Everyone's welcome to ask questions at any point 'cause I'll talk forever.
Otherwise, uh, the nine tech academies.
So this one is a little bit different as well this year.
So you'll see here some of these optional modules, right? Okay.
So the ones that are definitely gonna be delivered.
Okay? Now for the benefit, anyone in it, this is gonna be great for, right? IT director, director of technology, IT technician, network manager.
This is gonna be brilliant.
If cybersecurity is on your mind, which it should be, if cybersecurity is on your mind, this is gonna be a great, uh, academy course for you because it's really going to teach you how we build cybersecurity by design, all of those core parts of infrastructure, the principles, the practices of cybersecurity.
I always kind of say, you know, you can do vulnerability assessments and, and you know, we do them for schools.
It's not like we're, they're very good things to do.
But that feels very much like you're always kind of giving a person a fish.
You're not teaching them to fish.
Um, there's a pun in there somewhere with fishing, I'm sure, uh, pH fishing.
Um, however, this is more about providing you with the professional development to do cybersecurity yourself, right? To actually make it a core principle of the IT team.
So one of them will be cyber awareness audits.
How do we do cyber audits? How do we do audits ourselves as an IT team? And what are those practices? What frameworks should we be looking to? There are some really good cybersecurity frameworks out there.
Vulnerability management, implementation.
Okay, so we know our vulnerabilities.
How do we manage that? We know things like patch management is a vulnerability, right? We can't, it's so hard in a school.
We all have our phone software updates and I put mine off more than I should for somebody who works in a business like nine.
But you have hundreds of devices, sometimes thousands of devices, all of these softwares.
So again, brilliant resources, really good toolkit kits.
Really good.
The platform allows you to project manage things like patch management as one example.
But how do we do vulnerability management? Because it's one thing being told you have problems.
It's another thing being able to manage them on the day to day when it is really busy.
You are doing lots of other stuff other than just vulnerability management.
So we talk about that, how we can do that, how we can manage these kind of things.
Serve infrastructure, audit and best practice.
So, you know, how do we get into the weeds of our kind of core infrastructure? You know, it's complex.
There's engineering work required there.
How can we do that kind of thing? So, and what the kind of things we should be considering.
So that'll be our session.
Then we hit an optional module.
I'll explain that in a second.
Backup systems audit and best practice.
Super important when it comes to attacks, because I've been here for six years now and I've seen the worst case scenarios where, uh, they've had huge denial, denial of services or ransomware attacks and backups weren't in place or they weren't as reliable as they should be.
Um, so we'll be talking about how do we have robust backup systems.
You know, what is the, what is the, what is the best way to do this so that we, we don't have too much downtime with education program.
We don't lose a huge amount of, not just personal information, but also you know, your company information, your own sensitive information as a business.
So we'll be talking about backups because insanely important when the worst happens.
Uh, 'cause it can make all the difference, uh, between storing things quickly and not being able to restore things very well at all.
To be perfectly frank.
Another optional module gonna explain the second update and patch management implementation.
'cause it is time consuming.
So we even give that its own session.
Talk about how best to do that.
All of the tips, tricks, resources, toolkits, project management kind of ways we've worked out to do that with small IT teams, IT teams, um, as much as large ones.
We're very used to working with schools with small I teams, teams, even one person.
So it's how do we actually make this practically work.
We might have limited, um, capacity in the IT team.
And then one more optional module.
So what we're gonna be doing a little bit different this year, and this was based on feedback from the Atlas cohorts last year, is we're gonna give you a choice.
You're gonna have a vote.
So when you join the tech academy, these will be the additional sessions that can fill in those three optionals.
And we're gonna just do a fair voting system.
And the top three sessions are gonna take those optional ones.
So I've been talking quite a bit so I won't go through all of these in detail, but instant management, okay, very much from a cybersecurity perspective though obviously we do more AI and privacy in the other academy network switching access controls, uh, wireless audits and best practice firewall audits.
Obviously firewalls being, you know, one of the most important things you have in it at school 'cause it does so much protection.
Uh, proactive system management, uh, system documentation.
Not the funnest spin, but it is the one that I see schools struggle with a lot.
It don't have a lot of time to do documentation.
So again, templates, resources, we give you lots.
Um, disaster recovery.
Whenever I've met a school who wants to start working with nine on technical disaster recovery is always something they're really keen to kind of get underway with us.
So let's talk about those different scenarios, those different disasters that could occur, right? And how do we have plans and how do we have policy that enable us to kind of recover or act in those different types of disasters.
Active directory security audit had a very serious incident just recently with a school where they had their whole active directory accessed.
So that might be a popular session.
Email protections and antivirus implementation.
So if you're gonna be one of the people to join, you're gonna get to cast your vote and free of these, the most popular free will come through.
So, uh, interesting different way of doing the privacy academy.
They've got their own flavor this year.
One's got advanced, one's got pick and choose.
So that should be interesting how that works.
Um, slow down for a little bit.
I I don't think I've seen any questions.
Uh, but I have, have a quick check just to make sure.
I think we're okay.
Fine.
So you know who the consultants are going to be.
Okay? So we're gonna have Dan and Marcus for tech and we're gonna have Julia and Katie for privacy and ai.
I will be join joining both.
You can think of me like, kind of like the scrum master cheerleader for the sessions.
That's my job.
The other part of the, uh, the academy that you get isn't just the consultancy and the training sessions.
It's actually the nine platform.
Our software, our governance risk and compliance platform that we've created, which all our clients use.
It's how they manage their ai, cybersecurity and, uh, data protection programs as a school.
So just a little bit of a, if I just go back a second and just give you an example of one of the sessions.
Um, if we look at the Privacy Academy, the structure of this, the way it will work is that you can see that we've spread the sessions out, okay? So it's a, it's like a year collaboration almost together.
Uh, we've picked the dates very carefully.
Don't worry.
Again, we only work with schools so we know when you've got your breaks.
Uh, so we won't be overlapping any of those.
Um, and then we've got a nice sp spread here.
And what will happen is, uh, prior to every session you'll be given a few fins.
So the first one is you'll be given a video to watch and that's where we'll be covering some of the theory.
Think of that as pre pre-work for the session.
Um, the second thing that we'll be given to you is your resources in advance.
So you can have a good look at them first before you come to the, the training session itself.
Maybe 'cause you've got some questions you wanna directly, um, ask of them.
Um, we'll also be giving you, um, our appendices.
Now all that means is the session topic.
We give you your legal requirements based on where you are, okay? Based per state.
We're very specific so you can understand exactly what you do and have to not do.
Okay? Now, a lot of you, I work so many schools, always wanna shoot for the stars.
Be as safe as possible.
But we do actually tell you your legal requirements ahead of each session as well, just so you can baseline those.
So you'll be given a video to watch Now that will be delivered to you via our, and again, it's in the platform is our nine academy LMS.
So nine has its own learning management system.
It's part of the platform.
Uh, it's one, one of many modules and that's where we'll be putting your videos in ahead of each session.
You will be able to go in, watch the video, any of your colleagues who you, uh, sign up for the academy with as well.
There's a quiz, okay? We always want to check, uh, we're all teaching, right? So there's a quiz afterwards just to be comfortable that you kind of got the knowledge there before you head into the training session with the consultants.
All right? So we give you a little bit of pre-information and then we cover it in the training session as well.
Um, those training sessions will be an hour and a half.
Okay? One and one and a half hours.
Um, and in those where I do a lot of things, so where I have breakout sessions, so every single topic there, there's really good and I've seen them or I've just signed them off.
There's really good case studies and breakout sessions to have.
So this is where the kind of cohort CoLab happens because if it's the tech academy, you're gonna get to work with other IT teams at other schools on these little scenarios.
These are breakout sessions.
And of course with the privacy and ai, you're gonna get to work with other people who have to manage AI and privacy at other schools in, in the, at this, uh, at this kind of cohorts.
Um, so, and then there'll be some theory.
So what we typically do is consultants do some training.
Okay? We hit a case study, bring it, bring the training into reality.
Um, then we do a little bit more training when move into a breakout session.
So we can go apply what we've learned already that day and take that into a session where we all work together.
Consultants will come around the room, help you have a chat with you, get loads of questions at that point.
'cause it's obviously, you know, you're thinking to yourself, how does that apply to me? This doesn't happen at my school.
How would this work? Uh, and then we will come back and we kinda share best practice and then we close on a little bit more, uh, training.
And then we set you some homework.
Now, homework is typically the resource, right? We're actually saying, here's a brilliant resource, take it away with you.
Uh, try and implement it in, into, into the school.
Uh, you do also get consultancy hours as well as part of the package.
So, you know, if you, whichever academy you join, those consultants, the trainers, you'll be able to work with me If you want to, we give you consultancy hours and some of these sessions, you probably won't need the help.
You'll be able to do 'em just fine after the training.
Other stuff you might wanna really get into the weeds with a consultant with a one-on-one meeting and use those hours for that, for that help to actually help them get them to implement it or customize it or quality assure it.
So it's a really collaborative uh, uh, exercise this and you get to know the consultants exceptionally well as well over that period of time.
So a little, I don't wanna linger too much on our platform because if you do, um, if you do register and what will happen is, um, you will obviously get access to it.
This is the nine platform, okay? This is the governance risks and compliance platform.
You are gonna be given access to almost all of this, uh, as part of a registrant.
Um, so when I talk about things like it breaches, you'll be able to use our incident management reporting tool.
So you'll be able to use that as a framework to investigate a data breach or a cybersecurity incident during the time we're all working together.
Um, everything is data portable.
So if you're using this for a year, obviously we would love for you to continue using it beyond the privacy academy, but that's very much based on how useful it's to you.
And if you see that as something to continue, what I want to be very clear though is everything you are offered, everything you do, you can download and take away.
You're not gonna lose the progress you make over that year.
You can keep it to your own methods.
Um, so everything can be extracted.
Uh, for example, when it comes to things like records of processing, that data mapping, I talked to you about that there's a tool in the platform for you how to do that.
You'll able to construct it very quickly as well.
By the way, one of our core principles at nine is to make things quick.
'cause we know you don't have a lot of time.
So we always wanna make things quick.
So yeah, it's an example here.
We're able to construct your, uh, processing areas as we call them.
Departments basically.
So admissions and hr.
And then we're able to document all the different processes we have in there.
And then we're about to look at the different types of personal data we're capturing in there.
PII.
So then we have this brilliant reportable inventory.
We can use it for all manner of methods at that point.
Um, one of the things that's very popular, which we'll be giving you access to as well is the vendor libraries.
So the vendor library is very popular because when we talk about those assessments you have to do on third parties, um, many of you will be doing this or, um, from my experience, many of you will be trying to do this because it's very time consuming and quite painful.
Having to go from vendor to vendor to vendor and get all their policies and documentation, which sometimes isn't easy to find or they don't have and then have to assess it.
Um, our tool has done it for you.
Our consultants.
So the consultants who you'll meet, Julia and uh, Katie, that's a big part of their job.
They spend a lot of their week, about half their week working with schools and half their week doing vendor assessments.
So we've gone off and done these vendor assessments, right? Because again, save you time.
Core principal advice, keep school safe, save you time.
Um, so we've gotta done them.
So there's hundreds in here by the way, hundreds and hundreds.
Um, all your major platforms you think of and some really obscure ones that maybe you've never seen before.
And what we're doing in here is we've done all the privacy checks, we've done all the AI checks, we've done all the um, child protection safeguarding checks and these are very, very detailed vendor assessments.
I can't go into them now.
We wouldn't possibly have the time.
I can give you a very quick example of one because you'll see, and this is why a lot of schools in uh, the US work with us now, um, because this is a big kind of headache for them.
Um, I'll show you the nine one because obviously we have a vendor assessment on our own platform 'cause we have to lead by example.
Um, this is what the vendor assessments look like.
Um, they are effectively, um, we've done all the work for you.
So there's a very complex framework, a very comprehensive, we do data protection, safeguarding, cybersecurity, AI checks, and we have gone through all the policies and procedures a vendor holds and we've addressed them all for you.
The key is that you can we just get to the crux when you go to the end of the assessment, which I'll say most people do, and you get to just see what the risks and issues are, guess what? There's no risks and issues of our platform that would be a bit of a, a bit of a, a terrible thing if we had risks and issues.
Of course many vendors do have risks and issues.
Uh, and you can see even at this preview stage where you can see some example cards of vendors, you'll be able to see that there are a lot of risks and issues.
You know, we are living in a very fast paced world and vendors aren't always keeping up with their own compliance.
So you can see, yeah, this one has a privacy problem, this one has a privacy problem.
This one here has a safeguarding concern.
Okay? So we'll be in the academy letting you download these and using some of the ones you use in your school.
But we'll be teaching you how to do your vendor assessments as well.
Most people just wanna use this thing.
We don't wanna do their own vendor assessments, but that's fine.
We're still teaching.
If you wanna learn how to do it, you'll have access to our resource section.
This is absolutely full of stuff, right? Like when I say every policy you can imagine, I do mean every policy type you can imagine.
We've got some really nice student stuff as well.
Now, now though, we've got like a student privacy policy and we've got a student AI policy and it's not just that it's been written with students in mind, it's been written with students to read.
It's been brought to a more appropriate language for students.
Uh, and it's got some nice design as well just to make it engaging 'cause we want 'em to try and read these things and understand about data protection and ai.
Uh, so all of these are accessible, that's all part of it.
Anything from data protection policies, CCTV policies, AI policies.
But then from a technical perspective, we start to look at disaster recovery plans, business continuity plan templates, these big, we have got lots of templates.
It's what we do and it's how we help our clients.
So you'll get access to this as well.
The, um, training academy I talked about, uh, the nine academy, sorry, is where you'll be spending time as well.
And this is the learning management system, uh, we've created.
So this is where pre-session you'll get your training video, you'll be able to take your quiz, you'll be able to see how well you did.
Uh, but don't worry, you do get to go in a session with us afterwards and we're, we do further training around that as well.
Um, so it's a very comprehensive platform.
Um, many schools from last year continued to use it because you kind of started to build out your compliance program in it and it was like, let's keep doing that, right? Let's what, why would we not keep doing this? Um, but again, I always wanna be clear.
You can download it all.
You're not gonna lose all your progress.
That's not what we're like, you know, if you don't, you don't wanna continue using it.
We understand.
That's absolutely fine.
Um, so that is in effect what those academies are.
Um, I think, uh, they're more useful to more people than it always seems.
It's not just a data protection person and not just an IT person for each one.
Uh, I think we know AI goes across so many departments now, so many decision makers, um, vendors in their own right and we do spend a lot of time talking about vendors, but vendors in their right, uh, touch so many people at the school now.
Um, I think the training is brilliant for everyone and I think it will definitely build the culture in your school and it will definitely leave you coming away feeling you can succeed through this more because it is tricky.
I mean, AI is a lot of work.
Fe protection is a lot of work, cybersecurity is a lot of work.
Um, so we're really hoping to see you, right? Last year was a blast.
I really enjoyed it.
Uh, with all the, at this cohort, I shouldn't say this, but we do another one.
I really shouldn't say this, but I'm gonna, 'cause I've started now we do two versions of this.
We do one in, um, with Atlas, and then we do one in Europe and Southeast Asia.
And yeah, you guys are a lot more fun.
I'm just gonna say you are.
You got a lot, you've got a lot more chatter about you and you, you, you, you're really good people to get to know.
And I've got to know a couple of people from last year really well and I still speak to them.
So that's been brilliant.
Um, so on that note, I'm kind of done.
That's, that is the highlights and I would love to see you all there.
Um, you really are Kelsey.
It's one of the best community, um, I work with when it comes to this kind of thing.
Um, but if anyone has any questions, it could be about Eva Academy.
Uh, it can be about the subjects we're addressing or maybe even anything you've, um, I've talked about today you haven't seen that you would like to have seen.
'cause um, I can, I can always get us to make sure we put some of that into the session.
So at that point, really appreciate Atlas inviting me along.
Love to see as many of you as possible Um, and I'm very welcome to any questions, anyone, anything they want to ask at all.
Thanks James.
We're gonna open it up like again, if anyone has questions, feel free to come off mute.
Also know if you're interested in learning more about these programs or you would like to register.
We've dropped a link for that in the chat.
Any specific questions y'all want to ask James while he's here? At this stage, people are thinking, I hope he's not doing the training.
'cause he talks so fast.
But don't worry, my consultants speak faster, slower than me.
They take their time, they're better at that.
Yeah.
So we've got a, a question from Tracy in the chat.
She says, thank you for the information.
In addition to the privacy information, will these sessions also include ideas on how to support, how to use AI to support work in K 12? Brilliant question, Tracy.
And yes, absolutely.
So the AI training we do isn't just about risks.
We wanna be very clear it's benefits and risks.
So we will talk about really tangible things that can be used across K 12 that are gonna be great.
And that could be not just for the education program, but we're talking about admin as well, right? These are gonna be great.
You can, you know, I've done a lot of training on school site.
I've been to lots of schools recently doing onsite training and I spend about half of it talking about the practical uses.
Like, did you know you could use this in hr? Did you know you'd be able to use this with, with, with the students? So we definitely cover the great things you could introduce.
We just balance that with, well, if we're gonna use it in this way, we need to be conscious of this risk as well.
So yeah, it's a really good question 'cause we forget to tell you that sometimes we're to heavy on the risk at nine.
We do talk about the benefits, we do talk about the strengths and how you can actually implement these things and, and how that could change the score as well.
So thank you Tracy.
It's a good question.
So for everyone on the call, um, I also want to, to put a shout out.
We're actually working with nine on a lot of different projects and one of them we're really excited about it actually started in Dubai.
It's an international, um, project and we're, we're working on Tracy, what you just talked about.
So really that student facing AI piece and there's going to be a learning, uh, basically some of the student facing pieces.
We'd love to get you guys to vet it and to get some input with your students.
And so if that is something that you're interested in, drop your email in the chat or reach out to me personally.
It's ashley@theatlas.org and I'll get you on that little task force that we're putting together so your students can actually give, um, some real feedback on some of this as well.
So we always wanna loop in our constituents and make sure that they have a voice in these resources and that they're real, they're applicable and that they make sense.
Um, I'll tell you, people that have gone through the program last year really loved it and one of their favorite pieces was not only that hands-on learning, it was a lot of the projects that they needed to tackle anyway and they didn't even know where to start.
It gave them this Atlas community to go through those projects with, again, direct support with consultants and experts and you even get some individual hours, you know, four year school for those one-on-ones if you need that extra help.
Um, that's included in.
So a lot of just great resources to get you further down the path.
So, uh, I hope you guys will check out our academies and let us know what you think.
Um, if you have any questions, we can turn off the recording and stay on for just a moment, but if not, that brings us to the conclusion of this information session.
James, I wanna thank you for being here with us today.
More than welcome.
It was a pleasure.
You can invite me to anything.
I love doing these with you guys, so we love having you here, James..
Takeaways
-
Collaborative Learning
The Academies offer a professional development experience designed around cohorts of ATLIS members, fostering peer networking, collaboration, and shared best practices for tackling common challenges in independent schools.
-
Dual Focus on Privacy & AI and Tech
The program is divided into a Privacy & AI Academy, focusing on data protection, legal compliance, risk analysis (including vendor and AI ethics), and a Tech Academy, emphasizing cybersecurity, infrastructure, and technical implementation projects.
-
Hands-on, Certified Training
The academies are CPD certified, offering structured, year-long learning with expert-led courses, real-world case studies, and interactive learning, resulting in official, recognized professional development credentials.
-
Addressing Vendor Risk
A major theme is the growing threat of third-party/vendor-related cyberattacks and data breaches, with a specific module dedicated to evaluating vendor compliance and managing associated risks for EdTech and AI-enabled platforms.
-
Practical Resources via 9ine Platform
Academy participants gain access to the 9ine platform's resources, including training videos, policy templates, and tools for compliance, risk assessments, and vendor assessments, which support daily department management.