Governing AI Privacy Frameworks with Blackbaud and Veracross
Presented by:
Leaders from Blackbaud and Veracross join the panel to discuss the complexities of AI governance and student data privacy . The conversation covers essential steps for establishing AI policies, understanding international regulations like GDPR and NIST, and the shift toward agentic AI. Gain practical insights on vetting vendors and securing your digital perimeter.
- Blackbaud, cloud software provider serving nonprofits, educational institutions, healthcare organizations, and CSR entities in the areas of fundraising, financial management, and education administration.
- Veracross, cloud-based student information system (SIS) and school management platform designed specifically for private and independent K-12 schools, connecting academics, admissions, accounting, development, and student health.
- Data Harmony: Integrating Systems, Empowering Schools, previous episode of TTWA featuring Blackbaud and Veracross
- NIST Cybersecurity Framework (CSF)
- NIST AI Risk Management Framework (RMF)
- General Data Protection Regulation (GDPR)
- California Consumer Privacy Act (CCPA)
- Coquito, traditional Christmas drink that originated in Puerto Rico
Transcript
Peter Frank:
Dan, welcome to Talking technology with ATLIS,
Peter Frank:
the show that plugs you into the important topics and trends for
Peter Frank:
technology leaders all through a unique Independent School lens.
Peter Frank:
We'll hear stories from technology directors and other
Peter Frank:
special guests from the Independent School community,
Peter Frank:
and provide you with focused learning and deep dive topics.
Peter Frank:
And now please welcome your host. Christina Lewellen,
Christina Lewellen:
Hello everyone, and welcome back to
Christina Lewellen:
talking technology with ATLIS. I'm Christina Lewellen, the
Christina Lewellen:
President and CEO of the Association of technology
Christina Lewellen:
leaders in independent schools, and
Bill Stites:
I'm Bill Stites, the Director of Technology at
Bill Stites:
Montclair Kimberly Academy in Montclair, New Jersey, and I'm
Bill Stites:
Hiram
Hiram Cuevas:
Cuevas, the Director of Information Systems
Hiram Cuevas:
and Academic Technology at Saint Christopher school
Christina Lewellen:
in Richmond, Virginia. Hello, gentlemen. How
Christina Lewellen:
are you today?
Bill Stites:
It's spring break for me here, so it's an
Bill Stites:
interesting time for sure. Reminds me a little bit of the
Bill Stites:
summer, because it's nice and quiet, getting some work done.
Bill Stites:
But you know, doing pretty good is it
Christina Lewellen:
kind of weird to be at a school when
Christina Lewellen:
there's no teachers, no kids. Like, is it kind of creepy? Do
Christina Lewellen:
you feel like maybe it's a zombie apocalypse kind of
Christina Lewellen:
situation? She brought it up
Hiram Cuevas:
first. Impressive she did.
Bill Stites:
It is interesting because it is quiet. I would say
Bill Stites:
it's awesome. There are often a lot of dark hallways because the
Bill Stites:
maintenance crew doesn't turn on the light. So it can be a little
Bill Stites:
eerie at times, but generally it's pretty nice. It's good to
Bill Stites:
come in dress a little more casually. Just kind of relax,
Bill Stites:
settle in, you know?
Hiram Cuevas:
But we know Bill has his melee weapon if he needs
Hiram Cuevas:
it. I do.
Christina Lewellen:
Yeah, you're not afraid if there is some
Christina Lewellen:
zombies wandering the dark hallways over
Bill Stites:
MKA, no, though I do need to hide it because it's
Bill Stites:
quite impressive. So it's got to stay in the background.
Bill Stites:
Otherwise, I do have my trusty letter opener.
Christina Lewellen:
I mean, in a pinch, it would work, Hiram,
Christina Lewellen:
what are you up to? You're not on spring break
Hiram Cuevas:
yet, huh? No, we just had our spring break last
Hiram Cuevas:
week, but it was a nice time. Got to visit my parents down in
Hiram Cuevas:
Florida. We just missed the airport chaos that happened on
Hiram Cuevas:
Friday in BWI, Richmond and the DC airports, and my baby girl is
Hiram Cuevas:
back at Virginia Tech finishing up her sophomore year, so we're
Hiram Cuevas:
good to go. I think
Christina Lewellen:
it's so cute, Hiram, you always refer to
Christina Lewellen:
her as your baby girl. And I'm wondering, like, if baby girl
Christina Lewellen:
actually listens to the pod, how thrilled or not she would be
Christina Lewellen:
about aforementioned nickname of baby girl.
Hiram Cuevas:
Well, she thinks it's hysterical that a I have a
Hiram Cuevas:
podcast with two other people.
Christina Lewellen:
We think so too.
Hiram Cuevas:
So she always takes every opportunity to mock
Hiram Cuevas:
me whenever possible. I am jealous of Cameron that she has
Hiram Cuevas:
a toddler because they still love you back then.
Christina Lewellen:
Yeah, that's right. Well, before we get to
Christina Lewellen:
Cameron and our other guests that we have today, I did want
Christina Lewellen:
to let you guys know that not long after I wrap up recording
Christina Lewellen:
with you, I'm very shortly heading to the airport. I have
Christina Lewellen:
some CEO friends who have, I don't know if kidnapped is the
Christina Lewellen:
right word, but they basically said, You're not about to change
Christina Lewellen:
jobs with no time out, with no moment to celebrate, with no
Christina Lewellen:
moment on the beach, and so I'm heading to Puerto Rico this
Christina Lewellen:
afternoon.
Hiram Cuevas:
Oh, nice. Jealous.
Bill Stites:
Notice how she invited us, Hiram. It's not like
Bill Stites:
we'd spend so much quality time with her. It's like, I know
Bill Stites:
she's after the Coquito, and you're Puerto Rican, Hiram. I
Bill Stites:
mean seriously,
Christina Lewellen:
it is about the Coquito first and foremost,
Christina Lewellen:
although it's not exactly the season for Coquito, but I will
Christina Lewellen:
say that we will have to take a rain check on a poolside podcast
Christina Lewellen:
here at talking technology with ATLIS. I don't think our
Christina Lewellen:
producer would be very happy. I don't think our editor would be
Christina Lewellen:
very happy. And you know, while I would love you guys to be
Christina Lewellen:
there, it is, as it turns out, a girl's trip, so I'm not sure how
Christina Lewellen:
much you would love it, especially our last pod, I
Christina Lewellen:
talked about how I had adult beverages with Brooke and Bill
Christina Lewellen:
when I was up in New Jersey, and so Brooke and I started going
Christina Lewellen:
down the path of Girl Talk, which, you know what that poor
Christina Lewellen:
woman deserves a little bit of Girl Talk in your household, and
Christina Lewellen:
poor bill is just sitting there with wide eyes like I don't know
Christina Lewellen:
that I need to know about all of this midlife change with women,
Christina Lewellen:
things going on in this conversation. So I don't think
Christina Lewellen:
you guys could hang that's what I'm saying. Coquito or not. I
Christina Lewellen:
don't think you guys could hang with me and my girlfriends in
Christina Lewellen:
Puerto Rico. Just saying
Hiram Cuevas:
you have little faith.
Christina Lewellen:
Listeners, I'd like to hear your feedback.
Christina Lewellen:
Do we think that Bill and Hiram could hang with the women in
Christina Lewellen:
Puerto Rico, I think it's a big no. All right, guys, let's get
Christina Lewellen:
serious as much as I'd love to sit here and pick on you all
Christina Lewellen:
morning, we have some really great guests, and I'm really
Christina Lewellen:
excited about this podcast. Many of you who have listened for a
Christina Lewellen:
while know that we have, on occasion, brought together some
Christina Lewellen:
vendors in our space, some of our vendor partners, to talk
Christina Lewellen:
about a topic. We had one in particular that people really
Christina Lewellen:
loved, and that was that we had brought together, and we'll drop
Christina Lewellen:
it in the show notes, so that you can go back and listen to
Christina Lewellen:
this episode. We brought together some folks from
Christina Lewellen:
Blackbaud and some folks from Veracross, and together on the
Christina Lewellen:
same pod, we explored some. Trends as some of the largest
Christina Lewellen:
sis providers in our space, they see a lot. They work with all of
Christina Lewellen:
our schools, right with many, many of our schools. And so as
Christina Lewellen:
we were sitting in our planning meeting a few months back, we
Christina Lewellen:
thought, let's do it again. Let's bring back Veracross And
Christina Lewellen:
Blackbaud so that we can have a conversation around kind of
Christina Lewellen:
what's shaken right now in our space, in particular, Bill and
Christina Lewellen:
Hiram, the three of us talked about wanting to talk about
Christina Lewellen:
privacy management in this era of AI. And so we're really
Christina Lewellen:
excited today to bring some representatives from these
Christina Lewellen:
companies, and it is now going to be my pleasure to introduce
Christina Lewellen:
them. We have Cameron from Blackbaud, we have Joe from
Christina Lewellen:
Veracross, and we also have Mike from Veracross. I'm going to let
Christina Lewellen:
you guys introduce yourselves. Let's start with Cameron. Tell
Christina Lewellen:
us a little bit about what your role is, and also, welcome to
Christina Lewellen:
the podcast. Thank you
Cameron Stoll:
for having me excited to be here at the second
Cameron Stoll:
annual Hunger Games, where the winning, surviving vendor
Christina Lewellen:
will prevail. That's hilarious and so
Christina Lewellen:
appropriate. Thank you. Okay, this is already great. You're
Christina Lewellen:
in. You win.
Cameron Stoll:
Okay, great, that's a wrap. I am Cameron
Cameron Stoll:
Stoll. I am Blackburn, Chief Privacy Officer. I manage our
Cameron Stoll:
privacy and AI legal teams, so a group of nerds who just looks at
Cameron Stoll:
data regulation all day long to help us inform our product teams
Cameron Stoll:
and our data teams and our AI teams. I've been with Blackburn
Cameron Stoll:
for almost 10 years, which is kind of unimaginable. I am a
Cameron Stoll:
lawyer by trade, so I did some unspeakable years in big law
Cameron Stoll:
firms, and then came in house, and I am absolutely never
Cameron Stoll:
leaving.
Christina Lewellen:
Thank you. And now let's go over to Joe and
Christina Lewellen:
Mike,
Michael Martell:
thanks for having us. My name is Mike
Michael Martell:
Martel. I have also been working for Veracross for almost 10
Michael Martell:
years. So Cameron, we're going to have to compare notes see
Michael Martell:
when we actually started from a Hunger Games perspective, I
Michael Martell:
think you're outnumbered Cameron, unfortunately. So
Michael Martell:
probability is on our side for the Veracross team, and that's
Michael Martell:
very true. So I am not a lawyer, but I am an engineer and a
Michael Martell:
physicist. Those are my backgrounds. How I ended up
Michael Martell:
working in the ed tech space is a very interesting story, but I
Michael Martell:
started about 10 years ago by building out our cloud
Michael Martell:
infrastructure and building a DevOps team deeply embedded in
Michael Martell:
engineering, and then decided that I wanted to spend more of
Michael Martell:
my time focused on security, privacy and compliance. So I
Michael Martell:
built out our security and compliance team, and now I am
Michael Martell:
our chief information security officer Emeritus, as I am moving
Michael Martell:
over to build out a business operations function, and am
Michael Martell:
happy to welcome Joe
Jo Bentley:
on board. Thank you very much. It's a pleasure to be
Jo Bentley:
here. I am Joe Bentley, and I am the new CISO for Veracross. I
Jo Bentley:
have all of, I think, four plus weeks experience in Veracross.
Jo Bentley:
So I may not be able to answer some of the really hard
Jo Bentley:
questions as it comes to Veracross, but I can answer the
Jo Bentley:
questions as it comes to cyber security. I am true and true a
Jo Bentley:
cyber security professional. I've done all permutations of
Jo Bentley:
cyber a level of privacy, but not to Cameron's level. And I
Jo Bentley:
have built out to date about three highly successful
Jo Bentley:
information and cyber security programs across various vectors
Jo Bentley:
in the industries, from financial services, hospitality
Jo Bentley:
and student loans. So I do have some insights into independent
Jo Bentley:
schools, but I come at it from a student loans point of view.
Christina Lewellen:
Now I want to start with the fact that
Christina Lewellen:
Cameron and I at BB Con last year, we did this cool panel,
Christina Lewellen:
and that's where some of this idea came from. And as the
Christina Lewellen:
lawyer on the panel, she was kind of like, look, you know, I
Christina Lewellen:
hate to be the bearer of bad news. And yeah, you know, I
Christina Lewellen:
thought that the way that she delivered a lot of those
Christina Lewellen:
messages was very reasonable and accessible. And so that's where
Christina Lewellen:
I want to start, because that's where the idea originally came
Christina Lewellen:
from with me and with our planning is like, wow, I'd love
Christina Lewellen:
to get some of that on the record. And so Cameron, if I
Christina Lewellen:
could go to you first, obviously not asking you to recreate an
Christina Lewellen:
entire panel that we did. But a big part of what stood out to me
Christina Lewellen:
is that you know, you are kind of sounding some alarms in terms
Christina Lewellen:
of your client base and in terms of what you're seeing. So can
Christina Lewellen:
you tell me a little bit about, maybe just what keeps you up at
Christina Lewellen:
night as someone with your background, and obviously you're
Christina Lewellen:
representing to make sure that your clients are safe. So what
Christina Lewellen:
is it about what your clients are doing, or what are the
Christina Lewellen:
issues that you're thinking about the most from where you.
Christina Lewellen:
Sit in the Blackbaud ecosystem. Wow.
Cameron Stoll:
Well, if you have eight hours to cover, what keeps
Cameron Stoll:
me up at night, I think the number one thing is, I'm still
Cameron Stoll:
hearing from schools. We have yet to finalize our AI policy,
Cameron Stoll:
and what that means is, all of your teachers are using AI that
Cameron Stoll:
you haven't approved, because we are three years almost to the
Cameron Stoll:
month down this road of the generative AI explosion. So if
Cameron Stoll:
you don't yet have a policy with a firm posture about what AI you
Cameron Stoll:
can use, then they are going to be using it without your
Cameron Stoll:
approval, which means without your controls, without any
Cameron Stoll:
governance. They don't know what data to put in it and what data
Cameron Stoll:
not to put in it. So I think that is probably the lowest
Cameron Stoll:
hanging and biggest priority. Fruit for me would be to have a
Cameron Stoll:
stance, even if that position is going to develop over time. Not
Cameron Stoll:
yet having a position because you're risk averse doesn't mean
Cameron Stoll:
that you aren't already entertaining all of that risk
Cameron Stoll:
and inviting all of that risk into your ecosystem. So I think
Cameron Stoll:
that is one big early point I would make for schools. The
Cameron Stoll:
conversation around privacy right now is probably 75% AI, so
Cameron Stoll:
I would have to say that would be the first place I would
Cameron Stoll:
start. It's not the only place I would go, but it's a really good
Cameron Stoll:
place to start.
Christina Lewellen:
Can I ask the same question then of our
Christina Lewellen:
Veracross friends, what is it that's keeping you guys up at
Christina Lewellen:
night? Does that ring true? Pretty similar ideas.
Michael Martell:
Yeah. So I typically come at this question
Michael Martell:
from a technology perspective, because I'll always be an
Michael Martell:
engineer at heart. We struggle to draw a boundary around AI, I
Michael Martell:
feel like with a lot of other traditional technologies, we
Michael Martell:
were able to differentiate the technologies and put it into a
Michael Martell:
category and put it into a bucket. But AI just breaks
Michael Martell:
through a lot of those categories, a lot of our
Michael Martell:
security perimeters because of its ability to access so many
Michael Martell:
different types of data, provide data reason over connect with
Michael Martell:
other systems. And that's probably the thing that keeps me
Michael Martell:
up most at night. Is not necessarily the technology
Michael Martell:
itself, but all of the connections that the technology
Michael Martell:
has to various systems, auditing those connections, understanding
Michael Martell:
and providing governance for those connections, and I'll echo
Michael Martell:
what Cameron said, many of our schools are still catching up.
Michael Martell:
They don't quite understand just the scope of what's going on
Michael Martell:
here, and they need to figure out a way to express their
Michael Martell:
values in a policy that helps to prioritize and protect what
Michael Martell:
matters most of them, while also providing access to a
Michael Martell:
revolutionary technology, which is AI, and that's a really, I
Michael Martell:
mean, it's a traditionally tricky balance, because there's
Michael Martell:
always going to be tension between security, privacy
Michael Martell:
compliance and the ability to innovate and the ability to
Michael Martell:
educate. But the fact that drawing that security perimeter,
Michael Martell:
that boundary, can be so difficult with AI tools, I
Michael Martell:
think, really amplifies the struggle.
Christina Lewellen:
So that we don't bury the lead on this
Christina Lewellen:
episode, can I ask the follow up in terms of the punch line and
Christina Lewellen:
get to it right now, which is, if there was one or two things
Christina Lewellen:
that you would hope to see schools do in the next 12 months
Christina Lewellen:
to respond to those things that are keeping you up at night.
Christina Lewellen:
What would it be?
Jo Bentley:
I would say, just leaning into AI, have a clear
Jo Bentley:
policy and have a governance structure. But apart from that,
Jo Bentley:
as you continue to engage in AI, it's really exciting. Putting a
Jo Bentley:
tech hat on it is Uber exciting. It's seamless. It's easy to
Jo Bentley:
engage with it. It answers all of those questions that most of
Jo Bentley:
us had dreamt of asking without having to go to an encyclopedia
Jo Bentley:
or whatever else. But that simplicity that ease equally
Jo Bentley:
translates to a loss of data, a compromise of an environment, an
Jo Bentley:
ill use of the data. So if anything, be clear about what
Jo Bentley:
your intentions are, include it in a policy. Have a structure
Jo Bentley:
that allows you continuously understand how you're
Jo Bentley:
interacting. But I will also add have fun with it. You know,
Bill Stites:
AI is in every episode that we talk about
Bill Stites:
outside of AI, what are the things that you're thinking
Bill Stites:
about here? Because it's something that when we think
Bill Stites:
about risk, when we think. About what we're looking at when you
Bill Stites:
know, you mentioned evaluating and reviewing a lot of these
Bill Stites:
tools outside of AI, like, what is it that you're worried about
Bill Stites:
at this point, or should we be worried about,
Jo Bentley:
for me, outside of AI, the sheer amount of
Jo Bentley:
regulatory requirements that we have to interact with, both from
Jo Bentley:
a cyber and a privacy point of view, and the changes those
Jo Bentley:
requirements are introducing into the ecosystem is dizzying,
Jo Bentley:
and for someone like me, that keeps me up at night. When
Jo Bentley:
you're a global company, there are various permutations of
Jo Bentley:
these requirements that you have to adhere to, and that in
Jo Bentley:
itself, is a full time job just keeping tabs on what one country
Jo Bentley:
requires versus the other, how to appropriately engage both on
Jo Bentley:
the privacy side of the House and then on the cyber side of
Jo Bentley:
the house, because, God forbid, they're the same. Generally
Jo Bentley:
they're not. They are distinct requirements, and they're coming
Jo Bentley:
fast and furiously.
Michael Martell:
I'd pick up on that, and say somewhat related
Michael Martell:
again, from a technology perspective, aside from Ai
Michael Martell:
perimeters, it's identity. We've been talking about identity
Michael Martell:
being the new perimeter for security for quite a while, even
Michael Martell:
before the AI boom about three years ago, the idea that
Michael Martell:
identities can be very easily faked at this point, and the
Michael Martell:
idea that zero trust framework is table stakes at this point
Michael Martell:
when it comes to identity verification, I've found that
Michael Martell:
many believe that a zero trust paradigm is this technology you
Michael Martell:
can buy, but it isn't. It's a way of thinking. It's a way of
Michael Martell:
designing systems, and I feel like it is more important than
Michael Martell:
ever, now that even the ability to have an identity that is
Michael Martell:
unique to yourself is being threatened every day online,
Cameron Stoll:
I would say, Bill, just to address your
Cameron Stoll:
question head on, other than AI, there is still a lot of movement
Cameron Stoll:
in the privacy field. And to Joe's point, we are seeing an
Cameron Stoll:
explosion of regulation, and it feels like in the past couple of
Cameron Stoll:
years, lawmakers worldwide have discovered a few things. A
Cameron Stoll:
technology can be harmful for children, not a surprise,
Cameron Stoll:
probably B some vendors, not as much, vendors like Veracross and
Cameron Stoll:
Blackbaud, which act as very strict data processors, but
Cameron Stoll:
vendors you might use that are actually data controllers,
Cameron Stoll:
they're collecting a ton of data about your kids and from your
Cameron Stoll:
kids, and they're using them in whichever ways you want. And so
Cameron Stoll:
these have become uniting principles for a lot of
Cameron Stoll:
politicians that agree on absolutely nothing else. And so
Cameron Stoll:
you are seeing an explosion of regulation. We see very little
Cameron Stoll:
movement on the federal stage because we can't seem to pass
Cameron Stoll:
anything. So as a result, these laws are very localized and very
Cameron Stoll:
specific. So you're going to have laws that apply to only
Cameron Stoll:
public entities. You're going to have laws that apply only to
Cameron Stoll:
consumer facing entities, and then you're going to have laws
Cameron Stoll:
that maybe try to regulate different types of technology
Cameron Stoll:
use, like social media. And so when you have these laws that
Cameron Stoll:
are not broad and they're not federal, you get a lot of them,
Cameron Stoll:
and that is very challenging for both schools and for vendors,
Cameron Stoll:
because the way jurisdiction works for many of these
Cameron Stoll:
organizations that regulate in the US schools are not subject
Cameron Stoll:
to them, but then you have laws that do apply to their vendors.
Cameron Stoll:
So you're stuck in this position where maybe the school is having
Cameron Stoll:
to follow principles that the vendor's subject to, even though
Cameron Stoll:
the laws aren't necessarily designed for them, which leads
Cameron Stoll:
really, it doesn't do anyone a lot of good. It's not helping
Cameron Stoll:
the schools. It's not helping the vendors, because it is this
Cameron Stoll:
complex patchwork of things that we have to build into our
Cameron Stoll:
products and try to keep customers apprised of and it is
Cameron Stoll:
challenging, and it is super fast moving.
Hiram Cuevas:
So Cameron, what was fascinating is I attended
Hiram Cuevas:
that webinar from the responsible AI Institute. I was
Hiram Cuevas:
the only K to 12 participant in the audience. Everybody else was
Hiram Cuevas:
corporate, and it was fascinating talking about. Out
Hiram Cuevas:
the complexities that you've just articulated from a global
Hiram Cuevas:
framework that corporate is also struggling with this. And it
Hiram Cuevas:
brings me a little bit of solace to know that, even as a K to 12
Hiram Cuevas:
school with just over 1000 boys, that you know, we're in a very
Hiram Cuevas:
similar place, but at least we are engaging in the
Hiram Cuevas:
conversations that people like Bill and myself have to deal
Hiram Cuevas:
with without cyber staffs. We don't have a chief privacy
Hiram Cuevas:
officer, we don't have a cyber security team. We are it, we're
Hiram Cuevas:
vetting, we're pulling wire, we're doing it all.
Christina Lewellen:
Hiram, does that make you, like, more
Christina Lewellen:
reliant on your vendors? Absolutely.
Hiram Cuevas:
And you know, I think about the pathways that
Hiram Cuevas:
Veracross and Blackburn have taken in terms of something like
Hiram Cuevas:
AI and some of the policies that they're building, and I think
Hiram Cuevas:
it's safe to say, you know, they're doing it slowly. They're
Hiram Cuevas:
rolling things out slowly because there's intentionality
Hiram Cuevas:
behind it. And I would love to hear some more about the why
Hiram Cuevas:
behind that intentionality as it relates to this complexity,
Michael Martell:
to answer your question directly, Hiram as to
Michael Martell:
the why behind the intentionality, there are two
Michael Martell:
different sources of that intentionality. One has to do
Michael Martell:
with the way in which we adopt AI as upward entity, and that
Michael Martell:
comes down to what I was mentioning before, really trying
Michael Martell:
to carefully understand the data perimeter, trying to carefully
Michael Martell:
separate our customers data, as Cameron pointed out, we are a
Michael Martell:
processor from our corporate data, and making sure that our
Michael Martell:
corporate uses of AI are done In a very thoughtful and a careful
Michael Martell:
way that strictly maintains that separation, while also allowing
Michael Martell:
us to take advantage of some of the tools that are fairly
Michael Martell:
revolutionary, even in the past few months compared to what they
Michael Martell:
were previously. So that's the first source. The second source
Michael Martell:
actually echoes back to the complex patchwork of global
Michael Martell:
compliance frameworks that we're all subject to, and that is AI
Michael Martell:
in our product that is an even slower role, because not only do
Michael Martell:
we have to be very careful about those security perimeters and
Michael Martell:
understanding what it is that the technology will do, but also
Michael Martell:
we have to be respectful of the requirements and the rights that
Michael Martell:
each one of our customers brings, which is very different
Michael Martell:
for a school in Virginia versus a school in Dusseldorf or a
Michael Martell:
school In Australia, and trying to marry up all of those
Michael Martell:
requirements takes time and takes thought and has a real
Michael Martell:
impact on the product itself, because there are additional
Michael Martell:
technical requirements that have to be met in order to be able to
Michael Martell:
safely apply an AI reasoning technology to customer data,
Jo Bentley:
I think I'll add something a bit more simpler to
Jo Bentley:
that, which is really being careful about what your use
Jo Bentley:
cases are, as we would like to think with everything appended
Jo Bentley:
with AI at the beginning of every sentence, the assumption
Jo Bentley:
is AI does it all in reality, it doesn't. You have to be clear
Jo Bentley:
about how you intend to use it and for what purpose and what
Jo Bentley:
value it brings to the overall delivery chain, so that in
Jo Bentley:
itself, is extremely important understand your use case, and
Jo Bentley:
ensure that you're delivering the value that you actually
Jo Bentley:
intend to to your customer base, so that there's that level of
Jo Bentley:
intentionality. What is it? Why? I would also
Cameron Stoll:
say there is something very unique about
Cameron Stoll:
serving this customer base that also it changes some of the
Cameron Stoll:
normal R and D steps. So you
Cameron Stoll:
have unique use cases. You also have a very
Cameron Stoll:
vulnerable data subject population, which are our
Cameron Stoll:
students, and you have a generally fairly risk averse
Cameron Stoll:
group of organizations. And to give you an example, there are
Cameron Stoll:
state laws that are being passed and were passed in 2025 so two
Cameron Stoll:
years after chat GPT really entered the scene, there are
Cameron Stoll:
laws in different states being passed that, say, form a task
Cameron Stoll:
force about how we should be using AI two years after the
Cameron Stoll:
explosion, and also laws saying you cannot use AI in certain
Cameron Stoll:
school settings, and so you have conflicting advice and
Cameron Stoll:
conflicting requirements come. From state level legislatures,
Cameron Stoll:
and I think it's leaving schools very confused, which just adds
Cameron Stoll:
to that hesitation around adopting AI. And so when we
Cameron Stoll:
build, you know, we're two companies that are building AI
Cameron Stoll:
specific for social good and for educational purposes versus open
Cameron Stoll:
AI and anthropic. They are building a Swiss army knife for
Cameron Stoll:
all different types of utilization. We are building fit
Cameron Stoll:
for purpose. Ai, which takes longer. There's more analysis of
Cameron Stoll:
customer sentiment and how they're going to use these tools
Cameron Stoll:
and what they actually need. Like, what do they need in the
Cameron Stoll:
classrooms? What do they need in the back office? Versus, oh, I
Cameron Stoll:
have aI chat. This is really cool, and I can serve all manner
Cameron Stoll:
of things right? Open AI, ask people to start submitting their
Cameron Stoll:
medical files into chat GPT to get medical advice. Then, of
Cameron Stoll:
course, the first comment was okay, but this is not
Cameron Stoll:
appropriate for medical records because open AI is not regulated
Cameron Stoll:
by the HHS. They're not subject to HIPAA, and so they don't need
Cameron Stoll:
to comply with HIPAA, so you can put your medical records in
Cameron Stoll:
there. It's fine, but it's not a fit for purpose solution, and so
Cameron Stoll:
I think that sort of adds to some of the challenges and
Cameron Stoll:
longer, more thoughtful development cycle.
Christina Lewellen:
Yeah, Cameron, I'm always saying when
Christina Lewellen:
I'm out on the road, I always use the analogy of like, the
Christina Lewellen:
early days of social media, and we were assured that our data
Christina Lewellen:
would be protected, it would not be monetized, but there was no
Christina Lewellen:
oversight, there was no implications. And then, lo and
Christina Lewellen:
behold, 18 or 20 years later, we found out, Oh, oops, our data
Christina Lewellen:
was monetized. And we were told it wasn't going to be back then,
Christina Lewellen:
when we started wondering about that, like, Hmm, what are they
Christina Lewellen:
doing with all this? Right? And so I feel like we're in that
Christina Lewellen:
same moment with AI. It's like, you know, just because the
Christina Lewellen:
current leaders of those organizations or those companies
Christina Lewellen:
say, Oh no, no, no, we're not training the model or whatever,
Christina Lewellen:
who's to say that they're not or that they won't, like, there's
Christina Lewellen:
no implications if they do. So. I'm with you on that.
Bill Stites:
Years ago, I did a presentation at ATLIS called not
Bill Stites:
an attorney, but I brought one with me to the conference with
Bill Stites:
Adam Griffin, who's a good friend of ATLIS, an attorney,
Bill Stites:
because at that point, there were things that we were talking
Bill Stites:
about that as we were trying to make sense of whether it was
Bill Stites:
Copa or whether it was issues around our ability to digitize
Bill Stites:
media, so on or so forth. You know, we were trying to
Bill Stites:
interpret what we were reading, because we're not lawyers,
Bill Stites:
because we don't have that CISO hat on, and nothing has changed.
Bill Stites:
It's only, as you know all of you have mentioned, it's only
Bill Stites:
gotten worse with regards to the compliance issues that come out,
Bill Stites:
particularly with the complications with AI. So I'm
Bill Stites:
I'm looking for from each of you, a really practical
Bill Stites:
direction to go in when we start engaging with vendors like
Bill Stites:
yourselves, who are holding on to large amounts of data from a
Bill Stites:
PII perspective, or from an academic or financial
Bill Stites:
perspective, depending on which one of the modules you have, if
Bill Stites:
we're dealing or looking at companies that deal with that,
Bill Stites:
what are some of the first things that we need to be
Bill Stites:
focused on as we start our process of review and vetting
Bill Stites:
for each one of those in The
Michael Martell:
10 years, or almost 10 years that I've been
Michael Martell:
working in this space, we've seen this explosion of
Michael Martell:
requirements, and we have schools who have a lot of
Michael Martell:
expertise and a lot of resources that they put towards compliance
Michael Martell:
and privacy. They have internal counsel. And then we have
Michael Martell:
schools on the other end of the spectrum who it's, you know, one
Michael Martell:
person doing many things, and on average, it's somewhere in
Michael Martell:
between those two. I would say that approach a vendor, after
Michael Martell:
doing a little bit of research about what regulations you're
Michael Martell:
subject to, depending upon what jurisdiction you're in, have a
Michael Martell:
general idea of what the requirements are, and then come
Michael Martell:
to your vendor with a simple set of questions about, How will you
Michael Martell:
help me to meet these requirements, understanding that
Michael Martell:
the onus typically is on the school who, from a GDPR
Michael Martell:
framework, is a controller, but that the school's responsibility
Michael Martell:
is to Select vendors that meet the requirements that they are
Michael Martell:
subject to, just understanding that is a really big step
Michael Martell:
towards understanding how to have a really good relationship
Michael Martell:
with a vendor from a security and a privacy perspective, and
Michael Martell:
frankly, it echoes the same thing. Process that we go
Michael Martell:
through with our own third party vendors, understanding first,
Michael Martell:
what requirements we're subject to within a given jurisdiction
Michael Martell:
and within a given type of data, and then going to our vendors
Michael Martell:
and having a dialog with them about, how do you help us to
Michael Martell:
meet our requirements?
Cameron Stoll:
Yeah, audit reports. Take a look at the
Cameron Stoll:
audit reports. I mean, at a minimum, SOC, two PCI for
Cameron Stoll:
payment processors, for any payment data. I think that
Cameron Stoll:
there's so much complexity in these laws, but ultimately,
Cameron Stoll:
there are really good standards that already exist that maybe
Cameron Stoll:
aren't legally binding standards, but are great
Cameron Stoll:
frameworks for best practices. So NIST is an amazing source
Cameron Stoll:
where, if you do, they call them crosswalks. If you take the
Cameron Stoll:
requirements of NIST, crosswalk, it against HIPAA, security
Cameron Stoll:
crosswalk, it against CCPA, the new security regs that just came
Cameron Stoll:
out a few months ago. They're very, very similar. And so I
Cameron Stoll:
think vendors that talk about adherence to a particular
Cameron Stoll:
standard like this, like ISO, is a great international standard
Cameron Stoll:
that will help a lot. I think that'll get you 99% of the way
Cameron Stoll:
there. There are a couple of local state laws that might have
Cameron Stoll:
some deviations, but I think that's a really good place to
Cameron Stoll:
start. And it's very easy to ask for, if you remember, like, I'm
Cameron Stoll:
looking for NIST CSF, NIST AI, RMF, which is their risk
Cameron Stoll:
management framework for AI, those are very easy to remember.
Cameron Stoll:
Companies that adhere to those, again, will cover so many of
Cameron Stoll:
your bases. The other thing that I would really focus on is
Cameron Stoll:
vendors with purpose limitation language in their contracts, and
Cameron Stoll:
that is a privacy principle, which basically means you can
Cameron Stoll:
use data, personal data only for these things, and if a vendor
Cameron Stoll:
has purpose limitation, language that is super broad, that is
Cameron Stoll:
pretty much anything other than providing you with what You
Cameron Stoll:
bought. Really examine those closely, and I think that will
Cameron Stoll:
help you understand this vendor's posture with respect to
Cameron Stoll:
how they're going to use the data. Are they going to use your
Cameron Stoll:
student data for promotional purposes? Right to show ads to
Cameron Stoll:
kids, to sell to data brokers? There's any number of horrors
Cameron Stoll:
out there. And so if you look at data, the purpose limitation,
Cameron Stoll:
language in your contract, I think that will go a long way
Cameron Stoll:
too.
Jo Bentley:
I'll cap it really quickly. First, I'll go to Bill
Jo Bentley:
and say this, CISO is not a lawyer. So when I turn up, I
Jo Bentley:
would have to turn up with a lawyer as well, but more
Jo Bentley:
practically, augmenting everything Mike and Cameron has
Jo Bentley:
said, first and foremost, it seems simple, but it's
Jo Bentley:
important, what problem am I trying to solve? What vendors
Jo Bentley:
solve that problem? And once you've established that, and you
Jo Bentley:
go into conversations about due diligence and things like that,
Jo Bentley:
it becomes a case of whatever problem you're trying to solve,
Jo Bentley:
you're going with your asset, the most important thing to you.
Jo Bentley:
So always remember in the room, it's your asset. Whoever's going
Jo Bentley:
to engage with it, is required to protect it. So you'll be
Jo Bentley:
looking for first, are they meeting my the needs, the
Jo Bentley:
problem I've defined, do they meet those needs and whatever
Jo Bentley:
asset I entrust to whoever the vendor is, are they going to
Jo Bentley:
protect it in the way I would have done myself, very simply.
Jo Bentley:
The other thing to remember, though, is when you entrust an
Jo Bentley:
asset to someone else, you are still accountable for that
Jo Bentley:
asset. So that goes to Cameron's. Look at audit
Jo Bentley:
reports, look at audit trails, because you also want to know
Jo Bentley:
that anyone you entrusting an asset is doing exactly what you
Jo Bentley:
expect them to do, out of sight is not out of mind. With your
Jo Bentley:
most important asset, you must know that they're doing exactly
Jo Bentley:
what you expect them to do, and have a way to engage them so
Jo Bentley:
that they can prove to you that they are doing exactly that they
Jo Bentley:
are meeting the requirements of the law. They're meeting all
Jo Bentley:
your data requirements. They're meeting your recovery
Jo Bentley:
requirements, in the event, for some reason, they're no longer
Jo Bentley:
there, and if you've given them your data, how do you get your
Jo Bentley:
data back when you no longer want to do business with. Them.
Jo Bentley:
So very practically simple terms, we can overlay with regs.
Jo Bentley:
We can talk about the complexity. Remember, it's your
Jo Bentley:
assets. Remember you need it treated well, and you need proof
Jo Bentley:
that it's done in the way that you expected it
Hiram Cuevas:
to be. So when you're working with the third
Hiram Cuevas:
party vendors at your respective corporations, there's likely an
Hiram Cuevas:
opportunity for you all to create an addendum based on the
Hiram Cuevas:
requirements that your respective institutions are
Hiram Cuevas:
interested in. Similarly, I'm finding in the conversations
Hiram Cuevas:
that tech directors are having around the country, we're having
Hiram Cuevas:
to customize our individual vetting and frameworks based on
Hiram Cuevas:
in Virginia, a whole bunch of new legislation just came out,
Hiram Cuevas:
and I need to be cognizant of those changes. We have alums
Hiram Cuevas:
that are in the European Union, so I've got to be aware of GDPR.
Hiram Cuevas:
We've got folks in Australia, when you start mentioning this
Hiram Cuevas:
to the senior exec team, they don't have that where with all
Hiram Cuevas:
about the impact of data regulations across product
Hiram Cuevas:
lines, do you encourage schools to continue to in looking at
Hiram Cuevas:
these frameworks, to add these addendums to protect the
Hiram Cuevas:
schools, such that we can have a little bit more peace of mind?
Hiram Cuevas:
And should that come from an attorney, as opposed to a bill
Hiram Cuevas:
Stites or Hiram Cuevas, who are tech directors. We have lots of
Hiram Cuevas:
frameworks that we use and share amongst our communities, but I
Hiram Cuevas:
think this actually has to come more from an attorney. I think
Hiram Cuevas:
what our schools are really asking for is, who do I ask to
Hiram Cuevas:
help me develop these addendums and actually pass that
Hiram Cuevas:
information on what would be the best practice.
Michael Martell:
This is something that I work with
Michael Martell:
literally even today. I still work with this every day again,
Michael Martell:
the strategy that I would advise schools to take is to find
Michael Martell:
counsel that you trust, that is going to provide you with easy
Michael Martell:
to understand advice and guidance about the environment
Michael Martell:
you operate in, not necessarily someone who's going to list off
Michael Martell:
a million bullets of legalese, but someone who can understand
Michael Martell:
what your operating conditions are. You know, Virginia or
Michael Martell:
Germany or New South Wales, wherever you are, and then
Michael Martell:
provide back to you practical guidance about what you need to
Michael Martell:
look for. And then to your vendor. We do this every day.
Michael Martell:
Veracross maintains a DPA. We maintain other papers that are
Michael Martell:
privacy and compliance related. And when a school comes to us
Michael Martell:
and asks us and provides us with that succinct, practical
Michael Martell:
guidance, we are always willing to work with them to make
Michael Martell:
updates, changes or addendums, to make a specific version of a
Michael Martell:
contract or a specific version of a DPA that meets that schools
Michael Martell:
requirements, and we continue to do so today, but for schools,
Michael Martell:
the most efficient way to do this is again, sort of getting
Michael Martell:
back to what I said earlier. Know, the environment you
Michael Martell:
operate in bring someone in who can help to bridge the gap
Michael Martell:
between your practical day to day experience and the ever
Michael Martell:
changing regulatory framework and knows the right keywords,
Michael Martell:
and then someone who can consume a DPA or a contract or an MSA or
Michael Martell:
a privacy policy and actually do that mapping between the
Michael Martell:
concepts that matter Most and how it's expressed typically in
Michael Martell:
these privacy papers. I would also say another practical step
Michael Martell:
that a school could take when they're in this discovery phase
Michael Martell:
before they engage with a vendor is to take a look at the way
Michael Martell:
that the vendor expresses security information publicly. I
Michael Martell:
found that you can really take the temperature of a vendor. Are
Michael Martell:
they speaking to you in simple terms? Do they understand that
Michael Martell:
you aren't a DPO or something like that? Are they open about
Michael Martell:
what they're doing with your data? Do they have something
Michael Martell:
like a Trust Center that you can access or request access to? I
Michael Martell:
feel like and I mean, we do this internally as well, with our own
Michael Martell:
vendors when we're selecting third party vendors, taking the
Michael Martell:
temperature, the security, privacy, compliance, temperature
Michael Martell:
of that vendor, and coupling that up with someone who can
Michael Martell:
provide that translation, really goes a long way towards being
Michael Martell:
efficient and being open when it comes to engaging with your
Michael Martell:
vendors about things like custom addendums or other changes that
Michael Martell:
might need to be made to the relationship that you have with
Michael Martell:
your vendor to make sure data is protected the way that it should
Michael Martell:
be. I would
Cameron Stoll:
also, and not to throw shade on my fellow global
Cameron Stoll:
lawyers, we get very precious about our unique. Like laws that
Cameron Stoll:
we have drafted and passed, but we're really working with a
Cameron Stoll:
pretty common set of principles here. So if you are working with
Cameron Stoll:
a vendor that is a large vendor, maybe a global vendor, most
Cameron Stoll:
likely they have tailored their DPAs to the most stringent laws
Cameron Stoll:
that apply to customers that they sell to. And so oftentimes,
Cameron Stoll:
we get this all the time. Hey, I'm in Arkansas, and we have a
Cameron Stoll:
new student privacy law, and so we have to append this onto your
Cameron Stoll:
GPA. And if you go through point by point, maybe the language is
Cameron Stoll:
slightly different, but they're all the same concepts. You've
Cameron Stoll:
got that purpose limitation, you've got a data breach
Cameron Stoll:
notification provision, you've got security language. So of
Cameron Stoll:
course, they're each perfect, unique little unicorns, but
Cameron Stoll:
these laws really can be very similar, and there aren't a huge
Cameron Stoll:
number of deltas that need to be separately accounted for. The
Cameron Stoll:
other thing that's going to get me in a lot of trouble with the
Cameron Stoll:
lawyer mafia is if you have an enterprise version of an AI
Cameron Stoll:
tool, you can put these documents into the tool and ask
Cameron Stoll:
to see what the delta is. Of course, it is not a substitution
Cameron Stoll:
for legal advice, but it gives you a really good start. Hey,
Cameron Stoll:
this is what I'm required to do. This is what Blackbaud is
Cameron Stoll:
offering me, what is not covered. And then you have a
Cameron Stoll:
really good starting point where you can go to their trust team
Cameron Stoll:
and say, and our trust team deals with questions from
Cameron Stoll:
customers all the time. Where is this provision in your DPA?
Cameron Stoll:
Where is it covered? Hey, I need encryption of this. Where is
Cameron Stoll:
this addressed? And any good vendor is going to be able to
Cameron Stoll:
point out, oh, it's in number seven. And so I think that's a
Cameron Stoll:
really good place to start as well.
Christina Lewellen:
So I'm curious what you each think. I
Christina Lewellen:
mean, we're in this moment where AI is complicating things.
Christina Lewellen:
Global reach is complicating things. What do you think the
Christina Lewellen:
role is of ed tech companies in terms of offering security
Christina Lewellen:
services to schools, just making sure that we understand
Christina Lewellen:
cybersecurity and student data privacy issues. I mean, I would
Christina Lewellen:
imagine that you're doing more education than you ever have.
Christina Lewellen:
Correct me if I'm wrong, but I'm just really curious, what do you
Christina Lewellen:
think the appropriate role is for our vendor community to help
Christina Lewellen:
us out and get this right?
Michael Martell:
This is an interesting question, because
Michael Martell:
the answer, I feel has changed dramatically, even over the past
Michael Martell:
five years. I would say that when I started in this space,
Michael Martell:
around 10 years ago, the idea of an ed tech vendor providing
Michael Martell:
security advice or information to one of our customers was
Michael Martell:
pretty far fetched. First of all, there's the whole Thou
Michael Martell:
shalt not provide legal advice to your customers, which is
Michael Martell:
still true today, but also it just wasn't as front of mind
Michael Martell:
because schools just had different concerns 10 years ago,
Michael Martell:
even five years ago, from my perspective as a system of
Michael Martell:
record, sis vendor, these days, there isn't a contract
Michael Martell:
negotiation that goes by without questions about security and
Michael Martell:
compliance, and it is top of mind for many people. And I
Michael Martell:
would say that a principle that I like to see vendors adopt,
Michael Martell:
that we've adopted as well, is the idea that a rising tide
Michael Martell:
lifts all ships, especially for our customers who don't have
Michael Martell:
entire teams of security and compliance professionals
Michael Martell:
available to them, we will have deep conversations with them to
Michael Martell:
help them understand what their own positioning is from a
Michael Martell:
technical and a legal perspective, although we still
Michael Martell:
don't really provide legal advice to our customers, but we
Michael Martell:
also take further steps to provide them with some technical
Michael Martell:
insight into what is your network security perimeter
Michael Martell:
looking like right now. We're seeing some interesting stuff
Michael Martell:
going on that's not actually within the box of Veracross, but
Michael Martell:
it's actually in your perimeter. Let's try to help you fix some
Michael Martell:
of those things, because nobody wants an account to be
Michael Martell:
compromised. That's not good for anybody. And if we can help our
Michael Martell:
schools get a bit of an edge in that space through technical,
Michael Martell:
consultative and other approaches. We're absolutely
Michael Martell:
going to do that, because that investment is worth it for
Michael Martell:
everyone.
Jo Bentley:
Mike has said it extremely well, I think it's
Jo Bentley:
important to inform and educate, and I think in my mind, that.
Jo Bentley:
Where the security services needs to sit, I would say I
Jo Bentley:
would be remiss as a security professional if I didn't say
Jo Bentley:
that. I can't roll up my sleeves and walk into a school and
Jo Bentley:
interact with their network, but I can inform you and educate you
Jo Bentley:
on how best to proceed. I can point you to resources. I can
Jo Bentley:
have a webinar with you, but I can't go in and interact with
Jo Bentley:
your underlying structures. I think that's probably where they
Jo Bentley:
would need to be.
Christina Lewellen:
I want to make sure that I give you guys
Christina Lewellen:
each a few moments to talk about what's coming at you in the
Christina Lewellen:
context of your companies, and so what we will be able to look
Christina Lewellen:
forward to in terms of what's next. So you know, many of our
Christina Lewellen:
listeners are either Blackbaud or Veracross customers. So I
Christina Lewellen:
know that you stay in communication with them. I know
Christina Lewellen:
that you're talking to them about these issues on a regular
Christina Lewellen:
basis, but I guess I just want to give you the floor for a
Christina Lewellen:
moment. You know you've got the mike. What would you want your
Christina Lewellen:
customers to know and what's coming next? What are you
Christina Lewellen:
thinking about? What are you working on? What is it that
Christina Lewellen:
you're helping get out ahead of for our community, for
Christina Lewellen:
Blackbaud,
Cameron Stoll:
it's certainly about the agents we have just
Cameron Stoll:
released, just to GA yesterday, a development agent, which is an
Cameron Stoll:
AI powered fundraising Virtual Employee, and we are rolling out
Cameron Stoll:
agents in a variety of different solutions and different ways. So
Cameron Stoll:
I can't talk about specifics, because we're publicly traded,
Cameron Stoll:
but I will say that for some of the more predictable and
Cameron Stoll:
annoying workflows that y'all deal with in school
Cameron Stoll:
administration, perhaps admissions, we are certainly
Cameron Stoll:
looking to identify that experience give you all a bit
Cameron Stoll:
more capacity to do the things that matter, and less manual,
Cameron Stoll:
annoying tasks but eat up your day. So I think that's where I
Cameron Stoll:
personally spend a lot of my time, is with the agentic team.
Cameron Stoll:
It's all very exciting. It requires a robust risk
Cameron Stoll:
framework, governance framework, and certainly a more of an
Cameron Stoll:
appetite, because there are certain places where a human is
Cameron Stoll:
not as in the loop, as if you were just creating content in
Cameron Stoll:
chat GPT and porting it over here, the ability to send
Cameron Stoll:
emails, for example, have a conversation. But that is very
Cameron Stoll:
exciting. On the less exciting front, for sure, it is keeping
Cameron Stoll:
up with all of these laws. We're seeing some attempted amendments
Cameron Stoll:
to COPPA. So there's COPPA 2.0 there is a talked about
Cameron Stoll:
amendment to FERPA, which will be the 10th since the 1970s to
Cameron Stoll:
update FERPA to accommodate AI, who knows if that's going to
Cameron Stoll:
pass or not. We're seeing these things called age appropriate
Cameron Stoll:
design codes. There's a very robust one in the UK, but some
Cameron Stoll:
US states are starting to adopt it, and that is basically
Cameron Stoll:
requirements around how you build user experience for
Cameron Stoll:
products that interact with children. So that is really a
Cameron Stoll:
huge focal point for us right now. They're getting blocked in
Cameron Stoll:
courts and all sorts of First Amendment concerns, not humans
Cameron Stoll:
first amendments, but corporations First Amendment
Cameron Stoll:
online safety laws. So keeping tabs on these, plus all the AI
Cameron Stoll:
laws and cybersecurity laws that are coming out. But again, if
Cameron Stoll:
you refer back to a nice, robust standard, hopefully you can
Cameron Stoll:
accommodate all these. So I'm
Christina Lewellen:
going to turn over to Veracross. But I
Christina Lewellen:
also just want to mention Cameron, it's interesting that
Christina Lewellen:
you're talking in this space of agentic AI. I've been on the
Christina Lewellen:
road a lot in the last couple years, naturally talking about
Christina Lewellen:
AI, and what I'm seeing in AI with schools. And I remember two
Christina Lewellen:
years ago, some schools would say, you know, we want to build
Christina Lewellen:
our bots, and then later on, they said, We want to build our
Christina Lewellen:
agents, and are you doing this? And I remember saying in those
Christina Lewellen:
earlier moments, I'm not going to do this. I don't have the
Christina Lewellen:
money to do this. The companies I pay my subscription service
Christina Lewellen:
companies will do this right, like, let them do it. Let them
Christina Lewellen:
figure it out. Let them protect us. And chances are, just like,
Christina Lewellen:
all of these bells and whistles were baked into either our
Christina Lewellen:
Google workspace or the suite of products that we're already
Christina Lewellen:
using in any including like our accounting and credit card
Christina Lewellen:
coding software's right, like they're auto generating what
Christina Lewellen:
they believe the code would be for a transaction. And I'm just
Christina Lewellen:
going to let them do it right? And so it's really interesting
Christina Lewellen:
to hear that that's something that you guys are working on and
Christina Lewellen:
thinking about. Because not that I am a genius in the way of
Christina Lewellen:
tech, but when certain schools who are sitting on a good amount
Christina Lewellen:
of money were like, we're going to build this ourselves, I said,
Christina Lewellen:
Yeah, that's cool, but my guess is that the vendors you're
Christina Lewellen:
already paying are going to be doing this as well. So maybe, if
Christina Lewellen:
you're a smaller or less resourced school, hang tight,
Christina Lewellen:
because I'd be willing to bet that in very short order, the
Christina Lewellen:
things you're looking for, the bells and whistles, will be
Christina Lewellen:
baked into the base model, just because they're competing with
Christina Lewellen:
each other. And so I'm having a full circle moment where I'm
Christina Lewellen:
like, See guys, everybody who's listening, who heard me speak
Christina Lewellen:
two years ago, I told you to just hang tight. They were
Christina Lewellen:
working on
Cameron Stoll:
it. You feel so validated.
Christina Lewellen:
I do. It
Cameron Stoll:
is important to point out we made the decision I
Cameron Stoll:
think last year, nothing would be baked in to the products by
Cameron Stoll:
default such that you don't know that you're using it.
Christina Lewellen:
That's an important distinction.
Cameron Stoll:
We took into consideration that many of our
Cameron Stoll:
customers are very risk averse, so you have to actually go in
Cameron Stoll:
and turn things on so you're not bringing shadow AI into your
Cameron Stoll:
network. You have to be authorized to do so, and you
Cameron Stoll:
have to affirmatively do that. So that is a choice we made,
Cameron Stoll:
just to be as transparent as possible, because you'll be
Cameron Stoll:
editing a PDF and be like, wait a minute, why is there
Cameron Stoll:
generative AI in my PDF editor, this is a unnecessary and B, a
Cameron Stoll:
little sketchy.
Michael Martell:
I'll hop in and share some of the themes. We're
Michael Martell:
going in a lot of different directions, and we're very
Michael Martell:
excited about them, but there are two that I think will
Michael Martell:
resonate most with this overlap between security compliance and
Michael Martell:
AI. I'll start, I think, with the second first, helping our
Michael Martell:
customers find comfort in what we can bring to them from a
Michael Martell:
security and compliance perspective is very important to
Michael Martell:
us. I'll use Australia as an example. Australia has a very
Michael Martell:
interesting perspective on data, privacy and security that is a
Michael Martell:
little bit different from a GDPR model. It's a little bit
Michael Martell:
different from the way in which many US states and other
Michael Martell:
jurisdictions adopt it and having meaningful conversations
Michael Martell:
with our customers so we can bring to them what they need and
Michael Martell:
what their underwriters need and what their communities need, is
Michael Martell:
a big direction that we continue to go in. At the end of the day,
Michael Martell:
I feel like, as Cameron pointed out, there are themes and
Michael Martell:
principles that underlie all of these regulations and laws. And
Michael Martell:
being able to connect with people on a human level and help
Michael Martell:
to understand what their values are in that space is very
Michael Martell:
important to us from an AI perspective. Yes, Veracross is
Michael Martell:
working on AI tooling for our customers as well. The idea
Michael Martell:
behind it is that we like to think of ourselves as a
Michael Martell:
community operating system because, like many other large
Michael Martell:
ed tech spaces, we are just so embedded into our customers
Michael Martell:
lives and their parents lives and their students lives. And so
Michael Martell:
being able to safely and transparently bring tools to our
Michael Martell:
customers, agents, chat, etc, to our customers that helps to
Michael Martell:
support their community in a healthy way is a big part of
Michael Martell:
where we want to go from an AI perspective and to echo what
Michael Martell:
Cameron said, do so in a transparent, safe and opt in
Michael Martell:
sort of way. I also get really annoyed when I'm editing a PDF
Michael Martell:
and something pops up and says, Would you like me to help you
Michael Martell:
write this? No, thank you, but maybe, maybe someday. So those
Michael Martell:
are just a couple of the directions that we're heading
Michael Martell:
in, trying to keep the human element forefront, because at
Michael Martell:
the end of the day, schools are communities. They're collections
Michael Martell:
of people who have shared values and a shared direction, and
Michael Martell:
trying to be able to reflect that in technology is something
Michael Martell:
that matters very much to us.
Christina Lewellen:
Okay, so I know we're running out of time,
Christina Lewellen:
and we could probably be here all day talking through some of
Christina Lewellen:
these details, and we'll make sure that all of your contact
Christina Lewellen:
information is in the show notes, so that people can reach
Christina Lewellen:
out to you if you've touched a nerve in any way with some of
Christina Lewellen:
the things you've said. But before I let you guys go, I know
Christina Lewellen:
that the three of us have been in the chat, framing up a final
Christina Lewellen:
question for you, and that is this, in this moment that we're
Christina Lewellen:
sitting in right now with AI with all the privacy issues, how
Christina Lewellen:
would you describe a gold standard, whatever that means to
Christina Lewellen:
you, for schools who are really trying to pursue excellence and
Christina Lewellen:
make sure that their schools and their student community is.
Christina Lewellen:
Protected. What's the gold standard?
Jo Bentley:
It really depends on where you are, and I'll explain
Jo Bentley:
why. I say that instinctively, the ISO standard will pop to
Jo Bentley:
mind as the gold standard, especially when we're talking
Jo Bentley:
about cyber security. But it depends now comes into play when
Jo Bentley:
you're talking about portability and expense. So for that gold
Jo Bentley:
standard of ISO, you can use it internationally, but at a
Jo Bentley:
significant expense when it comes to standards that are
Jo Bentley:
usable in most places. I would go out on a limb and say, NIST
Jo Bentley:
is becoming the de facto standard for the simple reason
Jo Bentley:
that you don't have in quotes. And I say in quotes because
Jo Bentley:
taxpayers in the US cover the costs of the NIST standards, but
Jo Bentley:
from the outlay, you don't have to cough up a significant amount
Jo Bentley:
of money to go to NIST and pick up a standard that allows you
Jo Bentley:
baseline they do a very good job at defining AI structures, AI
Jo Bentley:
governance and underlying cybersecurity controls and even
Jo Bentley:
privacy in the last couple of years, so NIST CFS, in that
Jo Bentley:
scenario, becomes the gold standard. But also be aware, if
Jo Bentley:
you're international, you might then have folks asking you for
Jo Bentley:
more than just that, especially when they're dealing with
Jo Bentley:
particular regulators. So baseline NIST CFS, its beauty is
Jo Bentley:
you can go deep, as deep as you want to go to the special
Jo Bentley:
publications if you just want to stay on the surface with NIST
Jo Bentley:
CFS, that gives you pretty much everything you need at an
Jo Bentley:
advantage, you now have privacy in there as well. So it covers a
Jo Bentley:
decent landscape for good practice.
Cameron Stoll:
Yeah, I totally agree. I like NIST CSF for
Cameron Stoll:
security. I like NIST AI, RMF for AI, and I like GDPR, good
Cameron Stoll:
old GDPR for privacy. I do. I think the way they define their
Cameron Stoll:
terms just makes sense. I think the whole structure makes sense,
Cameron Stoll:
and that, to me, is the gold standard for privacy laws. So
Cameron Stoll:
those are the three that I typically look for adherence to.
Michael Martell:
I agree NIST is fantastic because of the breadth
Michael Martell:
and the depth that it provides. I also agree that GDPR feels
Michael Martell:
like home. It was really one of the first frameworks that got us
Michael Martell:
all thinking about this, aside from PCI, DSS, which also has
Michael Martell:
been around forever, and of course, HIPAA and other
Michael Martell:
specialty frameworks, but aligning on GDPR makes sense to
Michael Martell:
me, because so many of the world's privacy frameworks are
Michael Martell:
derived from the GDPR, not all of them, but that, and I have to
Michael Martell:
say, CCPA, from a US based perspective, if I were to pick
Michael Martell:
outside of NIST CSF, if I were to pick two really privacy
Michael Martell:
focused ones, it would be aligning on the GDPR, because
Michael Martell:
you get so much for free from other jurisdictions if you do
Michael Martell:
that, and then from a US perspective, aligning on CCPA,
Michael Martell:
because Up until recently, they've really been the thought
Michael Martell:
leaders in that space, even extending beyond some of the
Michael Martell:
protections that the GDPR traditionally had.
Christina Lewellen:
Mike Joe Cameron, thank you so much for
Christina Lewellen:
joining us. Please. Thank your teams at Veracross and Blackbaud
Christina Lewellen:
for all the work you guys do to help keep our community safe. We
Christina Lewellen:
appreciate you more than you know, and I appreciate your
Christina Lewellen:
time.
Peter Frank:
Today, this has been talking technology with
Peter Frank:
ATLIS, produced by the Association of technology
Peter Frank:
leaders in independent schools. For more information about Atlas
Peter Frank:
and Atlas membership, please visit the atlas.org if you
Peter Frank:
enjoyed this discussion, please subscribe, leave a review and
Peter Frank:
share this podcast with your colleagues in the independent
Peter Frank:
school community. Thank you for listening. You.