 Let's jump in. So first of all, a little bit about us. We're a strategic technology partner and we basically help nonprofit organizations, lever technology to fulfill your missions. Not about the tech, it's about what you need the tech to do and help you with. If you wanna stay connected with us, we've got Twitter, we've got LinkedIn, we've got Facebook and of course, you know, website, email, all the good stuff. My name is Joshua Peske. My title is Very Strange through CPO for all the Star Wars nerds out there. Basically, I function as what's called the CIO, our Chief Information Officer, Chief Information Security Officer or CISO and then also a program officer at Roundtable. And so a friend a couple of years ago thought it'd be funny to have this three CPO title and I loved it. And so that's my title now. And Kim, tell us about yourself. Kim Snyder and I've got a little bit of an easier title to digest. I'm VP of Data Strategy and I've been working with data and nonprofits for a very, very long time. Joshua and I have been doing that for a very long time. So pleasure to be here today and talking about privacy. Really is. All right, so we're gonna start with a poll. So, Rita, if you could launch the first one, we wanted to find out which description best characterizes your relationship with your organization's data. So you could be the parent or guardian of your data, long-term committed relationship, frenemies, casual acquaintances, total stranger. And if you wanna give us something else in the chat, feel free. And, Rita, whenever you've got, I don't know, 60, 70% people having responded, you use your judgment, go ahead and close it and show us what the answers are. So Kim, how would you describe your relationship with that? Oh, that was fast. So we have a lot of parents and very committed folks here today. Well, that makes sense. And a handful of frenemies. Fortunately, very few of you absolute strangers with your organization's data. All right, so go ahead. I don't know if I need you to close that up or if people just close the responses themselves, but thank you all for participating in that. And all right, and I think Kim, you take over at this point, right? So I'm gonna take us off with, well, first of all, throughout this presentation, you're gonna see this little bird, okay? And what that bird means is that there's a resource or a takeaway for whatever topic we're covering at that moment. And so we're gonna provide you with a link to this privacy stuff page where you can get all this. So we're gonna be covering a lot of information but there are a ton of resources that you can take away from this today. So okay, so why privacy and why now? So Josh, you can take it into the next slide. Okay, so there's been a lot of buzz around privacy and a lot of that has to do with emerging legislation that we keep hearing about. So Congress is right now slogging through a proposed American Data Privacy and Protection Act. Individual states keep coming up with their own privacy legislation. We hear about more and more laws being enacted at the state level. And this map is actually from the International Association of Privacy Professionals, their legislation tracker. And this map keeps getting new colors. Right now we're in between sessions. So there's not as much blue, but while state legislation was in session, there was a lot of blue states, meaning they are working on privacy legislation. Okay, so the next reason why it's so important is cyber crime. So cyber crime has gone through the roof. And especially during COVID, right? We've seen huge increases in the amount of cyber crime between, I don't know, cyber criminals having more time at home and also workforces becoming more remote-based. So we've seen a huge uptake in cyber crime. And what that means basically is that a lot of data is at risk. So the next reason is protecting our constituents. Okay, and I'm gonna say that protecting constituents is always important, but increasingly today in today's climate, personal data can put people at risk. And having that data be out there is, and having those potential risks actually, it's a bit more of an abstract reason, but it's a very, very important reason to be taking data privacy seriously right now. And I will say that data privacy and ethics have a lot in common. And I think that's an area where nonprofits have really connected to data privacy. So in a world with inquiry, okay, that's fine. That's fine, you can go on. All right, so privacy is not new in the United States, okay? So we've had privacy laws like HIPAA, which protects personal health information, FERPA, which protects student records as well as various regulations for financial institutions. We've had these for a while, okay? But up until now, privacy law, all right, what's called privacy law in the US has been very sector-focused. That means specific areas of practice or industries have had to adhere to these practices, right? But, and you can click on the next, okay, privacy laws that are arising today at the state level and the American Data Protection Act, okay? Speak more to personal privacy, and that's an individual's right to privacy, which is really best embodied in the GDPR, the European Union's General Data Protection Regulation, okay, which passed in 2018. GDPR has become the foundation for most of our state laws, and it's actually also the foundation for laws that we're seeing prop up in other countries, okay? Japan, Brazil, so we're hearing about India is working on privacy laws, so we keep hearing about new laws. GDPR really forms the basis. So what I'm saying is that having a general understanding of GDPR principles helps deal with the myriad privacy laws that are out there and propping up. So you can go to the next slide. So what really do we mean by individual privacy rights? So there's a resource for you here. I'm not gonna go through each of these things today, because that's a whole webinar unto itself, but GDPR names kind of nine distinct rights that an individual has if their data is collected, it's increasingly important to understand what each of these rights means, but today I'm going to briefly cover five of these. So I'm not covering all of them, and that is not at all to say that the others aren't important, okay? But these five are really most applicable, I'd say to any organization that is collecting data that belongs to people, and that's a lot of non-profit organizations. And when I say people, it can be employees, job applicants, as well as donors and clients, people. People data is what privacy regulation protects. So let's just take a brief hop through each of these. The right of information is really the right to be informed of what data is being collected about us and why. It is often important to have in the form of when we're collecting data, say, publicly, especially in our increasingly boundary-less world where people from our website, people from the EU, may sign up for a newsletter. It's important to know that we need to tell them what we're collecting and why. So, and GDPR does apply to any person who is also an EU resident, okay? So the best practice with the right of information is really to have accessible privacy policies at the point of any data collection, okay? And that's really to cover yourself around people from the EU signing up for things. The next right, and this is the next slide, is the right of access. And so what that basically means is that if you collect my data, I have the right to then ask you for it and get a copy of it. I have the ability to ask for a copy of it. That means I have instructions around requesting my information and those instructions are provided to me in a clear and accessible way. So that's the right of access. I can ask for my data as a right. If you collect my data, I have this right. Next is the right of erasure. A lot of people think of this as the right to be forgotten or deleted. So that's to say, if you have my data, I have the right to ask you to delete all of my records from your system, okay? Unless you need it for legal reasons like certain donor records, for example, that you need to keep for financial record keeping. Unless you're legally prevented, you do need to comply with my request to delete your data, okay? Under GDPR. The next slide. And again, this is, I'm just kind of cruising through these to give a general understanding, okay? So less drastic than deletion is rectification. So if my name changes or I see that the data that you have about me is inaccurate, I have the right to ask you to please correct it and you have the obligation to correct it. And with both of these rights, this last one, the deletion, the right to be deleted and the right to have my data corrected, you need to then tell me once you've done it. So that's the right of notification. That's the other one on here. And what this means is you need to be able to do all of these things, okay? And without data management, doing these types of things can be an impossibly tall order. So let's take a look quickly when we talk about private information, what is it that we need to protect? And this is you, Josh. Absolutely. So the difference as Kim pointed out in privacy laws versus the kind of data compliance regulations like HIPAA and PCI and FERPA and so forth that we've dealt with for quite some time, is this idea of data belonging to an individual? And therefore that individual has rights about what is done with that data. So I have information about me, my birth date, my phone number, my address, my email and many more things that people will collect on me and organizations will collect on me and these privacy laws are seeking to give me rights about what companies can and cannot do with that data and what they need consent to do if they want it. So it's information that really belongs to an individual, which is information about that individual, right? So let's get a little bit more specific about that. There are three broad categories that are defined by GDPR and while not all of the other privacy laws use the same language, generally I think these three buckets are applicable to all of the privacy laws that you'll encounter. So the first is what's often referred to in the US as PII or personally identifiable information or personally identifying information. And that is your name, your address, your birth date, email, phone citizenship, your geolocation, your address, stuff like that that allows you to identify the individual. So if it's information that will allow me to identify Kim and say, I know who this person is, I know how to reach them, I know where they are, I know how old they are, that is PII and that is a protected class of information under virtually all of these laws. The next is sensitive information. And this is stuff that falls under the kinds of things that have typically been protected before, right? Under some of these other laws, not all of it, but a lot of it and that's like social security number, your passwords, financial information, biometric information, health information like PHI or protected health information in the HIPAA law, genetic information. So if I go to 43 and me, that information is protected. And so that's sensitive information that would have potentially more potential harm if that information is exposed to people that I didn't want to have access to. And then the last one is special category information. You can think of this as something like the psychometric information. For any of you remember the Cambridge Analytica news that came out about all the information that was being collected about people's preferences, political alliances, their friends were all that kind of stuff. So that falls under special category which is information about what religion I might be, what sexuality I might be, what philosophical beliefs, political beliefs, all of those kinds of things fall under that special category. So all of these are collected and it's also important to understand that if you take information that I have on Kim and then I combine that with other information that's public to add, right? That added data is still subject to these same laws and I still need consent to have that. I can't just start creating data on Kim without her consent. Okay, that's a really important point about all this. All right, we have another poll here. So Aretha, if you could pop up our next poll which is my organization has received a data subject access request also affectionately known as a DSAR. If you've never heard of one, that's fine. We've got what's a DSAR is one of the responses. If you have received one, we've got an image of what one kind of looks like on the screen and we certainly know of organizations that have received these. And Aretha, again, I'll defer to you as to when we've got 60, 70% of respondents to go ahead and close the poll and let's see. So not a lot of you yet, all right? And a lot of you not sure what a DSAR is, but a few of you have received one and those of you who have not, you can look forward to receiving one in the future. Hopefully not too soon, but it probably is coming at some point. So go ahead, Kim, do you wanna say something? No, no, no. Oh, okay, all right. So the right to be forgotten. Okay, and by the way, just to define what a data subject access request is it's a means that is provided under a lot of these privacy laws where a data subject such as Kim can come to a company such as me, Joshua, who has collected her data and say, Josh, I would like to know something about the data you have on me. For example, I would like to know what data you have on me. I need to be able to tell Kim all the data that I have. She might ask, who have you shared my data with, Josh? I need to be able to tell her. Kim, here's all the companies that I shared your data with and here's the specific data I shared with them. Kim may also say, I would like to be forgotten. I would like you to delete all of my data. And if she asks that, I need to delete her data not just from where I collected it. So if Kim filled out a little form on our website to get our newsletter and gave us some information and then came to us six months later and said, I'd like you to delete that information. I need to not only delete it from HubSpot or Constant Contact or MailChamp, wherever that went, but where else did that go? Did our marketing person download a CSV that's sitting on their laptop and Kim's data is in that? It needs to get deleted from there. Did it get exported and sent out to another organization so they could look at who signed up for our newsletter? It needs to get deleted from there. Did it get backed up? This is a really hard one, right? It needs to get deleted from those backups as well. So this is a really challenging thing to do for many organizations. And we'll say this a few times before you start getting overwhelmed, you don't start doing all this perfectly tomorrow, but you want to start thinking about how would we do this so that you're not completely overwhelmed when that starts happening to you, all right? So basic premise change. There's this quote, I don't know how old this was. Kim, do you know when CloudHumby said this? Is this 10, 20, 30 years ago? I don't know. Probably 20 years ago. It is the new oil. You know, it's kind of a gross quote, but... It is. People have heard this a lot. Data is so valuable. The more data, the better. And what I will say that is totally applicable to most nonprofits in my experience is that they view data as an asset only, meaning the more data we have, the better. And once we have it, we'd like to keep it for as long as we can. There's no reason to ever delete data because why delete it? It's an asset. We might have a use for it someday. So let's hang on to it. Why not? Here's what I am going to encourage everyone to think about going forward. Now, this is a little bit I'm honest to have a quote from myself that Kim put it in the chat. Play I do this as well. Okay. And oil is a good analogy here, which is... Oh, thank you, Stephanie. So economic form in 2012. So not even that long ago. Thank you, Stephanie for that quote. All right. Data is both an asset and a liability. Oil kind of works in this regard. So you have oil, oil's an asset. If I lose the oil, right? I've lost whatever value that oil had for me, bummer. But if I lose that oil in the ocean, right? I now have additional liability, which is I have to clean up all that oil and that cost me a lot of money. And I suffer reputational damage because I spilled a bunch of oil in the ocean and people don't like that, right? That I made a big mess. Well, if I have Kim's data and she has nothing to do with me for five years and five years later, I have a breach and Kim's data is part of that breach. Well, guess what? Kim may have a case to sue for damages, right? There's liability that I may carry for still having Kim's data. And if that data wasn't providing any value to my organization or not providing value commensurate to the risk, then it is just a liability, right? And I should get rid of it. So thank you, Stephanie for your second of that. So a very simple phrase. It is an asset. It is also a liability. And if the liability part of it is a lot more than the value of the asset, then hey, probably get rid of it. And even back to you. Well, and even the liability, even if I don't sue you, Josh, right? There's the liability of like having to look me up and find me and know that you have to let me know, right? Enough of a liability. All right, so what are organizations supposed to do now? All right, let's go to the next slide. This is a slide I added at the last minute, but I actually feel like it's so important to provide a larger context. So I'm going to give you a super high level view of something as the title of this presentation suggests. This is a privacy primer. This is where it really looks up to the name of primer. Okay, so I'm gonna introduce a couple of general concepts. First is the NIST privacy framework. Okay, the privacy practices that we're covering today are built from what's known as the NIST framework. So NIST is the National Institute of Standards and Technology. I believe that's a division of Department of Commerce. So while the US doesn't have a federal privacy law, it does have a privacy framework, which is actually legislation agnostic, which probably makes sense since we don't have a federal law. But anyway, so this is an adaptable format. Okay, covering NIST in depth is a full course. We actually made that, but anyway, but I want to introduce you to the framework in general and the different functions. So starting at the top at 12 o'clock, identify refers to identifying what laws apply, what data your organization is using, what risks you may be having regarding your systems or what risks your systems and the type of data you collect may pose to you. Okay, so the next one at two o'clock is governed, which speaks to the overall prioritization or buy-in for dealing with privacy within your organization. So having a privacy program, defining how privacy practices will be carried down. Okay, the next area of NIST is control, which speaks to the policies that will define how employees in your organization handle data, okay? So data handling policies that tell people, right? For that example that Josh just gave, no, you can't just download all of our marketing data and take it home, right? So those kinds of rules would fall under the realm of control. Protect, and you're gonna be hearing more about Protect from Joshua, really speaks to the privacy, the practices you have around maintaining data security, right? And so some of that has to do with who can access what accounts where. Also, do you have multifactor authentication in place in those databases where you have sensitive data and general cybersecurity practices at your organization? For example, does your staff receive training? Things like that. So finally, at 10 o'clock is communicate. And that's about really regularly educating employees, having something in place so that once you have policies, once you've identified your data and you have policies around how you're gonna use it, everyone then understands what they can and cannot do with organizational data. So that's a lot of information about NIST. A big takeaway about this framework, however, and that is really why I included this slide, is to say that NIST was designed from the ground up to be a very iterative framework, right? So it defines different functions and different activities that are done. And it assumes that new practices are going to be put in place incrementally and not all at once. So know that a privacy program takes time to develop. So as you hear about the different types of practices today, you're gonna hear about some things that you may already be doing, even if it's maybe super informally, like maybe you have fields for your database on a spreadsheet, okay? That counts, right? Some other practices may not be underway, but understanding what they are, at least this gives you a good place to start. So this is to say it's easy to feel overwhelmed with the responsibilities around management for data privacy. But I just wanna say at the top, it's a very iterative process, okay? So that's kind of my life drawing out. So data ownership is another, or data governance is another core concept around data privacy. And by governance, one of the key elements is ownership. That's why it was so great to see data guardians and parents here, all right? Because with 25 years more helping nonprofits around data, the first question I ask when I go into an organization is who owns this data? And then when I hear everybody, we all do. That's not a really good answer because what that means, possibly, is that no one owns the data. No one's really overseeing it. So I think it could be helpful to equate data systems in general, and this is especially when you've got privacy in mind with unruly misbehaving middle schoolers, okay? They need a guardian. They need a structure. Otherwise, the data misbehaves and you put yourself more risk, more liability, okay? So sometimes it's not always the central owner. Sometimes it's a departmental data steward. That's fine. Just so long as data sets have owners. Governance then, and we're gonna be speaking to policies and process around data governance, the security, and also finally, education, right? Educating employees, onboarding new employees, et cetera. How do people learn about how you use your systems? Okay, so I think we can go to the next slide and how are we doing time-wise? Good. Okay. So these policies and practices apply to the entire life cycle of data. And so when we think about privacy, we really need to think about where the data starts at your organization. How do we get it in the first place, right? So as an example, say an after-school use program, they may receive applications like in a PDF or a mail document, or even some form of application via the web, okay? Some of these may contain sensitive information and not to mention after-school program like minors, okay? So data collection points refers to all those points of input. How do we get that stuff? Where does it come from? How did they get it to us, right? Storage and access is, so once I have a student's application, where do I then store the pertinent data? Do I put it into Salesforce? Okay, do I, you know, if I put it into our Salesforce system, who has access to it, does everyone or just the program people who work with those students? And then what happens to that original PDF application that I received, that I use for entering this data? So storage and access speaks to understanding those aspects, that part of the life cycle. How are we putting it? How do we use it, right? Some data is not uncommon, a lot of nonprofit organizations, right? Donor data, especially major donors. They're what they've given some of their history. That's not everyone can see it, and that's the way it should be, right? So if we need to collect sensitive data and donor data in the times that we're in, this sensitive data can be, especially when collecting data for certain or donor data for certain causes, we wanna really protect it, who can see it, who has access to it, right? So the next area of sharing and transfer, do we share our data anymore? And by sharing, that can be with a consultant, or it can be with a third party organization, or it could be with another application that we use if we say have an email marketing program, okay? An email marketing service approach, okay? So do we understand all of the sharing practices? Do we know what those are? Do we have that documented somewhere? And then finally, so we get to the area of retention and deletion, and that can be one of the hardest things for nonprofit organizations, right? Josh was saying that oil that we don't wanna give up, okay? So back to the after-school program, once that participant has gone through the program, how long we hold on to that person's information? We may be legally responsible, obligated to hold on to it for a certain amount of time. So maybe a clear reason, or do we not have the reason? Do we not really have a reason for how long we keep it, or when we would want it deleted? Or do we have a process in place for deleting it? Okay, so it's all of these kinds of questions. And again, this is all the stuff that's covered in this. So on the next slide, so this is a lot of the identify work, the stuff that comes under identify. These are other resources that we give you. And this is a data inventory worksheet. Now this can be a lot of work initially. So I encourage people, you're not already doing this to really start by prioritizing specific areas or programs where you think that there might be concerns because the information you're collecting in those programs is particularly sensitive. Like if you work with a group of at-risk youth or something, that's an area to hit first. Donor data is also often an area to hit first because of the financial information that's collected. So this data inventory worksheet is a good place to start for documenting the whole life cycle that was just on the previous slide. And generally you'll do one of these for each specific program area or a specific type of individual from whom you collect data. So for an example, I would create one of these worksheets and fill it out and it's really a sketch. This is meant to be an agile, this is not a long, long business plan document. This is a one-pager, right? Just to start to sketch it out. And I would do one for the afterschool participant and then trace that information through, okay? So in the worksheet basically walks you through the typical types of questions that get asked at the different points in the life cycle. Where does data come from? How do we get it? What personal information do we collect? So we collect particularly sensitive information or not. And then how do we use it? Okay, I recommend doing this, especially if you're a data guardian, that's an organization-wide data guardian that's helping other departments do this to do it with the stakeholders who work directly with that data, okay? So anything else on this, Josh? Yeah, I'm noticing that we have just an incredible amount of questions piling in. I think we're well over 20 already. So keep them coming, by the way, everybody. These are great, but I wanna continue moving through and then we'll try to basically hit as many questions as we can. And I'm just checking with Aretha and Justin if there's any, they want us to sort of stop and hit now. I'm kind of skimming through them, Kim. And I think I still wanna hold them to the end, I think. And just for people, I think we've got maybe 10 minutes left and then we'll give lots of time for Q&A. Okay, so these are two other resources. One's a data map worksheet. It's basically the lower one on this, the leftmost one. That's a visualized version of the inventory, the one that was on the previous slide. And that can be a good way to present the information. And then finally, the one on top is a spreadsheet and that's also a template, a data classification table. And that is really the indispensable thing. And it includes all this information and it really, a well-maintained data classification table can really be the single source of truth, a well-maintained one for all of the different information that's collected at your organization and what sensitivity level it is. And by that I mean, certain information you may consider restricted. That means that it's tightly controlled. If people are to see it, they need to, if a non-employee is to see it, they need an NDA, right? Social security, that kind of thing. Internal data is another kind of data. It needs to be secure, but a little less sensitive, but it needs to be kept internal to your organization. And finally, there's public information. That's the information that's publicly available, like an employee's email, work email address, things like that. But it's good to know what's there and what the sensitivity level is, okay? Because a well-maintained classification table can really guide you through this process. It also helps you see where your risks may be. And knowing what you collect is essential for any privacy program. So hopping to the next slide. I think it's our poll. So, Rita, is it your poll? Oh, yes. You have. No shame in anyone's game on this one. We're not tracking the answers. We're not reporting them to your boards. We know a lot of folks don't have this, but do you have a data classification and handling policy? Essentially, something that says this type of data is protected, confidential, restricted. And if it is restricted, here's what is okay to do with it and what is not okay to do with it. And, Rita, as always, yeah, just to let us know. All right, so hey, you know, that almost 20%, that's way up there from where it was a year ago, Kim, so I'm really excited. It must be one of those guardians in the house. And Matt from the UK did correctly identify that the data mapping form was adapted for something that is UK-based. You are right on with that. Wow, and so you must be very close to your GDPR, the data protection. Yes, it was adapted because GDPR is a foundation for this, though, right? Yeah, if you're a UK or a Europe non-profit, you know, EU non-profit, that this has been up in your grill since 2018. Maybe I have us in the US. Yeah, yeah, the US a little bit behind, just a little, yeah, we'll get there though. All right, all right. Is this me? Kim, you have another school, you can go ahead. So, all right, so I'm gonna go quickly so that we do have time for Q and A, or we have a little more time for Q and A. So, internal policies are these things. So, the data handling policy, the types of thing that Josh was just talking about, data classification will qualify certain data for being categorized at certain sensitivity levels. How do we treat it? Inward-facing policies, these are guidelines. One of the purposes of these are employee education. They are a fundamental part of a privacy program and often a starting point once you've done a classification table. So, it builds from there. External notices, on the other hand, are outward-facing, okay? So, this might be your organization's privacy policy that's accessible for, say, website visitors or people that visit your website that sign up for a newsletter, especially if you've got some people from EU or the UK. Now, the UK has its own privacy law, which is very close to GDPR. Think of these as trust documents. So, we're telling people what data we're collecting about them, how it's being used, and what we're gonna do with it, right? We've recently been hearing more about the need for donor privacy policy. So, a specific type of privacy policy for donors and that's really because the current environment that we're in, where it really can pose risk to donate to certain types of causes. So, it's important, especially if you work for one of these types of causes, to think through, well, your donor data, the management around that donor data, but also your privacy policies and what you want to be able to do for these constituents, for these donors. So, think of these as trust and accountability documents. So, the next, okay, so this is a resource, this is a theme I'll keep returning to. This is an iterative process. So, this is a map, data privacy or information privacy maturity guide that we put together. We do this type of thing for a lot of our, we think of this more kind of, I think of technology capacity building in general as very developmental, right? It's done in stages, it doesn't, and so, this map, which you can get on that page, shows different characteristics at these different stages. And, oh, he's beating me to the punchline, but if you just clicked. Oh, I'm sorry. I'm sorry, Kim. No, but if you just clicked, then. Hang on, hang on, let's go try on the energy tool. Oh, look at that. All right, my bad, Kim. No, no, that's okay. No, because it is in a lot of organizations, feel like, oh, I'm starting way down here. The differentiators in all this is really like the accidental data nerd often at a nonprofit. So that's the, you know, kind of de facto data owner, data guardian, all right? Or sometimes some organizations, especially large organizations in today's day and age, we might get people more dedicated to privacy, but those are really differentiators, having someone own it and be responsible for it, okay? All right, we got another poll, I think. Is that okay? Can I move on, Kim? Or would you use? Yes, absolutely. Great, great. Oh, thanks, Rita, right on the polls. All right, so, oh, you jumped one ahead, unfortunately. We're gonna save that one for the end. Did we not have the awareness training one in there, Rita? We had, has your poll provided you with, yeah, awareness? If we don't have that one, that's okay. And we can try to relaunch the other one later. If not, it is what it is. It's all right. Waiting for Rita. Do we have the awareness? Yeah, I don't see that one. I don't see that one. Okay, no worries. On we go. All right, so we were gonna ask if you've had awareness training in the last year. We have this kind of Swiss cheese, the New York Times did this wonderful thing around the pandemic. So a lot of nos, few yeses on the awareness training. Thank you, everybody. Wow, without us even asking, everybody lets us know in the chat and I'm just eyeballing it. It looks like about two thirds no, one thirds yeses by my eyeballs, maybe, you know, the less than that. Okay, that's not where I'd hoped it would be. Unfortunately, we're still a little behind in awareness training. So when we think about protection and defense against cyber threats, the pandemic model of respiratory viruses where you have, you know, just masking, just getting vaccines, just social distancing, these things, nothing by itself, you know, gives you excellent protection. But as you start putting those things together, you start to get better and better protection. Very similar model. In fact, you know, the term virus, in fact, right? We take over into the computer world of computer viruses. And the Swiss cheese model is basically the defense in depth model, which is the term that's used in cybersecurity. The idea that, again, there's no single intervention that is perfect against preventing cyber threats. In fact, you know, you're never gonna get a hundred percent protection, but by layering these different measures on, you reduce both the likelihood of something bad happening and you also reduce the potential impact or how bad it will be if it happens. And those are both really big things, right? It's a lot better for my organization if one kind of minor bad thing happens over the course of the year, as opposed to five bad things, two of which are really big bad things, right? That's a big difference in the organization. Cybersecurity is not an all or nothing game. And with all of these privacy laws, they will always have a cybersecurity component because of course, if you have people's sensitive information, you have to be able to protect it. If it's just exposed for all of the world to see, then that's not a great thing for everybody's privacy. And the New York Shield Act, we really like because it specifically uses this idea of reasonable safeguards. And I wanna talk about this term reasonable because I actually really like it a lot. It has a history in the legal space. And I think it is useful for people in a real world sense of thinking, how do I know if I'm doing enough or if I'm not? And I encourage you to answer that question by asking these questions. So if I have Kim's data and I have a breach and I lose Kim's data, right? And Kim comes to me and wants to know was I reasonably protecting her data? She could ask me questions like, what data did you have? Why did you still have it? What were you doing to protect it? If my answers to that question is, well, I collected your data five years ago, then I used a bunch of public resources to aggregate and append that data with more information about you. I was hanging onto it. I don't know, because we just hang on to stuff and what was I doing to protect it? We didn't really have our cybersecurity going yet. We tried to get people to use good passwords. Well, Kim would say that doesn't sound reasonable, none of that sounds reasonable to me at all. On the other hand, right? If I said welcome, we collected this and we would have deleted it after a year of you not being a client anymore or whatever it is, but it was only six months. So we hadn't deleted it yet, but we would have six months from now because that's our policy. And the breach we had, they exploited a vulnerability that was on our cybersecurity roadmap because we'd had a risk assessment done. In fact, we've had one every year for the last three years. Our last one gave us a roadmap and this vulnerability was on that roadmap. We just hadn't gotten to it yet and it happened to be that, your data was part of that breach. We notified you as soon as we found out, we're really sorry about it. Here's what we're doing to make sure it doesn't happen again. Well, Kim might say, well, I'm bummed that my data got exposed, but that sounds reasonable. Putting yourself through that exercise, put yourself in the place of a data subject, right? Ask yourself if the organization had my data and lost it and had to tell me about it. What questions would I ask to determine if I thought they were reasonably protecting it and had a reasonably good reason to have it, right? That's very straightforward stuff. Now the resource for you, we have a cybersecurity readiness checklist. Just to show you, this happens at Roundtable. A lot of these kinds of conversations, data classification, retention, what data do we need, happens at Roundtable. We reviewed this very checklist yesterday. It was required that you put in a phone number because we wanted to be able to call you with a follow-up to essentially see if we could help you and maybe sell you some stuff. We removed that as a required field. So for those of you that go to it now, you'll see it's optional. So if you want us to call you with a follow-up, put your phone number in it. If you don't, we're not requiring that. That was a conversation that just happened because we want to practice what we preach, all right? And Kim, I think we want to, this is you to wrap up. Is that right? And we should wrap up quick because we have a good journey. Yeah, no, we do have many questions. So just in a world where exposing people's personal data poses increasing risks and this could be because of the climate that we're in, okay? Responsible data management is elevated and important, right? So that's a big takeaway. Data minimization is another concept that I just want to leave in people's minds here before we go off into Q&A. And that is really asking, do we need to collect this information? Are we collecting, do we need this? And that can be a good thing to have. And then finally, reasonable cybersecurity, right? All of these things are ethical practices. So I just wanted to touch on that, that there's a lot of intersection with data ethics and data privacy. So I think we had, so that was the next poll. Yeah. It looked like, I don't see the answer. Did we get answers to it, Aretha? So it's up there. We're gonna relaunch it again. Yeah, go ahead. Okay. So we've given you a lot of information today, for sure. So it's good to make a plan. So after this, the next thing or what do you inspire to do or frighten you into doing having attended this webinar? Ah, okay. Getting buy-in to prioritize privacy. That's... Well, have a good cry, wasn't the top answer, Kim. So that's, I feel good about that. Maybe we should have another session. Have a good cry session. That's 22%, so that's a good number. So, yeah, getting buy-in from leadership is key. And more and more leadership may be hearing it from their boards, and they also may start hearing it from insurance, right? Less around data privacy practices, but it's coming to an insurer nearly soon. All right, so what's next? We have a non-profit it.com. That's a roundtable URL. One slash privacy stuff to access all the resources from the presentation. Is that in chat? I'm sure. Yeah, Justin's dropped in chat a couple of times. I'm sure he'll drop it in again now. If you have any questions, email us outreach at roundtabletechnology.com. Yeah. And thanks for all the great questions. So people are voting them. So Kim, I'm gonna just go with the audience votes. Okay. So you can click the little thumbs up next to the questions that you see there and we will answer the most upvoted first. So the first one, we've got two with four that I'm seeing. So I'll go with anonymous. Ask, can you provide insight on using cloud storage platforms, iCloud, Google Drive box, et cetera, in terms of data security, privacy around different types of personal data. I'll stop my screen share now. All right. I think I'll take that one Kim, is that what you want? Yeah, that's with you. Sure. So first of all, you need to understand what they're doing with the data because if you're storing the data there, you need to make sure that they're not sharing it. For the best, those parties in general are not gonna be sharing your data with anyone but still good to make sure. In terms of security, all right, you have what's called a shared security model and maybe while Kim's answering another question, I'll go find the slide on that and I'll show it to people but you are still responsible for data classification, right and account access who has access to the data that's in those cloud platforms and securing your account access. So if I have it in Google Drive, Google is unlikely to get hacked on their side and wind up exposing my data. However, if I have a really big password and I haven't put multi-factor authentication on the account and the account has access to all the data, even there's no reason for that person to have access to all the data and that account gets compromised and someone takes out all our data, is that Google's fault? No, that's my fault. So you have to understand the parts of the security that you're responsible for versus what Google's responsible for, a box or those other folks, okay? So that is the answer there and we'll mark that as answered and let's go to the next one. I think it's Ken's question. What framing have you found? By the way, hello, Ken. What framing have you found helpful to kickstart this type of organizational thinking? Like lots of security work, it's often seen as eventually priority rather than immediate. Your auditors and your cyber insurers will solve that problem for you, Ken. I'm sorry to know that. But I mean, that would be my quick answer, Ken, is tell them if we don't deal with this as an organization, then we're going to find out the hard way our next audit or some partner organization is going to send us a questionnaire that we're going to be unable to respond to and they're going to say we won't partner with you or your cyber reliability is going to come up for renewal and your insurer is going to say we're not insuring you. Kim, what else do you have to add to that? Yeah, no, I would agree to that and also keep an eye on state privacy laws, right? If you're in New York state, for example, and some of you are, right? There's the New York Shield Act, which has certain responsibilities if you collect certain kinds of data. So pay attention to the laws and also do you collect any data from anyone who's a resident of the EU or the UK? So these are requirements. All right, we've got two votes with six. So let's go with Stephanie because Stephanie has been a great contributor in the chat today. So I'll reward Stephanie who's tied with a couple other questions. Are there any canned data privacy programs, services, trainings that exist that organizations can utilize immediately as a jumping point or is it better to build out governance from the start specific to the organization's mission and vision? That is a very well articulated question. That's funny, you should ask, Stephanie. It was a Justin to answer this one. So NIST provides a framework. We do have a training that we will make available. It's a six-part training that walks through the different steps. But I would say following the NIST privacy framework and it's available for free. I mean, you can get the framework, the documentation for it. And that is a good place to start. And NIST actually has a ready-set go guide for small businesses that I think is very relevant to nonprofits. And if I can find the link to that while we're still in this webinar, I'm not that great at that kind of multi-task. Didn't you do a whole class based on the NIST data privacy? I did. But where would people sign up for it? Oh, I got you. Okay, that's coming. The number we need are partner and prime Justin. So we're soon to make it available. That will be coming in. Yes, no, I do have the six-part class tailored for nonprofits in this material. All right, ooh, we've got a vote. We've got one that got voted up to seven. All right. And thank you, by the way, Stephanie, for your great contributions in the chat during that session today. How do we figure out which data laws apply to us as an organization? I'm going to throw you straight to that. That one's straight to you, Kim. So, well, data, okay. So US data law is sector-based, okay? So, but you would probably know. Like if you were handling, if you were responsible for maintaining compliance with HIPAA, you would probably know that. So the data privacy laws have to do with region, have to do with where a person is from. Yeah. It's really important to understand it's where the data subject is from. Not you. Not you. Not you. Your business. Now, here's what's, and this is the thing that makes it difficult in the US lacking a federal privacy law, right? Laws are springing up in different states. Yes, these laws that are currently on the books generally target commercial entities, right? You see the most aggressive laws in California. Well, California is also home to social media competence, right? So it's a primary target. So it's not, you know, it's like 25 million or more, right? But given that a lot of us end up collecting data in this increasingly boundary-less world we live in from people who live in the EU or a host of other countries that are adopting laws similar to the EU, they don't differentiate often between how much money, what revenue, okay? So that's why it's a good idea. And we say as a general guideline, work toward compliance with GDPR. Again, not overnight compliance, but, you know, but your laws are gonna be based on who you collect data from and where they live. And that includes employees and collects some of the most sensitive data on them. And just gonna give folks a heads up. We're gonna go about five minutes long. We're gonna, Aretha and Texiup have been kind enough to allow us five extra minutes to answer questions. So we'll go to about five minutes past the hour and we'll just get through as many as we can. So keep voting. I'm doing my best to keep up. Looks like Kara's question is the next most highly voted and I think it's another one for you, Kim. This is right in your wheelhouse. Can you give more insight into assigning a data owner at an organization where many departments need access to the data? How do we limit who has access and where ownership is? Love this question, Kim, I know you love it. Go for it. This question is the story of my life, actually. But I'll try not to take that much time to answer it. Okay, first of all, organizational buy-in is really important or leadership buy-in. Leadership needs to understand that data, a responsible party, now it can sometimes be a person who has an eye on all the data that the organization collects. They may not have their hands in all of it, but they have a view of the organization's universal data, okay? If that's not possible, and that's often a tall order because that is a CIO type of role. Another option is data stewards. So departmental data owners. So if you have different departments, your big organization, different departments with different systems in them, you may have a departmental owners for different sets of data, but then they come together and then there's a data leadership committee. And so certain members of these departments and people, you do need to identify an owner. And it's generally a person close to the data and the person who's, I'd say, the most comfortable in that role. And then leadership needs to ensure that person has the time to spend in that role of oversight because there needs to be that single point decision making around with David. I hope that helps. All right, our next one, we've got another tie that involves Stephanie. I gave Stephanie the last one, so I'll give this one to Sean. How do you reconcile between NIST and GDPR when you're an international organization? Big question, I know. Okay. Take that one, Henry, you want to take one. I can take this one and... Okay, so GDPR is a regulation. So it specifies things you need to do, requirements that you need to be compliant with. So things that you need to do if you collect data belonging to, in this case, GDPR, EU residents, right? NIST is a framework for that's meant to be adaptable, right? For different, all of this, it's regulation agnostic. So what NIST has are areas of practice. There are some differences. Like for example, and this is a big question. This is where I recommend my class talk about this. For example, GDPR has more requirements around having a data privacy officer in an organization, certain depending on the size and sensitivity of the data. That's probably not most of the organizations here, right? That's not so much in NIST thing. NIST is really talking about areas of practice that you want to have in place in order to implement a privacy program, including one that's GDPR compliant. One thing I would dump in on this, but this is for organizations that have the budget to do it is that we're using one at round table and it makes just the world of difference. There are compliance platforms that are coming up now. The one we're using is called DRATA, D-R-A-T-A. There's a bunch of other ones out there, Vanta, tugboat logic. Sometimes they're referred to as GRC platforms or governance, something in compliance platforms. And they will, so you have different controls, right? Or requirements that are part of different, either cybersecurity frameworks or different regulations. And these platforms will like to overlap them. So like if you get compliant with GDPR and then you say we want to be compliant with this particular NIST framework, it'll just map all the things that you've already done for GDPR onto NIST. So instead of having like 144 things that you need to do for NIST, you only have 26, which is the Delta between what NIST had that GDPR didn't or vice versa. So those things can really help you, Sean, if you wanna kind of see how close you are to different frameworks or standards, okay? That hopefully would be helpful, but they're expensive like, you know, five, 10, 15 grand a year, but oh my God, I don't know how I would do, we're going through what's called a SOC-2 for roundtable. I have no idea how I would do it without a platform like this and it makes such a difference. So I spend a lot of time convincing our leadership to pay for it and it was worth every second. All right, Stephanie, we have time to get to your question. So this is a longer one, so let me read this. Data privacy enforcement is undoubtedly a team effort, but the issue I'm coming across is that our organization staff have different technology aptitudes, right? And we can all understand that from very savvy to not so savvy. How would we best go about simultaneously educating staff about the importance of data privacy, but also expecting them to enforce it? Or what's the best way to start building out data governance standards in this situation? That is a big one. Yeah, you want to take the first crack at it or you want me to take it, Kim? I can take an initial stab, but then I'll have you work on. So it is not uncommon, right? For different aptitudes. I mean, it's just the nature of the beast. And I'd say having, but also there are many different kinds of learners, right? And this is where thinking about ongoing, and I guess the term is, I don't know, we still use this one, Josh, upskilling, like ongoing, upskilling as part of these. Continuous learning. Continuous learning as just an expectation, right? That and helping people understand what it is they need to do, right? So some people are going to need more hands-on kind of education. But I'd say, you know, there are definitely, there's kind of the introductory awareness training that you want everybody to be aware of, right? So you want everyone to have a general understanding, but then people may have different data handling requirements based on the data that they use in your organization. So then you'll want to tailor specific kind of training more toward handling that kind of data. So if you have a very limited group of people who only use certain, you know, the very sensitive data, they're going to need special training. They may need special logins and everything, okay? But that's a limited group. So long as in your controls, they're the only ones who can see it, right? So that's where those types of things go together. Josh? Yeah, I mean, where I would go with that and the first is the idea of technical versus policy-based controls. So controls are things that we do to prevent data from being exposed or shared or mistakes being made. So we have policy controls where we have a policy that people accept, but I think, you know, many of us can agree, a lot of people don't necessarily read those policies, don't necessarily understand them and may not understand, like if you say you have to encrypt anything, they may not know how to encrypt it. So that is unique to do that, right, for a variety of reasons, but anywhere you can implement a technical control, right? So if I have something like data loss prevention in place that notices that Kim is trying to email a social security number out of our organization in a non-encrypted message, it just stops her. She just can't because the data loss prevention program stops that, that's a technical control that prevents different data privacy laws from being violated. So, you know, where you can kind of manage people who may make a lot of mistakes is by implementing technical controls wherever you can, as long as it doesn't unreasonably block workflows of the organization. And I think with that, we are at time. So I think Aretha said, we just say bye. I can't thank people enough for all these fantastic questions. I'm sorry we couldn't get to all of them. Please, you know, feel free to reach out to us. We'll be happy to try to answer your questions one-on-one. And thanks to Justin. Thank you to TechSoup. Yeah, thank you for using that. Yeah, thank you so much TechSoup. And if there was a way to collect all these questions and we could write answers to them, we would be happy to do that. Bye everyone, have a wonderful rest of your day. Thank you so much.