 I'm just gonna say hi and welcome to everyone. My name is Mark. I am, as the little tag says on my picture there with Square, a website development company here up in Dollsworth Worth. And I am a volunteer with TechSoup Connect, which you are all here to hear a presentation from tonight. And our guest tonight is a round table and we're very excited to have them here. And I'm going to let them either introduce themselves or let Justin introduce our presenters tonight and let them take it away without any further time to live. Sure. I will pretend like I had something planned and introduce our two presenters tonight. So round table technology, we offer premier IT services for nonprofits. And so therefore we have a lot of experience with nonprofits and technology. Kim Snyder is our VP of Data Strategy and Joshua Beske is our three CPO, which I'll let him explain in a minute. They are going to give you a great presentation on data privacy and I'll let them take it from here. Thank you. I'm Kim Snyder, VP of Data Strategy and I'll let Josh explain three CPO. I don't really explain if anyone wants me to. I said I was just leaving. Oh, I want you to. It's like a, it's a made up title because my previous title, no one understood anyway, which was BCIO slash Cybersecurity and I had to explain. So I function as a CI, virtual CIO for organizations that round table works with, as well as for round table itself. I function as what's called the CISO, which is a security officer or cybersecurity officer for round table, as well as for various clients that we work with and running cybersecurity team around table and designing the screen program for our clients. And then I also, along with Kim and Justin, design and plan all of the content and program delivery that we do. I can't take credit for it. A colleague about a year ago said, but you really should change your title three CPO because you're the three C-suite program officer. So I was like, I, that's great. I actually changed my title to that. So that's the title. That's the story. And if you're a Star Wars nerd, yes, I understand that it's C3PO is the character's name. It's deliberately revenge. Anyway, all right. So we thought we'd start with a little ice breaker where we would just see if anyone wanted to enter into the chat, the most private sensitive information that you would never ever under any circumstances want to share publicly. If you would just go ahead, we can have everybody actually just put that into the chat for us, and we'll be happy to see it. But- We have your private information. Yeah. If you don't want to do that, what you could do is tell us sort of the one kind of question or one concern or one takeaway that you really, when you decided, hey, I'm gonna, you know, take a Tuesday evening and attend this webinar with these two folks from New York that I don't know what you would want to get out of it. We'd be interested to hear what questions. So why is it foolish to use a password manager? We will be happy to answer that. Eli, but wait, we've got some strong names. Wow. Yeah. All right, Eli Van Der Giesen. I hope that I said that well. Short answer is- Why? You can get in, when we get to the Q and A section, Eli will align to that. Clients identifying financial info and we'll certainly touch on that. That's not bad. And do you require password protection for downloaded rosters with PII? Short answer to that, Sarah, is password protection not necessarily, but encryption certainly would be a good idea and we'll definitely touch on that tonight. So these are all great things. So let's keep those coming in, dropping the chat and we'll try to get to, we've only got hopefully about 30 minutes of content. So we should have plenty of time for discussion in Q and A. We recognize this is a group that normally has a lot of conversations so we don't want to bury you with content. All right, and with that, Kim, take it away here. Okay, so we're going to, this is really meant to provide an overview of privacy and privacy today. And I also want to just add a note that we have resources. Josh and I love making resources that we make available to people. So if you, so I'm going to show you stuff, whenever you see that little bird, that means this is a resource that's available. And you can get all of these things at roundtabletechnology.com privacy stuff because it's more important than you listen than you worry about the details and you can hold onto these details. So let's talk about why privacy, why now? So you can, okay, there's two big reasons. The first, we could say is privacy legislation, right? Which has been on the rise. This map on the left is from the International Association of Privacy Professionals. They keep this going. That's actually more gray here than while committees are in session, but there was a lot of blue earlier. So lots of laws are being passed in various states or at least are being introduced and being hashed back and forth in a committee. So in any state that has red, has a privacy law that has passed and any state in blue has a privacy law that's getting pretty close to passage. That's in committee, states in gray have one that didn't quite make it into committee. Remember, we're just right out of the session now. The- They make a quick point about these privacy laws though, just so people aren't misled by whether your state has an active privacy law. It's really important to understand how privacy laws work, that it is not about the state in which the business operates. It's about the state in which the data subject, which we'll define more clearly resides or is a citizen of. And so whether or not your organization operates in the states, any state that has a privacy law, if you have data on someone who is a resident of that state, then you are subject to that law. So, can you proceed? Yes, and what we're seeing is what we'd say is a crazy quilt of different privacy legislation laws. And I'm gonna talk about it in a moment, creating a standard or a standard to base a privacy response or privacy program around, but hold on to that thought. The other big reason is cyber pun, okay? And this chart shows actually in dollars from 2001 between the, I think it's from the U.S. Department of Justice cost of cyber pun. And basically the important thing about this slide is the upward trajectory that continues by leaps and bounds, okay? Nonprofits are not at all immune. We all know sometimes they think so. Nonprofits, fairly typical for people. So what do they want from us? We don't have a lot of money. They want money and what nonprofits have is data. And we have data about people, donors, things like that. As more data, because as nonprofits, especially those working in, I'll say potentially sensitive areas. We could say some media journalism, women's health. Those organizations are likely to be more highly targeted for cyber product. They do want your money, but that's based on the value of your data. So these are two big reasons why privacy, why now? So if you wanna, I just wanna keep us on our clip so you can look to that. By the way, Kim, this doubled actually. The most recent number that I looked at in 2021, it was up over in the United States. So we're actually on an almost exponential growth curve in terms of monetary damages from cyber. Yeah, COVID really rocked this because a lot of cyber criminals were stuck at home too. And so the U.S., so just a tiny bit of history, the U.S. privacy laws have been by and large sector focused, meaning HIPAA. So a lot of us have heard about health information being protected, refer, but that refers to education records of students. So we had certain industry specific laws that we thought of as our privacy laws. And if we were an organization that received medical payments, then we were subject to HIPAA and we had to buckle ourselves in. And while that is true, the difference is a regulation like in the EU, the General Data Protection Regulation or GDPR, which actually came into effect in 2018, has a very different mindset about privacy. And it's a much more person as opposed to industry focused regulation. And GDPR asks us to think of privacy as an individual right. So protecting my privacy is my individual right. Now you could say these are also true of the U.S. laws. You could view them that way, but I'd say, and I'm gonna show you in the next slide, the different ways that GDPR, and you can move on. So the individual privacy rights, so this is a slide, this is a resource, little bird. Okay, so these are the individual privacy rights defined by GDPR. And I'm gonna go clockwise, random the right of information. So if you're gonna collect my data, the right of information means I have the right A to know that. I have the right to know why you're collecting the data. I have the right to know how you're going to use that data. So the data use statement, if you're collecting my data, how is that gonna be used for marketing, for a general kinds of records collection? And this does apply to the innocuous little email newsletter signup. If I'm gonna give you my data, you're gonna have the opportunity to know why. I also have the right of access, which means I can ask you for my data. If you've collected my data, I can ask you for. The right of rectification means if you have my data and there's something that I want changed whether it's inaccurate or outdated, that I have the ability to ask you to please correct the data that you have and that you have the ability to follow through with that. The right of erasure, and that's the one that a lot of people are getting hit with in data subject deletion requests, which there's a whole market for generating those kinds of requests. And some nonprofits have probably received some of these to say, you collected my data, I want you to delete the data that you have about me. All of the right to restrict accessing means if you're, I can ask you to stop using my data. All right, so you can collect my information but I don't want you to use it for anything. I'm not gonna ask for it back, but you have to be able to stop using it. Notification means if you're going to delete it or if you're gonna correct it or restrict the processing, you're gonna tell me to let me know that yes, you have done it. So these are all just some of my rights, data portability, and this one in Facebook, I think this was a little bit more notorious, I've got to be able to ask you to give me my data back so I could take it somewhere else. And Facebook came up with a ridiculous way to do this, to satisfy this law. The right to object and to avoid automated decision-making is probably less of a relevant issue for a lot of nonprofits. And this basically speaks to using my data to drive decisions, like AI uses of my data. For if you're going to make a decision solely based on my data about whether or not I get a loan or get approved for a certain program or housing, solely based on my data, I have the right to object to that and to question that. So these are all my rights if you collect my data. So that's quite a game changer when we think about data collection. So if you go to the next slide. I just want to ask him just for a moment to say that I think, I don't know about everybody else, you can thumbs up or whatever if this resonates for you, but as individuals in the world, I think we all are very grateful for these rights and wish we had them protected in the US the way that EU citizens have been protected. But as nonprofit professionals that are seeking to use data, this becomes challenging for us because we've thought for a long time that the more data we have, the better we can serve our missions, the more information we have and the more we can leverage that information. And so there's a little bit of cognitive dissonance perhaps as individuals who really appreciate these protections, but they can be kind of classed onerous in terms of our current data practices. So I don't know if that resonates with folks, but that's certainly how I can feel at times. And Beck Morse asked a question about privacy legislation around the world. GDPR has really shaped a lot of the legislation. And so it's a good foundation and it's also shaping all of the US laws. In fact, on the IAP website, there's a legal tracker that shows all of these different rights and which of the different laws exclude these, right? And so GDPR, think of it as a model and because there's so many different approaches to privacy, you're the safest following this as the standard as Josh said in the comments, okay? So anything what is projected? And so when we think about personal identifying information like PII, this is any information that can identify any person. And so that can be your name, address, phone number, email, the list that was on your IP address or device identified to our phones that are tracked and also providing geolocation information as we go around. Those things are all forms of identifiers. So the items in green are what we, this is a resource that you can have. I wanna stress that this is a sampler of many of the common types of personal identifying information. It's not by any means representative of all of it and different regulations will cite different types of data but this gives you a good starting set to think about, gender is also in there. So this information can be used either sometimes alone or in combination to identify an individual. In red, we have, oops. In red, we have sensitive information, okay? So this is information that's high stakes information. This causes potential risk to people if this information gets compromised. And we're thinking here about social security, passwords, password hints, right? Because people can be subject to financial risk when some of this information gets out there. Houseport information, health records. And we're hearing about this a lot more recently, privacy around health and also the vulnerability that people have if health records get compromised. Financial records, who's giving money to what organizations, right? That's a nonprofit. One form of financial record that's important for nonprofits. We also have credit card information, that type of thing, where you're at a theft or is a risk. The orange ones, the items of orange are what we call the GDPR special categories. So GDPR exceeds these as special types of information that's protected. And this may come as a surprise to some of us in the US, right, race and ethnicity, religious and philosophical beliefs, political opinions. We can see how if this information gets compromised in certain settings, it could pose a hazard to the individual. And remember GDPR is about protecting the rights and safety of an individual, right? Trade union membership is considered special category under GDPR. Biometric genetic information, health again, and sexuality. So this is a good kind of guide to thinking about types of protected information and what is by no means a comprehensive list. So the basic premise is that St. John's was just speaking to it. We've worked for a long time. Data is the new oil. Data, the more data we have, the more value we have, we need to have data in order to have a lot of value. But if you click again, this premise in the age of privacy and cyber crime is very challenged. If you think about data as belonging to the individual, organizations using an individual's data are really borrowed for licensing. It's not my organization's data. It's data that's still, it's information that belongs to the person. And that's the paradigm shift that a more privacy-focused environment has or asks us. Josh, anything you wanna add? Yeah, the phrase that I use quite regularly is that I think a lot of organizations, both in the for-profit and non-profit sectors, think of, have thought of data as an asset, as something that has value and only an asset. So the more that I have, the more value that I can potentially get out of those assets. And what I encourage people to think about is that in the realm of privacy laws and cyber crime running rampant, data is also a liability, right? So if you have an asset at home, you have some gold bars, let's say, they have value to you, but if they're stolen, you simply lost that value. But imagine if not only losing those gold bars lost the value that they had to you, but you then had to pay some penalty for the number of gold bars that you lost. It was $10,000 per gold bar that you lost track of because that was part of the law. So these things are abilities as well, and it's important to understand that. Data is an asset and a liability. So what does this mean for us as organizations? If you go to the next slide, okay, so here's the deal. The right to be forgotten by ads from a native to be deleted means everywhere. And I've been working, actually Josh and I've been working together for a long time, going profits, 28 years. That is true. And we've seen a lot of non-profit organizations and their data solutions. And I think silo is a word that a lot of organizations would use to describe themselves. And I think judgment-free zone, you've got a lot of important mission-based work to do. So keeping track of all these different data silos can be tricky and hard. But the right to be forgotten means not just delete it from the one place, I know it exists. Oh, and it's in our CRM. Okay, this person's record this year. Oh, but wait, that's right. We use constant contact or HubSpot and that connects to our CRM. Okay, I guess their data is also there. And oh yeah, razor's edge. And then it was in the report and we downloaded it and gave it to our mailing house, et cetera. So exports, reports, backups, these are all forms of data that if I say, I want you to delete my records or someone points to me and says, delete me. I need to be able to delete that person and know that I did. And so when I notify them and tell them I did, I know that their data isn't anywhere. So what does that mean? By the way, sorry. Yes, go ahead. I was gonna say, if any of you read about or regrettably were victims of the BlackBod data breach that people got notifications about, I believe it was March, 2020, there were organizations that had not been BlackBod customers for multiple years. We wound up being a part of that breach because their data was in backups that BlackBod was retaining that were breached by the attackers. And we'll cover this later when we talk about reasonable safeguards, but think of how you would feel if you had your data breached and had to send out notifications to thousands of your constituents because a vendor that you hadn't worked with in several years still had your data and lost track of it because they failed to delete it. So that's an example of a very large company failing to do this effectively. Go ahead, Kim. And there are big blue there, just so you know, all 50 states have reached notifications and that's been for a while, that's not a new thing. They waited an inordinate amount of time before informing people. The more time takes by the data's out there, the more people are at risk. So I feel like we're giving like the bad news, but we'll turn it around, sir. This is another resource. All right, so I think about information privacy and Roundtable, we have a very developmental approach to technology, maturity, technology development. We know that organizations are juggling a thousand things and that it's an incremental, highly iterative path forward, right, where a lot of organizations start somewhere that's what it says, reactive, right? So there's a lot of privacy risks, assessments aren't taking place, and privacy is not even on people's brain until someone gets a data subject access request and then, oh, wow, what do we do? They're asking us to delete their data, do I even need to have to do this? And so that can sometimes kick it into gear. So hopefully that and webinars such as this help organizations say, okay, take a little bit more of a proactive approach. So A, this universe right with cyberponding and sensitive information in digital form. And we're gonna, all right, we're gonna take some ownership of that. Okay, we've limited resources, so we'll put our accidental data nerd. And that's, I think the universal term, right? For, it's someone who owns the data or who minds the shop. And that personally has an eye on the data in the organization. And so we start with data inventories, data mapping, you start to know where our data all is. And we start to give staff and employees and volunteers some security privacy training so that they know what they're up against. And we understand what our risks are. So that's adapting. And I'd say for nonprofits on these developmental roadmaps, it's not, yes, to be at level four leading is an excellent place to be. Well, better nonprofits do quite well in the adapting, integrating levels two and three in these areas where it starts to become more formalized and someone leads privacy through organization and then information management becomes more of a practice, a day-to-day practice. But this was a very inclinental path. And I wanna stress that, but I wanna have this available because it, for everyone here, because it does give you some characteristics and some things to look out for as you think about your own organization and where you are now and where you wanna be. Anything else, Josh? Yeah, so let's jump into our activity. And we've been doing a lot of electronics. That's a, yeah. So I got the URL, I'll drop that and then we'll go there ourselves. And so you want me to talk everybody through a camera? You wanna do it? Sure, you had been talking a lot. Okay, so here's how this works. Hopefully people are able to get to this link and we have, these are not all the data privacy practices that you might think about, but they're four of what we would consider to be the kind of bigger ones. So thinking about collection and classification, which is do we actually know as an organization all the data that we do collect? Then do we classify it anyway and say this information constitutes PII or sensitive information or GDPR? Do we classify information in any way? So that's one practice. Storage, do we actually know all the places that data goes? So when someone registers for a webinar, do they go into our CRM system? Do they go anywhere else? Where does that data wind up? Does it then wind up on a backup? Does that backup ever get deleted if we like that? Security, are we thinking about how we're protecting that information both while it's at rest and in transit and all the systems that it winds up in? And then disposal, do we have any kind of a policy for when we get rid of data? And how do we disclose that data and make sure that it's not there? So what we'd like everybody to do, and this is synonymous, is just grab a dot and for each of these practices indicate where your organization is at. So if I were doing Roundtable and I said, we're not doing collection, like I have no idea what data we have and we don't classify it all, I'd put a red dot over there. My thought, we were doing it really well and really formally, I'd put a green dot there. And so everybody go ahead and grab whatever dot you think is appropriate for the level of practice that you have in your area. And while we're doing this, I'll say if there is anyone who wants to talk about your practice, whether it is a, we're absolutely doing this formally and we're pretty confident we're doing a good job, you're the kindest, maybe you could tell us what kind of means to you. And if you're not doing it, tell us why, what are the obstacles to not doing it or is it just you've not thought about doing it yet? So I'll take any volunteer, anyone wants to talk about their particular practice at your organization. There's a lot of sort of. Yep. And disposal is hard, right? Whoever put the green dot on security, if you're willing to tell us about the formal practice that you have around protecting your data, because that's what we're going to go into next, I'd be very interested to hear from you, particularly if you're willing to speak up. That might be me, Josh, this is Neil. I guess all I would have to say is that we have consultants who monitor pretty with respect to any access of the data and that we clearly use encryption and other security protocols with respect to access and management of the data. Okay. Thank you, Neil. Does anyone else have anything they want to say on this? I'm also curious as people are if they're, if this is about what you would have expected from insurance responses from any organizations or if people find this surprising sort of where we're landing out of the group, which is speaking frankly, a pretty stark lack of formal practices in any of these really, except for just a couple of organizations. So, I couldn't, oh, okay, this is Michelle. Hi, so we have, we're a very tiny organization, but we've been collecting data for a long time because we're an older organization, but that data is held on our machines and on our backups, and we just started cloud backups last year. So the thing that I'm unsure of is in terms of storage, I know the three devices that we have data stored on, but when we talk about it going to the cloud, I'm not 100% sure it goes to the cloud. So what does that mean? And then that has an impact on security because we have a website provider who gives us security, but only on from the website perspective. And this data is in, it's in a straight database, it's a file maker database. I'm not 100% sure where things go, but we never get rid of anything. So I'm pretty good on disposal. Michelle. Yeah, the cloud data backups so much depends on what provider you're using for that and what are the practices that you're using to back that data up and how you're protecting it. And we can, I wanna, Kim, we are actually getting a little tight on time, so I wanna make sure we do have time for Q&A. So why don't we cruise through the, your last couple of the security section and then Michelle, we'll slicker back to that when we get to Q&A because we can dive in a little deeper, all right, but thank you, everybody. That was excellent. All right, so let's get back to our webinar. Yeah, Kim, can we try to hassle through? And also, in fact, why don't you take this to life cycle? Okay. Yeah, so really what this is asking us to do is to think of data as having a life cycle in our organization and managing it throughout. And Sarah, I believe, asked a question just how practices and compliance, right? And I will answer it, and this is the topic for the other webinar, which I'm sure will be at some point doing, and that's on your data handling policy, data classification and handling, because that is how you define how people in the organization use information. In the absence of policy, this can much more easily run amok. But what data life cycle management is asking us to do is to ask these questions, and you will get a copy of the slides, right? About data at all points, from the point that we get it into our organization to how we store it, how we securely store it, who can access it while it's stored? Do we control if you get to? Do we share it with people? Do we get people's consent first? And then finally, do we keep it forever or do we may have a certain deletion cutoff point? Those are all policies. There's no absolutely perfect right or wrong policy. It is something that as an organization, you would want to develop in a way that makes sense for the culture of your organization. But that is a meaty topic right here, but this is how you address a lot of this practice in your organizations. So Josh, on to security. On to security, sure. And the reason that we have to talk about security in the data privacy webinars, and in effect you can't have data. If you are in fact collecting data that needs to be protected in any way, you can't really have privacy without cybersecurity because if you can't protect the information, then the privacy is going to be a real problem for all the information that you have, the reasons that hopefully are obvious. So let's talk about the things that you need to do. Now, cybersecurity is a huge topic unto itself. So in regards to the webinar today, I'm going to try to keep this both high level and give you some really practical kind of simple framework to think about security, all right? The first of which is that the information management and data leadership is increasingly a separate role within organizations. So smaller nonprofits, many of you here, probably think of everything technology under one rule, under one function. So you just have your IT person and she's just responsible for everything. So whether you need a new computer, need a new database, need to comply with GDPR, it's all that person. But as these functions get more more complex and as organizations get larger, these roles are breaking out. Or you have the kind of data side of the house, people that are responsible for managing the data in the organization, the governance of that data, figuring out what privacy regulations apply. And then the other side, you've got the cybersecurity piece, which is people that are responsible for setting the policies around what personnel can and can't do on their computers, what kind of password systems you use, what kind of authentication systems you use, access controls, all of the kind of technical stuff around cybersecurity as well as thinking about the overall posture, okay? So whether these are all in one person, right? Or in one function to your organization, whether they're broken up, it's important to understand that they are different functions. So there's this idea of understanding what's all the data we have, what kind of protection does it need? And then the other side is responsible for implementing that protection around that data. Hopefully that made sense. In New York where Kim and I operate, we came up with the New York Shield Act which went into effective March of 2020. You can imagine how much nonprofits in New York were paying attention to that privacy regulation in March of 2020. They were maybe distracted by some other mild thing that was going on at that time. And it is hitting them a lot through audits right now. And it's the kind of thing, Eric dropped a question in the chat that to Kim's point earlier, when we kind of showed all the privacy regulations, I would expect that states like Texas are going to be seeing regulations like these, sooner or later, whether they're implemented at the federal level or whether they're simply going to wind up applying to Texas organizations because they're going to ultimately have data from subjects that reside in states that have their own privacy laws and therefore their citizens seem to be protected by them. Or even right now, if you have data on a New York person, you're subject to New York Shield, right? If you have data on an EU citizen, you're subject to GDPR. And the New York Shield Act put together this reasonableness standard for security safeguards. And the reasonableness standard has a long history in port if any of you are lawyers or legal folks, you'll be familiar with that. And I think this is the best way to start thinking about security for your organization and for the data of people for whom you are collecting their data, whether they're constituents or donors, volunteers or staff, whoever it is. So let's say that me and Maria von Haften have some kind of a data exchange. I've collected some data from Marie that contains her email address and her home address and her date of birth and some other information. And let's say I collected that information a year ago. And it turns out that I had a data breach and I have to notify Marie and say, hey, Marie, look, I'm really sorry. I lost your data. Marie then is going to potentially ask me some questions and say, what data did you have on me? And I'll say, Marie, I had your name and your email address and your date of birth. So why did you have those? And I'll say, you signed up for our newsletter. Say, okay, I get why you had my name and my email address. Why did you need my date of birth for your newsletter? I'm like, man, I don't know, we just collected it. All of a sudden it's not real reasonable. Marie is just asking some very simple questions and I'm having a hard time answering them in a way that sounds reasonable. Marie then further says, what were you doing to protect my data? Did you, had you had a risk assessment done? Were you using any kind of password policies? And they say, oh, really? We meant to do like a risk assessment, but we never really got to that. And yeah, our staff used the same passwords for everything. Marie is going to say what? You weren't reasonably protecting my data. That conversation goes really badly. On the other hand, okay, if it goes like this, I collected it six months ago. I just had Marie's name and email address. Marie asked me, why'd you have it? I said, you signed up for our newsletter. I said, oh, that's right, I did. Okay, that makes sense. You would have that information while we're doing the protective. We did a risk assessment. We had a whole roadmap of things that we were doing. We'd done everything according to the roadmap. What we were going to do was next month we were going to secure our backups, but we hadn't gotten to that yet. And there was a breach with our backup provider and your data was part of that breach. And so really sorry that happened. We were working on it. We just hadn't gotten to that. Then Marie could say, that sounds all reasonable. So that's what I encourage you to go through as a practice. Put yourself in the position of a data subject who you have lost their data and have to tell them, hey, we lost this on you. And think about the kinds of questions that might ask you and whether your answers to those questions would sound reasonable. It's, I know it sounds ridiculously simple, but that actually works pretty well as a starting place for your security. Does anyone have questions on that? That makes sense to everybody. Marie, are you okay with all that? I hope I didn't traumatize you. Just the data bird. I mean, it sounds vague and there's a certain level of vague to these regulations. Question, do you keep how you get their information ideally in a data inventory? Yes, the source of the data, where did it come from? Did it come from the person themselves or did it come from somewhere else? That's become more of an issue by the way with people getting more data from like social media and Marie may not even know if they've ever had contact with Josh's organization. Yeah, so Sarah, to your question, so Sarah's asking about around security and whether you can find all those things and do you keep how you get people's data in regard to these privacy regulations, something like GDPR, you have to be very transparent about how likely they're data. In fact, you can't get it without consent. Here in the US, there's quite a few less protections. So our data, I'm sure anyone who spills this, your data is sold all the time and aggregated by all kinds of parties. We're gonna talk about that in a little bit. So there's not as many protections in the US. Now, hope is that those protections are going to be coming and they're coming in some states. We'll see if they make it to a federal level or if they reach a sort of critical mass across all the different states, okay? So Kim, I don't know, did you have anything particularly you wanna say on the ethics side before we get into our last couple slides? I guess in some ways it's the brighter side of this. Yes, it's for on behalf of regulations. Yes, it's in response to growing cyber crime and threats from all over the place. But at the same time, what these practices asked us to do is really just more responsible ownership of data. And that's where it crosses into ethical data practices. And I think so these are, well, there's a kind of a great work element to this kind of stuff. There's a burdensome feel. It also is ethical practices of data. And the more our society and culture is saturated in digital information, the more we're gonna have to make these decisions. And we're gonna get dark, well, a little bit dark for a minute. And I just wanna pair folks with that, which is to all this idea of the ethics of it, I've been working for over a decade now in cybersecurity with at-risk organizations and individuals. So my colleague, Destiny has joined us here. She and I both do a lot of work with journalists, social justice organizations, human rights activists, climate change activists, that depending on the country in which they live, data being exposed about them and the work that they're doing and the people that they're working with can have real life repercussions that in some cases can be quite serious, can lead to being apprehended and jailed, all the way up to being physically parent. And so the stakes for protecting information on certain stakeholders can get very high. Now, let's put ourselves, if we want in the shoes of a internet service provider in the state of Texas. So internet service providers are allowed to collect the browsing data of people whom they serve. And so they're collecting that data all the time so they can sell it to advertisers. Now, on an ethical level, okay, if I'm an internet service provider in the state of Texas and I'm in a person who I am providing services to is browsing information on how to get abortions, where to get abortions in the state of Texas, that is information that can be legally subpoenaed by the state of Texas in order to press charges against that person or the providers or people which we're in contact with. So the internet service providers have to ask themselves, do we wanna have this data, right? Because if we have it, then it potentially can actually cause harm to our customers. And if you put yourself in the position of a nonprofit wherever you're operating and think of the data that you have on different kinds of stakeholders, right? Based on the different sensitivities they may have, they're a certain place in the U.S. where obviously seeking information on abortions is now something that is legally problematic. And so if you have information on that, it's highly sensitive. And if you fail to protect that information or if you collect that information when in fact you have no reasonable need to, you are creating quite significant risk for stakeholders in a way that you could argue is unethical, right? And so that's where this starts to get very challenging for organizations when you think about the different kinds of information you're collecting on different stakeholders with whom you work, all right? So that's an increasingly important thing to think about in this world, especially when the information that you collect, that Google collects, that Facebook collects, that TikTok collects is one legal subpoena away from being in the hands of the FBI or the nation state of India or any entity that may seek to get information about the stakeholders. So Michelle has a question, let's say you don't collect that data, how then do you continually follow up with potential customers? So Michelle, I think that's a question in terms of, so the reason that the internet service provider is going with a specific example, want to have this information is so they can sell it to advertisers. So if I search for how do I increase my muscle mass, they can send me ads for DMC and kettlebell stores and whatever it is. That's why they want to collect that information if they're viewing it as an asset, it has value. However, right? In the climate that we're in, it can also be a pretty significant liability, not just for the ISP, but obviously for their customers who could be significantly compromised by that same information. So that's when the kinds of things that you're searching for and browsing for can cause you potential harms, that becomes a much more loaded situation. Does that answer your question, Michelle? Not entirely, because the problem is that although my organization doesn't operate quite this way, I know a lot of organizations that say if you come to us seeking information, you are a potential customer. And if I should meet, keep any of your data, how is it that I get potential customers or you make one inquiry, how do I follow up with you if I don't have that data on hand? But I think it's not so much that you can't collect the data. So responsible data practices, thank you. Responsible data practices means you're gonna collect that which you need, right? So on these customers, you probably don't need their social security, blah, blah, blah. So it's data minimization. You collect the minimal amount of data that you need in order to do them for your business practices, and then you protect it. And this is where the cybersecurity practices come in, awareness training, we had some questions in the earlier part of this, dropped in, is our password managers wise? Absolutely, these types of things that you can put into place as your day-to-day practice in building a culture in your organization of security and privacy, that is how, but you have to do that. Thank you, that helps a whole lot. Because yeah, none of us are, I don't think, or ISPs, we're not Google, and they have their original reasons, but you do have legitimate business purpose for collecting stuff, so they didn't need to collect it. I'm gonna turn up the slide back and we'll open it up for questions. So Suzanne has got a question, go ahead. Hello, hello, thank you for taking my question. So this probably overlaps between information management and data manager, I think this would be a hypothetical, let's say I'm asking for a friend, and a hypothetical nonprofit that has one employee, and this nonprofit has Google workspace and other CRM and things like that, but the board wants to have more than just that one employee, and there's only one employee, they wanna have more than just the one employee have access to the Google workspace, be an admin, they wanna have somebody else be an admin also for some of these spaces, just in case the one employee leaves suddenly. Who, besides a board member, who might be able to hold back in a safe way, or is there a way for a second person to possibly hold it without giving them access to the information? Would you follow my question? Yeah, absolutely. This is Suzanne, that's a super common kind of question and problem in organizations large and small, right? Which is how do we give someone administrative access to systems without giving them access to sensitive information? And the short answer is it's not difficult, right? You can restrict permissions to certain data within Google workspace, even so that an admin can't do it. So Destiny and I and Kim can all be admins in the Google workspace, but Destiny can still make a folder that only she and Kim have access to that I can't get access to, all right? And being an admin is not going to allow me to get access to it. Now what I would be able to do as an admin is go and reset Kim's password or reset Destiny's password and log in under their account and go access it, but they would know that happened because there I wouldn't be able to put the password back to something because I won't know what it was. So there's that protection in place. Does that make sense, Susanna? So you can have those kinds of controls in place. Then there's what's called logging and this would, you might need some help. You could go to Google support for that and just make sure that logging is enabled for any changes to accounts, for any changes to folders, permission changes. And then you would have an audit log if that was board member and this other person went in Disney monkey business with the permissions on the system. Does that help? Yeah, I think that basically this organization wants two super admins in the case of the Google workspace but wants to have a way that the second super admin can't access certain things. So we might have to have some compromising discussions there and make them an admin with only certain permissions instead of a super admin. Okay, thank you. Thank you. That's certainly the sort of thing that we can help you with a short consult, Susanna. So you can feel free to reach out to Eric or Justin out there in Texas and be happy to get someone to help you with that. Other questions? I think we're getting close to time, Mark. I'm happy to answer other questions that folks have. I'm sure Kim, you're playing with us well but I also want to be sensitive to wrapping up if we need to wrap it. Yes, there, wait, Stara has a question about encryption in Office 365. And yes, you would need to collect, you would need to protect it because especially if it's people who are EU citizens or even not, you do want to protect it and you can protect and control access in Office 365. God, thanks for your word to that but you can get pretty fine. My question is I have been told that we needed additional protections, not just if it's sitting in Office 365 but only certain people have access. We were told that we needed additional protections like passwords for any files that have PII beyond they only people who have like access to that SharePoint site with their company password have access to it. Each document would need an individual password. Yeah, so each document needing an individual password I hate to throw under the bus whoever gave you that advice but that's, I don't, I'm not familiar with that being an approach that anyone would recommend to protecting data. Thank you. PII being in like a SharePoint is generally speaking fine. The vulnerabilities would be primarily around the security of the accounts that have access to it. So if you aren't using like multifactor authentication and everybody's password on your Office 365 tenant is password, then that data is not very secure but it is in fact encrypted while it's sitting in OneDrive or in SharePoint. And if you wanted to put additional protections on it, Sarah, in order to improve it you would wanna use the Microsoft 365 security and compliance tools that gets complex but there's a function that's referred to as data loss prevention or DLP where you could prevent people from say downloading a spreadsheet that has a bunch of PII in it. So you could set rules that say, hey, if a document has like email addresses in it or social security numbers in it and someone goes into our SharePoint and tries to download that to their workstation locally our kind of DLP or data loss prevention will say you can't do that. That's not allowed because this has sensitive information so we're not going to allow a copy that is unprotected to be created on your workstation. Without those protections, you have some risk that people may download that PII and email it around and do all sorts of other things that remove it from those Microsoft protections. But as you've described it, Sarah, I think it's perfectly fine and password protecting it document by document sounds very onerous and not particularly helpful to me. Yes, and our colleague Destiny has a lot to say. As it controls and audits and some of the things the data loss prevention that you just talking about but yes, you can protect it. And if it's local, so is Destiny. We can protect it whole life. We have reached the hour mark. I want to thank Kim and Josh for joining us this evening with a great presentation and thank everyone who joined us this evening. You guys want to real quick give a shout out to your organization where to find more information. Justin, we'll leave that to you. Yeah, I think that. Yes, as I explained before, if anyone's interested in doctor, it's more about cybersecurity data. You can visit us at RoundtableTechnology.com.