 Hi everyone. Thank you so much for joining us today for our webinar, Tools of the Trade, Privacy in the Digital Age. Before we get started, I just want to go over a few housekeeping items. All callers will be muted. So if you have any questions, feel free to use the chat box that you see on the left-hand side of your screen. If you lose your Internet connection, try refreshing your browser and using the link that was emailed to you when you registered. If you're interested or if you have to drop off early or if you want to watch the webinar again at a later time, you can visit our website, www.techsoup.org slash community slash events dash webinars to watch the webinar at a later time. We'll also be sending an email with the presentation, the recording, and the links. And if you're on social media, feel free to send a tweet at us at TechSoup using hashtag TSwebinars. But again, like I said, we'll be checking the Q&A as we go through the presentation today. So just a little bit about TechSoup. We're in 236 countries and territories, and we work with over a million nonprofits. So I just want to take this opportunity to give you guys a chance to use the chat box if you want to chat in where you're dialing in from today, and I can read out a few of the responses. Okay, we have San Francisco. So somebody right next door to us, Plainfield, Indiana, Portland, Lowell. Do we have any international folks? We don't see any yet. We have Prescott, Arizona, Fort Wayne, Indiana, so a lot of Indiana folks on the line. All right, so the chat is working. That's good. So just a little bit more about TechSoup. We work with several technology partners who make our mission possible. So Adobe, Intuit, Microsoft, Symantec, just to name a few. If you're interested in learning more about the products that we offer nonprofits, you can visit our website. The URL is here, techsoup.org slash get-product-donations, and see what's available to you and to your nonprofits. So today we have a presenter, Tracy Ann Kosa, who is a privacy researcher at Stanford. So just a little bit about Tracy. So Tracy is a privacy researcher at Stanford University and adjunct faculty at the Seattle University Faculty of Law. Her current work proposes privacy patterns for enforcement and regulatory activity. Dr. Kosa has held a number of leadership roles in privacy at Microsoft and in government. She holds a doctorate in computer science in privacy, master's degree in ethics and public policy, and undergraduate work in economics and political science. So I'm going to go ahead and pass it off to Tracy. Thanks, Seema. I appreciate it. I wanted to start off today by saying thank you very much to everybody for taking the time to join us. And I'm very much looking forward to hearing and seeing your questions. So please do actively use the chat box if you have any things you'd like to raise specifically for today. I'm going to read to you a little bit about the Stanford Society that I come from just so you have some good background information. The Stanford Center on Philanthropy in Civil Society is known as the Stanford PACS. And their mission is to develop and share knowledge to improve philanthropy, strengthen civil society, and affect social change. PACS is mandated to connect students, scholars, practitioners, and they also publish a journal called the Stanford Social Innovation Review. That's where our research work is funded out of. To that end, I wanted to take a minute and ask you if you would mind sharing with me what type of organization you work for or with. These are some loose categories. And if you don't quite fit the bucket, I understand it's not scientific, but it'd be great to see your responses. This is fantastic. Thank you all. I'm just watching the poll results come in. Seema, I just hit skip to results to share them with everyone, correct? That's correct, yeah. Okay, great. So you can all see where we're at today. Fantastic. It's great to see such a diverse group, and I know you'll probably have questions related to your specific area. So please, again, feel free to raise those as we go along, and I'll do my best to address them now or at a later point. The next question that I have for you is to tell me a little bit about how comfortable you feel in the privacy area. So I would assume lots of you are here because you're interested, but I'd like to gauge how many folks feel like they're experts, or they're really just starting at the beginning, or maybe are familiar with some of the basics from any particular source. And it sounds like that's where a lot of you are probably sitting. Okay, I'll share those results with you. It looks like we have a good standard bell curve which is great to see. And again, questions, comments, concerns, feel free to drop them in. So let's get started. All right. So the Digital Civil Society Lab investigates challenges and opportunities for civil society to thrive in the digital age. And this, of course, has expanded the potential for civil society participation, but it also presents new challenges and threats. And our dependencies on digital software and infrastructure that are commercially built and government surveilled require new insights into how we make these digital systems work and how we can engage with them safely, ethically, and effectively for civil society purposes. What that means for our work specifically is that we're focused on looking at how privacy and privacy requirements fit into digital civil society. So the lab overall seeks to engage researchers, practitioners, and policymakers, and it touches on technology, organizations, policy, and norms. I firmly believe that privacy fits into all four of those buckets and was really happy to see that the Civil Society Lab was interested in funding this work. Today I'm going to walk through a little bit about some of the issues in privacy for a digital civil society. We're going to talk a little bit about the hypothesis and design of our research to give you some context for the web app that we've built. It's called Privacy Patterns, and I want to be really clear with you. It was designed and intended to be a free tool. It will always be a free tool and it's currently up and available to be used and explored, but it's very much in beta. And that's part of why we wanted to share it with you today to begin to get some feedback and better understand who's going to be using the tool and how we can improve it. So with that, let's jump into privacy. The first thing I want to do is level set what the expectations are when we use this word privacy. There's a lot of legal definitions for it. There's a lot of personal reactions to it. I like to think of privacy as broken out into three different components. We deal with territorial privacy. We deal with physical privacy, and we deal with informational privacy. And all three of these things come together in how each individual thinks about and represents their privacy decisions and choices. To talk a little bit about physical, I think the best example I can give you given that we're in a webinar is that if you look around you to examine the personal space, you have in a room. That's very much a privacy decision that you make. There's been some really fascinating research in the space done in social psychology. And as a Canadian, I'm happy to tell you that they definitely got it right for me personally. Canadians on average require 2 to 3 feet of physical space around them in order to feel comfortable, probably indicative of a 30 million person population in a very, very large country. But other countries, other cultures, and even families would operate with a different sense of personal space. And that's very much a privacy issue. That privacy issue can also be extended to what's called territoriality in law. It refers to your stuff. When I drive my car somewhere and park it, I fully expect it to be there when I get back. But what kind of car it is, what's in that car also speaks to me and my identity. Personalized license plates can be a good example of that. That's another kind of privacy, the expression of our identity, what information we choose to share. And then of course you have informational. And this isn't just about the information that we create about ourselves when I send an email or a text message. It also includes all of the information that is created around the transmission of that information. So you may have heard the term metadata. Over the last few years it's become more commonplace, but this refers to information that is created by machines when we use them that can actually be used for some really interesting and possibly scary behavioral analysis. With that, let's go a little deeper into some of the common requirements when we talk about privacy. You've probably seen things like policies and consent statements, notice and contact information. These are some of the basic requirements that we see in privacy, in law, and also in the expectations that people have with organizations that collect use and disclose their information. The consent statements are generally an area of contention right now. For example, in healthcare consent can often be conflated with consent for treatment and consent for management of information. There's been quite a lot of research lately that also suggests that, quote, consent is broken. This is not, I think, particularly helpful since we don't have an alternative to date. But I want you to know that there's a lot of discussion and debate around some of these mechanisms and these common requirements as we move forward today. So when we think about privacy, one of the easiest ways to look at it is by country. The Internet, I think, was very much intended to be a global, information superhighway, I think was the quote for a good long time for those of you who might be old enough to remember. And what we have seen, of course, is that with increasing privacy laws in different countries, the Balkanization of the Internet is really much more of a reality. And the EU GDPR I think represents yet another aspect of that with additional privacy laws that are being passed specifically in the European Union specifically for those countries, but will have an impact on how we all use information and what our expectations are. So let's take that back to the United States. I apologize for this eye chart. I want you to take away from it the complexity as opposed to the actual content here. Our research team looked state by state to examine what privacy requirements there might be that we could pull out issue by issue. Now tools like this where you can do these kinds of searches are fantastic. However, they are mostly accessible to lawyers and buy subscriptions with not insignificant costs associated with them. They're also behind firewalls. And for example, they can often still be written in legalese. So if you're turning to look at a law to try and understand what your requirements are, it can be a very challenging situation. Let me give you an example. In Alaska, which I'm picking because alphabetically it was our first result after Alabama which has no requirements, Alaska defines the privacy breach as quote, the unauthorized acquisition or reasonable belief of the unauthorized acquisition of personal information that compromises the security, confidentiality, or integrity of personal information maintained by the information collector. Now I can tell you as someone who has studied and worked in this field for about 20 years, I'm already lost. That's a not notably very long sentence but also contains a lot of words that are open to interpretation. For example, reasonable belief of an unauthorized acquisition. And then you deal with the concept of what is considered a compromise to confidentiality, what is considered a compromise to integrity. How do you define an information collector? All of these notions and thoughts and expectations are really open to interpretation. Let me dive a little further and give you another example. If we go straight into that Alaska requirement, we know, for example, that you can multiply this by most states in different categories and then you would have a pretty in-depth notion of breach notification, just for example. On the left side of this slide, you'll see a general breach notification statute quote that comes straight out of Alaska. And it effectively is attempting to narrow the definition by suggesting that if the organization owns the personal information on a resident of the state, then that's a breach. And no reasonable notification is required if there's no reasonable expectation of harm to that resident. But if the organization doesn't own the personal information, then it must notify the person that does own that information. And again, to me and to several of the folks that I've worked with on this project, this becomes an extremely complex, almost bend diagram or flow chart. And bearing in mind, contextually, we're looking at one particular requirement for one particular breach notification law in one particular state. So you need to kind of push further and push further and push further on each of these definitions. And it gets complicated. The definitions also about personal information change substantially. So for example, here, there are specific data points laid out on the right hand of the slide that talk about first name, last name, initials, social security numbers, driver's licenses, et cetera, et cetera. You can think about all of the pieces of personal information in your day-to-day life, or even just look at your driver's license and begin to understand that if you list out data sets in terms of what is considered covered by privacy requirements and what is not, that again gets very complicated very quickly. Well, given that there is not consistency around breach notifications or requirements or definitions of what constitutes a breach, the next question we had was, well, do breaches actually happen? If you're looking at a unregulated largely sector when it comes to privacy, can you say that there was a breach? And this is where things get interesting. This chart represents some data that we obtained from the Privacy Rights Clearinghouse, which is a California nonprofit corporation. It's 501C3. It has a two-part mission. It deals with consumer information and consumer advocacy. It was established in 1992. So they've been working in this area for quite a while. And they have an open searchable database of data breaches that are reported by anyone. So for example, if you know of a data breach, you can go to their website, enter a few data points, and they begin to track those things. In the NGO category that Privacy Rights Clearinghouse uses, which is the non-governmental organizations or effectively civil society nonprofits, you can see that they've tracked some really interesting breaches. And these numbers represent the actual number of records that were breached. So you'll see there's a fairly significant spike in 2011 of 3 million records that were breached. This refers to the 2011 U.S. Chamber of Commerce China hack, where hackers in China were able to breach the computer systems of the Chamber of Commerce and had access to the information of roughly 3 million members from November 2009 to May 2010. It was discovered in May 2010, but there's still some evidence that suggests those systems were compromised as of March 2011. Email communications were also compromised. And there were also things like trip reports, schedules, policy documents, meeting notes, all kinds of information around expense reports, and other bits and pieces of folks who had dealt with the U.S. Chamber of Commerce. So I think we know breaches are definitely an issue. You'll see reporting of them trailed off of it in 2016 and 2017. And nobody really knows why since it's a sort of unscientific study, but it represents an interesting notion that NGOs are still considered to have these kinds of breaches. So for another data point, we turn to look at what is the coverage of civil society and privacy effectively in the news? What does the coverage look like? Is it advisory? Is it about breaches? What kind of reputational risks are nonprofits facing? And I've pulled into this deck sort of a highlight of some of these issues based on our literature review. This is one of the earliest ones we found in 2012. Compliance SAI Global is a newsletter that operates predominantly in the U.K., but is of course online. This article references the ICO, the U.K. Information Commissioner, that announced about a charity called Enable Scotland that had breached the Data Protection Act. The ICO stated that there were two unencrypted memory sticks and papers which contained personal details of up to 101 people stolen from an employee's home. That information included people's names, addresses, dates of birth, and health information. So not insignificant breach. In this particular article, they report that the ICO stated that the charity did not have specific guidance for home workers on keeping data secure. Portable media devices were not routinely encrypted. And the Chief Executive Officer of the charity had signed an undertaking committing the charity to improve its compliance with the Data Protection Act. So again, this is a really good example of an article where you would expect some advice or guidance about how to address some of these issues, but it's a little bit more general than that. It's mostly about the actual context of the issue itself. Let's look at another example, also coming out of the U.K. in 2014. This is a Computer Weekly article, and I should mention all of these links are in the slides. So you can check on any of the details that you would like. The takeaway from this particular article was that the ICO investigation found that the charity failed to realize the website was storing name, address, date of birth, telephone number of anybody who had requested a callback for advice on pregnancy-related issues. The personal data was not stored securely, and a vulnerability in the website's code allowed a hacker to access the system and locate the information. This was a really specific targeted example of where data breach requirements were not in place because the folks who were running the charity, of course, didn't have the cybersecurity expertise necessary to address some of these issues that are now a little bit more common and well-known in respect of using websites and having websites. Let's try the next one here. Another article from 2014 was published with the Nonprofit Times. The Nonprofit Times actually has done a really good job in this article to try and frame this advice and these issues around data breach notification for nonprofits as something that they could do. In this particular case, they talk about a targeted attack of hackers. The LA Gay and Lesbian Center last year, so that would have been 2013, was a victim of what the charity described as a sophisticated cyber attack designed to collect credit card, social security numbers, and other financial data. In a statement that was published in the end of 2013, the center said that there was no actual evidence that anyone's information was acquired, but approximately 60,000 clients and former clients were notified that the information might have been compromised over a 4 or 5 month period. This is a good example of how the reputational issues associated with charities could have a real impact on how it is they manage personal information and gives more credibility to why this issue needs to be looked at in greater detail. Another newsletter we found was from a law firm specifically dealing with a lot of nonprofits. They provided the advice that smaller breaches are actually of more concern to leadership and boards at nonprofit organizations. They are managing a lot of personal information. You all are. And that's not just about the members and volunteers. It's also donor information, employee information, all of which can include all kinds of sensitive data points, name address, phone number, email addresses, et cetera. Some nonprofits have mature privacy and security programs, but I think for the most part what we need is some standardization, and we are starting to see in these articles about the inconsistency of how these requirements or best practices should be implemented from a resourcing perspective. In 2015, things start to shift a little bit. And again, nonprofit news raises really good points about best practices. For example, permitting personal information to be stored on laptops or smartphones is problematic, especially if they are not encrypted. Drawing a line between what employees do on their personal devices versus what they do on work computers is another important point. One of the interesting things that the nonprofit times highlights here is that the standards, and this is in 2015, are becoming increasingly complicated in doing best practices in data management and security is especially challenging when data is moving between devices and storage sites. You have many privacy regulations that require businesses to protect personal information no matter where it resides. So that could be on a network, that could be on an employee device, it could be of course on paper as well. And so thinking about a holistic approach to dealing with data privacy becomes more and more important. An insurance company, Newsletter in 2015, had some interesting case studies about a Utah food bank breach that occurred that year. The unauthorized access was gained to 10,385 donors through the website. Again, the Utah food bank had to absorb the cost of offering credit monitoring and offering restoration services to people who were a victim of identity theft as a result of that, which took a significant bite out of their budget. I'm happy to report we also found a TechSoup article, a really good one that outlined different notions of nonprofit mandates for the protection of data beyond simply regulation, aka trying to do the right thing, whatever that means. One particular quote jumped out at us, most nonprofit leaders admit knowing too little about the risks and consequences of failing to adequately protect personal information collected from employees, volunteers, clients, and donors. I think the important takeaway from that is that we are talking about different kinds of personal information with different contextual risks around it. The potential breach of donor data has very significant consequences for a nonprofit, but they are very different from a potential breach of employee data. And that's an important thing to consider as you build out privacy requirements. In 2017, we start to see a little bit more specificity in some of these articles that are being written. This is a law firm newsletter specifically geared towards nonprofits, and it again reviews a privacy breach that resulted in essentially a malware being installed on the nonprofit's computer, and then they were held ransom. The files and data that were on those computers were held ransom. The hackers demanded $43,000, and when the organization refused to pay, the hackers posted on Twitter private letters that the organization had in their capacity for providing services. And consequently, in addition to losing all of their files, the organization also lost funding because they didn't have any of the administrative data that they needed to apply for grants. This article really stresses that this is a much more common kind of breach that doesn't necessarily make the mainstream news but has dramatic and drastic consequences for the nonprofits that operate in the space and can be subject to those kinds of attacks. Charity Digital News published a series of notifications written in 2017 by Elizabeth Denham, who is the new privacy commissioner in the UK in the ICO's office. A lot of these posts and myths have much more specific guidance in them and talk about how quickly the details need to be provided when a breach occurs to a regulator. This is a fairly substantial change from what we've seen in privacy in the past where there was an expectation that an organization could both identify, stop, solve, and remediate breaches within even as little as 72 hours. The ICO and other data protection authorities are now recognizing that that's not feasible and that you may become aware of a breach within a certain time period and there is an expectation to share that information with the regulator but not necessarily full remediation plans or comprehensive reports at the outset of the discovery or detection of an incident. Regulators seem to be pivoting to understand specifically nonprofits need some support and guidance as they go through these kinds of incidents. And I think that's a really good sign. There's another data breach tracking site. This one particularly highlights specific data breaches related to employee data. It was kind of an interesting case. There was an employee who was working at a nonprofit who sent spreadsheets containing information of vulnerable clients to his personal email address without the knowledge of the data controller which was his employer, the nonprofit. The defendant sent 11 emails from his work email account to his personal account. It contained sensitive data of 183 people including children. The data points were full names, date of birth, telephone numbers, medical investigations, and he had apparently done this in the past as the nonprofit dug in to research this breach. I find this one particularly interesting because there really is no motive for this that is financial or suspects hacking or identities after any of these other things. This really does appear to be a case of an employee who wanted to do work at home who just wasn't familiar with what the best practices around the management of client data should be. And it's probably a good example of why it is training employees about these best practices is a really important part of doing business. Last but not least, I wanted to share this article from the Information Age also published in the UK. Data protection laws have never been so stringent. This was evident today as it came to light that 11 UK charities had been fined by the ICO for misusing information about millions of past donors to seek further funds for future projects. The different organizations are listed in this article. They include Oxfam, Cancer Research, Cancer Support, the Royal British Legion, and each of these fines ranged from anywhere from 6,000 pounds all the way up to 16,000 pounds. Collectively, the charities were fined 138,000 pounds and the ICO decided to keep these individual fines quite low because the donors, the ones who had actually, quote, been exploited had specifically given statements to the ICO that they would be unhappy at more severe financial punishments which is kind of a good news story, I guess. The donors of these charities really stood behind them as they faced some rather rigorous review for their data practices. And the ICO also talked specifically about why these charities were targeted. They had screened millions of donors to target them for additional funds. Other charities had traced or targeted new or lapsed donors by putting together personal information obtained from other sources. And then other charities yet had traded personal details with different charities to create a large pool of donor data for sale. And I think some of those practices are fairly common and in the UK you will see an increasing tendency towards trying to shut those down which presents some really interesting challenges. So to sum up, from a literature-perviewed perspective, there was really only five points that we were managed to call from this massive amount of academic and industry literature that we found. First of all, make sure that you have advice for employees on how to keep data secure. Second of all, make sure you've got some mechanism to deal with the understanding of compliance obligations related to managing personal information. Thirdly, be careful that your employees only have access to the information that they need to have access to that it's legitimate use. And train them about not quote snooping in data just for interest sake. We've seen similar things in the healthcare industry. Fourthly, lots of recommendations and restrictions on not permitting personal information to be stored on a laptop or smartphone. That is a work or organization owned laptop or smartphone. And then finally, I think what we really see in a lot of this coverage and literature is just that bad things are going to happen. It's not a question of preventing breaches entirely. It's a question of exercising the best due diligence you can in context under those circumstances, and then accepting that you need to have a well-established plan for how to deal with things when they go wrong. Those observations though were just the beginning of the iceberg. What we also began to look at was what are the open questions that exist for civil society and privacy. So right now, for example, we know there is no overarching privacy regulatory framework that is specifically geared towards civil society in the United States. However, there is other regulation both in the United States and beyond that can help inform how civil society should address privacy issues. There's definitely media coverage which means there's lots of case studies available and different things we can look at as opportunities to learn what we need to do better. We know that breaches do occur, but there's a fairly significant dearth of data on what are the actual harms of those breaches, and how do we quantify that? There's a lot of advice, definitely a lot of advice, but really no information or guidance on how to evaluate all of it. How do you know which privacy advice is good privacy advice? How do you know what to implement under what circumstances and in what order? So that led us to sit down and begin to talk through what the hypothesis and design should be to help solve this problem. The first thing that we did was toss out that global map that I showed you at the beginning of our seminar today that contains a list of all of the privacy, quote, laws, regulations, and obligations by country. There's some fascinating academic research on this. In the last total, I believe, was 183 different laws globally. If you go down below the state actor levels to say provinces, states, territories, etc., then you will discover a whole other wealth of privacy law and legislation. And again, if you go further still to look at sector-specific laws and requirements, then you have even more. In the United States, there's probably upwards of several hundred requirements that pertain to different aspects of both information management, data breach notification, and a little bit about how you need to deal with specific data points, for example, in HIPAA. We decided that that was not helpful. We could look at different ways of concording all of those things. And the approach that we took was, where do those laws come from? And in large part, they are based on a set of principles that were generally developed by region. One of the first was the OECD guidelines privacy framework that was established in the 70s coming out of the European Union that began to talk about what are the principles associated with privacy that we would see and expect, not just legal principles, but also the values and ethics that people hold around privacy. Well, we looked for that kind of document across the globe, and we found them. And what was particularly interesting about these principles is that they begin to align not necessarily in implementation, but certainly in the values and notions behind what the expectations of privacy are by data subjects. So let me give you an example. This is a sample principle that we pulled out of one of the guidance documents that we have. And this particular principle talks specifically about program management. What are the expectations for having a really strong program management for privacy, but also for security? In this case, it particularly highlights the need to have guidelines, the need for those guidelines to be context specific, the need for there to be safeguards specifically about privacy risk assessments, the need to incorporate that into governance structure, the need to have incident management programs, and the need to have compliance monitoring. These are effectively the best practices for program management, and they've been customized for privacy. So we were quite excited to see some of these commonalities start to pop out as we began to review these specific documents. Here's another example. One of the principles calls on organizations to demonstrate that their privacy program is, quote, appropriate, which leaves a lot of room open for interpretation. But if you go back to these requirements set out in the first principle, you can begin to see where you could establish a code of conduct or give a binding effect to this program by committing your employees either through an employee agreement or some other mechanism to follow these program requirements the same way you would with any other best practices or code of conducts that are required in an organization. Another principle we saw very well established in these documents was the need to provide notice when there has been a security breach of any kind. Now, different jurisdictions will quantify that differently. Some will say a breach is – notification is required when there has been a significant breach. Notification is required when there has been a breach that will adversely affect a data subject. I think the takeaway from this really is if you suspect a breach, go to your regulator and let them decide or help you decide what kind of notifications, if any, are necessary. These patterns that we started to see were really exciting because once you start to see patterns, we can begin to actually adopt a language to describe what those requirements are and how consistent you can be in the application of them. And these were the patterns we found. Globally, 12 patterns were completely open to the fact that there may be more. But in effect, what we see globally is a requirement for some due diligence around the collection of data, some safeguards around the access to that data, limitations on the use of that data aligned often with consent or accountability, limitations on the disclosure of data, concerns around the safeguarding of the disposal of data, requirements limiting the retention of data, same for archiving and backup purposes, a real push for organizational transparency which I think we can all see is becoming more and more important not just in the civil society sector about how organizations do all of these things and being open and transparent with data subjects about what the uses of their data are proactively, not simply just asking for blanket consents or signatories to end user license agreements. We also see a right of access in a lot of countries where an individual can ask an organization to see everything that they have on that particular person. Now, there's a lot of debate in this space. In Ontario, for example, there is a carve-out for metadata generated about a person's information. So in other words, if I was on Facebook and wrote a post, I could ask Facebook to give me back that post which they will share with me along with any other information I've uploaded, but there's still a lot of debate as to whether or not I would also be able to get access to the logs that Facebook tracks on everybody who uses their website. And that's not just a Facebook thing, that's every organization, because are those laws just about me or are they my data as well? So there are some interesting ethical issues that need to be sorted out with right of access. Data quality is an interesting one. We often see this represented as integrity. In other words, the information you have about me has to be accurate, and that leads nicely into the right to correction. So if I discover that an organization has data about me that is inaccurate, I have a right to have that data corrected. Once we see these patterns in all of the privacy requirements, it helps us get to how do we then do privacy? I mean, that's still a lot of work and a lot of open questions under each of those patterns. So we begin to look at what are the recommendations that would apply to a specific situation. We needed to talk about that though. We needed to talk about that in a meaningful way. We developed a questionnaire to help us understand what users were looking for in the management of data. But then we also had to look at how do we actually implement those things? How do we prioritize them? How do you find the right answers? It requires a significant effort to curate data, and this is from people who are doing this full-time with not another job on their hands. We settled initially on resources that were published by regulatory groups or nonprofits or advocacy groups that specialize in privacy, for example, the Future of Privacy Forum, the Electronic Frontier Foundation, the ACLU. We looked for templates, for questionnaires, and documents that could be used immediately without requiring subject matter expertise or had significant cost or time resources. As you can imagine, this was a significant effort, and we have just begun, I think, to scratch the surface of how we evaluate those kinds of tools. We also wanted to look at other jurisdictions that have different actions based on the type and sensitivity of data. So we're including that as well. We need to know what locations our users are in so that we can customize our advice. But one of the real questions that we have is how do we set civil society up to do things in a way that is easy, and repeatable, and defensible in a prioritization that makes sense? To that end, one of the things we researched specifically starting with prioritization, from a practical perspective, I'll walk you through a personal example. When I was first hired to create a privacy program for a healthcare organization 20 years ago, the people I worked for had no idea what I was doing. They knew we needed something to do privacy, quote, and somebody to do privacy, quote, but they didn't know what that meant or what it was I would need to do that. They just wanted me to take care of it. In effect, I sat at a desk and made judgment call after judgment call after judgment call about what those privacy requirements were, how they should apply to different lines of business, what constituted good enough, and how to create some kind of effective program that could live on. And I'm not trying to toot my own horn at all. I made a lot of mistakes in doing that, and I think that's normal. The problem is there is no scientific basis for prioritizing or looking at what those requirements are and how they should be implemented. I sat, and I remember very well, counting out to the list one by one of all the legal requirements, all the contractual obligations, all the vendor commitments that we had at the time. And came up with a list of 250 things that needed to be done. Where do you start? Where do you start was the question. And that issue of prioritization remains today. We cannot reasonably ask any organization, but particularly a civil society organization, to undertake 80 plus, 200 plus, 500 plus, consecutive privacy tasks. We need to know which are the ones that are a better time and resource investment. So the way we did this was we counted. Privacy regulators published their findings, and the FTC in the United States is no different from any other privacy regulator. We looked at every single case that they published by the tags that they used, or the filings on their site. The lawyers on our team looked at each case, pulled out specific data points that would help us understand what the advice was, what was being recommended, and then we were using that information to help weight the recommendations and the prioritization in our tool. So with that, oh actually let me speak to this lovely pie chart. I didn't put data points on here because what I wanted you to see was just the visual. I think I hope your eye is drawn to the bigger parts of the pie, which are in fact exactly where the vast majority of the findings from the FTC sit. As a person who wanted to implement privacy in an organization, you can be damn certain that's probably where I'm going to start. And that gives me a really nice slice of risk. If I just look at half of my risk situation, there's 1, 2, 3, 4, 5, 6 things that I can do right off the bat that will take care of 50% of the regulatory and enforcement risks I have for privacy in my organization. To me, that's when we started to see real value ad. So we created a web app and we called it privacy patterns because we're looking for patterns and we want to surface those patterns. This is our site. I will stress again that it's very much in beta, so we're working with a designer and some other folks now to make it a little bit more user-friendly. We started of course with the standard that you always do in this space which is the disclaimer. The lab is providing this information to help people try and prioritize what actions they want to take for privacy for their organizations. It's not legal advice. It's best practices. It's general guidance. It's education. We always will recommend to talk to an attorney about some of these more complex issues, but we want to surface for you the ones that can be dealt with with some general information. Once you get past that disclaimer, the first thing that we want to talk about is what's being done with your data. We broke this out into some of the different patterns that we saw. Looking at, for example, are you interested in collecting data? Are you interested in accessing someone else's data? Are you interested in using the data your organization already has for some other new purpose? Are you sharing that data with someone else or a different organization? Do you have retention schedules in place? Are you interested in archiving and backing up data that you have? This helps us categorize the advice that we would want to give based on those data points that you're sharing with us. We also ask for location. You'll see a sort of random smattering of countries and states here that is not complete, reflective again of our working through these proof-of-concept issues. So you select particular jurisdictions that you're interested in, and then we will show you the research that we have found in that space and give you a top three list of what are the things that you should be doing or taking care of from a privacy perspective for that area that you're interested in, both geography and the privacy pattern. So let me drill into a little example here. For example, one of the pieces of advice that we often give is it's important to be specific about data handling practices. We then provide you a link with a document that is a template that shows both notice and policy requirements that you can download, customize for your organization, and publish in about 10 to 15 minutes. Lo and behold, that tool specifically comes from the Digital Impact Group, which I'm sure some of you have heard of and is actually part of our lab. So we were quite excited to see that some of our research actually bounced right back to some of the tools that DCSL offers already. And again, these documents are templatable. It was our intention not simply to say you need to be transparent, but to provide an organization with a vehicle to be transparent that was easy to use. We also wanted to put in a quick glossary of terms recognizing that a lot of these are specific for a given geography, a given industry, a healthcare definition might be different from an education definition. We are trying to solve for the greater problem of the sector. And so a lot of these are going to remain somewhat vague with the intention of giving it up to the organization to customize for their own purposes. And that is our tool in a nutshell. So as promised, I just wanted to leave some time for specific questions or other things to walk through and then I'll share some next steps with you. All right. Thank you, Tracy. That was super informative. So if you guys have questions, again, please use the chat box to have Tracy answer them and I can read them out to her. One question that surfaced my brain when you were going through this, is there sort of like a, I know for a lot of for-profit businesses they have like a seal of approval that shows they've taken all the steps to be private and compliant. Is there such a thing with nonprofit organizations? That's a great question. I think off the bat, I would tell you that trust used to be the signatory seal for privacy. And I believe they've now rebranded themselves as trust. And I think that they do in fact offer a similar kind of service for nonprofits. The only thing I'm not sure of is how much it would cost. And I think that's the special consideration for this sector is thinking about what is the cost benefit of having those kind of certifications largely dependent on the kind of data that you hold and what your data subjects and employees and donors are interested in. So it's definitely worth exploring. It also looks like there's a question here. Oh, George Hamilton weighed in with a fantastic quote. If you haven't already, you definitely should. It's a standard security statement which is there are two groups of organizations. Those have been breached and those who don't know they have been breached. And it's really true. It's not a question of if. It's absolutely a question of when. And then I see Whitney raised questions about organizations for ethics and compliance programs framework and how to guide them on developing stronger privacy posture. Yes, okay, this is a great document. For those of you who are not aware of it, the U.S. Department of Justice guidance on compliance effectiveness is a fantastic resource and has also a really good document that outlines what core compliance programs should have in them as the beginning of creating a privacy program or something related to privacy compliance. I really think the next step after establishing that kind of program framework is to figure out how to measure it. And I know that's a much longer topic when I speak about regularly and I'm happy to share resources on. But the next question becomes not just having a policy in place, but having a vehicle and a mechanism by which you can demonstrate effectiveness of that policy. So that's a really good starting place for that. And Whitney, I'm happy to talk more about that with you if you have more questions in that space. What else have we got? Oh, the best first step for privacy. Jennifer, that's a great question. I think I'm going to be a little weasley here and give you two steps. The first thing I think you want to do as someone who is concerned about privacy at an organization is understand what data you have. There are lots of resources to help conduct a data inventory. AIMA is one of them, and I'll send a link around to FEMA afterwards to share with you. They have a lot of great resources on their website. And that's a really good place to start because if you discover you have a lot of really sensitive data, you may want to take a slightly different first step than if you have somewhat less sensitive data. I think it's probably also worthwhile to think about establishing some kind of policy or guidance document. And I know some organizations are a little loathed to implement formal policies. So what I've seen some organizations do is create what's called guidance documents for privacy, as in these are the values that we hold in the management of your data. And then kind of work through those issues together and make sure that that's reflective of the business practices. In terms of your question about Google Drive, I'm probably going to have to punt that one. If for no other reason than a lot of it depends on what you mean by safe, what kind of information, and how a drive is implemented. And that's really the same for any type of cloud storage, whether that's Microsoft or AWS or Google. They're all going to have different positives and benefits around their services in relation to compliance. So if you want to reach out to me, I'm happy to discuss that with you a little further offline. Melissa, are there resources for how to decide who sees what in an organization? Yes, many, many resources. One specifically that springs to mind that's a little obscure, but I still like it. The Government of Ontario has a security team that publishes advice and guidance on role-based access controls that contain a really cool matrix chart in them that effectively sets out what is your role, what is your function. Okay, this is the data you should have access to and sets out recommendations in a really handy little chart. We're going to be including that in our tool as a link. It's not up yet, but again, if you would like me to point you to that document, I'm happy to do so. Sima, is that it, or are there more questions that I don't see? I think that's it. And I think we have just a few more minutes, so I think that's good. If you guys have any other questions, Tracy has her information. I think in the next couple of slides. I do. Could I flip to that next slide now? Would that be okay? Yes. Okay, perfect. So thank you very much for your questions. It sounds like you guys are really on point and on top of all of this. I would also ask you a question which is for feedback. Any feedback on anything I've talked about today, of course, but also any other aspects of the project and what we're building, we really want it to work for all of you. And it's really important to us that we represent both your concerns and answer your questions. So I've created a really lightweight open-ended survey on SurveyMonkey. If you want to fill that out, I would greatly appreciate your feedback. Also, you can just email me directly with any thoughts that you have. And again, I want to stress, this is not about building a product. It's absolutely going to be an open-source tool that we maintain through the lab for this community. And so we would be delighted to have that feedback and those comments or concerns or additional questions that you have. So with that, I will say thank you very much to all of you for your time and turn it back over to Seema. All right, thank you, Tracy. And we can send out that link also in the follow-up email when the webinar is over. So just to wrap up, a little bit about TechSoup again, here's the link if you're interested in our product donations. And then one thing that we always like to do is just ask our attendees to chat one thing that they learned in today's webinar. We also have a post-event survey. So any feedback that you have for us really helps. And then also we are on social media. So if you are on Facebook or Instagram or Twitter, we love social media love. So please give us a like or heart or whatever is relevant to the platform. And then also please feel free to visit our blog which is blog.techsoup.org. And we have a few more webinars coming up. We have one next week, Video Storytelling Made Easy with Adobe Spark. And then on 531 we have five clear steps to get your nonprofit cloud ready. If you are interested in watching any of our archived webinars you can go to this link to watch this one or any of our other prior webinars. And again, thank you Tracy for all that information. That was super helpful. And thank you to Lashika for helping on chat. And lastly, thank you to our webinar sponsor, ReadyTalk. Thanks and hope to see you all soon. Thank you.