 My name is Brian Kordelski. I am the CRO of Data Sentinel, and we are a solutions company that helps find privacy-based data. So PII, PHI, PCI do it very rapidly. But what we're talking to today is about data ethics and data governance. My slide jumper's not working. Can you grab the other one over there? I was out here last month for a conference called the IAPP conference. Does everyone know that conference is an international association of privacy professionals? And there was 1,100 people, and we weren't having our next wave of the virus. So it was a wonderful event. And what I learned in that event, and I'm always learning even though I'm an old man at this point, is many of the organizations that are dealing with their data and their understanding of their data are really focused on understanding it more ethically and managing their data with more controls. But before we get into that, like I said, I'm Brian Kordelski, CRO, Chris Brown. Hi, I'm Chris Brown. I run the data solutions practice for Prolifix. Nice to see everybody. Within the data solutions practice, we focus on data governance, data integration, master data management, and we also partner with Data Sentinel on data privacy solutions. Excellent. Can you go to the next slide, Chris? Where's the pointer? I don't know. Sorry about this. This one doesn't work. It doesn't. Oh, there it goes. Okay. So today we're going to take you through data ethics overview, data ethics and governance alignment, the importance to your organization, and the maturity that's happening that we see in the market, and then really how to get started. Because one of the things we find, and we've been doing governance programs for so many years, is that stuff takes too long, right? We've got in there where we're trying to investigate, understand, and manage information, and the best way to go after things is to have modern technology, but also a modern approach to managing your information. So that's our agenda. Chris is going to start. We'll start with the framework, and we've all seen various forms of frameworks over the course of the last couple of days in the sessions that have been done. And prolifics with our organization, we've tried to make it as simple as possible because you can get deep on this. On the left-hand side, you're all here for a reason. Somebody has a vision to put a data governance program in place, and you need the people, the processes, and the technology. But we've tried to align it to the various components within the data governance disciplines, on the left-hand side, which is the vision, the theory, the strategy side of governance, or on the right-hand side, what technology to use to make your governance program successful. So that's the core framework. And we've been delivering that for many years, working with companies like Data Sentinel in order to make companies more mature in their governance journey. Now, Brian, how does this then tie into data ethics, and how does the governance strategy tie into data ethics? One of the core pillars there? Sure. So, in my opinion, if you look at this, the key topics on data ethics is really all about understanding who owns the personal information inside of your organization. And I'll give you a story. We're working with a financial services company right now, and we scanned all their information, we found all their privacy-based data, and the next level of conversation was how do we remediate when there's a problem, right? So, ownership number one, who owns that within your organization? Is there a policy that's transparent enough? Everyone that we talk to these days is trying to operationally control and protect their data using a wide variety of security products, right? But it's really all about the transparency across the organization and the right policies and authorizations for specific people to use that data. The worst thing that can happen is it gets out, right? So, we hear data breaches all the time, and we know that over 60% of data breaches are caused by an employee of the company that was breached itself, right? So, how do we understand what the policies are on that and then how can we put in place the privacy protection of that? And one of the things that we're seeing more of is the understanding of those core assets. We really want to, as an organization, inform our customers, whoever they are, how we're going to use data about a consumer, right? And I'm sure all of us do this. We unsubscribe constantly. We're tired of being drugged through the mud with different data. So, what is the intention of the organization? And most organizations say they have good intention about data, but you need to push that out so that your consumers know what the use case is, why they're storing your data, how it's being managed and how it's being protected within it, and what the outcome of that is. So, these are the key topics that we see when someone's doing, to your point, a governance framework, one of the key pillars of that is really getting at and understanding and measuring the data. But what is the outcome? What use cases are we trying to do to protect it? Now, those fall under a variety of different topics. It's access, control. It's all about the use cases within the organization for that information, but also how you protect it. So, these are the key topics that we see. And when we are doing data ethics policies, if an organization already has governance program in place, we're simply adding to that. We're not saying buy more technology or get more broader solutions. We're saying let's put in place the policies that not only help you understand what these topics are, but that you are actually protecting it by putting that framework and decorating that framework with the privacy perspective that's needed. So, it's an overlay as opposed to a separate activity from your governance program. For sure. The only thing you need to do is find all the data. And when we say find it, you have to find it everywhere it lives. Email attaches, PDFs, photographs, you know, all your relational data. It's not easy to have hunt and gather that down, but we live in a modern world. AI, deep learning, innovation is happening all the time. So, you can decorate your governance program with the right solutions. And that doesn't have to be a product. It can be a managed service. And so, if you're looking at where all the data is, if you go back to the previous slide then, one of the key things there is on the right-hand side, data lineage. So, it's not just about scanning your data. It's understanding where it's from and how it's got to where it is right now. Absolutely. Without the knowledge of where it came from and what steps it went through to get into your fingertips, you're not going to be able to control the patterns that that data is running through. So, you need to know, at a minimum, the systems of record, where the information came from, all the critical data elements within that that are going to cause you risk and conform. All right. So, we talked a lot of theory, Brian. What does this really mean? So, I can think of one example myself. So, my brother-in-law is from Nigeria. And his skin tone is very dark. And he's one of these guys that, he always wants to get the newest smart watch and stuff like that. So, when the first smart watches came out, it would have read his pulse. And it was because his skin was so dark. Everybody else in the family, even the later shaded people, the pulse was read okay. So, there's a clear problem there with the test data management and governing the test data that goes into a smart watch. Now, that's okay. We had a little giggle about that and moved on with our lives. But what if that exact same scenario happened to a medical device? That now turns into a serious problem that could be life and death situations. For sure. So, that's just one example just off the top of my head. So, Brian, you've been working with other use cases. Tell us some of yours. Yes. We have a pilot going on right now. It's a company that's been around for 20 years in the financial services space. And one of their biggest challenges isn't the data that's managed. So, not the stuff that's in Snowflake or DB2 or whatever core systems they have. They just have massive file-based repositories. And as an organization, just as an example, they had 400 employees that have come and gone. So, to deal with data minimization, what they want to do is be able to search for all of that employee data and only retain the information that they absolutely need. So, if that's information on a 401k, other HR data, but it's crazy how many organizations hang on to even resumes. And if you think about it, a resume contains some very private information. So, those policies that we're helping them put into place is going to be to hunt, gather, tag that information and allow them to purge it. Now, vendors, providers like us, we can't purge the data for the customer. We have to help them do that, right? One of the policies they have to have is data retention policies and data usage policies. And you have to comply with things like this, you know, strategies that the federal government is putting into place. And if you are in an organization that is or must be compliant, then you're going to have to follow those policies as well. So, all of these things have to be considered. And even today with the AI capabilities that are out there, we have to make sure that those AI routines themselves are not biased. So, speaking of your, back to your biased example with poor test data and a smartwatch, the same thing goes when you're running programs in an organization. This one's mine, right? This one is yours. Okay. So, driving within your privacy program, right, if you're looking at how this impacts your organization, and I'll use a use case. This financial services company that we have asked us to help deploy data governance technologies. So, we deployed a catalog. We didn't scan all of their data. We started with critical data elements. Which systems are we using? What is the key data elements that are involved? And then they said to us, hey, help us write our data ethics policies. And we had to then dig in, do a jumpstart, meet with all the people who's allowed, who gets authorization to touch certain information. And then what you want to do is you want to drive this solution to not burden your overall programs, right? So, I don't know if everyone in here owns governance technology, and I'm not saying that it is necessary. Bless you. You can do this with simple policies, rules and processes, but you have to catalog that information. And you have to agree on the targets of the program. The one I just mentioned with all of the unstructured data, that was their target. They knew their relational systems were fine. We had to put all that into a repository. What was critical about that is getting through that information quickly and having, and everyone likes to say this, having a democratic approach to sharing that information and enabling people visually with the right reports or solutions, dashboards to drive that across their governance program. So, that's really the process. And what I've seen over my history, I've had the luck to present 300 different governance presentations in 70 countries, is big bang doesn't work. We all hear agile, agile, agile, but really it is just targeting the solution, keep the group small, run through agile delivery process and just provision information to people so they can be enabled to make decisions. So, no ocean boiling then, Brian? No ocean boiling. That's what the sun does when it goes down at night. And what we've seen just tying into what you're saying here is with prolific, some of our clients, they're very mature on some of the structured data, but the governing the unstructured data is a huge challenge at the moment. For sure. Yeah, absolutely. And it's unstructured data. A person's handwritten signature is a piece of unstructured data that has to be obfuscated in some cases. And images, you know, all these places that take a snapshot or take a picture of your driver's license or take a copy of your passport, you've got to find all that too. And that gets really interesting when you're trying to modernize and go through things with OCR and using those kinds of technologies to tear apart a document. And, you know, we've seen documents with some customers, 600 page PDF, loaded with little bits of privacy data, embedded throughout that process. What do you do with the object? Do you encrypt it? Do you anonymize it? Do you just gather the sensitive data, tell someone where it is? There are multiple ways that you can do that. You can say, I found personal data in this document and I'm telling Chris Brown that it's there. So now you have a decision to make. What do you want to do with it? Do you want to obfuscate it? Restore it? Do you want to see who touched it and quarantine the information? Do you want to delete it? I mean, these are the things that you have to decide. And we can't help organizations decide on that. They have to follow policies. So within an organization, and I was looking at this from Harvard Business Review, the conference I was at, the IAP conference, one of the senior analysts from Price Waterhouse Coop, she was saying, look, the things that organizations need to do is follow these processes. And AI fear is a huge one, right? I work for a company, we have an AI solution. When I talk to people, they're always like, how accurate is your AI? What is the percentage that your AI is going to get through my data? If you're saying I can scan 20 terabytes of information in one hour, how accurate is that? How much work do I have to do to go through that? So the second one around communications, you need to keep that sharp. You don't want to write giant specifications. You really just want to make sure that that process for the communications side of the house is really clean, but also attached to real data. Don't try to focus on developing rules for information that you may not have. So if you're trying to come up with a PCI rules around data and you don't have to comply with that mandate, don't focus on that first. Focus on the data that's important for your business. Obviously, I think we're seeing, if you go to any annual report for any organization, you will see the letter from the president and the letter from the president's always, we must be customer first. We're going to have a core focus on our customer program, and that's going to become an initiative within my organization for the next year. Tie those executive level programs back down to the small projects that you're doing and say this is attached to what the chairman wants. This initiative is funded by that process. That will keep the ball rolling and do quick wins. And then what does ethics mean operationally in your data? Like I mentioned, do I quarantine? Do I delete? Do I remove access? One of the things that we're seeing in organizations now is access control doesn't work when people have the right to pick up, click, copy, attach something into an email. Access isn't going to always save that, so you need to figure out what ethics means inside of your organization. And then obviously you want to continuously educate in the form of workshops, but don't educate on policy. Educate with data and the impact to that policy and share that across the teams because people will attach to that really quickly. So that's where we're seeing good success across these kinds of programs. You wanted to jump in on some of this? I used to start, Brian. Let's talk about the maturity model and where people are. Sure. And just give some examples of, okay, what does this mean for a company of a small, medium, large? Sure, so focus on the top, right? You've got programs that are going to go from you're unaware all the way to you're optimized, and every time I speak to clients about programs around this kind of maturity, they say, I want to get optimized. And that may not ever happen, right? You may only make it to defined, but you still have the policies in place. If you get audited, you'll have the information in place, but you can't let that burden of saying, I want to get optimized, leave you at the unaware stage, right? Because that is where people are very reactive. We were having a conversation yesterday with another customer. They have to comply with the California Consumer Privacy Act, right? Everyone has to comply with some of these policies. And in all reality, they only get nine or ten of those a month. So why do I need to buy all this software, put in all this integration, team up all these people, train and educate when I can do this, you know, with some level of defined process? And that is really the evolution that we see, is that process. And you'll notice when you get into aware, you need to inventory your data sources, and you need to do that continuously, right? You need to keep refreshed information on your data, what's going on, what's happening inside of it, and then put some of those policies in place to automatically remediate before something bad happens. And that's where we get into that two to four months. Most projects that I see with our clients, we have two week quick start, two week pilot, 20 terabytes of data, two weeks in, out. Here's what's happening inside of your data. Don't buy my software, just see what's going on, make a decision, what do you want to do, and then we'll have a better conversation. Then you can scale it out from there. Then you slide them to the right. Yeah, exactly right. So how do you get started with this then? So we've got the core governance framework, and everybody in this room is looking at some level of implementing data governance within their organization. But now you're looking at how do we influence the data ethics within our data governance program? So as Brian was saying before, you start small with an assessment. So if you flip over to the next slide, Brian, this is the approach that we like to take, which is a small workshop to just understand where you're at in your governance program. Half a day, we have a number of people from your organization. We have data experts on our side, data privacy experts from Data Sentinel. And we just walk through exactly what the business challenges are within your organization, how that ties into the governance framework itself. Once we understand that, it normally fits into the four pillars you see in the middle, which are the data ethics audit, the data governance maturity audit, the technology audit. So what technology do you have already that you may not be leveraging or where are the gaps there? So we can do a gap analysis and say, well, you may have good structured solutions in place, but your unstructured solutions are completely non-existent, so you need to understand a technology option for you there. And in terms of a sensitive data audit, what accelerators can you use to understand, okay, where is your sensitive data and what sensitive data do you have? And Brian, do you want to just tie into that and add a bit more flavor? One of the biggest outcomes isn't, oh, surprise, I have information in here that's privacy-based data, but one of the biggest outcomes we see through the sensitive data audit is the clustering process, right? So one of the results from that audit is to say to a customer, I looked at 40 systems of record that you provided to me, and here is the primary cluster of your sensitive data. These six systems have super high sensitivity information in it. We color that red. These ones have mostly green. What we do that really helps people align is we do a risk assessment while we're scanning the data to find the privacy information. We put a dollar figure on it. So this helps people target, right? So if you see this source of data, and I'm looking at a dashboard and it says, if your company loses this data, the fine for this is going to be $2.5 million. So we built a data calculator into the scanning process. So while we're running through that process, we show you where it's clustered. We show you where your most sensitive data is. And we also show you, hey, this is going to be a high fine if you lose this data. Not to tell someone you're going to be breached or you have been breached, but rather to say focus here first. This is the most important data you have in this process. And so those who have maybe struggled to get a data governance program off the ground because you can't really justify the ROI, you've just given two, three use cases of where you can go to your executive team and give a significant ROI on a governance program tied to data ethics. Exactly right. And our longest one is four weeks. Four weeks in, out. Most of them are two weeks. The reporting shouldn't be manual. We use the technology to feed the data back to the customer. So we're not sitting there writing reports. And I know consultancies like to build out all these great reports and say this, look at this, but it's dead afterwards, right? That piece of, that report is stale in a week. So being able to do something like that quickly, hand that report or that dashboard over to people and say, here's where we're going to go target this first and go after that the right way. That really builds the roadmap because you don't want to come in and say, I can make you compliant. I can make you clean. I can make you, you know, you don't want to make big promises. What you want to do is understand the content first and then decide where to go. Excellent. Is that our last one? We have got two more slides here. So I'll just quickly summarize here. So part of the outcome of what one of those audits and assessments would look like is how you tie in your business governance with technical governance. How you tie in any compliance to that. So you're buttoning it all together in those three areas of focus. And so you're working with both the business and the technical team, not executing in the silo. And the compliance governance side of the house, one of the things that we have to understand is as an organization, where do we have to comply? Are we talking about mandatory mandates? Are we talking about NIST or PCI? Are we a financial services organization? Are we an insurer? Are we a healthcare company? So the compliance governance, the thing I will say there that really helps is come in with the bigger picture, look for all the use cases across those levels of compliance and serve that data up. And that's the best way you're going to get to it. And so Brian, one of the other challenges right now, so we've got GDPR and CCPA. We've got the Canadian legislation. We've got Bill C-11, Bill C-64, GDPR, CCPA, LGPD, PipaD, the new Chinese one that came in, the fact. All the states that are dropping them in. That's the tip of the iceberg, look. There's going to be more coming along. How easy is it then to use this framework and use this assessment for the next set of compliances that are due to come in? Well, it's straightforward, right? So when you're doing a privacy audit, you should look for all the sensitive data. But while you're doing the audit, you should extend to what those policies for compliance are. So most of them overlap, right? So if you look at GDPR and CCPA and CPRA, most of them overlap. If there's 20 fields, there's going to be 23 in the next one, 24 in the next one. So it's not a lot of information. Some of them they want, you know, clickstream data. And images and any of the biometric information, right? So you have to hunt for that at the same time. And what I would say is you can do that all in the same past that you're going to do and put your ethics policy on that. And those don't have to be mandated by compliance rules, right? They can just be mandated by your organization to be better shepherds of your data and make that promise back to your customer and tell them how you did it and why it's important. And I think that is the last slide. Yes. So any questions? We went really fast. We'll give you guys some time back then. Thank you very much. Thank you.