 Hello and welcome. My name is Shannon Kemp and I'm the Chief Digital Manager of DataVersity. We'd like to thank you for joining the latest installment of the DataVersity Webinar Series, Data Insights and Analytics, brought to you in partnership with First Hand Francisco partners. Just a couple of points to get us started for today. Do the large number of people that will be attending this session. You will be muted during the webinar. For questions, we'll be collecting them by the Q&A in the bottom right-hand corner of your screen. Or if you'd like to tweet, we encourage you to share highlights or questions via Twitter using hashtag DIAnalytics. As always, we will send a follow-up email within two business days containing links to the slides. The recording of this session and additional information are requested throughout the webinar. Now let me introduce our speaker for today. Today, John Labley will be discussing Governing Quality Analytics. Well-known industry analyst John is a business technology thought leader and recognized authority in all aspects of enterprise information management with 30 years experience in planning, project management, improving IT organizations, and successful implementation of information systems. He is the president and chief delivery officer at First Hand Francisco partners. And with that, let me turn the webinar over to John to get us started. Hello and welcome. Well, thank you very much. Can you hear me okay? Let's do that sound check first. Yeah, you're good. Alrighty. All right. And good afternoon. Good morning. Good evening to everybody listening wherever you may be. It's a solo today. Kelly is on holiday. So you're stuck with just me. Well, let's just start with a question. Let's just see where everyone is and what you're thinking. In terms of governing your analytics and your analytics processes and the teams that do the analytics, how effective are you? Are you very effective? Are you somewhat not effective? Or you're just not thinking about it right now. It's not a focus of you. And I'll talk about the next question while people go ahead and please answer that. I do want to remind we did get some feedback that while people were ruminating on the question, the timer went off. So this is just kind of a gut feel thing. Don't debate too long on it. The second question is your organization taking advantage of analytics derived from big data? Now we've asked this once or twice already before. We're just kind of doing a check to see what the changes are during the course of this year's webinar series. So if you could just take a second here to possibly answer that one. That would be terrific as well. And while you're doing that, I am going to move on and continue to converse if that's okay with everybody. Nobody complains. So let's go. All right. The topics for today. We're going to be talking about blending governance and analytics. And of course, we see analytics. The big data speaks into there as well. And it's always a question of degree. And it's always a multi-dimensional or a multifaceted type issue. So we're going to be thinking in terms, I will tell you right now. We're going to be looking at a couple of multi-dimensional presentations today and try to get you thinking in terms of different influencing factors that tweak governance. Because on one hand, governance is governance is governance. You have a data asset and there you require the rules that go along with that. But we do have some subtle shading that we need to discuss. We want to talk about their different levels of the types of capabilities that you might have and what type of support you need to give these types of environments. What we aren't going to talk about today is whether the analytics are useful for your organization or not. We're going to assume that the analytic idea or the big data concept or your intended use is a home run. It's good to go. So how does governance apply to that? Taking a look at the poll, we have a lot of bees. So folks are somewhat effective and not really and not sure, though, that they're taking advantage of things yet. Not a big change during the course of the year, but we'll try again here before the end of the year. Some quick definitions in case we need to talk about it. We do a case. We have some folks here that are new. And data gamer, this is really just a fast review. Data governance is the framework for really managing your data. It's not doing the management of the data. It's not managing the data per se. It tends to be or needs to consider certainly an oversight body that creates the rules and the policies and the process and things like that. I want to make sure that we cover that here just as a level set. Analytics is when we talk about analytics, we're not talking about complicated business intelligence or just big piles of data. What we are talking about are using some sophisticated algorithms, some artificial intelligence or machine learning. We have the data scientists involved. We're mining enormous amounts of data quite possibly. It's more sophisticated than that. Quote on quote, regular data governance is always been applied to traditional business intelligence. What we're talking here is a subtle difference between governance for the advanced type application of analytics and versus the normal or what I call traditional application of data governance. So that's just kind of a level set for us and if there are any questions. We do have questions by the way. We are allowed to ask questions on our series here. You can see that there's a spot to do that. And if you have a question, please enter it. We always leave a little bit of time towards the end of the presentation. So let's just talk about the degree of governance and analytics. And it is a question of degree. Okay. You want to exploit a data asset. And the more robustly we want to exploit our data, the more we find that there has to be an application of governance when it comes to analytics. Now, let's step away from analytics. Could I do master data management and govern that and not have data governance over an early stage set of an analytical sandbox? See, we're just getting started. Absolutely. All right. But if you want to get more and more effective and exploit this analytical capability more, you're going to have to think more and more about some insertion of formal governance there. Okay. Because things are going to start to enter into this data supply chain. All right. As you get more sophisticated with analytics, you're going to be pulling data from more places. You're going to be correlating external data and internal data. The models are going to be more sophisticated. So you have to have much tighter oversight over quality, sourcing, the distribution of your results, and the meaning of the data you're using and the meaning of your results. Now, there are a lot of other characteristics of oversight. But I pick these four, quality, source, distribution, and meaning just to kind of represent where the data governance needs to change as time goes along. So if we kind of do a, what kind of data governance, you know, as we get more robust, a lot of organizations will have what I call data governance like, which would be here some guidelines. It pops up in an area. It might be many organizations when they start doing data governance. As I'm sure many of you on this call have, you're just getting started and you're just beginning to really realize an internal line that, oh, this data governance has to be here. We need some oversight. We just can't have all this stuff going on in all these directions at the same time. So you've already started and you have some form and it could be one or two people trying to get some data stewards or you could have tried to launch some stewards and they're trying again to launch some stewards, but you've got some form of it. The next step is what I would call the federated oversight where you start to have the council, perhaps, or some centralized, excuse me, or hub of governance. And then there is either a hierarchical or a matrix structure where this central group has some influence to some extent out there over either master data or data quality or analytics or business intelligence or some combination of those. Lastly, we get into what I would call a more sophisticated application and for the sake of this talk, we're going to call that active data governance. And what that means is that when you do any project and you're touching data, governance is at the table. Whether it's federated or centralized or decentralized, doesn't matter. The PMO has data governance embedded in it. When you do your annual portfolio planning or your budgeting process, data governance is at the table. So regardless of the structure, regardless of the operating model, data governance is embedded. So we have light, which is awareness. We have federated, which is we're starting to get organized and formal and then we have active, which is we're just up and running. So as we exploit this data more, of course, the more active data governance needs to become and we'll look at some of the characteristics as to what causes that. So if I move forward here to the next one, when you get to slides, here's your definitions. Data governance is acculturated. Your entire data lifecycle is governed. Where the federated is, we've got that role and it's or, and if it's light, it's that new function and it's somewhat limited. But that little curve there is very, very relevant. We have to understand that the more we want to climb that curve, the more we want to exploit our analytic capability, the more active we are, the more likelihood we're going to be able to exploit. So before I go any farther, I just want to review a couple of terms we had a few months ago on the types of analysis because I'm going to be referring to them. There's descriptive analytics, which uncovers, so that's kind of what happened. We have a bunch of data we look at and go, oh, that's what happened. That's why that happened. Predictive is we have a bunch of stuff. We get some trends or some things and we can say, I think that if all these conditions are the same again, this is going to happen again. And then the last one is prescriptive and that's what should be done. Could you, the algorithm, make a recommendation based on a recognized set of patterns and in algorithms? The algorithm says, tomorrow you must turn left or turn right because the data says that's what is appropriate. This to review those, and the only reason I did this slide was so the rest of the conversation could be a little bit tighter. So we have our three types of analytics, descriptive, predictive, and prescriptive. And we talked about some different layers of data governance. So that means we can start to put the two together. Remember, I told you at the beginning we're going to be a little bit multi-dimensional today. So the green, of course, is go. That's the ideal spot. So let's take a light form of governance and an organization that is just now embarking on analytics and there is a tendency that when you just embark to be descriptive. So here's a bunch of data, what happened? Show us something that explains something to us. Show us some causality instead of just correlation. Now, when we do that, what we find out is the typical scenario, we've talked about this in this series this year. You're going to have a sandbox or a group of data scientists who are spun up and they tend to be funded outside of IT very often inside a business area even. So data quality has not to be a concern. They're just looking for insight or patterns within that. There is some awareness of the data source because, of course, you have to get the data from somewhere, but it's wherever they can get it, all right? There hasn't been a lot of scrutiny as to whether it's got legs or not, but for this purpose of this initial go, it's good enough. The only distribution problems you're going to worry about because you have a very narrow audience would be compliance-based, for example, in hearing the privacy. We have seen examples where this is overlooked. Any initial data science group does something really awesome and grabs a bunch of data and sticks it on their laptops and goes it, and it's got all kind of private data in it, and they actually, in effect, kind of push the envelope on compliance. So there does have to be some oversight there, but that can be self-governing, okay? And then meaning or context or semantics is rarely a concern. This is where you get the schema on read. The data is loaded in there. We have an algorithm. We run something against it, and we will tell you the meaning of it when we present it to you. So obviously, an awareness of governance only from the sake of compliance is appropriate. This is not the ideal situation, however. This is more of the proof of concept. You can do it without any governance if you, you know, and, again, as we've said, we've seen instances where this has happened. Now, if we get to the predictive state, I need to understand the volume, but data volume will dampen out data quality issues. I just need to have an awareness that it is the appropriate dampening of the data quality. So some governance on the quality, are you using the right types of data? We're going to the source, some awareness of the source. Is there, you might want to have some oversight as to who's seeing the results. Are they supposed to see the results? This is not a stage where you have the data scientist, citizen or the citizen data scientist, where everyone can dive in and mess around with it. There still has to be some throttling because algorithms aren't managed really tight, things like that. Meaning you do have to know the context now. Once I'm starting to predict, you know, based on a certain set of patterns, this is going to happen. I have to know the context of those patterns, the context of the data that was used. Therefore, now my governance is becoming more important. So if I am in a data governance-like situation, we're just getting started. If you're going to wade in to the analytics area, you better have these capabilities at hand. If you're not, you're not going to be really getting in the green. You're going to be like to the left of that. When you get prescriptive, we have to get down to the next level. Or we have to get into federated. Now we can start federated in the predictive area. We have tighter data quality because we're getting perhaps in a federated thing where I have some oversight. I can go, let's make sure we're using the right sources and the approved sources. Obviously, the awareness of the sources, the distribution pretty much is the same, and the meaning is the same. It's just we might want to tighten up the sources because when you're federated, there's more of an audience now for data governance. If we want to do prescriptive and we're federated, we can certainly go beyond that, but the volume dappening comes in, awareness comes in, the context is required, and there have to have more oversight as to who is seeing this. When we get into prescriptive, your sweet spot is active governance because you're going to need to, when you do prescriptive, you are in essence starting to be data-driven. You might have operational input from your data analytical models, and as such, you better have some good idea of quality. Quality is important. Reference data management, RDM, reference data. The data dimensions you're using to slice and dice and present and decide have to be well-governed. The sourcing has to be known. Where did it come from? And you have to know where it is and you have to know why you're using it. That's lineage. Distribution has to be well-controlled because you are feeding inputs that will adjust business processes to business functions. So therefore, you have to have some oversight over that, and you have to have an idea of context and managing those by, when I say models here, I'm not talking data models, I'm talking analytical models. So prescriptive equals a more robust form of governance. Predictive, we can deal with light or federated, but if we're descriptive and we're just kind of starting out to look for some patterns, we can get away with little or no governance, which, again, a lot of organizations do. So if we move on here a little bit, let's find the right button. There we go. Let's talk about the maturity now of the organizations, because we talked about governing, but that's not necessarily data maturity. When we have our little maturity scale, that is, you get more mature, you do tend to get more prescriptive and trust the data more. But that's based on your use of the data. It's not based on the sophistication of the tools. It's not based on your metadata. It's really what are you attempting to get out of the data? If you're still looking at something descriptive or prescriptive, fine. And you could have it all wrapped up in a little bow, and that's really great. But you're still not getting to that data-driven part where you're really depending on data to help. Data becomes an equal voice at the table. We have a lot of companies, we all know this. Companies are managed by individuals based on experience for the most part. The more mature you become, the more the data has that seat at the table to help with the decision-making. And that's what we're talking about here in terms of our maturity. Maturity is reflected in the architectures. We touched on this a couple of months ago. We start out with isolated architectures, and the value proposition goes to a limited audience. Once the capabilities are recognized, we evolve the architectures to support an expanded audience to become more insight-driven, but we become data-driven when we start to embed the results. In other words, data has that seat at the table into tactical operations or monetization of data. Architectures evolve along this course. Architectures are not static. Just to revisit a comment or something we submitted a few months ago, your target state architecture will not be implemented day one. Just don't count on that. Don't try to do it. It's too hard to do it. You will evolve. Let's just accept the fact and evolve. Let's talk about the use of the data, the progression of the architecture. Now, let's talk about how mature you are with how you use the data. We can use the data, and as you can see, the top row here is kind of that descriptive through prescriptive line. We can also operationalize data and get into more adaptive behaviors. That's a perceived maturity of the organization. There's just the capabilities of the organization where you can make things more autonomous, and there tends to be architectural characteristics that come along with that. It's not an absolute, so don't send me questions that saying, John, we are operationalizing our data and we don't do data streaming. We look at Facebook feeds or whatever. I go, that's awesome. That's great. But in the IoT world and in a lot of the social media examination, you are streaming. You're trying to get as close to real time as possible. Again, as I said earlier, there's kind of a lot of dimensions to keep rolling around in your head when you're starting to think, how do I apply governance now in this evolving maturity world, evolving architecture world? So back to one of our little charts. When the initial start of the analytics is isolated, tends to be descriptive, not much of a role. In fact, what we've noticed on a maturity view is it can often be viewed as invasive. So our advice to organizations where, and you could put, if we had a third multi-dimensional way to present this, I would have the data governance light hovering over this, is if you're a young program or federated, it is best to start to just build your relationships with the data science area. They have most likely spun up without any data governance oversight. And in many cases, at least half the times, they don't want to be bothered. The other half the time they go, we really hope we're a catalyst for data governance because we can't get much beyond this descriptive phase or get much into monetizing our data without some of it. We're just basically proving the concept here. Now, as we, if we want to use this isolated type architecture and do a predictive type thing or prescriptive, our data governance is going to be dependent entirely upon what they're doing with those results. You know, if the results are really significant to the organization, you had better apply governance even in an isolated type architecture or a very early attempt to use data analytically, all right? And this is where we see a lot of data governance now starting. We see data governance starting in two big areas right now. One, I just mentioned here, getting more into analytics, more into data-driven. A management realizes that they're trying to build something really cool with a supply chain that is questionable at best so they get on board the data governance bandwagon. The other one is the regulatory-driven stuff, which we've had for many years, and we're not even going to talk about that one, all right? However, again, with this one, you build your relationships with data science. If you're federated or you have a pretty sophisticated function of governance and you've been doing data governance on MDM and data quality and BI and data warehouse and along comes analytics, by now you must build a relationship with them. I don't think you wade in and say, we are governance, we are here to help. Now just stand in the corner and watch. That's not going to happen. There's a lot of reasons for that. A lot of it is the governance has been funded by a business area and they may have tried to keep a low profile and stay away from governance. Governance is embedded in IT, and the business area starting this doesn't want to deal with IT in the first place, all right? There just could be some ego. Data science is a pretty cool skill profession, and, you know, someone comes along that doesn't know a Monte Carlo simulation from the Monte Carlo road race and, you know, you don't want to talk to them. You're busy. So there's a lot of reasons. So that's why I like to say start to build a relationship first for these more isolated examples. Now, let's say it's been recognized. Once it's recognized, folks, governance has to be there, all right? You are expanding out into the organization. The architectures recognize people say, hey, stuff's in the data lake. We're getting cool things out of the data lake. I want to put stuff in the data lake. I want to get cool stuff myself out of the data lake. Now you've got to have the beginning of a gatekeeper of some sort, not only for the output, but for the input, all right? And that's, of course, pretty much the definition of data governance, especially data quality, especially consistency of the data. Now, consistency is a dimension of data quality, but I like to call it out separately that what's coming in there is a known commodity. It could be that what's coming in there is changing or dynamic, and if it is, maybe we don't want to use that source. Maybe we want to use another source. But then moving on now to data-driven. Organization now is embedding analytics in operations. So data has that seat at the table, right? You are monetizing your data. So in the descriptive and the prescriptive type uses of that data for the data-driven, data quality, consistency, data governance now ups its gain. It's not just the oversight and the characteristics I said earlier, like you got the right source, you got data quality, semantic consistency, things like that. Now we're looking at business alignment. One of the often overlooked roles of data governance, but it's a role that was defined, I know I talked about it years ago, I'm sure others have, is business alignment. You have this powerful new tool being a data-driven capability in the organization. You've got a data lake and you're dumping IoT stuff in it, and when the possibilities are limitless, there needs to be a conscious alignment of where the business is going with what's coming out of that. Because at this point, you can actually produce too much out of these data models. You can give management too much to deal with. The metaphor of drinking from a fire hose comes to mind here. Organizations that are finding themselves data-driven still go through a phase where they have to really assimilate the amount of stuff that can be presented to leadership. So data governance has this additional up-the-game type thing to get involved. Governance also needs to be part, once you're prescriptive, now you are, remember as I said earlier, you are inserting yourself into operations quite possibly. Well, that's a new business process or a new business function. Data governance comes to bear and says, have we vetted the new processor of the function? Is it documented? Are there appropriate controls, data controls within this function? Track the rules, track the machine learning models, and again, going back to business alignment, make sure that what's happening is what was intended to happen. Now, I will say there is an argument to be made that this is data governance pushed to almost not data governance. It's almost business governance, and that's okay. I don't want to have a philosophical argument here, but something I've said on this series of webinars. Governance is governance is governance. Data governance is relatively new. Other types of governance have been around a long time. What we're seeing here is a real positive sign of maturity when data governance starts to bump into other types of oversight within the organization. So when that happens and you start to have this debate, is this data governance? That's actually a really good sign that you're doing some marvelously cool things with data and you're going to have to stand back and say, hey, we're doing something that not a lot of people have done, and we've got to put our thinking cap on. Anyway, maturity, as you can see from the chart here, maturity really does have a role in the type of governance as well. So I'm going to pause here, let anyone type in a question. If they have one, I need a quick sip of water, so I will take that, and then we'll be right back to it. See, that's normally the point where I will have Kelly make a comment, and I can get my water. Not today, but that's okay. I hope everyone, again, let's take a pause here. If there are questions, please enter them. More than happy to entertain your questions towards the end, and we're going to probably have a good 10 or 15 minutes here this morning or this afternoon for our questions. So what additional capabilities do we need? Well, data governance to remediate the architecture. Enterprise architecture does oversee architecture, and we have many clients where data governance and enterprise architecture get along famously. We have others where one or the other tries to tell the other one what to do, and that's incongruous. That's an apples and orange type thing, and it's not the horse leading the cart, it's the horse pushing the steamship. I mean, it's an incongruous type example. There are two very, very separate types of things that do run into each other in these instances. So if we are trying to evolve a big data or analytic architecture for a certain reason, then that reason from a business alignment standpoint should be presented or run by data governance to make sure that there is a tie between a business driver and expansion of the architecture. The reason I say that is because a lot of organizations will say, wow, this year we're this big and they'll do an extrapolation and say the sky's the limit and go out and buy an awful lot of stuff that they don't use for a very long time. I think that's smart to try to prevent that. Another bit, providing principles to just guide your decision-making around your architectures. Do you want to expand? Do you make things available to folks? What are the gatekeepers for making data available, for promoting a citizen data scientist? What are those guardrails or rubble strips? What are the principles behind that? There are also the methods for doing architectures and a lot of architectures are driven solely by estimating physical capacities and latencies and network capabilities and things like that without connecting those capacities to specific business drivers, always good to say for governance to say, hey, you said you need a machine this big next year or we're going to put this much more in the cloud. What were the numerical drivers for that? Okay, there's your numerical drivers. You're going to have a 10-fold increase in volume. What is the business conditions that create that 10-fold increase? Do we have those? Are those realistic? Are you just buying ahead on capacity because you have a good deal? That's possible. That's okay. But let's just go through that little bit of due diligence. Again, interact with enterprise architecture. Interact with procurement. We all know about rogue procurement. I have a reputation for being candid, but candid in a nice way. This is one of those times. Business areas go out and buy stuff, and they do it for what is considered a good reason that it takes too long to go through IT or whatever. But more and more you become data-driven. The more and more risk you put your organization in and you cannot do that. This is becoming no different than if someone in logistics says what we really need is a new factory because our supply chain is getting bogged down, and then they go out and buy a new factory and build it without any other type of planning process in the organization. It's not a far-fetched analogy. That's exactly what you do when you take something that is a good sandbox and you knock down $20 million and go with some enormous type capacity and you get lots of tools and you put lots of stuff on cloud, sign lots of agreements for external data coming in, and you are doing the same darn thing as just building another factory without permission if you're going to be data-driven. If you're going to embrace that level of maturity, you have to embrace some of the rules that come with it, and this is one of them. And then using the components, and this helps out the data scientists because you're going to have the data, the citizen data scientists wanting to dive in here, and maybe it's not appropriate, so they need to do that. So you have to have some controls over that. Being data-driven, sometimes you don't like the answer, and organizations need to get used to that. Now I put a picture, we know on our Halloween edition we have a lot of fun and there's a trivia, so this is a peek at our Halloween trivia contest. What is the answer that the people didn't like that's related to the graphic? And if you get that right, you get absolutely no prize at all except the pride in having a right answer. So you have to work with the data scientists to diagnose the impacts of being data-driven. That means maybe governance can do some oversight and say, do the business function people get together with the data science people and are they in line to do this reaction? I have no correct answer yet, I'm very surprised at that. Determine the organization tolerance for closed-loop activity. Now what this means is, and back when data warehouse was really, really hot in the late 90s, everyone said this was the next step, that we were going to do closed-loops. Well, with IoT and Internet of Things and big data lakes and all these powerful analytics and artificial intelligence, we can now have decisions made hands-off, human hands-off. If you're going to do this, do you want that human intervention? For example, we're seeing, you know, self-driving cars. That's a bunch of algorithms making a cargo somewhere. If self-flying airplanes, that's a bunch of algorithms going somewhere. So those are some things that, that's a cultural thing. So you have to check with that. Double-check with your compliance area that you can be data-driven, that you're not going to break any laws. And again, we're all working here that you don't like, you know, you might work with, not like the answer that you are working with. How do you integrate organization change into this? Because if you are going to close a loop, if you are going to embed algorithmic results into operations, then you need to have those folks with the role. So now the chief analytics officer or the data scientist should be on a data governance council. It should have a role in this data governing function. We have data stewards. A lot of organizations we now encourage to appoint an analytics steward, okay? And that is someone who makes sure that the models are understood and documented and you know who wrote them and you know what the expected result was and the unexpected results that happened. And one thing I read the other day, which is really interesting, which I'm kicking around, offering up to our sophisticated clients, is also can you generate repeatable results? We have seen, and I saw an article connected with the pharmaceutical industry on this recently, which triggered this thinking, that you run an algorithm and you get some type of descriptive or predictive result. And you go, that's the way it is. Organizations that do this need to repeat that model and get the same expected result, just to ensure that their initial assumptions were correct or that the assumptions haven't changed. And some organizations plan on that and some don't. So if your business cycle is a year or six months, then at least once every business cycle you need to rerun a lot of these things just to make sure that the results, nothing has changed structurally to alter the result. If you do have an organization change area, if you do have an organization change plan with data governance, trigger that and say, what do we need to do now about being data driven? What do we need to do about overseeing that we're creating some insertions in the functional areas? Taxonomies and algorithms are glossary type items. So let's consider getting those involved as well. And remember analytics, big data, it's all part of enterprise information management. So make sure that the appropriate planning activities and business alignment activities all are accounting for analytics as well as any other type of use of data. Let's see here. Let's just move on to this one. Let's talk about compliance for a minute. Another area, as we're getting more and more into a big data, we are also getting a lot more ideas out of analytics to help with compliance. So it's not just blocking intact. Well, there's two sides to the coin. There is the analytics solution in compliance. All right. Are we managing risk appropriately with the analytics solutions? And then there is actually using the power of analytics to become more compliant. For example, in GDPR, the global data protection regulation, notifying the appropriate authorities of a breach is really, really important, and it's quite a tight timeframe. What if you had a big data or analytics solution monitoring social media, and discern that your organization was getting a lot of bad mentions in social media, then you might want to investigate to see if somebody is upset for some reason, or they took a movie in one of your facilities that went viral. So this is a way to get ahead of, before you find out about the actual breach or the actual bad occurrence, you can start to detect the rumbles out in the Internet. So there is there are the two sides to the coin. Be compliant or support compliance or forecast compliance, right? Either way, though, data governance can help identify the risk areas, part of business alignment. How are we dealing with these risks? Are you going to use an analytical solution to deal with the risk? Okay, you are. What are the target risk levels that you are currently experiencing and what are the target risk levels that you want to move to? You know, it's a heat map. Go for the hot spot that we have here on our picture to go to the cold spot. Identify the risks. Build out the policies. Integrate with whatever appropriate corporate officer, such as a private officer, you need to integrate with. And I'm going to do one more pause here before I lose my voice. Apologies for that. And now we're going to move on into the last topic here, which is what kind of support do we need to do this? There is a lot of good material on big data and analytics and data governance in the literature. So I'm not going to try to cover all of that and all of the points that myself and other authors have brought forward. I'm going to pick out some nice high points here that I think are pretty interesting. And I'm going to do that through the... Oh, sorry about that. Through just this one little page here, I want to walk through what's it mean to treat your data as an asset. We talk about this a lot, but I'm going to... The perspective of this is an organization is going to be data-driven or monetize its data. Okay? So there are some things here that might be a little different and these are things that data governance can help to encourage, facilitate, or even operate in some things. First of all, if you're going to use data and really exploit it, treat it as a product. You have a... We'll talk about the supply chain in a minute, the details, but you have a supply chain and you'll need quality control. You'll need a product manager as such. Treat it as that. If your organization makes things and sells them, you have a perfect metaphor for advancing your analytic agenda because you want that to mirror your supply chains. Defining your data supply chain is the next important thing that you have to do and governance can certainly help with that and that's define it from the very source to its conclusion and disposal. Well, now that's, of course, record retention and things like that, but also the production of the data. When we produce a good or service, we take something, either an idea or a physical raw material and we add to it and increase its value so we can take the sum of the parts and make more on the pricing side, right? You need to understand that with your data. What is the sum of all of the parts? And that's not only the production and the sourcing, but also the marketing. Once you start to expand the role of analytics, you're starting to want to draw people in and be more engaged with data but it also accept the discipline that's coming along with it. There's a cost to that. What about shipping? What about distribution? We have this tendency to think that we're just going to make it available and people can just download files and we have these situations in organizations where there's hundreds of comma separator files flying all over the place. We don't want that. We want self-service. We don't want to be self-serving, okay? Establish governance and quality functions. And I mean functions, not the program, but burying these functions in the business. Data scientists can't sustain their data science area without good raw material. They can only go so far on that sandbox environment and then they have to get disciplined and embed this as a repeatable process and you have to help them do that. There'll have to be standards for the source data. You just can't go grab any old data after a period of time. That also means you might have to fix the source data, old-fashioned data quality programs. Integrate your data strategy and with the vision of the organization. This is the business alignment I mentioned. 90% of the same support mechanisms for data and data visioning go along with business visioning and business strategy. That is, what are we going to do with it? What changes are we looking for? What are the KPIs? What's the balance sheet result? What's the income statement result? So think along these terms. These are additional things for governance in a sophisticated analytics-driven environment. So just a quick review here. Then into the questions. Application of data governance varies on the architecture and the analytics at play. We saw many dimensions. I wish I could put them on one rotating holographic cube, but you kind of saw what I was talking about. Maturity is a metric of your capability but not accomplishment. You can get things done being data governance light and descriptive. You can do really cool things. You're not exceptionally mature, but you can still do cool things. All right? Data quality and consistency are the obvious things which we talk about, but don't forget business alignment. Don't forget engaging with the architectural folks and don't forget that data-driven means some insertion of oversight into the new roles and things like that. So at that point, I will start to address some questions I did not see. Shannon, you might want to weigh in and make sure I can see all the questions here. I just see a handful, but I don't see a correct answer to the trivia question. Is it there somewhere? Have I missed it? No, you should see all the answers to the trivia questions in the chat there. Finally, someone just popped in. Thank you, Shannon. You broke the log jam. All right. So I will back. And the answer, of course, was 42, right? So here's a question. How do we ensure that we are successfully implementing our data governance program? How do we measure our success? One of my favorite questions. How do you measure success? Well, first of all, you have to define success. And success is not normally best defined as better data quality or better access to data or those things, or we have a golden copy of our data. Those are actually requirements of being more disciplined with your data, because you would not ever not choose to do those. But what is the result if you assume all of those things go as planned how is your business or business processes different? Are you making more money? Can you connect embedding analytics into Internet of Things type stuff and generating more money? I mean, we are seeing, you can watch television commercials. You cannot go an evening on primetime television and not see a commercial for a certain very large hardware and software and services company on how they will help predict things are going to happen with their analytics solutions. These are, by the way, legitimate solutions. Some of them they didn't invent. They've been around a while, but they are all really, really good applications of the technology, and they all have direct bottom line results. So if you're expecting bottom line results, well, what are they? You measure those. You're allowed, just because you're a data person doesn't mean you're allowed to say, hey, did our income go up? Do we have better return on sales? Is our stock price up? So I start with measuring success with business results. After that, you want to measure effectiveness and efficiency of your data governance functions. So what's effectiveness? Well, are people coming to meetings? Are you getting your dictionary built out? Are you resolving data issues? Do a survey. Are people recognizing the value of this? So that's the effectiveness. The efficiency is, are people feeling that data governance is a lot of extra or it's saving some money? So you can measure those types of things as well. I hope that answered that question. There is, I will refer, and perhaps Shannon, before we tidy up here, can refer to data diversity where we have several recorded webinars on metrics specifically around data governance that you could avail yourself of and go give a listen to. What is governance's role in IT and business? How long does it involve making decisions? Well, now that's an interesting question in that data governance is not separate from IT or business. It is a mandate. Well, let's forget data governance. Let me explain it this way. Let's forget data governance. Let's just talk about governance. If your organization is publicly traded or is subject to some public scrutiny, that's a government organization or a very large non-for-profit, for example, you have governance. Your board of directors or your board of governors or whatever have a charter. And I will just about guarantee anybody the word governance appears in that charter. And what that means is someone's job is to provide oversight to make sure that the proper behaviors are taking place to ensure the government or the corporation or whatever accomplishes its vision and goals, but does so in a proper manner, whether it's controls, financially accurate, ethical behavior, whatever. And that is the essence of governance. So now we've established the fact governance is not unusual, nor neither then is data governance. IT does data governance now. Most IT data governance is not how we want to do it. It is use this standard service, use this, when I say service, I mean a web service, or use this standard data source because we built a very expensive ERP system. And that is governance, however, a lot of times the party making the request doesn't have to adhere to that guideline. So that is data governance light. It's not official or it's very nation, but it is a form of governance. It usually lives in IT. Data governance does not necessarily make decisions, but it certainly supports a set of processes and policies that will ensure that the decisions are carried out and that the data aspects or the required support of data in carrying out business decisions are there and are available and are accurate. The next question, is it possible to have governance without authority? Yes, it is possible. At the end of the day, in a perfect world, five to ten years from the future, there will be organizations where governance is so embedded that the only authority exercised are as a periodic, well-accepted and invited audit. Someone can say, you did pretty darn good except on this one control or this one process we could have done a little better, and everyone says thank you. Now, I don't call that authority. I call that controls. I call that normal corporate. And again, those that have heard me talk many times, you know I firmly believe that a data governance program, if done perfectly, will actually more or less disappear. It won't appear to be a program. It will be an embedded function in the organization. Let's see here. The next question, surprise me for how much people fill data governance is not widely appreciated in these early phases. And support from influential upper management is crucial. Absolutely right. Business needs to be better educated and stop making decisions based on emotions. Absolutely right. And for that reason, what we have is the data governance teams on board. We have a few educated or enlightened middle people on board. We have upper management that says, yes, of course, this is silly. We thought you were already doing this in many cases. Go do it. And in the middle are people who are not supportive, either through omission or commission, but they are not supportive. And you're absolutely right. It's a lot of education. It's a lot of things like that. What metrics do we use to measure ROI? What kind of ROI? Return on investment of governance. There is no ROI on governance. Is it effective? Is it supporting the business? Is it efficient? By ROI, does it mean does it pay for itself or does it result in benefits in the organization? Yes, but there's many, many metrics for that. There is no, we have a 14% return on our data governance program. That implies that it's a project, that it's a one-time investment like we built a factory. And that is the absolute wrong kind of metric in mindset to have. What we want to do in the spirit of ROI is it has an ongoing benefit. And that's where the efficiency and the contribution type metrics and the effectiveness type metrics that I mentioned come in. Let's see here. Well, there's a lot of questions here, but some of them have already been addressed. And I think we have one minute left. So with that, Shannon, I'm going to turn it back to you for a wrap-up, a little guidance on some of these other webinars that talk about metrics. And I will tell everybody here, thank you very much for your time today. And I look forward to talking with you at the next event. So I do. Thank you. Joan, thank you for this great presentation. And thanks to all of our attendees for being so engaged in everything we do. We just love all the great questions that come in throughout the webinars. And so just a reminder, I will be sending a follow-up email by end of day Monday to all registrants with links to the slides, links to the recording, and anything else that was requested throughout. And so I hope everyone has a great day. It's beautiful here. Hopefully it is in your corner of the world. And we will see you next month. Thanks, all. Bye-bye.