 Here. Hello and welcome. My name is Shannon Kemp and I'm the Chief Digital Manager of DataVercity. We would like to thank you for joining this DataVercity webinar, a practical guide to implementing effective VI governance sponsored today by Metric Insights. Just a couple of points to get us started. Due to the large number of people that attend these sessions, you will be muted during the webinar. For questions, we will be collecting them by the Q&A. Or if you'd like to tweet, we encourage you to share highlights or questions via Twitter using hashtag DataVercity. And if you'd like to chat with us or with each other, we certainly encourage you to do so. And again, just a note, Zoom defaults the chat to send to just the panelists, but you may change it to send to all attendees as well to chat with each other. And to find the Q&A or the chat panels, you may click on those icons found in the bottom middle of your screen. And as always, we will send a follow-up email within two business days containing links to the slides, the recording of the session, and any additional information requested throughout the webinar. Now let me introduce to you our speakers for today, Mike Smithman and Marius Moscovici. Mike is the VP of Sales and Marketing at Metric Insights and has over 15 years of product and marketing experience in the business intelligence industry. He helped bring analytic products to market with senior roles at Seagate Software, AIM Technology, T-Leaf, Extero, and Good Data. Marius has over 20 years of experience in analytics and data warehousing. Marius is the CEO of Metric Insights, the leading provider of BI Portal that helps organizations organize their BI environments and ensure users are getting the actual data they need. And with that, I will give the floor to Marius to get today's webinar started. Hello and welcome. Thank you, Shannon. Really great to be here. And before we start, just getting a level setting here, we're going to talk about practical guide to BI governance. I'd be good to really understand what BI governance is about. So I want to make sure that we work clear on a level set that when we talk about BI governance and analytics governance, it is often the case that people just think about a data governance when they think about governance. And that can be a very big mistake because in reality, only a small percentage of your user community is consuming your data directly, writing SQL, maybe your data scientists, maybe your BI analysts are writing SQL against the database or using Python or going out of it directly. But the vast majority of the constituents in your organization are going after data using analytics, whether that be a BI tool that's been implemented with the solution or whether that be multiple tools where it's been put together and wrapped to Excel, deploy via Python, whatever might be the solution that you devise. So when you think about governance, it's very important to think about both pieces of these puzzles and come up with a solution that addresses both data as well as the front-end analytics that users are consuming. So with that said, I want to set the stage by something that Gartner has commented on. And they've said that through 2022, only 20% of organizations investing in information governance will succeed in scaling their initiatives across the enterprise. Now think about that for a moment. That means that 80% of the governance initiatives that are going to be attempted throughout this next year are going to fail. That's a staggering failure rate. And that, I think, is sort of shocking in many ways. And yet, I don't think for many of us on these calls that's a big surprise because scaling governance initiatives beyond a small department, getting it to a point where it really is adopted enterprise scale and doing it effectively where you're generating ROI for your effort, is actually a very complex task. And why is that? Well, it's because there are these three gears that have to turn together in unison in order to make it work. You've got to have the right people in the right roles with the right responsibilities. They have to be given a process that really works within your governance framework. And then you have to have technology that meshes with the people and with the process to enable that process to be executed effectively. If any one of these gears does not turn or is not the right size or doesn't fit with the rest, then the whole thing is just not going to work. The machine will not work and the governance initiative will fail. So let's talk a little bit about what each of these gears is about and what involves success in these areas. So first, let's talk about roles and responsibilities. So typically what we've seen is that there are three kind of primary roles that are involved here. There's the business that's imagined this, call this role the business owner. And these are the folks that are responsible for defining the rules, the governance, how the data should be interpreted. What defines sales? What defines revenue? What defines churn? All the main things that are going to be measured and shown in a particular report or dashboard or data science visualization have to have somebody at some point defining what that means. And usually the business has to come up with that consistent definition. So the business owner isn't responsible for that scope of governance. Then you have the BI analyst and essentially the BI analyst is responsible for taking that information. I'm going to be measuring sales. I'm going to build a dashboard that's built on top of sales. Here's how sales is defined. We've gotten that baseline defined for me. Now I'm responsible for both putting together the business logic so that these reports or dashboards are created with a consistent definition for that those particular measures, the implementers of the analytics. And then finally, in some cases, you've got the data stewards. These can be data governance team members. They could sometimes be individuals that are responsible for data engineers that might be doing pipelining, but there are fundamental responsibilities to make sure that the data has arrived in the right place with the right definition that the analytics that are being used are being sourced from the right place so that information is consistent and of a high quality. So these are broadly the roles. And of course, an individual may fit multiple roles. It can be distributed across the entire organization. There can be a lot of complexity around this. But broadly speaking, these tend to be the three large swaths of roles. To look at this as an example, in a practical sense, let's imagine that you have a sales operations use case. So imagine you have a dashboard that's measuring the sales rep attainment against goal in an enterprise. So each sales rep comes in and says, okay, here's how I'm doing relative to my target. Here's what my commission is going to be by the end of the quarter if I continue doing. In this example, the business role, the business analyst would be the business rule that was responsible for defining the business rules. They would be defining, what does attainment mean? What are the territory alignments? How do we measure the progress of this particular sales rep against goal? And how do we measure their compensation? So they would come up with those rules that would be used as the fundamental underpinning of this visualization. Then the analyst who might be like, say, a sales ops report developer, that person would come in and they'd say, okay, let me build a dashboard and make sure that this dashboard, perhaps in Tableau or Power BI, whatever tool they selected, that this dashboard uses these rules that have been agreed upon and visualizes everything correctly based on that. And then finally, from a data validation perspective, perhaps a Salesforce data pipelining engineer or ETL engineer, that person is responsible for making sure that, well, did we get the data from the right place in Salesforce? Is there the right transformation logic to land the data from Salesforce into our data warehouse? Did we pull the targets from the right place in the financial system? Is that all being integrated? And then is that information all available to the business analysts to make sure they're using the data correctly? So you can see how these three roles work together around the governance of both the data and the visualization for this particular analytic. So let's talk about process next. And here, there's just this very common issue that happens all the time. We see this scenario with customers day in and day out. And one of the big reasons that governance initiatives fail is that people try to do this one size fits all approach. So invariably, in an enterprise, nobody really wants to do governance. It's one of these things that you just have to do in order to be effective. But none of the folks in the organization, nobody wakes up in the morning and says, hey, I really want to add documentation and classification and go through process. People want to just go out there and build reports and solve problems. And so it's one of these things that then what naturally happens is that one of two things takes place. Either governance is instituted because it is something that's required. There's regulatory requirements or it's clearly such a mess in the BI infrastructure that you've got to clean it up because your users are screaming because they don't know where to find things. So you've got that driving it. Or in which case you tend to basically go where you build this process that's very heavy and cumbersome and can be very difficult to execute. Or you come up with something super lightweight because you think, well, people are not really going to follow this process if it's too heavy. And so we're going to just go with something really light and made more of a guideline based perspective and make it really easy for people to follow. And if you go with either of those extremes, it doesn't work for many of the assets that you have because some things require more governance, some things require less governance. And the worst of all cases if you just meet in the middle and say, well, let's just make it kind of halfway in between these two. And then you've essentially, you've fit no one. And the analogy I like to use is it's as if you opened up a clothing store that sells suits and you sell everybody walking in exactly the same size, which represents the average size of what people wear. Your likelihood of actually having something fit is very, very low. And that's intuitively obvious if you think about suits. And yet we do this kind of thing all the time with governance and it leads to failure consistently. So don't do that. Don't think of a one size fits all. So what do you do instead? Well, we suggest basically this classification mechanism where you build a content classification grid. So you look at the asset that you're that you're governing and you then, you know, make sure that it fits correctly within this grid. And then based on that fit, then you assign the right level of governance to it. So this grid is broken up into two axes. On the one axis, you look at audience size. So is this visualization something that's being consumed by a small number of individuals? Is it maybe a departmental type solution? Or is it something that's going to go to a very large group of people that meets enterprise wide consumption or very large departments? On the other axis, you look at business impact. Is this analytic that's being governed? Is it something that is going to if something goes wrong, if the data is incorrect, if the data is misinterpreted or used incorrectly, what is going to be the impact? Is it going to have a massive impact on the business? You know, is this the kind of thing that's going out to your board of directors and your executives are making key strategic decisions from? Is it going to have sort of a medium level impact? Well, yes, it's important, but, you know, not the end of the world. Or is it something where, you know, yes, this is important to the person using it, but they might have two or three other things that they're disposed of to tell them that something's going wrong. And therefore it's really low business impact is an issue. And every analytic that you can think of every report or dashboard can be placed in one of these nine quadrants. Let's look at some examples. So let's and then based on that, excuse me, based on that, you then are going to identify who is going to be responsible for governance process, right? So something with low impact and has a small audience, then probably the appropriate governance process for that is just for the BI analysts to be responsible for the entire governance, making sure everything has high quality, making sure the data is good, the analytic uses the right logic and so on and so forth. On the other hand, something that has medium business impact, maybe departmental solution with a medium audience size that has some real impact, maybe a sales report that could be problematic if it provides wrong information, that you might want to involve the business owner in that because they're want to have that extra level of validation to make sure that this is consistent with the latest business rules that have been defined. And you don't want to rely on just the BI analysts to do it because people make mistakes and you want to have a little more assurance. At the other extreme, if you've got something with large audience, maybe a high level of impact, now you really want to surround this with a process that involves multiple parties. It's a three-party governance process where the BI analysts make sure the reporting is correct, the business owner make sure the logic reflects the latest business logic, and the data steward make sure the data is all been sourced from the right place. And following the same kind of approach, you can imagine filling in the grid based on what's appropriating your business for all these different other scenarios. And now you have a mapping that says who should be involved in each step of the governance process for this particular asset that you're working with. So these assets require three-party governance, these assets require one, these assets require two. And that's the starting point by which you can ensure that you're tailoring the governance process to the right audience. And so again, with the effort of making it practical, let's look at some examples. So imagine for a moment that you have a network performance report. So this is something that a network engineer is using to see if the system network is overloaded. So that example clearly is going to be a small business impact and small audience, only a handful of engineers are using it. And if the report is not showing the right number, but the network is performing poorly, they're probably going to know because they're going to be complaining about it. So yes, you want to make sure it's right, but the impact of getting it wrong is not tremendous. In this example, that would fall into that category. And then you would go from there. And there's another grid here on the right, which I'll talk about in a moment, where you can then determine where the process that would be followed to recertify an element would apply as well. So we'll hold off and I'll discuss that in a moment. Another example is the sales quota attainment report. So here for a moment, let's imagine that you've got something, the example I gave you before, where there is a sales report and it's being used by your sales team to be able to determine how they're performing. So that goes as medium business impact, because clearly communicating the right performance versus the target to a sales rep is important. You don't want inaccuracies in that. And it has a medium audience size. It's going to the entire sales team. And again, in here, you would have two different individuals involved. You want the business analyst and you'd want the business analyst and the business person involved in this. And then you would also want to have a process by which maybe this kind of content goes through a decertification process, whereby at the end of every quarter, as the new territory alignments come into place, that content gets decertified and then gets recertified once it's been revalidated by the business with the territory alignments and everything are not changed. And then a final example might be a monthly fiscal report. So this is perhaps a report that goes out to your board of directors and your executives on the performance of your business. It's incredibly high impact. If you get those numbers wrong, it might be there's some Sarbanes-Oxley implications. Maybe there's some fiduciary responsibility issues and liability involved in that. So in this case, even though it's a small audience, closely held data, it has very high business impact. And this is the kind of thing that would require three-party certification. Lots of people involved, lots of processes involved around this and also would be something that probably would be subject to some kind of decertification process that would occur on some kind of a calendar basis. So at the end of the month, you need to reclose the books to validate the financial for the prior month during the period of time when the books are being closed. This reporting would be decertified. And then once the finance team has checked all the boxes and said, yes, these numbers are correct, it will be recertified. And then people consuming it would know that it's good. So that's an example of the different items. Let's talk for a moment about process as itself. So identifying among these different individuals that are involved, there's also different things that you're going to want to do with a particular analyst. So for example, you might want to make sure that the data quality alerts are in place. So whatever it is that you've created, how do you know that the data is good? How do you know that specifically how do you know the data is good today? Maybe the data was loaded correctly last month, but in this month's data, it's not. So what kind of checks need to be in place for that? Second step in the governance process might be to say, well, what kind of data classification fields do I need to add to this content? Do I need to flag this as continuing PII data, or identify that this is internal or sensitive or has to be handled in a special way. So that's a governance layer that it should be added to any visualization. You might want to add documentation, things like release notes, the ability to identify what key terms mean within the report and how that particular report has changed over time so that somebody consuming it is aware of the fact that, hey, we actually changed the business rule here. And this reflects the sales territory alignment that just took effect. So if your numbers look different, that's why. You might want to tag this with glossary terms so that somebody looking at that sales report, they understand, well, what counts of sales? Does this do third-party sales? Is channel sales included in this, or just the sales that I'm selling directly? What exceptions are there to how this works? So all of that kind of terminology when associated with this. And then there's the whole certification process. Ultimately, saying being able to say, well, once I've validated that everything is right, definitions are right, the source is right, okay, now we know we can actually say this was certified and identify who certified it and when it was certified. And then aligning it back to the process that we talked about in the roles, depending upon whether it's a one-party, a two-party, or a three-party certification process, different people will be doing these different steps. So a BI analyst may be performing all these steps for a single-party certification model, a two-party model where the business owner is involved, maybe the analyst is just doing the tracking of data quality alerts and adding the data classification and adding the documentation. But the business owner is going in and tagging to the right glossary terms. They're responsible for really saying, yes, I'm certifying that this is measuring sales in the way that we on the sales ops team have designated as the right way to do it. As well as to finally give it that certification stamp of approval to say yes, everyone in the business can trust this, I verify that this is correct. And then in a three-party certification model where the data steward is involved, then perhaps the data steward is responsible for saying, yes, those quality alerts are in place, you can trust the data, yes, the right data classification has been assigned to this. And then the BI analyst is simply responsible in making sure that the report accurately brings in the data and the right rules are applied there with the business owner having final responsibility around glossary terms and content certification. So you can see, I mean this is an example, but you can see how you can easily craft this in your organization based on your data, your content, the steps that you want to follow in your certification so that it's not a one-size-fits-all. So it's a something where the asset is first classified, it goes into the right model where the how many parties are involved certification process, and then the right individuals are then responsible for the right steps to make sure that certification and publishing of this content is really meaningful and that the right data gets pushed up. You saw this little grid on my last, on my previous slide, I want to kind of speak to for a moment. When you think about certification and publishing content that's certified, it's very important to understand that certification is not a one-and-done phenomenon, right? So you cannot be successful if all you do is certify content and forget about it because invariably over time even content, the content that is absolutely correct today, you know, six months, nine months, a year, year and a half from now, it might no longer be correct. There might be business rule that needs to be revisited. So there needs to be a process whereby you're either automatically decertifying content, like the examples I gave you before with financials where it just gets decertified at the end of every month because it has to undergo a certification process or the sales realignment where the territories get realigned once a year or once a quarter, or if it's something that has to be reviewed for recertification. So many other reports might be, well, they're probably still good, but somebody should check them every six months, every year, every 12 months, every 18 months, whatever the right interval is to determine that they're correct. So for every asset, you need to say, is the subject automatic decertification or should it be reviewed on a regular basis? And then should this process undergo based on a calendar or should it happen based on an elapsed time, you know, as 12 months after it was published, a year and a half after it was published, whatever is relevant in there. And then based on that, you classify the content and then you put it into the appropriate workflow such that this recertification or certification review process occurs as needed to make sure that it's still accurate. And this gives your users a comfort level that when they know they've seen something certified, they know when it was last certified, perhaps when it was last reviewed, and they know that they can really trust. The other key key point here is that we also want to recognize that when content has been certified, when assets have been certified, that if that asset changes, then you need to reevaluate that certification. Now, you might have different kinds of changes. There's high risk and low risk changes, and those should probably be treated differently. So an example of a high risk change would be maybe the logic in the underlying tables that are being used for particular visualizations and modified. In that example, I've changed the logic, the sequel statements changed. Well, that might be caused for automatically de-certifying the content and then revisiting it because maybe there's something clearly changed the metric definition. A low risk change might be somebody just changing a description or something of that nature. And in that example, then maybe all you want to do is review this item to say, hey, there's been some change in here. It doesn't look like a high impact change, but let's go ahead and put this into a pipeline where somebody can review that content for potential de-certification. And so keep it certified, but make sure that the change has been noticed and that somebody's reviewed it and indicated, yes, it's okay. So that has to be a part of the puzzle, along with the automated time base and or event-based review for content to be continuously served. So we've talked about the people and we talked about process. The third really important gear here is technology. What good is a great process and people that are willing to engage in it, if at the end of the day, it's onerous. If there isn't the right mechanism in place to make it easy, the reality is no matter what, people will want to spend as little time as possible on governance. So you need to have a technological underpinning that tools in place to facilitate effective governance in a way that is very low touch, that requires very little effort from users and that provides really a lot of transparency and visibility around the whole process. And there are three parts to this solution and we'll kind of dig into details in a moment. The first part is there has to be a place, a portal, a place where all of the governed content can be assembled. So if you think about this governance problem and you consider the fact that if you're trying to do the governance, you have three or four different BI tools and some data, some new reports that people are consuming Excel and PDF and lots of different tools out there, right? If you have that situation, if you attempt to instantiate governance across each of those silos independently, that is just an effort that's doomed for failure. You're never going to even get out of the starting gate on that undertaking. So you have to have a way to pull all that content into a governance space, a single portal, a single access point, where people can get to the content as well as the governed information around that content, the certification, the tagging and so forth, in a coherent and consistent fashion. So that's a key pillar of the technological solution. A second piece of the puzzle is the workflows. So we talked about the fact that there's going to not going to be a one-size-fits-all, that that doesn't work, which means that for the process of classifying content, publishing, certifying, assuming things are tied to the right things, that all has to be governed through a workflow and there has to be some technology to support that workflow effectively so that it is not cumbersome or onerous for the people doing it and so that there's transparency around where things move as they go through that workflow. And then finally, there has to be a compliance and reporting aspect to this. So if you have the portal, if you've got the workflows, but if you can't tell a measure how you're doing, both from a perspective of knowing, are you adhering to the governance policies? Are things being certified? Are there things that have gone through the certification process six months ago, but have not been reviewed and recertified? You know, all those things are so basically checks and balances to ensure that things get escalated and that people are aware of the things that they need to take action on, as well as the measurement of the effectiveness of the process. You're doing this governance not just to ensure security, but to boost engagement, to increase data literacy, to do a lot of things, to improve things in your enterprise. So how do you know whether that's happening? You need to make sure you have the tools in place that measure those things so that you then can say definitively that, yes, this process is working. Or if it's not working, here, let me go make some adjustments. So with that, I'm going to hand you off to Mike, who's going to show you some actual examples of how the portal workflows and compliance can work together to provide the technical support to the governance sector. Yeah, thanks, Marius. So I'm just going to take a second here and switch across to our demo environment. Just give me one second. I'm sharing here this way. This is coming up. So yeah, as Marius said, I'm going to take us through sort of how metric insights implements some of these things. And we'll touch on the three areas that Marius spoke about. So this is concept of having a centralized portal for accessing and governing content and then talking through the workflow pieces that enable you to publish content in a governed way through those different stakeholders and resources that we spoke about. And so the first piece I'm going to look at essentially from the perspective a little bit of sort of the end user and what it means to have access to governed content so that I can better be able to understand the context behind the assets that I have access to. Let's just try and refresh this here. Okay. So what you're looking at here is the metric insights portal, but essentially think of this as a piece of technology, whereas an end user, when I come in, the complexities of understanding where reports and dashboards exist and what technologies they exist in, understanding what I can trust is kind of masks for me and I can come in and access content in one place. So what you're looking at here is sort of a catalog view of content where there's reports and dashboards coming in from multiple tools and essentially I have access to content in one place. And so you'll see I've got stuff coming from Tableau, Power BI, I've got spreadsheets and PDF documents all accessible to me and really I don't even have to understand where they are coming from. And if I click on a particular asset, I get a preview of what that content is. And as we'll see, because this content has been through a governance process like Marius described, when I preview this content, I'm getting a lot of context around what it is. So I'm seeing things like who the owners are, who have been involved in creating and managing this content. I can obviously see what's called and the description of it. I can see how fresh the data is based on the last time it was refreshed in the particular asset. I can see any classification for the content. So is this something that is for internal use only? Does it contain sensitive information? I can see gossary terms that Marius mentioned and we'll look at these in more detail. But this has been tagged with metrics that have been defined and described and that have owners behind them. I can see an image of the dashboard in terms of the last time it updated and what it actually means. And so even before I access this content, I've got some context around what it is, who's responsible for it, what it contains, and whether it's going to answer the question that I have right now. And if it does, then obviously I'm going to want to drill into that. And in the metric insights portal, we basically embed that report now. In this case, it's coming from Tableau, but it could be any underlying technology within this portal view. So it's very simple for the user to be able to interact with it and see the context around it. So at the top, you'll see the glossary terms that we mentioned about. You'll see the classification that we've just talked about and the ownership. You'll see things like documentation that may be related to this content that was added during the publication process. So it could be things like release notes or further definitions or help around how this should be being used. So again, it's not just about sort of publishing content for users. When that content is governed, I get some sort of a sense of confidence as an end user that what I'm looking at is correct and I'm interpreting it and using it in the right way. Another piece of the governance process as well, both from the sort of publication perspective and creation of content, but also can be useful for sort of the end user is this concept of a lineage as well. And understanding if I'm either checking this report as part of the certification process or even consuming it, where did this report come from and where's it being used. And in this case, we can see the Tableau dashboard was coming from a particular instance of Tableau, but also that it's being pushed out in a number of distributions in the organization where people are consuming it and be able to really understand for a particular asset, what are the components that make up that asset and how is it being used. So we've got the context around it. We've also got the context that this particular report has been certified. And we'll touch on how that happens in a minute. But it's important if we're going to publish content to again, not only add the governance layers and context to it, but also let people know that that has been done and it's been checked. And so the concept of a certification stamp of approval is important, having accountability when it was certified. So as a user again, when I come into this, I ultimately know I can trust it and it's something I can be working with. So the general idea within any portal is to bring everything together, to give the context behind it, to have that layer of certification, to be able to organize content in one place and give users this catalog of content where they can come and search through it and understand based on that metadata what it is that's available, filtering out by things like what has been certified. So again, I'm focusing and easily able to find the content that I'm interested in and make sure that it's truly what's answering my questions. So that's kind of the end result that we're trying to achieve with governance is make it easy for people to consume the right content based on what it's trying to achieve, what a particular report is trying to achieve. So again, I'm using the right content, but I'm understanding how it's being created and what it's telling me. So let's transition over and talk a little bit about the publishing process behind this. So as Marius said, one size doesn't fit all. So whatever technology you put in place, you need to have the ability to define a particular publishing workflow. How does the content get into the portal with all that context to the point that we can certify it? And so within metric insights, we have this concept of what we call publishing workflows that can be as simple as I'll show you some some end examples of this can be as simple as sort of a one party, as Marius mentioned before, where it's going through essentially a simple review step before it gets published to the end users in the organization, or as complex as say a three party workflow, where again, the different steps as different constituents, different people responsible for reviewing and adding and checking the data and adding the context to it before it ultimately gets published. So you need a technology that will support different workflows. And if we look at what that looks like from an end user perspective, you know, so what does publishing content mean? Well, it's about pushing it through that workflow. So I'm going to take the very simple example where maybe I'm doing these all these steps of review and publish. So I'm in here with a particular workflow where we've essentially got three steps, new content will come into the workflow, it will go through a sort of a certification process and governance process before it gets published in a complete fashion out to our users. So as a particular user here, John, there's a couple of dashboards that have come into my queue. You'll see again, they're coming from different technologies here. And I in this simple case may be responsible for doing some of the things that we spoke about. So assigning it to a particular category that users have access to, I may then take that across into a stage where I'm responsible for then once it's in progress in the review process for adding in the appropriate metadata. So I may tag it with the key metrics that we've spoken about that are included within this particular dashboard. I may classify it. So this is for internal use, but it doesn't contain any PII data. I may attach documentation to it at this point before it gets sent out. But ultimately, again, whether it's a simple example like this where it's just me doing it or multiple users, at some point, once I'm happy with that, I'm essentially going to save that and move it into a certified state for our users. So in this case, I can move it into our final complete state where it's going to be ultimately certified and available to our end users at that point. And so whether it's a simple process like this, whether it's the other end of the scale that we were talking about, where there's multiple stages where the developers are reviewing it, the business owners are reviewing it, the data stewards are looking at the data before it ultimately gets published. We can manage these different workflows, different users responsible for different pieces. And ultimately, again, what are we trying to do here? We're trying to, if we go back to our portal, end up with a set of certified content that the end users will have access to. So everything in one place, publishing workflows to get it out there in a governed way, and being able to do that in oftentimes a pretty distributed format. So it's not as simple as having one analyst in your organization to do this. There's probably multiple teams publishing content in different workflow formats that we've spoken about. The third piece of the puzzle for the last five minutes here that we'll talk on is around sort of measurement of the process as well. So Marius mentioned before about having sort of the checks and balances to say, is the governance process affecting how we use data in an organization? Is it improving the way we use data? Are we getting better engagement with the BI assets that we are putting out there for our business users? And in order to do that, one measurement of that is essentially looking at how content is being used. And any portal that you put in place like this needs to be able to measure usage at really any touch point within the organization. So I could be accessing reports and dashboards within the portal like we've looked at here. But equally, I may be consuming it because it's gone out in a distribution to my email. And that's ultimately where I'm accessing the content, or I could be bookmarking it in the online tool, or I could be accessing it on my mobile device. Anywhere a user is touching a piece of content, we want to make sure we're tracking that and using it as a gauge of how well our assets are being used that we're publishing. And so within metric insights, we report on that information. So obviously I can look at views of content over time, but more importantly, I can see on a rolling basis, in this case the last 60 days, what content is actually being used, what's increasing in popularity, what's decreasing over time, what's going unused. And getting a view for either as the library of content as a whole or the content that I'm actually responsible for, the ones where I've actually pushed it or created it, pushed it out through our workflow process, I should be able to see how that engagement is happening. And down to a pretty detailed level as well. So oftentimes we'll look at usage at a very high level. In this case, our Tableau sales analysis dashboard was viewed 747 times in the last 60 days. But I need to really understand the user journey with that content. Was that because I published it 60 days ago when 747 people came in to see what it was all about, but decided it was actually not that useful and never went in again? Or is it something getting continual engagement over time? And so being able to look at some sort of visualization like we're looking at here where we're sort of simulating over that 60 days, which users or user groups were coming in and engaging with that content? Are they coming in regularly, are they bouncing in and out and looking at it on an ongoing basis? Or did they kind of look at it once like John here and never come back in again? And so by understanding that engagement, understanding who's looking at it, being able to better either promote or categorize that content so that people get better usage out of it is important. Another way to think about sort of the effectiveness of governance when we're talking about the categorization content is understanding whether people are even finding the content. So one thing that categorization allows us to do as we saw within the search is make sure that people can actually find the things that they're looking for. And so tracking things like search performance, when people enter search terms into the portal, are they getting results? Are they actually clicking on content and it's a successful thing because they found it? Are they searching for things where either there's no results coming back or they're unsuccessful because they're not clicking on any content in those search results? And again, using that to better understand how should we categorize content or if this content isn't available, it may be candidates for things that we should be building in the future. And then the final thing for usage, understanding that over time, are we getting better engagement with BI in general across our whole body of content? If we look at sort of 30, 60, 90 day usage of content, are we seeing positive trends where that's increasing because we're seeing better engagement and we're seeing people finding the things that they really need. So again, in summary, and we'll open this up for questions in a second, then in summary, three things, portal, get everything in one place, make sure people consume content and that they can find it, have the necessary publishing workflows to be able to get it out there with context and then measure that and understand what effect it is having within the business. So I think we're right on 1145 here, Shannon. Why don't we see if there's some questions? Thank you both so much for this great presentation, lots of questions coming in. And just to answer the most commonly asked questions, just a reminder, I will send a follow-up email to all registrants by end of day Monday for this webinar with links to the slides recording as well as anything else requested throughout. So lots of great questions coming in here. So diving in, how is a data steward for BI analytics governance different from the data steward for traditional data governance? I think it's a, it can be a broader role. It kind of depends a little bit on how you define it, but traditionally kind of a data steward in a more just purely focused on data governance is just looking at the providence of the data itself, you know, how does it come in, the quality of it. I think it's important to maybe expand that stewardship role a bit more in this context to really understand, to have that data steward also understand, you know, what analytics are sourcing this lineage a little further upstream than just the final landing place of the data and the data warehouse, because again, it's a holistic approach. So if you take the traditional data governance role plus add the ability to really explore and understand lineage into the analytics where the data is consumed, right, and, and validating the quality around that entire process. And I would say not just it did the data in the warehouse land correctly, but do we have quality all the way into the analytic? Do we have the right dependency checks to make sure that that Tableau dashboard is pulling the data at the right time, right? All then, then that becomes the full on a stewardship role that that's covered in the BI governance scenario. So that's a great question. So what components and thresholds would be set in order to fire data quality alerts? If the system has a lot of content, where would where would be an ideal starting point for a BI analyst to start in order to set appropriate alerts? Yeah, I can take this one. So I think this is an interesting one. So if you think of the typical flow, there's often sort of checks and balances happening within your sort of ETL processes. I think this, you know, interpreting this question, I think it's important to have the checks and balances from from a BI and analyst perspective as well. And the way we see that happening is, and we actually have some alerting capabilities that we didn't get to within within the demo there, but is being able to, yes, have some alerting around some of the sort of more technical aspects of your dashboard. So oftentimes, analysts will set up alerts in our customers where they're looking at things like the the size of the data set that is is driving a particular dashboard, you know, the number of rows that typically get loaded. And if there's an anomaly, you know, suddenly one day we get half as many rows as we usually do, then that could be an indicator of a data quality issue. But also setting up alerts around sort of the business metrics within the dashboard, because if you've got a sales dashboard like the one we're looking at, and suddenly there's a drop in sales significantly on a particular day, well, reality is that could be a business issue, in which case you want to be alerted to it. But by alerting the analysts to that issue first, it could also be an indication that actually something happened with the data. And you know, you may want to check on that before it goes out to the general population. So I'd say set up alerts that kind of yes touch on sort of some of the technical aspects, but also use that business metric KPI alerting to use it as sort of an early warning that something might be might be wrong with the data. And if it isn't and you check it, then alert the business users because something's going on in the business. And I would add to what Mike said is, you know, to the vein of where should I start, because you can't boil the ocean, you can't check on everything on day one. Well, use that same classification grid that we showed you, right? Look at what are those assets that you have today. They fall into those those parts of the grid where they have high business impact, high user volume, and then, you know, you work your way down to that high impact line. And those are the items that you want to validate using those two mechanisms. Mike said, you know, validating for all of our role counts is like that. But then the metric itself, if it's changed the significant way, I know to find that. Perfect. So do you have any suggestions for tracking the impact of some of the certification process? Any rule of thumb to build this review into the process? In terms of tracking the certification process? Yeah. So I think the key there is whatever workflow you use, whatever tool or technology or capability you use, the key is to ensure that there is logging around that, right? And that there's reporting around the logging that takes place. So for example, if there's the expectation that something gets certified or get through the public move to the published process and certified in within a week, then you check to see the things that don't get that haven't started the process and have not ended should get should get escalated to somebody via notifications. If there's the same thing, if there's the expectation that you're going to recertify after a certain period of time, having that log such that you have reporting that goes out to the appropriate party to say, hey, your content that you've certified six months ago is subject for recertification, go in there and then you hit this link and go to recertify. So making that really, really easy and transparent are the other two key things. Yeah. And I add to that. And I'm scanning questions here. And I know we're going to run out of time. I think related to that, there's a sort of a lot of questions around sort of ROI and budget and, you know, your BI practices in general, you know, measuring the, the, the certification itself as well. So if you've got a body of content that you're expecting to be certified, how much is certified is the stuff that certified getting better engagement because people are trusting it or, you know, compared to the stuff that isn't certified, you know, if an analyst is responsible for a body of content, how much of his or her content have they certified versus not, you know, measure the process but measure what you have as well within that library of BI content that you're managing? Yeah. Shannon. Sorry, I was talking to my meat button there. Yeah, we do, we, what I was saying is we do get a lot of questions around ROI. It is a really calming question. And so all too often there seems to be low utilization, which tends to cause management to become hesitant for continued budget allocation. So you want to expand on that? Yeah, I can, I can start with that. So, you know, I think that's, that's, you know, been the reality with, with BI over the last couple of decades, right? And I think in some respects, it's because we've become almost a victim of our own success in terms of we've made it too difficult for people to, to use content, right? It's easier for a business user who's got a couple of minutes every day to pick up the phone and ask for a set of data or some report, rather than go and find it themselves, because there's just thousands of reports out there that he may, he or she may have access to. So, yeah, I think the way to, to start is by getting your BI under control with some of the techniques that we've been talking about, and it's less about volume, and it's more about quality and making sure that what you're making available to business users is the stuff that they should be using, and that they can easily find and stumble across that content if they need it, so that it's easier to do that than, again, pick up the phone and ask for something else, because reality is a lot of BI teams spend 24 seven fighting fires, rather than what they've been recruited for, which is probably to do more advanced analytics and, you know, move the business forward. So I would say, you know, the way to start is getting your environment under control, focusing on the body of content that's important, have a process to publish that content in a governed way, measure that it is being used and actually change the paradigm and show your management some higher engagement figures, and, you know, map that back to your BI tool usage, right? You know, reality is, you know, are you utilizing the licenses that you're paying for based on the engagement that you're getting with the content, because if you're not, maybe there's an opportunity to reallocate some of the things that you have or focus in different areas. I think what Mike said is exactly right, and it reminds me of that old sort of Stephen Covey saying that when there's no gardener, there is no garden, right? And this governance process is about creating a garden, right? It's about creating an environment that has the content is curated that that's useful and meaningful. So, you know, yes, there are all these measurements and numeric ROIs, but at the end of the day, many times, if you're very in numbers, you already have how much engagement is out there, how much duplicative content there is out there, how much content is not going unused out there in and of themselves, themselves, you can use that to generate ROI to just say, look, there's a tremendous opportunity to turn this jungle that we have here into into a well governed, well manicured garden. And I think we have time for a few more questions here. So one question is, you know, I'm having a hard time seeing how analyst governance is different from data governance except the different topic. You know, is there something different? Is there something extra that's needed? Is there data governance for analytics? Yeah, I think that the major distinction we were trying to do, I think that the speaker is the person who asked the question that he hit on a point that there's very similar principles here. The point is that historically, if you look at most organizations, you look at the data governance functions that are happening in there, they're very narrow in scope. They're focused around, you know, that when the data gets into the data warehouse, then that's where the responsibility ends. You know, so basically saying, well, how do, how does the data get in? What kind of, who do we allow this to consume this data? What is the classification around those data? So, you know, to stop at there is just, it's, that's all necessary, but it's insufficient from a BI governance perspective. So the way to think about BI governance, it's a broader umbrella that encompasses data governance, but also extends the principles you use in data governance to the analytics themselves, to those visualizations, to those dashboards, to the, to the spreadsheets that are created off of this data and then are consumed by the end user. I think that's a key point as well. I think, you know, oftentimes governance happens in a vacuum divorced from the users themselves and hopefully some of the things we showed, you know, show how actually governance can be helpful to the end user rather than just sort of this checks and balances thing that happens behind the scenes, right? You're going to put this effort into, you know, check the data and categorize it and tag it and certify it, you know, expose that to the end user through analytics governance practices so that they can, they can have benefit out of that. You're mute again, Shannon. So what about tools? How do you add tools to the mix, tools and environments? How does that fall into analytics governance? Yeah, I mean, so I guess I can start. I think, I think the challenge with, with a lot of environments is, you know, we're, we're all reporting in many different tools and technologies, everything from Excel to, you know, reporting in our operational systems to, you know, oftentimes multiple BI tools within an organization. I think the challenge doing that in that sort of environment is, you know, some of those tools may have some governance capabilities, others may not, you know, anywhere in between. And so, you know, I think it's very difficult to implement governance in a heterogeneous environment like that. If you don't have some sort of layer of technology on top, you know, and obviously whether that's metric insights or something else that allows you to manage governance independent of the underlying tools where the content exists, I think is important to really be able to do it successfully. Otherwise you just end up with governance happening in a lot of different vacuums across the organization. And then it becomes very difficult to report on that and understand how well it's happening. Yeah, and I think I don't know if that's what the person was getting to him from that question, but the other aspect of this is the governance of the, of the technologies. And I think there's an aspect where this enables that as well in the sense that, you know, if you're measuring utilization at a very detailed level, you can do things like, you know, look at how licenses are being used and by whom and be able to effectively, there's another area of ROI as well, effectively reallocate the license usage in such a way that you're better governing the usage of the tools that you have today, right? And there's a number of ways, we didn't really have time to get into it, but there's a number of ways in which the portal should enable that as well. And then finally, the governance process, the workflow process that might show you can also be integrated into a multi-environment scenario where you are saying, I'm going to move this content in my QA environment through in multiple stages. And as a final step, it's actually going to be going into production. There should be automation around the automatic synchronization of content into the workflow so that you can determine, you know, which is the, which are the things that are subject for promotion and certification, as well as on the back end, potentially, which are the things that are subject from going from QA to production. So there's lots of complexity around that can be integrated in the more complex enterprise environments. Well, thank you both for this great presentation and for the Q&A, but I'm afraid that is all the time that we have for today. And thanks to all of our attendees for being so engaged in everything we do. We just love all the information. We'll get you all their contact info so you can and get them the additional questions that we didn't have a chance to get to. So we get your answers to those. And just again, a reminder, I will send a follow-up email by end of day Monday with links to the slides and the recording, as well as the additional contact information for you all. Thanks again to Metric Insights for helping make this webinar happen. Really appreciate it, Mike and Marius. Thank you so much. Thanks everyone for joining us. Thank you. Bye-bye. Thanks all. Have a great day.