 Hello and welcome. My name is Shannon Kemp and I'm the Chief Digital Manager of Data Diversity. We'd like to thank you for joining this Data Diversity webinar, the Three Pillars for Effective Business Intelligence Governance, sponsored today by Metric Insights. Just a couple of points to get us started. Due to the large number of people that attend these sessions, you will be muted during the webinar. For questions, we will be collecting them by the Q&A in the bottom right hand corner of your screen. Or if you'd like to tweet, we encourage you to share highlights or questions via Twitter using hashtag Data Diversity. And if you'd like to chat with us or with each other, we certainly encourage you to do so. Just click the chat icon in the bottom right hand corner of your screen for that feature. And as always, we will send a follow-up email within two business days, containing links to the slides, the recording of the session, and additional information requested throughout the webinar. Now let me introduce to you our speakers for today, Marios Moccovici and Mike Smithemans. Marios has over 20 years of experience in analytics and data warehousing. Marios is the CEO of Metric Insights and the leading provider of a BI portal that helps organizations organize their BI environments and ensure users are getting the actionable data they need. Mike has over 15 years of product and marketing experience in the business intelligence industry and is the VP of sales and marketing at Metric Insights. Mike helped bring analytics products to market with senior roles at Seagate Software, AIM Technology, T-Leaf, and more. And with that, I look at the floor to Marios and Mike to get today's webinar started. Hello and welcome. Well thank you and welcome everyone. This is Marios and today we're going to talk about the three pillars of effective BI governance. And we hope to leave you not only with some ideas and examples of what makes BI governance work, but just some tangible methodology as well that you can apply in your environment. So we'll talk about what the effective governance needs. Take a look at what the BI governance lifecycle looks like, show you some examples, and then wrap up with Q&A. So rather than boring you with a long definition for what BI governance is all about, I thought I would provide you with a few signposts or markers that you can use to take a look at your own organization and assess, you know, how effective of governance platform do you have internally. So let's talk about what some of these characteristics are of well-governed BI environments. First of all, there's happy coexistence between security and discoverability. What do I mean by that? Well, in a well-governed environment, users can find content and not only can they find the content that they have access to, but they're able to say, you know, here's some content that maybe I don't have access to, but I can request it. And that's important from a request access to it. And that's important from a governance perspective, because if there's something that's near or close to what I'm looking for, then I can request access rather than going off and building almost near duplicate of that and wasting a lot of effort. Another key criteria is that a well-governed BI environment is much more like a garden than there's a jungle. So, you know, think about your BI environment, whether you use Tableau, MicroStrategy, or any BI tool. You know, when you go in there, does it look like a well-manicured Japanese garden? You know, those ones where, you know, every leaf is in the right place and it's everything is beautiful and will organize. Or does it look more like a jungle with duplicate reports? You know, version one, version two, version three of any given report. There's reports that are obsolete. When they were built six months ago or a year ago, they were useful, but you know, since then definitions have changed. Something's changed in the logic and that is no longer preventing the right information. Is it easy to find things or are you constantly getting sort of clutter of this obsolete or duplicate of content? You know, so that the user experience for finding things feels like just navigating through this dense jungle. Similarly, ask yourself the question, you know, what's the level of trust that you're seeing in both the data and the reporting? So, trust takes on multiple dimensions. It's both about the fact that your users feel like, hey, I know which content I need to go to to find information, but it's also by knowing that the data is accurate and that it's timely. Do I know and can I trust that when I'm looking at this dashboard, not only is the underlying data correct, but that I understand what the dashboard is saying. I've got the right context for it and that the information is current. I'm not looking at data from three days ago. I'm looking at yesterday's data. A high, well-governed environment has a high level of trust. And then, of course, if you have all these pieces in place, then you'll see high utilization of your BI tool licenses. So, BI tool licensing is typically one of the largest items of cost for a BI organization. And yet, when we look at our customers and people in the industry, we find it's not atypical for organizations to see less than 50% of their licenses effectively utilized. And that's a real problem, right? That means half of your cost is going into nothing. In a well-governed environment, there are tools in place to ensure that your licenses are going to the users that will actually use them. And there's dynamic provisioning and deep provisioning mechanisms in place to ensure that over time. And finally, the largest area of expense for your BI team, almost every organization, is the staffing, right, the team that you have. In a well-governed, effectively-governed environment, what happens is that you have tools with the capabilities and the visibility to make sure that the BI team is working on those tasks that are going to maximize business value. In un-governed environments, that's not the case. So, with this sort of criteria in mind, let's talk about, take a deeper look at what makes up BI governance and how to get up to the state. So, first of all, at a very high level, we should think about BI governance as two pieces of the puzzle that have to fit together and they're equally important. Everyone always thinks about data governance, you know, making sure that there is master data management in place, that there is the data is accurate, that the right ETL pipelining process is in place, and then that's incredibly important. But that's only half of the equation. The other half of the puzzle is analytics governance. Because no matter how accurate the data is, no matter how well structured and clean and pristine it is in your environment, unless you are visualizing it, visualizing it, excuse me, in a way that is consistent and provides the right context to your users, then your users don't necessarily interpret that data correctly. If you have three or four different dashboards measuring churn and they're used slightly different definitions, well, it doesn't matter if the underlying data is consistent and clean, your experience that you're giving your users is going to be inconsistent and it's going to feel like they cannot trust the data. This problem is exacerbated in most large organizations because establishing this kind of governance across the entire BI stack is not just about managing your tableau environment or your click environment like a strategy or any given tool. You need to have a management strategy that works across a portfolio that's very broad, that has a number of different BI tools, also looks at SaaS offerings, sometimes you've got things inside of your Excel spreadsheets that people are using, PDFs, various things that are, you know, where users are creating it on their own, maybe it's a share point. So there needs to be a strategy and an approach to BI governance that is universal and covers all the different contents you have in your enterprise with a consistent fabric. And that's a big challenge. So how do you get there? Well, there are three key pieces of a puzzle that have to be fit together. The first one is around data literacy. So you need to be able to obviously to have good governance, you need to be able to have an environment where your users know how to find the data, know what they're looking at and can interpret and use the right analytics to perform the day-to-day functions. So let's break that down and unpack it a little bit more. Data literacy at the beginning or as a first step is all about providing, first of all, single access points, all the information that you have, and then providing users with a tailored user experience. By definition, you cannot have data literacy in your organization. If your users have an experience of going through that jungle, if you're throwing every single analytic and reporting tool at them in this organized fashion of this, all the tools have different access points, different places where they go and search for the information. How can you expect your organization and the business users in your organization to be able to understand where to find the right information and to use the information that they're disposing in an intelligent way? So unified access, highly tailored user experience, experience that is aligned to the user's persona and the way that they use data in their journey within the enterprise is absolutely critical to be able to establish data literacy. Second item is to make sure that you're thinking more broadly than security. So again, you want to make sure that your users, with the exception of maybe a few small, highly sensitive pieces of content, you know, that maybe are HR related and you don't want to make discoverable, most content should be discoverable. Most content should be it should be such that if I'm looking for it for something I can find that, oh, there's this piece of content, I don't currently have access to it, let me request access and that goes through a workflow that avoids a situation where users are misinformed because they think something doesn't exist when it actually does and it avoids the creation of tons of duplicate content. So having a discoverability capability, very important and powerful search to enable that is critical. And then having some way to organize, categorize, tag and manage the metadata around all of your information, including documentation is super important. So if your metadata is buried at the data level, if it's all at the table and column level, or it's inside of the reports that you've created, well, there's no way to establish data literacy in the enterprise. You know, you need to be able to have a mechanism by which users can not only find an analytic but understand, you know, what are the rules that have been defined to come up with these numbers? What's the key lineage information associated with this? How do I interpret this information? And so this is very important. One of our customers, you know, said this to me the other day, that, you know, the user experience that most people should strive for is that of a museum. You know, if you think about a museum, you know, maybe an art gallery you like to go to, when you go to there and you visit, you're not seeing every piece of art that they have sort of strewn around the place. They're not, you know, there's probably 10, sometimes 100 times what they have currently in exhibit, placed out there in the warehouse. But you're not seeing any of that. Instead, you're seeing the things that are thematic to the presentation, to the particular exhibit that they're showing. And you're seeing that in context. You've got those little placards, explain, you know, what is this piece of art about? Maybe you're listening to an audio that describes it, that provides an interpretation. That is exactly the kind of journey that you need to create for your users with your data if you want to create data literacy within your organization and create governance. And then finally, certification is paramount. So, if I'm using a piece of data, I need to be able to know is it certified? Has somebody vetted the fact that this information is consistent with the other certified content we have in the enterprise? Who's certified? Who can I hold accountable to that? When was it certified? And is there a process in place to ensure that not only does content get initially certified, but there can be re-certification in the right period when something changes? All of these pieces have to fit in place such that you can establish data literacy and build a foundation for proper governance. So, with that, I'm going to shift gears for a moment and have Mike show you some examples of how you can achieve these goals using metric insights. Yeah, thanks, Maris. So, I'm going to jump across here to the metric insights product. I'm going to, as Maris said, show you some examples of what we're talking about here. Obviously, thinking beyond the technology that we're using a little bit more, the process and the features that we're thinking adds to what Maris has been saying. So, again, what you're going to see here is the metric insights platform. We've got it stood up in a typical enterprise that we see where it's connected to a number of different reporting and BI technologies and ultimately sources of information across the enterprise. And as Maris said, so the phase one of any sort of governance journey is really about understanding what you have today and having an effective way of cataloging that in one place for your users. So, what you're looking at here is the metric insights interface. We're actually in the catalog view and each of these tiles that you're seeing here is where we've published something to the catalog in a structured way. So, you'll see that I've got content coming in from multiple BI tools. I've got some content coming in from SaaS applications like Salesforce. I've got some documents, some spreadsheets. Again, the intention is that we can bring together into a single place where we can start to organize, as you'll see in a minute, all the content that might be useful for the users that are out there. Now, if we start drilling into this, what does this ultimately mean? Well, let's take an example of this sales dashboard that we have in Tableau here. What's actually happening when an analyst chooses to publish this into the catalog? Well, if I click on it, effectively, what's happening behind the scenes is we want to initially ingest any of the metadata that exists already for this particular report. So, that might be simply things like what it's called, the description, perhaps things like when the last time it updated was. All these things are things that we can automate through the plugin that we have into Tableau and that we want to bring in and surface to the users. But equally, we want to start thinking, as Marius said, if you think of a museum example, how do we start to categorize this content in a useful way for the end users? And so, having the ability to layer on additional metadata or additional context to this is critical. So, I'm looking at a preview here of the dashboard. Down the right-hand side, you'll see some examples of that. So, we probably want to highlight ownership of this content so that we've got some responsibility for it, whether that's business owners, technical owners, maybe you've got data stewards who are responsible for the data behind it. We want to highlight that, communicate it to the users so that they have a point of contact if they have questions. Maybe it's metadata fields that make sense, you know, what's the business unit that this is targeted at, or does this contain personal information, and we need to be sensitive to that, or lineage information, what's the source of this dashboard that I'm going to look at. So, by pulling in this metadata, we can start to give the end user a picture of what it is that they're going to access here. And, you know, for the sake of the technicalities of this, this metadata could be manually added within the portal when I publish it here, or it could be synced up and automated with some of the tools that are out there today. So, maybe you're documenting data in a calibre or an elation or simply an Excel or SharePoint list. We want to, you know, be able to pull that in appropriately and synchronize it, or, again, add it manually within the tool. You'll see here, you know, just as a point that we're also collecting images of the dashboard for the user. So, having an accurate visual preview of what the current version of the dashboard looks like, again, adds context to what I'm going to look at and gives me an indication of whether it's going to be useful for me. Now, if I drill into this piece of content, one of the things we're trying to achieve as well for the end user is giving them a consistent way of accessing content. So, I now understand what this is. I don't necessarily have to want to figure out where does this exist in Tableau? How do I access it? You know, how do I work within that particular tool? We want to give users a consistent experience in a governed environment. And so, you'll see actually what we're doing here is we're embedding the Tableau dashboard. This is the live Tableau dashboard, so I would be interacting with it in the same way as I would in Tableau. But we're embedding that within a consistent, I guess, wrapper or framework here. So, around the edge is actually metric insights, and you can see more of this metadata around descriptions and tagging that we've added. You can see things like the ability to add further documentation and context to the dashboard in a consistent way. So, here we might be tying in release notes and definitions. If I click on this, it may take me out to maybe a Wiki or a Confluence page or we may be uploading documentation. But again, the intention is if a user is truly going to trust the data that you're providing within the dashboard, we want to make sure that they've got the context to interpret it right. So, that might be things like giving them objectives to what's in their definitions of metrics, where it's come from from a data source perspective. So, having this consistent view with a consistent way of categorizing the content gives the user some trust in the data. And also, by embedding it in this way, again, regardless of whether this is Tableau, a click or microstrategy, functionally we can start to give users a consistent experience as well. So, whether that's in terms of being able to share content in a consistent way, subscribe to it to receive in their email, create bookmarks against the content so that they're always coming into the slice of data, even collaboration around it so that they can add context into the content is key. And so, whether I'm accessing Tableau, whether I'm accessing microstrategy, click any reporting tool, even a PDF, I get a consistent way of doing that. So, if we jump back to our homepage here, so, we're publishing content, we're adding context and categorization to it. Another layer that Marius spoke about is a concept of certification as well. And so, you'll see some of this content have green checkmarks on it with an indication that it's been certified by a particular person at a particular point in time. Now, certification typically happens again at the publishing time. So, as an analyst responsible for publishing content into this catalog, I may also be responsible for checking it, testing the data, queuing it, and giving it a stamp of approval so that when users come in, they're able to see which data they can truly look at and trust in there. And so, whether it's a simple sort of certification by a single person like this, or whether it's more a process that goes through a workflow where maybe there's people sort of staging the content into the catalog, because as an analyst or data steward, I'm responsible for essentially checking that and either then certifying and publishing it or perhaps sending it back if I see issues with it. I should be able to do that again before it hits the end user and those business users who are going to make use of it. So, that's really sort of phase one is starting to get sort of this consistent way of publishing, documenting, and accessing content. Obviously, as well, there is a security and permissions aspect to this. So, I've just moved across to another browser here. I'm logged in as more of an end user. And obviously, my permissions ultimately dictate what I see and what I have access to in terms of the categories of content and the tiles that are relevant to me. So, Marius spoke about this concept of sort of discoverability versus security. Clearly, there is a security aspect where we want to make sure users are only seeing what they should be seeing. But equally, I should be able to search through that content very easily. So, as a user, I should be able to come in, browse through my categories of content regardless of tool, but also search across that catalog for things that are interesting to me in a way that's making use of all that metadata and fields that we've added to the content. So, when I search within the portal here or within the catalog, it's looking across that content. It's searching through those metadata fields that we had in place. It's allowing me to filter out things that are certified so that I again can focus on the key pieces of content that make sense to me. Equally, though, with this concept of discoverability, if I'm browsing through the catalog and there are things that perhaps I don't have access to but have been deemed discoverable within the catalog, I should be able to see that. So, you've got an example here where I'm browsing through some procurement dashboards. I can see they have a lock on them. And again, this is where they've been published as discoverable. My permissions dictate I don't have access to these today, but there should be a process that if I see something that looks interesting to me, I can click on that and request access from the owner of that content. That can then be something that's pushed into a workflow or that's provisioned by the owner so that they can give me access when I'm not requesting a duplicative piece of content just because I don't have access. So, that really covers this first phase. And as Marius said, again, single point of access, balancing that discoverability so I can search for things, utilizing the metadata and the documentation to add context and making sure that we've certified the content that we really want people to be looking at. Thank you, Mike. And I think you've got a sense from Mike's description that to achieve this, it's not just about technology, it's not just about having the right tools. It's a combination of aligning the tools with the right process. So, you need to think about what is the process that you want to be able to respond to create general literacy in your organization, how do you cover those key test zones that we covered, and then you can either use a tool like metric assessor, you can roll your own solution that accomplishes that. So, speaking about the next area, next piece of the puzzle, it's all about optimizing resources. So, here we're talking about managing the content, licenses of people to make optimal use of the scarce resources you have to generate business value. So, what does that mean? Well, if you think about the way that resources get allocated, you know, what does your team work on? Which dashboards do you work on? In a poorly managed, or I shouldn't say poorly is wrong, or maybe in a less managed environment. What ends up happening is that whoever shots the loudest wins, right? The teams tend to go work on the projects that the business is yelling about, right? And that is obviously input from the business is very important. You do want to factor that in very carefully. But just because somebody asks for something and they ask for it very strongly and you built it doesn't mean that you actually generate a value. Because it really is a function of how is that content used six months from now? And if not, then what was broken in the process that created that content? Did you misunderstand the user journey? Did the user misunderstand the requirement? Was there something else that was fundamental? So, is there a process in place that fine-tunes the selection process for what gets worked on and then reinforces for the team the ability to know that they're working on the right things? That's very important. And what about the people who are not shouting? Exactly, that's a great point. Well, Mike just said, right, sometimes there's something very important, high value out there. And in fact, it's just that that particular user does not happen to be very vocal. That's a great point. So, this challenge is really about saying, okay, take a look at your BI content, your license expense and the team time that you're spending, that you're allocating, and making sure that that actually generates usage in ROI, and that you understand what that is, can kind of go back to try to quantify and measure it, and then iterate on this process, such that you're continuously improving. You know, governance is all about doing this process. So, we want to share with you kind of a methodology for doing this. And if you look at this diagram, I mean, you think about the two areas to try and optimize, content and resource optimization. We maintain that you have a certain process that you should be following to make that happen. So, the first thing you do always is, of course, build and deploy solutions. So, you know, you implement a set of dashboards, reports, you put it out there, you train your users, you get in the system. But it shouldn't be a fire and forget. You shouldn't be building these things and then just immediately moving on to the next thing without coming back and reviewing and providing accountability on this. So, the next step after that content is out there is to make sure you're measuring engagement. To understand, you know, over time, what are the usage patterns for this content? Does it have a flurry of activity initially and then die off? Or is there sustained usage? You know, who's using it? All those things really inform a deeper understanding of how effective the launch of that content is and how that effectiveness is sustained over time. Based on this information, you may want to promote and purge content. So, in my let's say, oh, well, doesn't look like this is effective. I'm going to remove it. Or, you know, we should be having a very large community using this, but I only see a few users. This particular group doesn't seem to be engaging with it. Let me promote it to them and see if by bringing it up, you know, into their email, making them aware of it, they can then, we can start boosting the engagement. That information and that process of measuring usage, promoting content, taking a look at what, how effective that promotion of content works in terms of driving long-term sustain engagement, can then be used to optimize your BI resources, right? So, you can look at, focusing on the resource side, on the user side and say, well, how are my users using my BI license? Which are the users that are not using it? You may maybe have not used the license in three months or six months or nine months and even though I'm paying for it, how can I deprovision from those users and that will reallocate that to users that actually will be using it? And then, what's the mechanism such that if a user that has been inactive for a while can come back and request access? So, making sure that there's an constant process to fine-tune and maximize usage of those expensive licenses. And then, the same information should inform the management of the team. So, taking a look at, you know, what were the projects that we worked in six months ago? What kind of usage and engagement did we get from them? And what does that say about how we should select projects for the next six months? What are the areas where actually there's lots of usage and as Mike said, people are not particularly vocal or maybe we should focus in those areas because it looks like they might, those might be ripe for further analysis. And where are we listening to the squeaky wheel but not getting sustained value? And then, all of this kind of closes the path going all the way back to tracking usage. So, based on that work that you're doing, based on the refinement, you look and say, okay, what is the usage and engagement for across, not just the use case that I built six months ago, but across my entire platform? And based on that, what changes do we need to do in organization, in alignment of resources, and also in categorization, tagging, etc., and in what gets built next? So, iterating in this process continuously just keeps on upping your game from a BI governance perspective and gets you to the point where you're really optimizing the resources at your disposal and maximizing business. So, to do this, you obviously need to have usage tracking, you need to have BI tool licensing utilization, and you need to have mechanisms that you can promote and duplicate content. So, let's look at some examples. Okay, so let's go back over to our metric insights instance here. So, the foundation behind this, I guess, is having a consistent way of capturing that usage data. So, a lot of the tools that you'll be working with probably has some level of that that the challenge becomes in this sort of heterogeneous environment. How do you get the real picture of what's going on when there's content in different tools? People are consuming it at different touch points. It becomes challenging to do that. So, by having all this data together in sort of this catalog environment, we can start to look at it more holistically. So, what actually metric insights is doing is capturing usage data at every touch point where it might happen. So, if you think of a dashboard, it could be being accessed through the catalog that we just looked at because people are searching for it or cooking on a tile. It could be being accessed directly in an underlying tool like Tableau where they bookmark that particular piece of content and go back to it every day. It could be that that content is being sent out in email, and I'm consuming it through a distribution, and that's how I have access to it or a mobile device. Essentially, we need to look at every touch point to get an accurate view of is this piece of content being used, and that's what's happening here is we're capturing this number of views, this engagement number across the different data sources that we might be connecting to for this information, across the different categories of content that we're organizing this in, and across the different users and user groups within the business. What that starts to give us over time is a view into what content at this point in time is most popular and may drive other areas that we want to build content in, what's increasing in popularity over time, but also what's decreasing in popularity. BI is this organic beast where what I created a year ago might not be useful now, and what I'm creating now might not be useful in a year, so we need to be on top of how is content being used and what's the life cycle of that content, as Marius said, and then ultimately what are we creating that is actually going unused, and for whatever reason, is it something that users don't know about that we should be promoting and shouting about, and maybe we set up a distribution from the catalog that sends this out as a newsletter with new content that's available to users, or again, as it even dropped off of this curve on the right here, and actually it's just not useful anymore. I should be able to look at that and take action on it. It's not as good to just look at these high-level numbers. Marius touched on this, you really need to understand the engagement patterns with the content over time, so even though I see something like the Tableau Sales Analysis dashboard that we've been looking at has what appears to be fairly high engagement over the last 60 days, what did that actually look like? Do I know whether that was because I promoted it on day one on the 1800 users when I looked at it and never looked at it again, or am I getting consistent engagement with that content, and so being able to drill into it and getting useful visualizations on that content, this is showing me this particular diagram, the report is in the middle, as users sort of bounce in and out of the diagram, it's basically showing me who's coming back in and looking at this content over time. You'll see a bunch of users drifting to the outside, well that's where they kind of looked at it once and then never came back in. So by understanding a very detailed level how this content is being accessed, what those engagement patterns are like, I can start to make some intelligent decisions about what I do with it. Do I remove it? Do I promote it? Is it being used in the way that I would expect it to? So content engagement, looking at it from both the content perspective and the user perspective, so which groups of users are engaging, which ones are shouting the loudest and getting content, who isn't using stuff because they haven't been shouting and we haven't been creating content for them. What are users searching for? So this comes back down to our sort of metadata discussion and how we categorize content. Are people searching for things and finding reports or dashboards to answer those questions, or are they running searches that ultimately are not bringing back content, which may inform us how we tag or categorize things or what new content we create within the business. So usage is key, tracking that at the sort of macro level across all the different tools and content that you have out there. And then the second piece of the puzzle is tying that back to the costs that you spend that you're making on the different tools that you have as Marius set up front. It's not uncommon for people to be investing in licenses, so it's just completely going unutilized in the business. And so in times when we're having to justify every dollar that we're spending across the business, this becomes critical for justifying your BI programs. And so taking that user data that we just looked at, mapping that against the licenses that we have and that we're spending for each of the tools that we have, we can start to automate the process of understanding, okay, how are those licenses being used? Are we giving them to users who are ultimately never coming into the tool? Are we giving users licenses sort of above their grade? You know, the content publishes on a particular tool, but all they ever do is receive a dashboard in their email, and maybe we can downgrade the license. Looking at that gives us potential savings around licenses based on actually what is being used in the content that we're creating. And then having a process that allows me to either enable users to request licenses if they don't have them or automatically deprovision licenses if they're not being used helps us manage that cost and justify that cost in a much more effective way. Thank you, Mike. So let's look at the last piece of the puzzle. So we've established a literacy, we've optimized resources. Last key piece is generating trust in both the BI team and in the content. And when we think about this, I think it's important to sort of start with a premise that no one's going to be 100% perfect, 100%. So no matter how good your data pipelining process is, you're going to have days where something happens downstream, that's a problem. There's something wrong with the data feed, the tableau data extract fails, something goes wrong. And so really trust isn't about being perfect, it's really about what you do when you're not perfect. How is it that users understand what's going on? And so in a governed environment, well-governed environment, this transparency, and a business user knows whether the data is accurate or whether the data is delayed or if there's any kind of issue with the data. And they know it in a timely way, they don't discover that on their own and then inform the BI team. You know, you don't get these angry emails or calls saying, hey, what's going on, this data is bad. Instead, before the users even look at the data, they're given the context of, hey, there's a problem and we need to take a look at. And that's absolutely key. So to do that, you need really three things. You need to be able to have good quality alertings, right? You've got to know that there's a problem and then that alerting needs to go to the right stakeholder to take action. Similarly, there needs to be communication around that. So just having the alert go to an analyst, if there's a problem, probably not sufficient, if there's not a proactive way to communicate to the end user, that, hey, the data in today's report is either delayed or it's suspect. There's something wrong with the pipeline. Don't take a look at it. There's, you know, wait until later today until we've resolved it. And then that messaging needs to be tailored. So, you know, I can't, if I'm in a manufacturing group, I don't want to see a message about a problem with the data that's happening in finance. I need to just know about any issues that are affecting me and the content that I consume. So let's take a look at that. Okay. So let's go back to our browser here. So there's really two ways to think about data quality, I think. And a lot of times data quality is sort of thought out of kind of the ETL process. You know, what's happening with my ETL? Is it getting through each of the steps successfully? But that doesn't always paint a picture that says the data is 100% correct. You know, it might complete, but we missed out on a bunch of records or something for whatever reason downstream. And so in a given environment, using a tool like metric insights, what we can start to do is actually tap into some of the data behind these dashboards and reports that we're pulling in. So in the metric insights world, we can actually use our plugins to start tracking different metrics. And those metrics might be sort of IT focused, but they also might be sort of the business metrics that may be an early indication that if we track those metrics, a big change could be business related or it could be an indication that something's going wrong in the data. Anyway, either way, we can start to generate these time series metrics that allow us to easily automate the process of looking for outliers. So an example might be here where we're tracking the row counts, the number of records that are getting loaded into a table, into a data set that is driving that Tableau dashboard that we looked at right at the beginning. By tracking this on a daily basis, when the Tableau, Tableau dashboard updates, typically we are getting a similar number of records every day, obviously this fluctuation. But by having metric insights in this case, perform some automated statistical analysis against that time series, any time we see outliers such as a drop here, where it looks unusual and it's outside the norm, we can use this to start alerting people of issues. And obviously that might be something like sending an email to the analyst, the person responsible for ultimately the dashboard or the data behind it, to say, hey, something has happened within here. But also it's to Marius's point about driving then a process that ultimately communicates this to the end users. So it may be that this alert actually sort of stops the press. We don't go and update the Tableau dashboard in this case because we know there's probably something wrong with the data and we want to take a look at it. So we stop the update. But we also, if we jump across to our other user here in Avery and our other browser here, we also might want that alert to automate a message into the catalog or the portal of any users who have access to that particular Tableau sales analysis dashboard to say, hey, we found some issue with the data that's sourcing this within our warehouse. It's failed. It's under investigation, whatever the custom messages that make sense in your environment. And having that automated so that users now coming in can immediately see, okay, something's going on. I probably shouldn't just look at this dashboard and be able to completely trust it. That message often is automated. Obviously, it could be manually added as well. But ultimately it's about communicating that back to the user. And so whether it's using sort of IT metrics like we've looked at or just ingesting perhaps the business metrics from that dashboard. So sales data in this Tableau dashboard, maybe we want to track the business metrics and use these as a way of determining whether there's an issue with the data. Just because there's a drop in sales in a particular region doesn't necessarily mean that that's 100% accurate. Maybe it's an indication that the data load on that day was incomplete. And so rather than sending this to my Canada regional manager and saying, hey, what's going on? And then him coming back at me and finding out that the data isn't complete, I can check this before it goes out to him or her. And we've got a better relationship around that. So using data as a way of ensuring quality, using automated analysis as a way of ensuring quality means that we can start getting that trust back with the business users. Just to add one thing to what Mike said there, that message could just as well also be telling you about delays. So maybe the data is fine, but perhaps running behind schedule you can generate a notice and that creates trust because users then don't go in and start looking at the dashboard only to realize that, oh my gosh, I'm looking at data that's actually old because it didn't refresh today. So just to wrap up as far as what we've shown you is we've shown you the key pieces that we believe need to be in place to achieve BI governance and effective BI governance. And once you've put these pieces in place, you can establish that happy coexistence of balance between security and discoverability. You can transform the jungle into the garden, declutter, clean up, organize, catalog, pull in your extended metadata, make it a museum that people love to visit. Establish trust in data, both in data and reporting as we've shown you with transparency and alerting and proper governance around the pipeline of data all the way into BI tools. All these things in turn can provide management to increase utilization of your BI tool licenses and make sure that the spend is going in the right place. And then all the usage and analytics across all the BI tools that are within your ecosystem and what happens as you promote and track engagement around that can inform the BI team to make sure that they're in fact working on the right project in the right way such that they are generating solutions that have sustained business. So with that, let's open it up to Q&A. Mike and Maureen, thank you so much for this fantastic presentation. We've got a lot of questions coming in for you both. And just to answer the most commonly asked questions, it's a reminder I will be sending a follow-up email by end of day Thursday with a link to the slides and the recording along with anything else requested throughout the presentation here. So signing in. So how does BI governance relate to data ops? It seems objective of BI governance should be a data ops capability. So I think data ops is a part of the governance BI governance solution. So again, going back to that slide that we looked at where we looked at the fact that BI governance is not just about data. Clearly without an effective data ops structure and without tool set and the proper level of quality checks and type planning and all the technologies that are typically going through that data ops infrastructure, you cannot have properly governed environments. So I would characterize that as a prerequisite that you have proper data ops. But we should also recognize that while it is necessary, it is not sufficient. Let's see what else. Should I go to the next one? So should certification apply at that quote unquote data or the BI report? Where should certification be applied? Oh, that's a great question. And I think it really some of this depends a bit on the organization and how you look at it, but how people work with the data. But generally speaking, we find that there's different kinds of certification. So the data sets that you're working with to build a report, oftentimes the certification there, and that certification is denoted by the individual who's in charge of creating that data set. So whether that's a tableau data set of you inside your snowflake environment or teradata, whoever the person who created that logic can apply certification to that data set to say this logic is applied in a way that is consistent with our published business standards and rules for the metrics that are being displayed here. That's one level of certification. Another level of certification needs to be at the visualization level because there, the Tableau developer or MicroStrategy developer will have you building that content is basically certifying to say I've taken this data and I have represented it in my visualization in a way that is consistent with our definitions of the key terminology that I by the way, documented here along with the visualization. So that piece creates a continuity of certification from the data all the way to the user. And how often should I read certify something? Yeah, so that again depends upon the data. So we've seen situations where certain kinds of content can just be certified once and maybe be visited once a year or in some fairly infrequent basis just to make sure that changes have not happened in core business logic. There are other dashboards that maybe represent, for example, for banks that might represent financial statements. I think, you know, what are my results for this quarter of this year? That might need to be, if it's a monthly financial report, you might need to be certified every month such that you know that you're looking at results for the prior months that have been fully vetted through an internal audit process. So it really depends on the data, the nature of that information and what does the user, the end consumer of that data need to know and how frequently does that need to be reset. So if you really look at it on a case-by-case basis. How, for example, can an analyst's government seize an international statistics office where you conduct household survey analysis and analyze the data and the publish? I mean, I think it's the same kind of approach as you would have in any organization where you've got a large amount of data collection. And we've seen survey, you know, similar issues in pharma and other places and that what you're looking for in that situation for an analyst's government, first of all, at the data level, you're looking for anomalies that might reflect issues of data collection, right? So whenever you have a dispersed way of collecting data, you're looking for situations where there's any kind of anomaly across the various dimensions of that data that might indicate a collection issue. And so that's a tier one. And then tier two, once the data has been validated, then it's the question of, you know, how is that data represented? Where are there multiple visualizations of representative? So are they representing it in a consistent way? What kind of governance have you put in place to ensure that there is consistency across representation to the audience, whether that's an internal stakeholder or external stakeholders that are viewing this in some kind of a portal? And then finally, what is the type of documentation that you need to provide those stakeholders to foster data literacy to create the necessary level of context? So is that definition of what does this term mean? Does that live consistently all the way into the visualization and the documentation provided that visualizations to the user so that they're interpreting the data correctly? And that seems like a scenario where recertification probably happens quite a lot, right? Because the data could be, the data is being collected and could be very different every time, right? This is, oh, it's just the standard feed that's coming from this system. And, you know, once I know it's right once, then I'm pretty confident with it. That's great. Exactly right. And in fact, in some of these situations, what we've seen is an automated de-certification process, right? So as Mike said, when the new feed comes along for the next month or the next period, once that feed has been applied, that content gets de-certified, it gets moved into the workflow where it has to be then recertified through those kinds of the process. So it's not a one-size-fits-all. It's a take a look at the data, how is it consumed, how is it changed, and then come up with the right certification and governance flow that is tailor fit for that particular use case. And there's a lot of questions here about API integrations and a list of tools that metric insights works with. Yeah, so we, yeah, we have a whole bunch of plugins. I'm seeing SSRS here. We've worked with that. I have to check on encoder, not sure. But if you go to help.metricinsights.com, you'll see a bunch of our plugins. We are expanding our plugin list on a daily basis. Half our development team is working on plugins. So as customers request them, we're building new ones. So it's an ever-expanding list as well. But our ultimate goal is to be able to connect to any source of BI in your organization, as we've spoken about. So if we don't have it today, we can have it. We have a lot today. I love it. So, yeah, and if you give me a, if you put the link in the chat or tell me the link, I'll make sure and get that out in the follow-up email as well for everybody. Awesome. So I think we have time for a few more questions here. So while I believe state BI governance is considered as many as part of a data governance program, an organization could start their data, their governance initiative by simply focusing on BI governance first rather than try to boil the ocean. Do your clients do this? Yeah, absolutely. And I think that that's the right approach. I mean, we've given you kind of a fairly comprehensive view of what BI governance is all about. But I want to make sure that by all means, we're not suggesting that you have to do it all at once. And in fact, I think that's usually the wrong strategy. It needs to be typically a crawl, walk, run. And you need to look at that spectrum of, you know, as we presented kind of what are the different dimensions across which you're working with, you need to prioritize and see, you know, where are the biggest pain points in your organization? Are they around data literacy? Are they around trust? Are they around, you know, kind of the notifications of things that we showed you? And I think then prioritize where you want to focus on accordingly, right? So maybe start by pulling all together or make your organization. Yeah, I think I think it's a good point. Because I think, you know, data governance is kind of the hot topic if we put it in that sort of in that, you know, organizations are spending money and buying tools to try and figure it out at the data level. And that's a big beast to sort of deal with that, you know, oftentimes it never sees the light of day in terms of what you're sending out to the end users and a way that a lot of customers have succeeded is to really approach it from the report and analytics perspective. Yeah, if I if I'm going to certify and govern a piece of content at the end user, at the end user perspective, then that's ultimately going to require me to go and certify the data behind it as well. But again, I'm not boiling the ocean and trying to certify every data point I've got in the organization, I'm doing it in terms of what people are actually consuming. Yeah, and we'll have the highest risk of creating issues for people if they don't understand it. So you there's a across many different vectors you want to very, very thoughtful and prioritize the effort you spend on. So does the search just use keywords or use any type of sentiment analysis? The search uses a it uses keyword based, but not just keyword, there's a natural language component to it as well. So you can ask more kind of a, you know, regular language search and Mike didn't show you this because it wasn't like a detailed product demo, but the same search is actually possible from Slack or from Microsoft Teams. It searches across name, description, all the tags, the extended metadata that you loaded either manually or through integration with tools like Libre, Elation or or or Excel spreadsheets where you might have some additional metadata. So every it is, you've got it was just across all of that based on the either either a simple keyword as well as a more of a natural language type of search. And the results are ranked obviously based on relevancy of search terms, but also things like engagement, right? Yeah, the popularity of content that's coming back in the search. I wouldn't say sentiment per se, but you know, it does take into account. So we use interaction with the content as well brings back the results. Alrighty, there's so many great questions here. Some of the questions I think we could have full webinars on. And I'm going to try and flip a couple of more here. How do you manage certification and complex enterprise where there are lots of stakeholders who need to certify content? So that that really comes to designing a certification workflow that fits with just stakeholders. So in your an example that Mike showed, you should kind of to simple for both of them fairly simple, but they're very different kinds of functional certification flows. The first one was very basic where a user just certifies their own content. That's that's typically maybe small organizations might do that maybe when you're starting you would do that. Oftentimes in large organizations there are many users and sometimes they do not might even certify their own content. There might be multiple layers of certification. So the idea is to be able to look and enable a workflow that supports the needs of your organization. To give you an example, many of our customers we use something called content sync where all of the content that is being created in a BI tool, a particular area, is brought into a staging area within metric insights. Then a user can use one of those pages that Mike showed you where they can see all of the items that are in that staging area that they own, the ones that they created themselves or our owners of. And then they can review each of those items and indicate if they certify it, in which case it then gets moved and published into the right into the right place within the way it's visible by the users. Of course they're adding as part of the certification process, they're adding metadata, they're conforming to whatever standards have been specified for a certification. Or they may say, you know, this is piece of content we don't really want to publish, in which case it just stays in a non-visible area. So that's just one example. That might be multi-stage as well. So I may see a piece of content that's coming to my area. It may be that I say, okay, this looks useful, but I actually pass it on to someone else to do the certification and ultimately one or more, someone certifies the data, it then goes to someone else to certify the report, then ultimately gets published into another area. But effectively, what are you doing with that? You're effectively, over time, opening that piece of content up to a certain user who is responsible for a certain stage until it ultimately gets put in a category that is available to the end users. But in all cases, it's a distributed process, right? You just, in any kind of larger organization, it is not one person or even a single group of people that certify content to be effective and needs to be pushed out into the lines of business and to the folks that actually know that content, maybe the authors of that content and have a workflow that supports that distributed process with some centralized governance and monitoring to understand, you know, how much of an adoption is there in that area? Is there any issues, things like that? Well, Marius, and I thank you so much for another great presentation. Great to have you guys back with us. And thanks to Mr. Akinzais for sponsoring today's webinar. I've been afraid that is all the time we have for today. Again, just a reminder, I will be sending a follow-up email to all registrants by the end of the day, Thursday, with links to the slides, the recording, and the additional information requested. So, I hope everyone has a great day out there and stay safe. Thanks, everybody. Appreciate it. Thanks, Mike. Thanks, Brian. Thank you all for joining us.