 I'm Shannon Kemp and I am the Chief Digital Manager of DataVersity. We'd like to thank you for joining today's webinar, Data Maturity, A Balanced Approach. It is the latest in the monthly webinar series from IDERA. It's a couple of points to get us started due to a large number of people that attend these sessions. You will be muted during the webinar. For questions, we will be collecting them by the Q&A in the bottom right-hand corner of your screen. Or if you'd like to tweet, we encourage you to share highlights or questions via Twitter using hashtag DataVersity. And as always, we will send a follow-up email within two business days, continuing links to the slides, the recording of the session, and additional information requested throughout the webinar. Now, let me introduce to you our speaker for today, Ron Huzenga. Ron is the Senior Product Manager of Enterprise Architecture and Modeling at IDERA. Ron has over 30 years of business and IT experience as an executive and consultant, spanning a diverse range of industries. His hands-on experience in large-scale enterprise initiatives include enterprise data architecture, business transformation, and software development. His background provides a practical real-world insight to enterprise data architecture, business architecture, and governance initiatives. And with that, I will give the floor to Ron to get today's webinar started. Ron, hello and welcome. Hello, Sharon. And thank you for having me today and welcome, everyone. This is, I think, a topic that's kind of near and dear to my heart. And I've talked about variants of this before. But I've put together a slightly different perspective here and just want to talk about how you really achieve data maturity in your organizations. Because when we look at it, a lot of companies, when they really look at how they're doing from a data perspective, they come to the realization that the maturity level is actually much lower than they think it is internally. And then you really need to start digging into the factors as to why that may be. So with that, we'll kind of talk about some of that because achieving data maturity is becoming more difficult all the time. And a big part of that is just because of sometimes the environment that we find ourselves in. So I'll start by talking about what I call the data ecosystem and the complexities that we're dealing with there just to set the stage at a very high level. And then we'll actually look at some examples of how our company's doing in terms of information capability in general. From there, I want to take a step backwards. And let's just talk about where did maturity standards come from and how did they evolve overall? We'll spend a few minutes on that. And then we'll talk about both data maturity and process maturity and some indicators of those in concert with one another. Because when we really look at it, we need to look at those. And then the supporting considerations for data in particular is understanding things like the data value chain, the data lifecycle of data. And we really need to understand all of those aspects plus the enterprise architecture and governance implications as well. From there, I'll get into some modeling specifics of how we can actually help ourselves to achieve maturity on both the data and process side of things. And then we'll wrap it up with a brief summary after that. Now, as most of you know, my Twitter handle is Data Aviator. And that is because I do have a keen interest in aviation, which is why you'll see some aviation-themed clip art here throughout. And of course, this is our pre-flight checklist. And now we'll actually start launching into our journey and move forward. I talked about data ecosystem complexity or ecosystem complexity. And when we look at a lot of organizations, we see a lot of different types of things that are happening. And we're trying to ingest a lot more types of data in our organizations now than we were before. Everything from raw data from internet sensors, IoT sensors, social media feeds, flat files, other data feeds, in addition to the typical relational structures. And of course, a lot of us are now utilizing no SQL data structures in our organizations as well. We're actually having that where we're actually trying to take information from both inside and outside the organization at any given point in time. So what that really sets up for us is we've got all of these different types of information that we're really trying to ingest into our organizations or are originating in our organizations, but we're trying to manipulate them so we can make sense of them. So we end up with all kinds of different categorizations of data and all kinds of places that data are stored in the organization. That ranges from raw transient data, as which is where we land some of the data. As which is where we land some of the external data when it comes in. Some sandboxes that our data scientists kind of use to try to prove out the conclusions that they're trying to draw around data, analytics queries and those types of things, just to find if there's anything useful in that data that they can carry forward and use further in the organization. What that really means though is we need to have these graduated approaches of how we handle our data. So once we've gone through the raw data, we need to say what data is approved for usage and that we're going to take downstream and what are we also going to dispose of. So we end up with more data stores quite often. And ultimately we end up with what we would call our trusted data stores and that can be a variety of different things. Things like master data management stores stored on different technologies, both relational technologies are in usage in organizations, as well as many no SQL technologies. And most organizations don't have just one. We usually have a mix of a number of different platforms, including home-built databases, databases with ERP solutions, no SQL data stores. So trying to make sense of all this is becoming more and more complex all the time. What we're ultimately driving at is we want to take that information, some of which we're using in transactional systems, some of which we want to move forward to decision support type of activities. So we need to refine or actually modify it somewhat to be able to make it fit for use to try to do the analytics and conclusions that we want from that. Ultimately we're taking that into either data warehouses, data lakes, and that type of thing. Again, introducing even more data stores there, many of which are on premise, more and more we're seeing as cloud stores as well. So we have a lot of different things that are going on. We're dealing with data that have different types of origin, internal versus external. We have raw different data types, so raw flat files, relational databases, no SQL data stores. And then from a deployment perspective, we also have a mix of on premise versus cloud. So it can become a very complex picture very quickly. And then of course ultimately what we want to get to in a lot of these types of things is to a point where we're actually able to drive self-service analytics and reporting with people having confidence in the data that they're utilizing, as well as knowing what that data means and how they should be using it. That's kind of the challenge at the end of the line and achieving it is becoming more and more complex all the time. To prove that out, let's look at a few different things. There are a few different studies and surveys that have gone on in different years and I'm just going to quote a couple of them very briefly right now. As early as this year, in February of this year in the Harvard Business Review, there was a study and they actually found that the percentage of firms that are identifying themselves as being data driven has actually declined in each of the last three years. So from 37% to in 2017, down to 31% this year, which is a fairly significant drop because when you think about it, more and more organizations are spending more and more money on things like AI, big data initiatives and other things, all with the objective of becoming more data driven and utilizing their information more effectively, but we're actually finding that the opposite is happening. So we need to take a look at why. Another study, this was one that I've quoted a few times as well back in 2015, still still very recent was PWC and Iron Mountain called the Seizing the Information Advantage. And they found several things here and these are all things that really tell us that there is a lack of data maturity. So for instance, very few organizations are utilizing information to its full potential and underlying that they see things like deficiencies in technical capabilities, skills or organizations that are lacking a data culture, lack of investment in value information, value driven information strategies. And when they really looked at organizations, very few actually know how to drive maximum value from the information that we have. At the same time, we're going after more and more information, so the return we're getting on that is less and less because we need to start utilizing it more effectively before we start piling more and more on top of it. So let's talk about maturity standards and bodies of knowledge and those types of things. We see a lot of different models, a lot of different scales that are using in different aspects of our business. A lot of them go back to what was called the capability maturity model, which was actually stood up in 1988, but it had roots that proceeded that by many different years. There was a gentleman named Watts Humphrey and he worked for IBM and he worked for them for 27 years and he was looking at different types of things like process maturity and those types of things. And he really started to elaborate that further when he was working, doing work for the Department of Defense when he was actually retired from IBM and working at Carnegie Mellon University. So that work kind of started in 86 and came out as the capability maturity model in 1988. Like I say, it was sponsored by the U.S. Department of Defense. It's been used as a basis to drive many other standards because while there's a lot behind it, there's a lot of elegant simplicity as well in terms of a five point scale where you really rate yourselves in terms of relative maturity to previous levels. And that same basic five point scale has been utilized in a lot of other things that are going on and also spawned a lot of other analytics or maturity efforts as well. So one that's fairly well known now is the data maturity model. I'll come back to that in just a second. But we also see it in other areas of business like business process maturity. We see other standards like COVID and ITIL and total quality management. All of these types of things have some type of formalized measurement techniques to evaluate yourself on some type of a scale in comparison not only to how you've been doing in the past but other organizations. Now I did mention that I was going to talk about the data management maturity model. That was actually released specific to data and it was released in 2014. The speaker that you've heard frequently on data diversity in other places, Dr. Peter Akin, is a real champion of this and was very involved with that. The DMM took three and a half years of development. It had major sponsors in terms of major corporations. But overall, the contribution to that standard was over 50 authors, 70 plus reviewers, over 80 companies. And what they came out with was a number of best practices. So when you really started looking at the details of the DMM, they had at that time 414 practice statements as well as 596 different functional work projects products that came out of that. I'm not going to go into that level of detail today by any means, but I'm going to talk about some of the things that are also consistent with the themes that they talked about because they had different areas that they looked at such as data management strategy, governance, data quality, data management operations, and platform and architecture. And as you'll see as we move on, a lot of those overlap with things that we see from the data management body of knowledge as well. And of course, not to make sure that we also cover those. There is the data management body of knowledge or the DIMBOK, version two has been out for a while now. But we also see other things that use in our organizations. An equivalent type body of knowledge for the business architecture body of knowledge or business analysis body of knowledge, as well as the project management body of knowledge that many organizations are familiar with as well. What I'm really trying to drive at here is we have a lot of standards and we have a lot of bodies of knowledge that we can utilize the expertise to bring to bear in our own organizations so we can actually start to elevate our maturity in our organizations about how we're dealing with not only data, but also be talking about process as well. To take a step back, let's just look at what is maturity overall. And a very simple definition is mature, being mature really means having reached an advanced stage of development. And that's probably a good enough statement because basically it implies growth. We're starting at one point and we're moving to another one. Organizational maturity overall requires what I call the balanced approach and that is you need to have a balance between data maturity and you also need to have process maturity. And you cannot achieve one without the other. It's virtually impossible to do so. And I'm going to dig into that a little bit further. Having both of these types of maturity and lockstep are fundamental to really driving initiatives such as enterprise architecture has been different components that I'll talk about. And also as a stepping stone and also as that supporting structure to allow governance in your organization, data governance, process governance, and other things as well. So I'm going to talk about, have a balance to post a little bit here, but I'm going to take the five point scale. And interestingly enough, even while all these other efforts were going on, I've been using this slide for years. Like I said, the DMM came out in 2014. I put this slide together, I think in 2009 or 2010. And there's a lot of overlap because all these different types of things, as I said, have their fundamental basis in that capability maturity model. The CMM has a scale from one to five. And this one I actually inserted a level zero because I've seen some organizations in my consulting career that had virtually no data maturity at all. So I put a zero in here just to kind of show things that are happening in organizations that are really at very early stages and don't have that level of maturity. What I'm also trying to do here is rather than having a detailed formula to try to calculate your maturity and that type of thing, I basically have a few basic categories here just so you can look at these basic areas in your organization. And if you see some of these things, it gives you a very quick feel for where you might be placed relative to some of the other organizations, some of the other maturity levels that you see. So we basically go from a level of none right up through initial managed, up to be optimized, which is from the CMM model as well. But we look at things like data governance. At the bottom end of the scale, you may have no governance whatsoever, but as you grow, you start to introduce it at possibly a project level that may be a program level to the point where it starts becoming drained in your organization through divisions and ultimately enterprise wide when you get up to an optimized level. A key indicator to look at is whether you're doing master data management or not in your organization. So if you have no master data classification, you're probably near the bottom end of the scale. As you start to evolve, you may have master data, but it's not not integrated. Then you start integrating it. Maybe you have shared master repositories or MDM stores. Then you have standardized data management services up to the point where you're growing that in addition to having master data stewards and a data stewardship council overall when you get to the top end of the scale. Data integration as well. If you have an organization that's characterized by many ad hoc point-to-point interfaces versus a really good middleware type of implementation with using things like service buses or data services, then you definitely have a ways to go in terms of maturity on the data integration front. And data quality is one that's very important. Do you have silos of scattered data? A lot of inconsistencies that you just accept and that type of thing. As you start to mature, you start to recognize those inconsistencies. You start to put plans together to address them to the point where you actually have quality built in all the way through your practices. Even data cleansing at consumption, which a lot of organizations tend to do. How many of you spend millions of dollars on ETL because you're taking data from source systems? You're trying to apply some sort of cleansing so that you can then utilize it to make decisions. Full maturity means that you've actually gone back to those processes that are collecting and working with that data and you're building the quality into those steps rather than cleaning after the fact. In terms of an overall behavior in terms of data maturity, again, this is a category I put in. You go basically from maybe unaware that you have a problem if you're at the very early stages to when you're getting the initial ones, chaos tends to rain. And you get to a point where you really get a retained stability and then have a very predictive approach to how you're managing your data in your organization when you're at that top level of maturity. There's also a very interesting thing that you can look at here and that is what's your IT focus in your organization? If you look at IT in general, if you're focused more on technology and infrastructure, you're probably at the bottom end of the scale whereas if you're really looking at information and how to strategically enable your business with that information, then you're probably trending towards the higher end of the scale in data maturity. Part and parcel of that are risk and value generation. At the low end of the scale, you have high risk and low value generation, but as you mature, you're transitioning from low risk to the point where you actually have very high value generation in your organization because you're utilizing the information correctly as well. Something I'm going to talk about in a little more in a few moments with some examples, but there's a direct correlation to the usage of data models in organizations and the level of data maturity. Each one of these dots that you see actually corresponds to one of those five levels. So if you're using data models only for documentation or maybe some project focused physical database generation and nothing beyond that, then you're probably towards the low end of the scale. But if you're utilizing the different types of models ranging from conceptual, logical, and physical, and really utilizing models as a design step, then you're increasing the maturity. When you get to a point where you're doing enterprise modeling, including canonical models, looking at data lineage and things and also tracking governance metadata in association with those models, you're moving up even further. And ultimately, you want to get to a point where you're using fully integrated data modeling and architecture, business glossaries, fully defined metadata, and really allow you, taking those same definitions to drive things like your self-service analytics as well. Let's look at the other side of the coin now, which I call the process maturity. The same type of thing happens here in terms of processes in our organization. Same scale, one through five, initial, right up through optimized again, borrowing those from the capability maturity model. And here I'm just looking at some types of things that we're looking at in terms of what are some indicators that can show us where we might actually be in terms of process maturity in our organization. So if we look at the focus of our organizations, if people are really looking at personal methods to accomplish their work rather than standardized approaches, you're probably at a fairly low level of process maturity. But as again, as you move up that scale, if you have standardized processes based on best practices in your different work units in the organization, ultimately to improvements all the way across the organization, then you're actually moving up at the high level of the scale. Same type of thing with efficiency. And I'm not going to go through all of these on this side, but if you look at the efficiency role, if you are inefficient with very few measures and then talking about quantitative measures here to analyze your effectiveness, you're probably at the lower end of the scale. When you start moving up the scale, you're going to see things like repeatable processes and repeatable procedures that you can use, common measures and processes that are used across the organization, and then driving that out to multifunctional processes and having ownership or stewardship of the processes as well as the data as well. In terms of a culture, quite often a low process maturity organization can also have a fairly stagnant organizational culture where you don't really have an NFL-level foundation to really drive commitment and improvement in the organization overall. Whereas when you get up to the top end of the scale, you really have a preventative mentality or approach where you have a systematic elimination of defects and problem causes throughout your processes in your organization, whether it's in your data, whether it's in your manufactured processes or any other processes or other things that happen in your organization. From a pure business process point of view, if you have few activities that are defined and don't have current state documentation on your business processes, you're probably at the lower end of the scale. And again, you want to work your way up to you have so you have things like fully documented processes and standardized processes in your different departments in your organization as well as the cross-functional dependencies in your organizations fully understood. Up to the point where you're really doing a continuous process improvement, always having quantitative feedback so that means instrumenting your processes so you know if you're succeeding and how to improve them by continually evaluating yourself against the measures that you've put into place. From an architecture point of view, you may have a lot of disparate IT systems at the lower end of the scale. You may have some random services adoption as you start to become more standardized, but ultimately you get to the point where you're using things like full services, possibly service oriented architecture and a fully processed driven enterprise where everything is integrated together at the end of the day. Again, same types of scales here. When we look at things like productivity and quality at the low end of the scale, you're going to have low productivity and low quality in your organization. And at the higher end of the scale, you're going to be driving higher productivity and value. Same thing with the risk and waste. If you're at the low end, you're going to have high risk and high waste when you've actually matured much lower waste and much lower risk. And in terms of some key things that you can look at at the lower end of the scale, you may see some predominant philosophies going on in your organization. So if you have a predominantly cost cutting mentality, you're probably towards the lower end of the scale. Whereas as you move up, you're looking at more and more efficiency across the organization. And ultimately, when you get to the high end, you're looking at value generation. Again, I talked about chaos a little bit, but at the low end of the scale, you're probably experiencing chaos in the organization. As you move up, you're having more and more management and alignment of processes. And when you get to the high end, you really have a leadership driven approach to actually driving process throughout the organization, starting with the CEO all the way down in the organization. Same type of thing that we see here that we saw with the data modeling. If you're using process modeling and you're just using it to do some basic documentation of processes, you're likely at the low end of the scale. If you start using it to manage your business processes, then ultimately drive process improvement and get to a fully mature type of area where you actually have quantified processes, then you're really going to have achieved process maturity in your organization as all. Now, this next slide, I'm not going to go through in any detail at all. What I've done here, just so you'll have it in the handouts when you get the slides, is what I've done is I've taken some typical types of behavior for both the data side of things as well as the process side of things. So if you see certain phrases such as, for instance, gut field decisions or that type of thing that you see going on in your organization, that probably puts you down at a low level of maturity or an initial level. If you see things like maybe master data stewards that you see there in the fourth column part way down, that tells you that you're probably moving towards more of an advanced approach in your or if you actually have cross-divisional data governance as an example. So these are just some of the telltale signs that you can look at or behaviors in your organization that you could say, if we're one of these, we're likely in one of these different areas across both data and process. I talked about a balanced approach. So in terms of achieving organizational growth, you really need to have both. You can't achieve data maturity without process maturity and vice versa. When you really look at it, some organizations tend to be more data-centric, while other organizations tend to be more process-centric. But think of this as a journey where you have people going through an obstacle course and you're tied together with a piece of rope. You can't get very far from another. So to get through this obstacle course, basically called organizational maturity, you always have to be evolving up the scale together. Perfection would be perfectly balanced moving up that red line to achieve maturity. Virtually nobody's able to do that. You're always going to fluctuate a little bit from one side to the other and probably towards the side that your organization is mostly geared toward. But ultimately, you want to keep it in fairly close balance as you're going up that scale. So let's talk about enterprise architecture for a moment here. People typically talk about the different domains of architecture, which are data architecture, business architecture, application architecture and technical architecture. I won't go into the detailed definitions of them because I think most people are probably familiar with them here. A lot of people talk about four domains. Our approach is different. We view it as you need a very solid data architecture foundation to really support the other aspects of enterprise architecture overall. Because everything that we do has some type of a representation in data. So we need that solid data architecture to be able to support that. Also in terms of the other three architectures, business architecture is that central pillar. And that's the business architecture, business strategy, as well as business process modeling fall into those. To fully understand a business, you have to understand the data and you have to understand the business processes of how that data is used. Then you have the application architecture, which is how you're actually delivering these capabilities to your organization, as well as the technical architecture, which are the components that support that. When you have a balanced approach to cross all of that, that's where you really achieve enterprise enablement. And that in turn is a structure that really allows you to achieve governance. You have to have this to make sure that you can actually achieve things like data governance that so many people are trying to are actually struggling with right now. When I look at addressing governance through models, I also look at the data management body of knowledge. And one of the big things that happened between the DIMBOK 1 and DIMBOK 2 is that orange bubble that you see there are kind of in the top right hand part of it, data modeling and design actually got a home of its own. It was split out of the data architecture aspect in the earlier areas because it's an extremely fundamental discipline that's needed to understand the data in our organizations. All the other things come into play as well. You've heard me talk about data architecture, data quality was mentioned in some of the charts I showed earlier and of course metadata management and managing all of that data metadata around this is extremely important as well. And I'm not going to go through the detailed descriptions on the entire DIMBOK wheel here. This is just my representation of the wheel showing slightly different ways. Let's look at other aspects specific to data that we need to understand to really achieve data maturity in our organization as well. The first part of that is understanding the data value chain. If we have just data like the number three or the number seven it means nothing to us because it's just pure raw data. Data represents facts, it's text, numbers, graphics you may have images sounder other things but without context it really doesn't tell you anything. When we have definitions, format, time frame and relevance added in which gives us that context that's when we actually take that next step up where that data has become information. Information is data in context and like I said without context the data on its own is actually meaningless but we need to go a step further. We really need to be able to understand that more about the information what are the patterns what are the trends in that information what are the relationships between the different pieces of information in our organization and what are the assumptions behind that information once we have that additional context that's when we've actually translated into knowledge. Knowledge is information in perspective and that means it's giving us a viewpoint that allows us to interpret the patterns and trends and also formulate conclusions based on other experience and information in our organizations as well. That's what we're ultimately driving at. So the question remains how do we get from data to knowledge questions we need to ask ourselves and things that we need to do where's the data? Well to do that we need to identify the data stores in our organization and the way to do that is to reverse engineer that from your databases you know SQL stores even raw data sources into some type of a format that you can utilize and actually tie the information together. The format for doing that is data models. What is it? Applying naming standards because in those data stores you're going to have all kinds of different naming conventions and inconsistencies you need to roll up your sleeves and be able to resolve those inconsistencies. Also something that we call in our product universal math things and what we're really doing there is we're saying now that we have these different instances of the same basic type of entity across these data stores we need to link those together so we know where all manifestations of customer are or vendor or employee and those types of things. I'll talk about that a little more in a few minutes. Very important where did the data come from or information come from and how is it used? We do that through a couple different things data lineage visual data lineage in particular and business process models they give us the context of where the data is created and how it's utilized and how it changes in our organization as we're consuming it. And of course what does it mean? And that means detailed data dictionaries with proper definitions often data dictionaries are technical definitions plus the data specifications whereas our business glossaries give us that business terminology and agreed to definitions that we have about this data we need both to pull this together and ultimately how do I govern this data? And that means we have to have practices like reference and master data management which we can drive out of the metadata from our models and other collection things because what we're going to do is where it's not just the basic metadata we'll do things like data classifications security classifications and tie the different types of data together with the regulatory policies that come into play to govern it as well. We need to paint this entire picture to be able to really achieve data maturity in our organizations. Another thing that's often overlooked is the lifecycle of data and that describes how each data elements created, read, updated and deleted in other words where does it start in our organization? How is it utilized in our organization and what are all the different things that it goes through? And that means that we're doing several different things. We're classifying it we're storing it how are we using and modifying the data how's the data shared? And things that people often forget about is how do we retain and archive the data and ultimately when do we destroy that data as part of the lifecycle? We need to understand all of that and that means business rules business processes and the applications and other processes that come into play with it all need to be understood. Another thing that complicates this is there may be more than one way a particular data element is created. Data elements may originate both internal and external to the organization and the same type of information may originate in different systems and depending on the business processes in place one system may be a source for one element of data where another system may be the ultimate source of another element of data even though you find them in the same data store downstream. So again, understanding that lineage of the data is extremely important. Other things that we need to know is to understand this we need to model our business processes we need to model our data lineage and that means not only the flow of the data the integration that's going through and that includes ETL extract transform and load activities not only for data warehouses and data marks because people tend to think of data lineage is basically taking things from our transactional systems how we transform it and ultimately move it into our data warehouses and decision support systems we need to go further than that we need to understand the lineage in terms of that life cycle in which systems the information made its way through on its journey through the organization and that includes manual processes where people may have actually looked at information in one system changed it or put it into another system manually and that's something I call swivel chair integration you won't get to that unless you actually go through and understand and model the business processes. So if we take that ecosystem that I talked about earlier what I'm doing now is I'm saying how do we now apply modeling to this and I'm talking about data modeling specifically to this data ecosystem and that is we're using several different things in play here and we'll look at this in a little more detail in the next slide as well but what we want to do is we want to bring the entire arsenal of modeling capabilities to bear and that means from a data modeling perspective conceptual models logical physical models the variants of that which are dimensional models to understand things for data warehousing and that sort of thing as well as enterprise models or canonical models that may be utilized to drive service-oriented architecture to really start to pull all things together and link things that are of interest across our enterprise back to the implementation models where they reside. That also means the visual data lineage and again the data dictionaries and enterprise data dictionaries where we have naming standards and what we call attachments or classification metadata that apply across all these different data stores that we have in our organization also tied to business glossaries. If I kind of take that top diagram and split it out what I'm really showing here is of course that we have the different types of models. Central that is the enterprise data dictionaries and we also want to tie together things like our enterprise models down to those physical implementation models. How do I do that? So I'm going to talk about the concept of linking entity instances. What are they and where do they reside and I'm using an enterprise model as the focal point to be able to do that. So again this is obviously a very small part of what would be an enterprise model but it has a few basic concepts in it. So I have key concepts like suppliers, items, customers, those types of things that are represented in my enterprise data model and I should be utilizing my organization standardized terminology to name those. What I'm showing down at the bottom is I have different implementation models. In other words I may have different databases it may have come from packages that I've bought it may come from solutions that I built in house over time and different naming conventions. But if I start to look at this type of thing I may look at things like suppliers and vendors and suppliers and say you know there's kind of an overlap there. So what we really start to do is we utilize that enterprise model as our focal point to start linking these concepts together because now when I look at my enterprise model for supplier I can say I call it supplier but it's called this in that in this implementation model it's called that in another implementation model and we want to do this for our major master data constructs and reference data constructs in particular and then go out from there. So items to products and parts as an example customer to client or customers and you get the idea state slash province because we're an international organization and maybe call province in one system state in another and we'll start applying these conventions across our models. In ER studio we actually have a construct called universal mappings that does that for us so we can actually create these links and those read that in a repository and the way we do that is when you're looking at the enterprise model you simply look at the where I used to have of it and you can see where all these other implementations are so it really helps you to tie that cycle together. Let's look at process extremely important. This is basically a high level process context diagram that's created representing a few different areas and this gets very complex as you go into lower levels of this particular organization or different type of organizations. This is your typical pipeline company is an example that have different departments and what this really represents is what's called the order to cash cycle in that organization. So if you're a retailer or something like that you'll also have an order or the cash cycle which really comes from people placing orders to when you ultimately ship it and get paid for it. Same type of thing here and that's what the red flows are for is I'm basically documenting some high level processes that goes through their ancillary processes also identified but what I'm also showing in this diagram is the solutions that are to implement these as well as the data stores at a high level that are utilized to store the high level information. Very summarized level of detail and something like a high level process context. What I want to do though to really understand it from a process point of view is I now want to start to break that down into different processes. So what I've done here is here's a level down where I've got the different swim lanes. I know a lot of you have probably seen these types of diagrams particularly if you're a business analyst you've probably created them. So I'm looking at the different areas of my business and then which parts of the business are implementing these different parts of the processes and again I'm tying together at a high level what the data structures are or types of data that are tying into these processes as well. Ultimately what I want to get down to is I want to get down to a point of view whereas I'm drilling into these business processes I want from a data perspective to have the data that's used within a different business area or high level business process. So we call them sub models you may call them subject area models those types of things depending on what terminology you're used to in your organization. But here's an example of a part of the data model that applies to that one part of this particular business area. What I also want to do now is I want to take this subject area model and I want to integrate that back to my business process. So I've now gone a level down again in that business process model. What you see here is a very detailed specification of how the data is used in the process. So I've got four process steps here. Obviously the start node the processes and sequence and kind of an end node for this section of the diagram. But I'm showing all these different data structures that now correspond to what I saw in that data model. But what I'm also seeing is how they're utilized. I'm seeing the create read updating delete designators on these different data flows showing me how these pieces of information are using the process. When you get down to this level for your key data and business processes in your organization that's when you have a shot at achieving data maturity. You have to understand the information and how it's used where it's created where it's updated and where it's deleted to have a full handle on the data and really achieve data maturity in your organization. I talked about data governance so I'm going to talk about a few things here as well. What you want to have in addition to this modeling paradigm both the data and process is you want to have business glossaries. The best way to organize those is you have terminology in your organization you probably want to break them down in the same way. You want to take some type of a business decomposition from your corporate model and break it into your different functional areas and structure your business glossaries that way and the terms that actually are in those different glossaries as well. What I may also want to do of course is glossaries aren't just for words and meanings. There are also a place that we can do things like catalog our data governance policies and they're also a way at least in our products where we can actually set up things like catalogs where we can actually link out to reference data whether it's stored in spreadsheets at MDM data store and those types of things and actually create that as a focal point to link out to all of your other information management and process management resources in your organization as well. So from a glossary and term perspective we want to have agreed upon definitions and that can be very complex definitions including things like formulas and that type of thing ultimately to understand what that term means and how it's used as well as identifying who the stewards are that are responsible for the information as well. Governance policies many of us deal with many different regulations so again as you're cataloging your data governance policies you'll probably want to break it down by regulation and then break those regulations down into the different policy statements that apply to you and then you can then start to tie that back to not only your data artifacts and the types of data in your organization but you can also start to link that back to the metadata for the processes where that data is used as well so you get that even fuller perspective of not only what's governed but what processes operating against it are governed as well and then ultimately of course things like a reference dataset catalog where if you have an MDM store or you have things like I say in spreadsheets maybe and maybe you store it in SharePoint or Google Documents or wherever it is or even other third-party websites you can then establish hyperlinks through URLs and that type of thing to start cataloging and pulling all of this information together ultimately at the end of the day to achieve the maturity what you really want is you want integrated modeling enterprise architecture and governance as a platform to drive everything and that means the different types of data models integrated to your business process models including visual data lineage data models and your enterprise data dictionaries as kind of the core that around that all the other things that you would think of in terms of enterprise architecture and governance everything from business glossaries and concepts policies security goals and strategies defining your business rules identification of your business units your stewards and obviously a means to collaborate and discuss and work with your coworkers to implement these types of things now I've covered quite a bit here kind of parting thought here as well which is the continuous improvement part that we're talking about is how do we implement lasting change in our organizations well obviously there's a lot to do here and you're not going to boil the ocean you have to prioritize the types of information and your key business processes first focus on those first and then grow out from there and for each of those have a defined target of what you're trying to do I'm going to use the flying analogy here again and we have a saying in flying which is plan the flight and fly the plan and that includes the contingencies as well and you want to do the same thing you want to lay out what is my plan to get myself to this level of data and process maturity break it up into meaningful chunks that you can accomplish with some metrics to measure how you're doing along the way and then once you've accomplished one goal you step up to the next one think of it as like flying every leg of a journey where you're going through different places as well and it conclude the contingencies what you want to do is break it down into small sustainable changes and continuously improve on that plan it execute always incorporate contingencies because you've got to run into things that you aren't expecting and again what I like to say is if you don't plan it your chance of success is virtually zero and of course tongue in cheek I like to say hope is not a strategy you really have to have the discipline and roll up your sleeves to drive this forward it needs to be concrete and what I mean by that is you need to have have a concrete plan of action and that means you have to have measurements in place to measure what your process is and your results are continuous improvement so always evaluate measure adjust based on what you did rinse and repeat and then keep adding additional parts in in small increments just to keep this continually growing in your organization kind of a post pipe debrief now is to achieve organizational maturity we need both data maturity and process maturity modeling is essential to drive this because modeling is how we understand it we are visual creatures and we understand more by looking at visual models than we do by looking at any kind any basic volume of text or reports of those types of things a picture is definitely worth a thousand words or more in this context we use the data modeling to tell us what and where the process modeling tells us the how in the context of how things are like the data are used the data lineage leads us to understanding the life cycle in addition to the process modeling and then we supplement that with additional knowledge from the other metadata and classification metadata that we create our business glossaries for business meaning and ultimately pull it all together all of this is core to really driving forward enterprise architecture and governance in our organizations and again the approach is a continuous incremental improvement celebrate your successes because it is a difficult journey so make sure you celebrate on the way through to motivate you to keep going further and just keep repeating that cycle and that's it for today and with that Shannon I will open it up to questions Ron thank you so much and thanks to everybody for all the questions that I come in so far if you do have questions feel free to send them in the bottom right hand corner in the Q and A section and just a reminder to everybody to answer the most commonly asked questions I will send a follow-up email by end of day Thursday with links to the slides and to the recording of this session and anything else requested throughout so Ron diving in here in the data model utilization what does full governance metadata mean what I do full of governance metadata what I talk about there is I actually set up classifications in my metadata so to me a data model is not complete until you obviously we think of all the things that we normally think of like the indexes the characteristics the data types and all those types of things for me I go much further than that classifications for master data management so particularly in my enterprise model and definitely when I'm looking at my implementation models when I actually start to bring those in and do it one of the first things I do is I classify it does this entity or table represent master data reference data or is it transactional type of data also things in terms of things like from a data quality perspective how volatile is the data and those types of things retention policies by the time you're through an enterprise model for example you should not have one entity in your enterprise model without knowing what the data retention policy is for that type of data and that's something you want to carry through the other implementations so again different organizations have different things that are important to them other things that I look at are things like the business value some entities are higher about or some information in the organization is higher value than others a way of categorizing that value and the importance of that is extremely important because that allows you to prioritize for things like data cleansing efforts as well so building all that in right from the models and then everything downstream flows from that so do you have any recommendations for how the data team and business process teams can work together to increase maturity in both areas generally speaking there's a few things that go on it really deals with collaborating as closely as possible and we actually see this quite a bit when we talk about agile development as an example if we're working on a new initiative or something like that if you're doing it properly you have representation from different areas of the business so you're probably going to have a team that's comprised of business users facilitated by help from things like business analysts or in your organization developers data modelers everybody working together as a team so what you really try to do is before you can improve on a current implementation or put in or put in a new system or develop a new system you need to spend some time understanding current state which means you're going to document current state both in the data and process models working together because there's an exchange of information and the tools actually exchange the information as well and then as you're working on the future state you're actually going to be working on this together so what that means is do it in sprints looks like you do your development but rather than just focusing on things like software as the focus of a sprint you need to focus on what are the overall deliverables out of the initiative you're looking at so that includes data models process models business glossaries all the things that are important to you those all need to be included in the deliverables of the initiatives that you're working on and you just keep working as a team and just keep elaborating on it I love it thank you we get that question at least a similar version of that question a lot so that's great so what is the difference between data classifications and security classifications should we say data classification or information classifications I think the right classification is on the information level rather than the data level yeah and that's fine I mean basically what we're really looking at is we're driving at in our models actually putting some type of a classification the reason I split out security classifications is they're basically still a classification but there are different implications of that and in our tools we actually have separate security classifications that are implemented the same way under the covers but they also have the ability that when we actually do things like publish our models out and that type of thing we can actually trigger alerts and those types of things so if somebody happens to be looking at that metadata we can actually have informational messages and things like that appear automatically in our collaboration platform called team server so if somebody's looking at say private metadata that represents sensitive data or personally identifiable information we can customize alerts to make them aware of that fact when they're looking at that metadata and Ron what's the difference between application architecture and technical architecture application architecture tends to focus on the actual applications that deliver the functionality so for instance if you're looking at if you have an accounting system or something that or an accounting package that you're running that would be the application with the with the different modules areas to implement things like general ledger accounts receivable accounts payable and that type of thing that's the application itself but of course to be able to deliver that to the users it needs to reside on a technical architecture so your technical architecture is kind of that backbone or underlying infrastructure everything from your network infrastructure your connections your security architecture and everything else that actually kind of sits as that baseline underneath all the other components that you're utilizing I love all these great questions coming in so Ron should the cred documentation for data be part of process documentation or should we do them separately and then integrate them you actually need to do it as part of the as the process documentation I mean let's put the separate way you you need to step through the process to learn how the data data or information is used in your organization it's those individual process steps that will tell you am I just reading this data am I doing something to modify the data in this process step or am I actually creating new data in this process step so tying that back to the specific process that did it is extremely important and it may be a very detailed process but then of course that rolls up into the higher level processes as well and to establish data governance can you start the pillars the business architecture before starting on data architecture reference back to slide number 13 or must you start with data architecture I would say they actually go hand in hand now the data architecture supports it but in reality especially when we're on when we're on this journey where we're trying to keep process and data in balance you're actually to do it properly you want to evolve them simultaneously so there's always going to be a little bit of back and forth there you're going to discover things that data in your processes you want to reflect that back in your data models there's data that you identify in your business in your data models and you want to basically in your process model say okay I know this data exists how are we using it where did it come from so that actually drives you to build out your process models to understand how you're actually utilizing it so it's a back and forth and what are the different data policies that can be defined for data governance I think that question may be a webinar in and of itself but let's see what you can do with it for in a few minutes sure so what I do is like when I actually set up my my report my metadata repository and what I on our team server product for every regulation that I'm subject to I set up a top level glossary so if I'm subject to GDPR my top level glossary is going to be GDPR if I'm healthcare I'm subject to HIPAA I might be subject to PCI or those types of things so I'm going to have an umbrella glossary for each one of those then what I do is I look at the regulation itself so GDPR or HIPAA they all have a number of policy or regulatory statements within that so then I would start to basically enumerate what are those specific policy statements that apply to me and I'll break those out as different pieces or terms in that policy glossary once I've gone down to that granular level I can then associate those back to the types of data and also the types of process or the specific process metadata that they tie back to so I'm basically building up a list of the regulations the policies within the regulations and then I'm cross-referencing them to the applicable data and process artifacts perfect so and we have a product question here for you Rod how can ER Studio be leveraged to pull together different models created and other tools to create an enterprise model well there's several things that we can do there is we actually have what are called meta-wizard bridges in our products so if there if you have models and other products we can typically pull those in turn it into an ER Studio model and then we can actually move forward from there other mechanisms say you've got something and we actually support a lot of older modeling products in terms of importing metadata but even so if you don't have the actual model but say it's a data model and you happen to have the DDL or the database we can also reverse engineer from that to give us a starting point and what tools do you use for metadata management that can be tied into the operations for our metadata management we actually use our own product within ER Studio so it's comprised of not only the modeling tools the data modeling tools and the business modeling tools we actually have a a product in there is well called software architect which allows us to do application architecture but the metadata is stored in our repository so whichever models it comes out of it's all put in it's all saved into our repository and then it's also published into our collaboration platform where we can then associate it to things like the business glossaries and the regulatory policies and everything else so we have the full metadata repository and catalog I call it a full data governance catalog ourselves and if that doesn't make you happy you can also integrate with REST APIs to other products as well so we get a lot of organizational questions so who in the organization should be taking on the role of a data maturity implementation and what kind of skillsets are needed to manage a data maturity initiative well you're going to get me on my soapbox with that question because in general organizations have failed miserably at it many years ago we instituted this position known as the chief information officer with the key eye there being information but unfortunately in a lot of organizations just because of where they came from or just the focus quite often the chief information officer has kind of been more of a chief technical officer because they're really focused more on technology and systems and that type of thing so as we all know that's where the role of the chief data officer started to emerge in industry and it's getting more and more foothold all the time so I would actually really charge the CDO with making sure that you're driving at that data maturity because governance and everything else ties into it and frankly it needs to be driven by the business it's not a technical function it needs to be business-owned and business-driven all right I think we have a couple minutes here left to speak and a question or two more can ER Studio be used as an enterprise master data management tool it's not a master data management tool itself but again we can set up the classification metadata right even at the modeling level but then we can actually link out to things like MDM repositories or places where you store it because basically on the team server side of things we can just do things like a hyperlink to link out where you're actually storing that so it can focus as it can serve as a focal point to help you navigate to where that is because you may actually have it in a number of different disparate sources so it allows you to pull all of it together and Ron I think we get one more question in here do you see the metadata management as part of change management or separate from it you have to do both hand in hand because every time that we're doing these types of things we're never working in a vacuum so we actually view change management as a very important part of our metadata so we actually do things like when we're doing model changes or for instance or if models have changed and we're actually reverse engineering and detecting those changes we have changed records that we can actually associate back to tasks or user stories or anything else so we not only have the what change but we also know why it changed and that's a very important part of your overall metadata especially when you're going through an audit when the auditor sits across the table for you and says why did those data structures change in that particular data store well Ron this has been another fantastic presentation thank you so much just a reminder to everybody that I will be sending a follow-up email to all registrants by end of day Thursday with links to the slides and links to the recording and as well as you can get Ron's contact information there if you have additional questions well thanks again Ron thanks everybody for being so engaged in everything we do we just love all the questions and engagement and hope y'all have a great day thanks all thank you everyone have a wonderful day