 And look, my name is Shannon Kemp and I'm the Executive Editor for Data Diversity. We would like to thank you for joining today's Data Diversity Webinar, Data Management Maturity Model. This year's latest edition is a monthly series called Data Ed Online with Dr. Peter Akin, brought to you in partnership with Data Blueprint. We at Data Diversity are very excited about this particular topic as our attendees and keep an eye out for more educational resources on Data Diversity regarding this very thing. So now let me turn the floor over to Megan Jacobs, who's a Webinar organizer from Data Blueprint to introduce our speaker in today's Webinar. Welcome. Thank you. Very possibly. Well, let me just give a shout out and a reminder then, if you want to actually, Peter, move to give this introduction here. The two most commonly asked questions is will I get a copy of the slides of the event and is the event being recorded? So just as a reminder, I will be sending out a follow-up email within two business days containing links to the slides, links to the recording of this session, and any additional information regarding requested throughout the Webinar today. So we'll make sure and get you all that information. And as you can see on your screen, we welcome you to get social with us. Here's the list for additional places to get social with Data Blueprint. And if you want to tweet and we encourage you to share highlights or questions by Twitter throughout the presentation by a hashtag DataVerseCity. I'll go ahead and introduce Melanie at this point, Shannon. So thanks. Good to be here with everybody. And I'm kind of excited about this particular session in case you can't tell. First of all, I just met Melanie about a year and a half ago, which is sort of inconceivable that we've both been working in this space for as long a time as we have, and we've never run into her. Melanie's had a number of different positions in data management. Most recently, she is with the Software Engineering Institute, which is a division of Carnegie Mellon University. She's been working in the CMMI space, and if you don't know what that means right now, hang tight because we're going to explain it as we go forward. She is the Program Director for the Data Management Maturity Model. She's got more than 30 years of designing and implementing strategies. You guys can read all that sort of stuff, but she's worked in the public sector and the private sector and is well-known for her architecture and design experience in all kinds of different areas. But I think the thing that's particularly important for this webinar is that she has been the primary author of the Data Management Maturity Model. And so if you're probably asking what that is, and I'm glad you've joined us, those that don't know me, I've been doing these webinars now for what years, I think, with Shannon all together on this. I've been playing the space for a long time and had lots of fun. It's really a pleasure to go out and gratifying to meet you guys out in the field because then you guys say, hey, I'm enjoying the webinars, and that's kind of nice. So it's kind of different for us because we're on the other side of a big screen and we don't see anything or hear anything until it's questions time. But let's drop right into the presentation here. The key for this, of course, is that if you want to get better at something, you've got to sort of have a framework on how to get better at things. And I'm going to turn it over to Melanie for these first two slides. We're going to tell you the bottom line up front in case some of you have to drop off and then actually go through it in the detail that we're going to talk about in just a little bit. So Melanie, here's the Data Management Maturity Primer for you. Okay. So the quick facts about the Data Management Maturity Model are that it is a reference model of operational data management capabilities that an organization needs to do well to support the maximum utility and management of its corporate assets. So it is above all a measurement instrument against which you can evaluate your capabilities and your maturity. So if you want to know across your program, how are we doing, the DMM can help you get that answer. And if you want to know where are we strong, where are we weak, what should we do next, the DMM can help that. And it is a very good baseline for developing an integrated strategy and making a lot of specific improvements to various disciplines in your Data Management Program. It was developed by the CMMI Institute with our sponsors Booz Allen-Hamilton, Lockheed Martin, Microsoft, Canvas Systems, and many contributing experts. And we have conducted assessments against it piloting recent versions of this model for Microsoft, Fannie Mae, the Federal Reserve Systems Statistics Function, the Ontario Teachers Pension Plan, and Fannie Mac. And our sponsors have also conducted assessments for the Securities and Exchange Commission of the Treasury Office of Financial Research and Cisco. So the first thing to note about the model is that it's actually tried and proven already. It's based on proven advances in this area. And again, this is not something that's just been made up, but it's really built upon a foundation. So here's the big reveal. Here's what the Data Management Maturity Model looks like. Yes, so this is our 100,000-foot diagram of the model's content and basic organization. We could present half of the webinar from this slide all by itself, but we will see it shortly and we'll get into more detail. Importantly, there are six category groupings with multiple process areas in them. A process area is essentially a topic or discipline of data management with all its activities and work products that support those activities. So there are 20 specific data management process areas and five supporting process areas for all of them that are adapted from the Capability Maturity Model Integration, which Peter is about to address. So as we move forward on this, this is the one slide that's the real takeaway for you. Let's dive into the actual program here. We'll start off with some motivation and then we're going to talk a little bit about how we got to where we are. And then we'll dive into the model itself so that you're hopefully going fairly comfortable with it by the time we finish at the time of the hour. And we'll finish up with a little bit of guidance on how it should be used in there. And finally, we'll finish up with sort of some directions on this in terms of where we're going future on this. Now, I'd like to start off talking about data and actually revealing that it's kind of like Maslow's Hierarchy of Needs. Some of you remember this from college and university or high school. The basics of this framework here are that if your food, water, and safety issues are not satisfied, it's going to be very difficult for you to do any self-actualization activities. If you're hungry and wet and naked, you're probably not going to go home and write poetry, create music, whatever else it is that you're trying to do in your self-actualization time or if it's call of duty. Again, people have different things that they like to do with their off time. Data management is an awful lie like that. And I use these items here in the Golden Triangle, master data management, mining, big data. These terms change every few years and we get some new and improved, we call them silver bullets because a lot of people think they actually solve the problems for them. These are tools, but we need to put the tools in the hands of people who know how to use them. These data management practices are really just the tip of the iceberg. And if you don't understand that there are foundational data management practices underneath it that are required in order to support these practices, your journey into data can be a little bit difficult. The five areas that we have down here are the five components of the model that Melanie showed you just a few minutes ago. Data management strategy, metadata management governance, data quality, and operations on there. That actually might be a mislabel. So we get customers that ask us, can you do the things in the Golden Triangle without learning all of that other stuff on the bottom? And the answer is absolutely you can. But if you're data mining activity without the foundational practices, it will take longer, it will cost more, it will deliver less, and it will present greater risk to the organization than if you instead learn to crawl, walk, and run your way to the top. There's another point on this diagram, too. And that's that these foundational data management practices are held together with a weak chain mentality. In other words, if you're doing very, very well in four areas out of the five, but you're doing one of these five areas poorly, it pulls down the rest of your data management practices. Again, making your projects take longer, cost more, deliver less, and present greater risk to the organization. Now most of you are familiar with IT's track record here. It's actually not quite as awful as this diagram shows, but it is definitely not terrific. And I have been working in these areas for a long time in terms of how these data projects are. And it's just a very big challenge around this because there's not a lot of good specifics in these areas. Melanie, you wanted to share a specific example that you were working with on this one. Yes, I do. A few years back for a very large federal agency, there was a massive services-oriented architecture transformation effort. So the hardware, the messaging, the middleware was all set up and ready to go, and the team engaged in developing the first group of 75 web services, including a couple of key data provisioning services. So one was master data for contracts, and there were millions of contracts at this organization. So they looked for, okay, what's the authoritative source for contracts? Couldn't find it. They could identify a single data owner. There was no repository for the information. There were many application database sources for it. There was no business glossary. There was a unifying data model, and there was no governance in place. So all of the fundamental support for a project of this magnitude did not exist. So they had to halt the project, manually review over 1,000 contracts, derive a meaningful set of information, and code like crazy to resolve all the discrepancies. Needless to say, it had a major impact on the schedule. So you can see this is just one of many, many examples on here. In fact, I personally looked at literally hundreds of IT failures. And without a doubt, 100% of them have a data as the root cause. The problem is that in IT there is not a lot of focus on data specifically. There are very few people that are data educated. And what's worse still is that when you look at the keywords that are available in the research community or in the advisory community, notice that the data keywords only make up 6% of them. But we're going to embrace the Internet of Things coming to us. We're going to have to have a lot more than 6% of the research keywords focused into that particular area, because this leaves us in a situation that I call our bad data decision spiral. And the frank and honest assessment is that C-level decision makers are not data knowledgeable, nor are most CIOs. That's a terrible thing to say, and I don't mean any disrespect to the CIOs because they're quite good at what they do. But knowing data is not an area that most of them are really, really strong in. This leads to a series of rather poor data decisions, which means that data as an organizational asset is treated very poorly, and of course we end up with poor quality data as a result, which means we end up with poor organizational outcomes, and unfortunately an ever-repeating cycle as we go through this process. It's very, very difficult to do this. The key is, until we figure out that data is an asset and economic resource that you need to own, control, produce value, convert into cash, whatever it is that your organization does, you will have data as a poor stepchild of the tail wag in the dog in this case. It's a very big challenge for us, and we're working as hard as we can to do it with our partners at Dataversity and other events that we do around the world on this. So the next question is, how did we get here? So we've got a lot of motivation. Data is the root of IT problems. The motivation is we get a lot of calls in here, data blueprint, where people say, we want to move our data management program to the next level, and if we were a dishonest organization, I could say pay us a million dollars and we'll tell you that we're there. Of course we know the world doesn't work that way, and we're also happy to be honest people as well. So the first question that's reasonable to ask is what level are you at right now? And if you don't know, then you can't very well move to the next level. What do you know where to put your time, money, and energy so that data management best supports the mission? I'm going to tell you a brief, brief little story here of two slides just in terms of the history on this. I'm a 19-year program manager. Oh my goodness, what a mouthful that was. And one of the tasks I was given at the Department of Defense was say, is the Department of Defense doing things like maintaining software, managing data, testing things, et cetera, et cetera, better or worse than the private sector or anybody else for that matter. We might want to compare ourselves against the German. We reported a team that funded the original research at Carnegie Mellon that showed up the Software Engineering Institute on this. And they said with what other people are doing in their practice areas. I have to tell you a little short story on that as well. They also said, by the way, we hear that Navy is doing some really interesting things in data management, so please go down there. Remember, I was DOD corporate at this point. And so I went down to check out to see what the Navy was up to and found out that there were a couple of luminaries in the field working down there by the name of John Zachman and Clive Finkelstein. So that's how I ran into them on that. Well, back to our story, the FBI worked on this problem internally for a while and came back with an integrated process and data improvement approach of which you'll be hearing about a little bit today. The interesting thing was that the Department of Defense actually told the Software Engineering Institute to remove the data portion because their name was the Software Engineering Institute and they didn't feel they had any business telling anybody what to do with data. Of course, we know now that that was an error on the Defense Department's part. And we really did pick up this research that was just lying around down there. And I want to give a shout out to a former colleague of ours named Bert Parker, who passed away a couple of years back but was instrumental in helping us put this thing together. There is an article that we'll send out at the end of this along with the slides that talks about how to use these frameworks in assessing this. This was an imperfect internal assessment at MITRE. You can see it was October 94 based on the CMM. And they essentially came up with this normative process for going through and generally framing up the question, how do we do it? And the answer was, the number we're doing it reported that it wasn't very well done by them. So that was kind of a challenge that Carnegie Mellon created around the world. Over 10,000 organizations use it. 94 countries, 12 national governments. There are 10 language translations. More than 500 partners. And in the year 2013 alone, more than 1,300 appraisals were done. And your next question might be, why? Is it that good? Well, I'll just get out the science actually tells us it is. If you're on budget project delivery, then RUP, which is agile, COVID or PMI. In fact, if you look at the RUP, COVID and PMI, it actually shows that applying those decreases your ability to deliver on budget. The same pattern holds true for on-time performance. Again, some percentage of projects delivered on time with particular framework. So there is good science that says that these things that we're talking about here do make a good pathway for people to use going from place A to place B in terms of their improvement journey. There are a number of different variations on this. Again, you can see the services, the development component, the acquisition supply chain thing, and workforce development are four of the main ones that are out there. There are a lot of others. And we're looking to continually expand this particular piece. I'm going to toss it back over to Melanie here who's going to tell us how she became familiar with this portion of it. Again, she was sort of pushing on it from the opposite direction, and we sort of met in the middle here which was just absolutely terrific. So Melanie? For the DMM, as we know, in addition to the foundational principles and the previous models that Peter's told you about, we all know that data management is a very broad and complex topic. And therefore, it is challenging for any organization to get its own surround data management and do a great job. So an effective program requires really a planned strategic effort phased over a multi-year period. In addition to that, it's been helpful to use reference models such as the DMM to evaluate what your capabilities are like now and where they need to be strengthened or if you emphasize certain areas over other areas, which we'll talk about in a few minutes. So the DMM was aimed at unifying the perspective about data management across the entire business, within IT, within the lines of business, within the data management professionals staff. So everyone could have a similar view and be well-educated and understand what the importance of these processes are and how their role could contribute. So it was considered to be a foundation for collaborative and sustained process improvement so that's why in late 2009 essentially it was a gleam in the eye from a couple of organizations and the SEI was put together with other organizations. Development was launched in January 2011 and for the last two years we've been transforming it into to leverage the best practices of CMI. We had industry peer reviews starting in April 2014 and in August the DMM was released. And just a couple of days ago, wasn't it? Yes, the slide states the day. Okay, so yes, the declaration of independence Founding Fathers in the bottom. So we were not quite sure the dead white guys, but the author's further model, I thought you should know a little bit about the author community, we put together an author community and it extended over the years who had a lot of experience in designing and implementing data management in the real world for real organizations so you can take a look at some of the skills there. The consensus spend diagram was quite broad, quite large. We used the consortium approach and we used the approach that had to be practically useful. So we didn't have collectively a lot of patience for theory alone. We had to keep illustrating with examples from implementations, from organizations in ABC, DEF, what actually worked, what didn't work. We're employing reference model architects from the CMI and other SEI model traditions and a lot of business knowledge experts, a lot of business from multiple industries. And I held extensive discussions resulting in consensus on the content. And as drivers, we wrote it for all of us because we've needed this for 20, 25 years. So our basic timeline, as I said, we have released the model at this point. We've launched a part program with 10 partners to date. They are the ones that can sponsor individuals for training and certification. And our full suite of courses comes out starting in fall 2014. Moving ahead here now. So that's sort of how we got here. Now the parts you all are interested in, what is the DMM? So this is our, we like the title page with the unified puzzle pieces. We think that's sort of emblematic of data management. With the 3.5 years in development, our four sponsors are 50 plus contributing authors. Our 70 peer reviewers representing altogether over 80 organizations. We have a dense and comprehensive content of 230 pages. We have 300 practice statements for all the process areas and 300 work products. So those are some quick facts about the model. So the orientation of the model is essentially a strategic orientation. We have decided that organizations rarely take this point of view. So what ends up growing and what ends up happening is that certain things are emphasized by one C exec or another, by one primary business line or another. So data management as we know has been a rather piecemeal and it needs to be pulled together and thought of in a unified way. The biggest challenges that we took on AMAT were the creating with the business, aligning the business, aligning IT with the business strategy, aligning the business strategy with the plans for data management, etc. and achieve the organization's wide perspective. We are not looking at the bleeding edge of data management. We know that there's innovations every single day, innovations of theory, vendor products, etc. We are looking at the tested and proven and practical implemented state of the practice, so the best practices that are proven and tested. The model is for all industries. Melody, I think the point that you made there is the one that I find is most useful when articulating this out to organizations. Most data management functions are very effective at the work group level. Imagine what I say to these groups, if you could take all of that power, all of that knowledge, skills, and abilities, all that passion around data that's at the work group level and focus it at the organizational ability and now that you have the opportunity to do that. So this illustrates our view that it takes the entire organization and everyone to have an understanding and play a part to the data assets where they need to be to meet the business's strategic objectives and their mission. So obviously, the owners of the business own the data. They are the ones who create and manage the data. I just talked to a CEO a few weeks ago who said, well, obviously, the business owns the data. We just provide it to them, but it's been a long road to get them to realize that. So that's where the model is aimed. It emphasizes business decisions. So there are some topics that are not in the model because they are more technical versus having a major role for the business. And we consider it to be a useful tool to create that shared vision and help unify the diverse audiences that have a role in data management. Okay, but it isn't. It's not a compendium of all data management knowledge. That would be quite presumptuous because we have a wonderful industry and it's had 35 years of evolution, many foundational thinkers that have blazed a path for the rest of us to improve some tremendous vendors. I mean, how many bad data quality products do you know? Almost all of them can do almost anything that you want them to do. So we also have many fully mature industry practices, such as data modeling. Best practices for data modeling have been around for the last 40 years. So we don't want to be too specific in the model because if you get lost in the detail, people try to implement the model as a whole and it's not meant to be that way. It's not a cookbook and it does not identify the one best way to do something because there are many, many good ways to tackle a problem. Again, the question here is, on how can we do the core functionality that we're trying to do with data management and improve that organizationally, not just individually? So I want to take you on a very quick tour of some principles that were applied to build the DMM. And many of these were tested and proven with the CMMI and they represent the standards that we held ourselves to as the author team. So there are sections for each process area which consist of purpose, introduction, goals, questions, related process areas, the practice statements on which you're evaluated, and the work products that support the practice statements. We tried to make sure that they were all consistent with each other, that the messaging was consistent and built one on the other through the process area. Each process area can stand alone. So we tried to carefully extract, overlap, as much as possible. Even though, as we know, everything in data management supports everything else. We have put them or thoughtfully so that you can evaluate each one separately. The practice statements are grouped by level and Peter's going to address that in a few minutes. And all of the statements need to have enough detail to convey understanding. We also tried to do written, precise, condensed statements using abstraction as needed. And we borrowed the maturity factors from CMMI. They're known as generic practices in CMMI. We'll take a quick tour of them shortly. So this is an activity-based reference model. Essentially, you are what you do. So we emphasize the creation of effective, repeatable processes that can be averaged and extended across the organization and reused for maximum value. Do things well. For example, if you have an area in the company that needed to put in place extensive data quality processes, sometimes those processes get lost in one cross-cutting program or one line of business. The aim of the DMM is to discover the strengths that you have as well as weaknesses you may have and to leverage them so that you can make your relationship more progressive and achieve more. This is non-prescriptive. Every DMM in the model is imperative that we mention a technology. For example, in metadata management, level three, we refer to the metadata repository. We think you should have one. But that's about as prescriptive as it gets. There are great ways to do these things. Next slide, please. So the process areas can stand alone. And what we mentioned was that there are also top-sided and piecemeal approaches throughout an entire organization due to what was funded, who owns the funding, what was emphasized. So sometimes it's this year you're going to implement data governance now because that's what the board wants. Or this year, my customer account management has to have better data quality. We're losing business. So it's kind of an organic growth of data management of the past number of decades. And we're trying to help the industry put a little more structure on that. To simplify the landscape for all parties. So getting into even a little more detail very quickly, we have a lot of stress for ourselves on how we formulated the practice statements. So we put these tests to ourselves in the author team. It used to be unambiguous so that every statement was clear to any reader. It needs to focus on what is done, not how it was done. Not overlapping, demonstrated by work products and each statement a single idea. Those are the things that encounter Melanie out on the street around the DC area. She can tell you some stories about sitting around and trying these statements out. Just how much intensity there was around this. And you know if you're a data person, you argue a lot over very specific things. This was about how to specify things. Again, it's just been a tremendous amount of work. There must have been an awful lot of fun for you and the team to do this. The thing that reminded me the most of was my graduate seminar on five pages of Aristotle's Physics. If you were a philosophy geek, it was fascinating. So yes, it was a lot of fun and it was quite engaging. And it kept our attention for a long period of time. So for the practice statement, sometimes a statement may be clear, but it may require more context for all the users to completely get the point of the statement. So for being assessed, yes, no, or partial are the possible responses. And if the response is partial, then we ask the organization to come up with an example. And that usually happens with a great deal of participation. So other statements may require additional information. That can be extensions, implementation tips, what not to do. This is from the experience base of our authors and peer reviewers and also what is not included in the statement. Most practice statements have elaborations and there at the bottom, I'm not going to read it to you, but that is our one sample of a process statement from the DMM in this presentation. That is data quality strategy level 3.3. So you can see that the statement is set out there and then we give you some helpful tips for how to do it. I mean, the objective criteria on which everybody is being assessed against. Yes. So this would be yes, no, or partial. We do something and let's give you the example. And the last slide is essentially these are the definitional and the implementation oriented characteristics that we try to have each practice statement. So also I can say that we've tried very hard over the years to do a lot of hard work for you so that the model is easy to use. And these types of attributes help everybody come up with the same kinds of assessments so we literally are comparing apples to apples. And one word about maturity. It has been the experience of our many CMMI partners and appraisers over the past 20 years. Then although a specific practice may be done well, if it's not embedded in the culture and organizational practices of the organization, excuse me, it's also not that well supported and cannot be sustained. So I'm not going to read these but essentially we have adopted these support practices from CMMI. So these are practices that make us sure and make your organization sure that they're really embedded in all of your practices. And I have to say the embedding part is hugely important. We've had an organizational design conference the weekend before last where a gentleman was describing his activities as the chairman of a large international organization that he had used this process to get the organization better and that actually achieved a very significant score and more importantly the capabilities and the practices that were with that organization were renowned. And his successor came into the organization and basically threw it all out because he had not institutionalized it and the organization devolved into chaos. So it was really a very challenging organization. So you've all seen this already and I'll just take and walk you through now the five practice areas and the supporting processes that are down in there as well. What the model says here, the framework, is that data management, in order to be practiced, that helps organizations apply their sole non-depletable, non-degrading, durable strategic asset in supportive strategy, there are five things it does. There's data management strategy, data quality, data operations, platform architecture, data governance, and as Mani said before, the standard supporting processes that you need to have in order to do this. And you can see on the right-hand side of this chart she's broken out into categories around each of these. What one organization has to do is to think about getting better on each of these foundational practices. And the way you get better is that you advance up the maturity level. So I've used this slide successfully in a lot of organizations to explain it. At level one, your organizational practices in data management strategy, in data quality, in data operations, in platform architecture, and in data governance are performed. Your practices are informal and ad hoc. You're dependent on heroic efforts in that process. However, at level two, your processes are managed, and now they are defined and documented, perhaps performed at the workgroup level or the business unit level. We're going to define them as a level. Now we're starting to see that they are standardized and used consistently. And that, of course, is key to implementing something organization-wide. Finally, we get to measured in terms of the maturity process. We now start to say, how well are we doing it? It's not just good enough to do it well, but how well are we doing it? And that leads us to the final question, which is to say, can we do it better? So periodically we get a group of people together who look at our existing data management practices and say, can we apply better knowledge skills and ability? Can we rework the processes? Can we change the interaction of some of these things? Anything that you want to do to make it better, this is the framework for doing it. This is, so you'll note on the right, these base of TQM, ISO 9000, and other processes as well. Did you want to add anything to that? I mean, to get past you on that one slide there. I think that's an excellent description of the level. So each process area has five levels in it that essentially form a gradated path to improvements. And each level builds on the one below it. So they're all together. It's important to represent a meaningful outline for when one would start with a process area such as data profiling, for example, and where they would end when they'd fully blown out the capability. What we have next for you are the five areas. And there's a sort of unifying statement at the top of it. So in other words, if we get data management strategy, what we mean is we're creating and communicating, justifying and funding a unifying vision for data management. And there's five subcomponents in this case to walk on to. Melanie, do you want to highlight some of these? Yes, essentially five process areas for data management strategy. Sometimes people say, why are there five process areas? Because as we were emphasizing earlier in the presentation, data management has been primarily for most organizations over the years a personal approach, or it's centered around one big implementation like the Enterprise Data Warehouse, or a SOA initiative or master data management or some high priority project. But we would like to assist the history in doing a better job convincing those who need to be convinced, the executives, the business sponsors, the lines of business, that the better way to approach this is sort of top down and carefully think it through as an organization. So the data management strategy is about the data management strategy itself. What considerations do you need to think about and how do you put them together in a reasonable document that everyone approves? Communications is vital both for the data management function, those performing data management activities, for communicating the progress of the strategy and the program and also for bi-directionally with data governance. Function addresses the organization or the pieces of data management functionality as it's dispersed through the organization. A little quick anecdote, one of the long running arguments we had in the DMM for the thread that went on for months, should we call this data management organization or should we call it data management function? And it actually functioned well because we knew that the organizations would be unique and different depending on the corporate culture and structure. So it refers to the function itself and if you have the central organization, it refers to that organization. The business case, how do you justify against your data management strategy and your strategic business goals? What should be funded? It helps you build a better business case for data management initiatives. The funding is essentially how to get funding justification for the program and there are some organizations that do not have core funding for a data management program and over time that hurts them. So we're trying to help them think about that. I'll add a little bit on that as well. Just from a funding perspective, if you have five people that are in your organization and they're each making $100,000 and they're working in this data management strategy area, it's reasonable for the organization to come back and say, how was that money spent and what did I get for it? And so that's why those last two are so critical to maintain that business case. That business case will evolve over time but if you're not practiced at doing it, the first time you're asked to do it, it becomes a real race and you probably won't have the articulation in your message that you'd like to have. Absolutely. So the governance area, the question we get on this is why do you have these three processes areas in data governance? And there are various answers to that question but when I was at a conference in Big Data Analytics, I did a presentation on the DMM and right after my presentation, Informatica did a presentation. And the slide that came out right away was this is the heart of our suite and our approach. Governance, business glossary and metadata management. So it looks like we're on the same page with the industry. Business glossary, of course, is a part of metadata management. It is so important to derive a shared meaning and have everyone agree on it and build that glossary over time that we have pulled it out separately from metadata management per se. Governance, obviously, is how do you plan and structure the governance? How do you make sure that the governance is active and engaged over time? Data quality essentially is a four process area category that composes together a complete 360 degree program for an organization. Almost everyone on the author team over the years has been a data quality fanatic and has a lot of experience developing programs, selecting tools, coming up with audit methods, coming up with compliance, coming up with governance. So it was a big area of emphasis for the model. So the data quality strategy is what the model advocates should be done by the organization. In fact, we typically recommend right after you either build or modify your data management program strategy that you immediately undertake this because this is the heart of business value. What are we going to do, how are we doing now, and what are the key subject areas in the business and the key repositories and the key apps that really need to be improved to get the biggest thing for the buck for our business objectives. The first thing we're familiar with, the analysis of data content in data stores, data quality assessment advocates that eventually the organization moves towards a data steward-based, subject area-based business-driven assessment of data quality. Only the business owners and users of the data can determine what are the targets that they need to do their operations and make their good decisions and what are the thresholds they're willing to tolerate. So this process area treats that in detail. Data processing, we all do this, so it's how to make it more effective and how not to spend millions and millions of dollars of it in tiny pieces on every project all over the organization. Those of you who are long-time listeners to this know that I have a favorite phrase that I use, which is data rot. Data rot is redundant, obsolete, and trivial. Clearly, you don't want to spend your earned dollars in this area on fixing things that are trivial. So just little small decisions like that are very helpful. Notice also how the quality assessment piece, the measuring, the impacts and costs imps directly on that business case. So remember the first diagram, everything kind of connected to everything else, and you're starting to see that interconnectedness here in these definitions. The data management platform and architecture emphasizes, so this is essentially a business-centric view of these topics in the DMM. So we do not talk about the technology stack, for example, in any detail. We are emphasizing across the organization to implement a collaborative approach to developing that target state with the appropriate standards, controls, and tool sets. We have run into organizations where there were beautiful target architectures, but they had been developed solely by one organization in IT that had not been completely, had not involved the business owners at the beginning, and had not been therefore completely adopted. So you can't get very far with that because then you are left falling your architecture again and again and again in every business case. So the architectural approach is how to fix all that. No matter who you are currently in your evolution of the layer. So architectural standards are data distribution, data representation, and other standards that will help the data layer evolve in a holistic and extremely manner over time, which ones to implement, how you decide on them, and how you maintain them. The data management platform, we don't really need to say anything about that. Don't buy the technology before you verify that it meets all of the business objectives and requirements. So this is about joint decisions for these platforms. Integration has the best practices and involvement of data governance in the integration of data from multiple sources into repositories and destinations. We do not get into strictly the technical realm of ETL best practices, but we refer to them and recommend them. Historical data archiving retention, we know are separate from each other, but yet they overlap. So we treat them all in one process area. And you would be amazed at the number of organizations who have no archiving strategy. I can't tell you how many organizations have said to me, we just keep everything. Whereas in many cases, if you just keep everything and you don't archive, you're affecting your performance. So lots of information like that in that process area. Just a quick little note on the data management platform. The biggest versions of data strategy actually concentrated quite a bit on saying, you know, how many different data management platforms are you going to support, and can you cut those down or increase them or do some other things. This does provide a focus here for making a basis for making decisions. In the past it would be like, well, I just want to get rid of platform X because I don't like it anymore, which is not a really way of following through on these things. So again, we're back to the science, the objectivity of this, and that's one of the things that's very exciting. So data operations, the term data operations, of course, can have a lot more topics in it. These are the three where the business involvement and decisions are critical and that we felt the collective team felt were not very well done in the industry at large, in the industry essentially. So the data requirements definition is how do you develop, validate, prioritize a requirement for data both at the project and the program level. The data life cycle is the criticality of mapping data to business processes as the data flows through. Not having that map over time will put huge obstacles in your way for creating an end-to-end data lineage, which you absolutely need for your reporting or regulators and so on. Provide management is to standardize the sourcing process and have SLAs both internally and externally to ensure that data is well provided at high quality from any of your sources. And finally, to get to the supporting processes, these are the ones that are common across most of the CMMI frameworks. Yes, so I'm going to make a slight digression here and do a little pitch for a measurement and analysis. So these are the process areas. If you're doing all of these well, they can help you make the processes real, the rubber meets the road, it's embedded, everybody is adopting it, everybody accepts the compliance scheme that the organization has developed against it. So the entire process area and measurement and analysis has many helpful tips, by the way, because that one is very fully blown out with 20 years of CMMI experience and a lot more information content in it than some of the others. And most organizations don't have sufficient metrics in place for the program. I think that's just a fact. So my pitch quickly is, why should you take metrics seriously? Because everyone loves metrics. Data sponsors, the data professionals, the business sponsors, and especially your executives. They can see, can you show me exactly where we are? How are we doing? What is our progress? Is this discipline working well? How much data have we cleaned this month? And we've said there's a lot of metrics out there or there's beautiful metrics in a tiny pocket. So metrics ensure engagement from the lines of business as they engender the support of your data stewards. They prove your achievements. They highlight your benefits and they offer great support for your funding business cases. And one more quick tour. How do you do metrics? Invit target state. Determine what you need to measure and count in order to achieve that target state. Look at the sources where you can get that information. Help with the metrics, start using them, start pumping them, modify them over time until they are maximally useful. Okay, Peter. So as we're looking at these things, what Melanie has just done is taking you through the entire structure here and showing you what the legs, if you will, that each of these pillars need to rely on. What you don't see is a prescription on how you should do it. But it'd say, what are you doing? And that is very important because it gets organizations thinking in terms that are important to them. It's not somebody externally specifying this should be important to you because every organization is different. That's one of the fun things about working with data is that everybody's got sort of their own spin, their own way of monetizing, if you will, the data around this. Also notice she's color coded them, making it really easy for executives to follow. I'm sorry, that was terrible to say. We do have to keep them interested in this because there are some dollars at the bottom of this. This is a multi-billion dollar industry and we've got many organizations that spend a lot of money on it. So how do you use this framework now that you sort of understand what it is and where it's going? The answer to that is pretty straightforward. What I'd love for you to do is to start thinking about a number of different ways to use it. So we designed the DMM to be an educational tool. It contains, as we've said, and Peter has illustrated with the flow of the levels, it contains a graded path for improvements. It is essentially to focus your collective wisdom to help guide practical action and implement real benefits. It illustrates on what to implement, not how, so the organization can use it. You can be a large retailer, you can be a scientific organization, you can be a federal organization, you can be concerned with national security. Data management fundamental disciplines are essentially, if you have data assets, you need this. This is how we get it. So it allows you to perform a very thorough and efficient gap analysis very, very quickly. You can see what needs strengthening, what you have that's great, that was buried somewhere in the organization, that you can now extend and utilize. And of course, get business stakeholder support for that. It uncovers capabilities that sometimes you don't know you had until you start looking. And of course, it builds the support for finding a program, advocating for the program, and helping you develop these things like a business glossary. So measurement equals confidence. As I talked about in our little riff on metrics, so the evidence-based evaluation proves to everyone, each person as well as to their managers and executives, just how well the program is doing and where it needs showing up. It allows you to gauge your achievement over time as the model gets more currency against your peers. And it fuels a lot of enthusiasm. For more on to the next step. And it does help an organization enhance its reputation. Let's say you're in a highly regulated industry. You can tell the regulator, we've taken this very seriously, we've done a detailed assessment against it, and here is our strategy and sequence plan to improve. With those components, and again on the left-hand side here, sort of the data management practice areas. Again, data management strategy, data quality, data governance, data platform architecture and operations. And we marry them with the CMMI levels here, and we now start to put together some really interesting stories. For example, here's one that we did on a very related issue in the insurance industry. So I have a database of a number of these things that I've benchmarked over the years. And again, you can see the scale is across the X-axis there, the Y-axis, we've got the five practices. And unfortunately for this particular group, when we did it at this point in time, they were not really even a solid level two organization. So it just tells from an industrial perspective, hmm, maybe as an industry, it'd be something really worthwhile for us to collectively work on. And it sounds kind of funny, because Melanie and I both end up in various engagements, and people will say, well, you can't really have these two banks in the same room. That would be collusion. And you say, well, it's not really collusion if you're collaborating at the metadata level. And it actually ends up being very, very useful there. Here's another example of the same kind of thing, though. This is one that we did for an airline recently. And we put the results up and said, you're a one, one, two, twos, and a one. And the airline people are looking at you going, I don't know what that means or why do I care? And I say, well, here's your competition. And all of a sudden I've got their attention. And all of a sudden they realize they're the ones whose competition is the two. So what this allows us to do is the benchmark and say how are you doing relative to everybody else? And then where would it make sense to put specific dollars and cents into enhancing the ones and getting them up to a two should that become part of the organization's? Well, how do we navigate through this big tone? So this is a very dense material. And we have to pack it full of the most useful information that we could collectively think of to help you use it. However, paradoxically, it seems paradoxical. In action, it really comes alive. Because when you take it to an organization and you involve all the key stakeholders, it will generate a tremendous amount of passion, energy, and liveliness. For example, one assessment we did, at the very end of the assessment, our last process area was configuration management. By that time, five or more people had joined the group because their friends had told them it was exciting. And if you think configuration management is exciting, we didn't think it would take us more than 20 minutes to derive the score and get the examples for that particular process area. In fact, we went all the way down to the wire because people were so passionate about it. So that's what we run into. And that's good. So this big tome can help you get some real action. So the reason we approach it, and the reason I'm mentioning this is because it will be this method that we will teach, train, and certify against since it is proven to work in a very short period of time. So it's a launch collaboration event. We put everyone in a room and it's a big tent approach. We use consensus as the means to determine what the score is for a particular process area. Everybody has a voice and we also use supplemental interviews as needed. If there are primary major initiatives where we need additional context, the evaluation is supported by detailed work product views. So that is all the evidence that what you are doing has taken tangible form and can be reused. The final report at the end essentially is how well you scored. What your gaps were, how you can remediate those gaps, remediate its strengths, and a whole lot of synthesized recommendations for your organization at this time, what is best to do next. And in the near future, next year, we will introduce optimal audit level rigor that can serve as a formal benchmark of maturity. It will be leveraged as CMMI appraisal method to do that. This is the picture that you get from doing this kind of a thing. It's kind of a heat map and showing again the levels in the various areas based on objective reporting, meaning two people who've been trained in the same method will be coming up with the same results, which is something that we simply had the opportunity to do before. So the right line in the middle of this caveat diagram or spider chart represents doing a great job, which is attaining level three in all of the practice areas. So the DMM intends for all organizations to do this for themselves to try to achieve level three, because if you have done that, then you have a really effective set of processes and they are institutionalized across the organization and you should be harmoniously working with your business users and all the good things at level three. If you need to achieve star quality excellence for competitive advantage or any other pressing business driver, you can aim beyond that and we've provided a path for that in level four and five. Fairly, this is the first of the pants cumulative range of scores among the five organizations that have piloted this model. It's not really a formal benchmark because the scope varies from one organization used half the process areas. One was a fairly narrow scope for one business area, but it shows you just looking at this, five different organizations, what's the range of capabilities that they have. From standard all the way down to I hardly have this capability at all. Lois, you can get us a one and we have some ones in there. Another really good activity that's been done as part of this is to crosswalk this with the DMM box. Some of you are familiar with the data management body of knowledge that we put together. And again, there's ongoing cooperation with this and other industry groups in order to come up with this. So here's a sales slide if you will if you're trying to figure out how to sell this. So these are the functional baselines that the DMM provides for you. And together they constitute a great acceleration path for your program. So look at this, there's a lot of things that it can help in terms of the professional. We're real close to the top of the hour, so I'm going to move quickly through these on here and really talk about next because one of my favorite statements of all time is something that George Bach said, which is essentially that all models are wrong and some are useful. I'm going to not say that this is wrong. I'm going to say this is actually quite useful, but it's certainly imperfect. And one of the things we need to do is to help you all get together and to start to improve this. There's a formal process of going through and creating improvements to the DMM on this version 2. I don't think you're just underway yet to do it, but you're certainly going to be prepared to do the next iteration. You're not thinking that version 1 is going to have everything, right, Melanie? Absolutely. We are very, very eager and willing for industry involvement to help us decide what the next level will be for the data management security model. Again, the training that you're putting together that will start hopefully this fall, where we'll have the first batch of trainers to be able to go out and do this process. It will be certified in this. There is a certification program that you're looking to get in place? Yes, we have qualifications, and our aim is to not only make sure that anyone coming through the program is fully confident of the model and their ability to lead an organization to fit and implement the process improvements, but also that the industry itself, all the end user organizations, are very happy with the results. And finally, the partner program, there's a number of different ways that you all can get involved, and we do have a slide here for you at the very end just to make sure everybody sees it. It says how to get in touch with Melanie to do some more work in this area. So that's her contact information. We're at the top of the hour. I want to remind you all that we have, of course, monthly series on this, so the next month's series we're going to be diving back into data governance in this area. And while you guys are preparing some hopefully good, hard questions for us, we'll get it all set up here and look for it. Megan is back online with us. She had a technical malfunction, but we should be all set and ready to go, right, Megan? Yes, thanks for that. Thank you, Melanie. That was a great presentation. Now it's time for Q&A. Time for you all to ask your questions. So just click on the Q&A window feature at the top of your screen. You should be able to submit your questions through that Q&A window. We've already had a bunch roll in, so we're going to go ahead and get started. Let's see here. The first one is, is it true to say that the model covers only traditional BI and not that much, but not that much the newer world of ingesting various data where data quality cannot be driven in the way traditional BI works? That was a mouthful. I can repeat. That's a great question, Melanie. Do you want me to take it out or do you want to jump in? I'll say a few words, and then I'll turn it over to you. I've been a long time. I've been like diogenes with the like looking for the honest man. I've been talking to a lot of big data experts for years now, and I have reviewed the model with them briefly, and I've asked, please tell us if there are fundamental data management practices that have not addressed, that are essential for big data success. And so far, everyone has said, no, this is the prerequisite for what we do with our technology. So I've been looking. I've been looking. I would absolutely agree with that. I think the question is an excellent one. There are some things that are happening out there in data management that are very, very interesting, that are going to fundamentally change things that are going on in this set of practices. But when I look at this, and again, we looked at a precursor of this as well, there's nothing in here that precludes you, or, you know, we wouldn't... I think it would be an error. You do yourself a disservice, if you said, well, I'm really focusing on the ingestion process, and therefore I don't need to, or that this cannot help me, and I think that's just simply wrong. Again, you're not going to go out and ingest a bunch of data without at least having an idea of what sort of strategy that you're trying to do. What is the business purpose, in fact, that you're trying to ingest this big data? I've seen a lot of organizations spend a lot of money on big data techniques, but all of them have had a purpose. We're doing this because, and the other part of it that we see that is really almost hilarious is that we've got people that are out there going, well, I need a big data strategy, and we say, great, well, give us your data strategy, and they go, well, I haven't got the data strategy yet. We're going, well, you know, data's a part of data, so it kind of be helpful to have a data strategy as well as a big data strategy in there. So great question. I hope that answered it. If you have more questions, by all means push back on us and let us know. Megan, next question. All right. Next question is, why another maturity model? What makes it different from the other models like Mike 2.0, Gardner's, et cetera? So fair question. Melanie, maybe I should let you address that one first. I've got some thoughts on it, obviously. Please go ahead. I'll chime in. The models that have been out there, and a couple of them were mentioned, and they certainly are good efforts, are generally ways in which that you can describe things. And I've actually got a chart I put together a couple of years ago based on some work that John Lathley had done earlier than that, that show how the various models all put together on this. But they tend to be things that we're looking for. They tend to be overt behaviors that are much more prescriptive in nature. The thing that's been so successful about CMMI is that over time, the hard work that Melanie lost over, I can't tell how many hours they spent arguing about the slightest word change in there to try and come up with something that is an objective criteria. And this is what's lacking in the other models. So if you tell you what you should see, so you should be able to walk around the organization and see that you're using analytics very well, right? What does very well mean and what does analytics mean based on that set of statements there? So yeah, no issues in there in terms of other models that are there. But I do think that with regards to the CMMI practice areas, we've seen so much success, such widespread adoption. And it's also what people tend to do is they go up to a maturity model, and then we said CMMI-based, and they go, oh, good, we're already doing some portion of that. So they look at it as a delta rather than as a brand-new approach to this. So it's a combination of the market force and really the careful work that's gone on before all of this and building this next instance, if you will, of the CMMI. And I'm sure there's going to be other versions of CMMI that will continue to come out in the future as well. Thank you. Oh, sorry. That's all right. I didn't say anything. No, that's fine. I think what he said. Well, he said, all right, good. I don't have anything to add. That was wonderful. Okay, so the question is, for governance, for governance fees, how do you address accountability for specific datasets slash domains? So I'm going to go to the governance section out of Melanie while you're... Yes. Well, actually, accountability is the child of ownership. So the DM does not delve into the details of exactly how the governance roles shall be assigned. Whereas the DM box, for example, has a very good basic structure that they explain very well, that there should be an executive data governance body. There should be a data stewards body. They define the roles of data custodians. The DMM does not get to specific organization or structure. You will not go to the DMM alone to determine exactly how you should set up your data governance program. What the DMM supports is whether or not the governance activities are active, whether they involve all stakeholders, whether they are making decisions about all the key factors. So governance, for example, there are at least one or two practice statements every single process area that will specify the involvement of governance. So what we try to do is cover the broad landscape of everything governance needs to do. Excellent question there. And if you look to the sort of broad category descriptions, active organization-wide participation in key initiatives and critical decisions that are essential to maintain data assets. So this is actually the general framework, and these are three areas underneath it that are important. However, again, unlike some of the other maturity models that are out there, this is going to have a group of people who are formally charged with maintaining this. And we may discover in five years that we need to expand this section out and apply it to specifically, you know, you, person X are in charge of data set Z. And, you know, it's more specificity around it. If we don't see scores in this area improving. Yes. And we know how challenging data governance is because once you have derived the memory in which you're going to tackle data governance in all of these different disciplines, then your challenge is to engage people and keep people interested and keep people on the task and keep people engaged in the achievements that you're trying to develop. For example, a fully populated metadata management repository. That's a huge effort. It takes a long period of time. You need sustained involvement. So we're not saying that data governance is not challenging. And we know that the industry has emphasized this greatly for good reasons. And we think that it is not necessarily mature in that not every organization is sponsoring and doing exactly what it needs to do with data governance. But we think it's coming along very well in the industry. So this is a way to check how well are you doing with it for your organization. If you will. Megan, next question. The next question is, where can you get a copy of the DMM structure? Where can you go if you have an interpretation question? Okay. The DMM structure is contained in high level form in this full deck in our white paper which is on the DMM website, the address that you were given. And in the DMM itself. It's also available on the website. Eddie, we'll get a copy of this deck. Go ahead. Go ahead. No, it's going to go under the next question. How do you determine what maturity level is appropriate for the organization? Can you provide an example if possible? Really good. Go ahead. It really could depend, of course, on your strategy for an organizational perspective. Of course, more and more organizations are requiring data more and more in order to implement their strategy. So we tend to see a lot of organizations that are really focused on this. But it's absolutely a practical look at and one of our customers we've worked with over the years is a heating and air conditioning vendor. The idea of IT is let's make sure that the rest of the organization can continue to function smoothly. So it's not a competitive strategic advantage to them. For them, it's a part of something that's an enabler of their existing business practices. So their goals and objectives around data management are going to be considerably different than the goals and objectives of a dot-com organization or something along those lines. I'm not going to elaborate on that. I think that's a little bit different. In terms of who makes the decision of what maturity level is appropriate, the organization makes the decision. But by using the DMM, the organization and all the key decision makers can come to a really well-rounded point of view on what they feel is important because all you need to do is essentially take these assessment results and compare them to your current business objectives. And the importance of one discipline over another or spending more money on data requirements or data quality this year will be fairly apparent very rapidly. So a lot of initiatives and a lot of perspective and decisions kind of flow out of this frequently like ripe fruit. Because somebody already knows the pieces, this helps them put the pieces together. And again, make the business case for it. So one of the chapters in the monetizing book, a colleague Linda Bevello in St. Louis had put together where she had a project that she was working on and they came to the end of the project from a schedule perspective. And Heratogy said, look, the project's not done until I've got a critical mass of data in the system. So it's what I'm concerned you're not done. And by going back and increasing the data in the system, the coverage of the data was 40% originally, they went up to somewhere close to double that. And it was tangible and she was able to measure tens of millions of dollars of direct business value, accruing from the fact that they used their data strategy to improve the measures they were using around delivery of IT systems, in particular the data component of it. There's more that I can get into in this one, but certainly there's a chapter of the book that you guys can look at. And again, Melanie, I know you've run into a bunch of examples as well. The question is, how do you determine the value of different maturity stages? How can we assess the value at different stages using the maturity model? So I'm going to go right back to your strategy. In other words, if you're looking at a framework here, such as I put together earlier on here, and again I'll put the results up here, but you're looking at this and saying, okay, where should we be? It all doesn't matter if you don't know what your strategy relative to data is. I can't emphasize this. If you get to your organization and ask your Comptroller, your CFO, what their strategy around cash management is, they'll be able to tell you very clearly, where retail business things dry up around the middle of the summer, and we have lots of cash around the holiday season. So obviously what we want to do is conserve cash so that we have enough money to buy the inventory that we need so that we can sell all the goods and services we'd like to do around Christmas time. That's a very basic strategy for somebody that's in that particular business. Your data strategy has to be at a similar level of development. And if you go with an oversimplified data strategy that just says, all of your data's got to be perfect, well, you've just signed the consultant's Guarantee Employment Act of 2014. It's just not reasonable to expect that. So we have to place this in context. Yeah, I take too broad an approach immediately. You know, just try to boil the entire ocean in one big prism, and run it with hundreds of people like the land rush. You're guaranteed to spend your time, money, and probably not have a lot of success. So the data management strategy will start with the business strategy of the organization and the current business objectives because that will tell you what the priorities are. So if you're in trouble figuring that out, go back and revisit your data strategy or at least see if there's a component of your IT strategy that addresses your sole non-depletable, non-degrading, durable strategic asset. All right. And the next question is, where does DMN cover the actions taken to analyze and design data structures? I wasn't expected to talk about types of model, notation, et cetera, but data modeling and design appears to not be mentioned at all. That's an excellent question. Yes, data modeling and design is not the subject of a process area in and of itself. We've mentioned it all over the place in our descriptions and elaborations. We think it's an absolutely essential skill with the data management program and everything that it produces. And many of us on the author team have spent more years of data modeling than we would admit to because we would be worried you would put us in a cube forever. It's just a chance there. So no, not underestimating the importance of modeling at all, but we think that data design is really mature in the industry and that all the organizations we encounter have either a data modeling center of excellence or have high expectations for the skills and experience level of data modelers and designers in the organization. Could it be better? Could it be more across the board? Yes. But we feel that that is a fully mature skill. And if we can go back and name everything out there, we'd also get into naming other things. So let's just take a brief walk, though. From a strategy perspective and a funding perspective, you've got to make sure that you have the dollars to be able to attract the type of talent that you have into your organization. I'm going to just point out one in each of these areas. From a governance perspective, you're going to say no IT project should proceed forward unless we have a good understanding of the data requirements of the system in there. From a quality perspective, what we're going to say is, look, modelers, if you don't normalize to third normal form during your analysis, you won't have enough understanding of your data to make the required trade-off analysis that you need to do going forward. A platform in an architecture perspective, you might want to say something as specific as Oracle's got a great tool for doing this. Right? Again, what if this happens to be, from an operations perspective, you might want to say, look, this data lifecycle that we're looking at here has a conflict with the way in which we're developing our system. So maybe we'd better come back and take a strategic supporting perspective. So while the design process is not actually called out in there, we're making the assumption that people really do this. I hope nobody takes this away and says, ah, look, they didn't see data modeling or narrowingization as important in this thing. Therefore, it's not part of the data management maturity process. No, that's not what we're saying at all. So it's actually embedded everywhere because one of the things that we emphasized in engaging the business as much as possible, so the processors that we selected were designed to be the ones where the business has a very strong voice and should have a very strong voice and be well represented. That does not mean that we're the handoff and synergy with IT occurs that we're ignoring that because, of course, like you, many of us came up through IT. Fun memories, right? All right. Next question is, where can classes and exam sites for DMM certification be located? It could be located anywhere, but where are you going to put them on? Currently, our upcoming classes will be taught in Pittsburgh, which is the CMI Institute Home Office. And our first public course for the DMM intro is September 22nd through 24th. There'll be an announcement on our website very shortly to register for that course if you're interested. There's the first public announcement of that, isn't it? Yes. Good. Was that a minute question, Melanie? Not me. Yeah, me neither. Okay. And the next question is, how does the duration of each maturity step depend on company size? Could you say each phase takes some amount of time, say one year? How do you estimate for a customer how long each phase will take? That's an excellent question also. And the answer is, and I don't mean to be copping out on this, the genuine answer is it absolutely depends on the organization. In that, there are some processing areas that are a little longer term items than others. And again, we can say the data management life cycle where you're mapping business processes to data, you need to do that in a month unless you want a thousand people working 24 hours a day under whips and guns. That's something that needs to be done in phases. So depending on the scope, something like that can be chunked out based on priorities, for example, because there's going to be a big consolidation of a number of operational systems or a new enterprise data warehouse. That will determine the scoping. So there's no way to say a priori how long a step will take. And then certain steps can be done very quickly. For example, you can pretty quickly develop metrics for how you're going to assess the effectiveness of your data cleansing. You do want to do it, put the bodies in the room, follow the primer, and you can produce them and start using them. And let's go a little more specific on this. For example, one of the... an organization may have come up with this as a baseline. Remember, this is a hypothetical organization that you put together as part of the training. But let's just assume this was your organization and let's make a further assumption that you were a bank and that you decided that level three at all the capabilities would also make you comply with Basel III. By the way, it doesn't, so don't take that up. But let's just assume you were working towards a strategic goal of that line. You can look around on this chart and see that you're in some ways exceeding three in certain areas and not exceeding three in other areas. So it does give you a very good way of finding and shaping the problem and saying we want to take all of the things that are not three and make them into a three. Therefore, we'll be compliant with Basel... I'll make it six because we don't know what Basel VI is going to look like. So if nobody does misunderstand and say that they're the managers of level three, we'll get you to Basel III. That's not at all what I'm saying. But let's... It's a way, a framework is a way of organizing ideas so that you can start to assess your progress against them. And it's very, very important to have that. Otherwise, and it will take you there if you don't know where you're going. Yes, and what we've tried to provide is a framework for what's good. So if an organization has its own business imperatives either internally or externally, then they're going to emphasize the areas that are those imperatives first. However, if you don't, but you know that the overall program needs to be strengthened if there's no key burning divers, then you can use the model as a path to effectiveness for the other purposes of cost-saving greater efficiency, harmony and smooth interaction between IT and the business or data governance. So you can use it as a guide. Again, I think it's simple is just propping up a chart like this on the Wall Street Management saying, you know, if we're going to beat the competition, we better be paying attention to some of these issues on here. Okay, and we have a lot of questions come in, so we'll see how many we can cover. If we don't get to all the questions, they will be in the follow-up email, so don't fret. Let's see here. The next question is, how much effort and from whom is it envisioned to require for an organization to complete its own DMM score? Okay, so you're going to take the model and do it internally. So there are many ways to do that. You can put a few knowledgeable people in a room for a number of hours, maybe half a day to a day, and you can look through the questions that are a segment in the introduction of front matter before the practices. Those questions are designed to elicit whether or not the organization is roughly level three good with respect to the practice areas. So you can do a quick assessment simply by taking process area by process area, going through the questions and evaluating your answers. You could wait a quick report by that and get a very good see-pants picture, as long as you add in the room those few people who really knew about these miss areas and these processes. So what we recommend is, certainly do a self-assessment by whatever means you want. You can do it by a survey. The problem with the survey, however, is that people ask you for interpretations, and it's best done in person. So the survey has some benefits and deficits, but it can be a good, rough estimate. So those are the methods you can use. Another thing that we also help organizations with is sometimes it's easier to say what you're not rather than what you are. So one of the things we do is we talk about what you would be if you were a Level 5 organization. Remember, a Level 5 organization is an optimizing organization. It's an organization that's looking at its existing practices and saying, how can we improve those practices for the overall performance of the organization? So one of the things we do is we start off with these groups, as Melanie has said, gathered in a room for a little bit. We say, what would it mean to be optimized? Well, that would mean that this person, this person, and this person get together periodically and say, how can we improve the overall performance of the data measured practices in this organization? Does that occur? No, it doesn't. Therefore, you're not a 5. All right, are we in fact measuring anything around our data management performance? For example, do we measure the number of data models that are created? Do we measure the quality of those data models? Oh, we're not measuring anything? Well, then you're not a 4. And you see how you can back into it? It's not much necessarily that you're a 2.2, but if you think it's important to be a 4 and a 5 and you're clearly not, that's actually enough information to get you started. And the next question is, could you help me understand how to map, how to what, and gaps in what ideal, how it needs to be? Could you question? I'm not sure I got that either. Just a little of the way it's phrased. Could you help me understand how to map, how to map, how to what, and gaps, and do you get it, Peter? I think so. So Melanie, I think the question is, okay, so you're very good at telling us what we need to do. How do we go about mapping our hows to your what? And I think it's a legitimate question, and I think it's also one of the things that we sort of stress, that organizations are going to be individualistic in this sense, like the needs and two banks in the same competitive market may have different strategies, and so consequently their hows may be different. But can we offer any guidance around how to do specifying what their hows should be? Every time we engage with an organization in action, it becomes a collaborative, interpretive seminar basically, with the consideration of all the participants about what that practice statement means in their context. So when we're working with them, facilitating essentially, we are allowing the discussion to happen and trying to bring the discussion to some conclusion and getting consensus on that. So that's how we work with it in action at this point to get that interpretation. And of course, some industries, for example, one of our assessments was for one function across 12 different verticals. And the work products, when we got to the analysis of work products, the work products were not titled in the same way that a normal organization that was looking at their entire organization might have work products. We had to analyze the content of the work product, see how it met the capability, and work with our point of contact to jointly determine the applicability. So every organization is unique. Okay, it's real clear that one of the things that will be useful, maybe we can look at this going forward, at least an example of the implementation at some point, maybe a little narrative or something around that or a case study that would be useful. So we can certainly take that away as a TBD on that. Maybe somebody would like to volunteer. Again, we're out here saying we can't do it by ourselves and we certainly don't intend to. Thanks for one more, Megan. Yes, we can get one more in. How is this connected to information management? Where does the scope of DM start and where does it end? I will argue strongly that they are huge. If you're managing information separately from your data, I think you're introducing so much complexity into your process that it's certainly not worth the effort. We consider data to be a fundamental building block for information in order to get there, and consequently, you know, this really does apply to information management, but at the most basic level of data management. I know this is a very short answer. Melanie, I don't know if you want to add anything more to that or not. I want to take a slight sideways step from that. One of my ambitions, and we have a great deal of interest in this, is to add a process area over time to the DMM dealing with unstructured data, not from the standpoint of any one particular technology, but from the standpoint of how an organization gets its arms around what it could do if it could harness the power of its 80% of data that's unstructured and what are the business and technology decisions that need to be made and how can they be planned to give the organization a clearer path. I think that's a missing area in our industry and we would love to address it in the future. Good. This has been an introduction and overview to this process. We're starting the journey. The actual DMM was only released on the 12th. So we are starting this. It's a few days old now. A few days old is the newborn. Again, we encourage you to reach out to us if you have other interests in this. Want to volunteer for some aspects. Point out some things that we haven't caught in the first version of this. Again, I think it's going to be very exciting. We're going to be able to take some very good, concrete steps as a profession in order to do this. So, Shannon, thanks as always for hosting us on this. Megan, thanks for doing this. Thank you so much. And Melanie, it's been a real pleasure to work with you on this and I'm so excited that we're actually able to bring this out together. I'm so excited that Data Blueprint was our number one first partner. Oh, thanks for that. All right, and we'll see you guys in September. Melanie for joining us this month. It was a great topic and I'm really excited about the work you guys are doing. And as mentioned, we're going to be doing, publishing some more information and education on it on dataversity.net. And thanks as always to our attendees who are so fabulous and with great questions and so interactive. We always appreciate that very much and hope everyone has a great day.