 Hello and welcome, my name is Shannon Kemp and I'm the Chief Digital Manager for Data Diversity. We'd like to thank you for joining today's Data Diversity webinar, Implementing the Data Management Maturity Model within, without, with Shake It All About, it's the latest installment in a monthly series called Data Ed Online with Dr. Peter Akin and brought to you in partnership with Data Blueprint. Just a couple of points to get us started. Due to the large number of people that attend these sessions, you will be muted during the webinar. If you'd like to chat with us or with each other, we certainly encourage you to do so. Just click the title icon in the upper right-hand corner for that feature. For questions, you'll be collecting them by the Q&A in the bottom right-hand corner of your screen. Or if you'd like to tweet, we encourage you to share highlights and questions via Twitter using hashtag Data Ed. To answer the most commonly asked questions, as always, we will send a follow-up email to all registrants within two business days containing links to the slides. And yes, we are recording and we'll likewise send a link to the recording of this session as well as any additional information requested throughout the webinar. Now let me introduce to you our speaker for today. Peter Akin is an internationally recognized data management thought leader. Many of you already know him or have seen him at conferences worldwide. He has more than 30 years of experience and has received many awards for his outstanding contributions to the profession. Peter is also the founding director of Data Blueprint. He has written dozens of articles and eight books. The most recent is Monetizing Data Management. Peter's experience with more than 500 data management practices in 20 countries and consistently named as a top data management expert. Some of the most important and largest organizations in the world have thought out his and Data Blueprint's extra keys. Peter has spent multi-year immersions with groups as diverse as the U.S. Department of Defense, Deutsche Bank, Nokia, Wells Fargo, the Commonwealth of Virginia, and Walmart. He often appears at conferences and is constantly traveling. And let me turn it over to Peter to get today's webinar started and tell us where he is today and to introduce our guest speakers for today. Thanks, Shannon. Absolutely. Glad to say hi to everybody this morning. A couple of special guests. Some of you may remember Melanie Mecca or have seen her out doing really interesting work. She has also 30 years of experience. We are kindred data spirits here in this process. And she's been working on this particular topic, which is why it's so appropriate for her to join us on this topic today. Also, Jeff Walcove, who is the data governance architect in the Arizona Strategic Enterprise Technology Office. And the key for that is to understand that Arizona is trying to manage a lot of things statewide and figure out what things should be managed locally and what things should be managed all the way around in terms of where all that goes. So what we've got is a little bit of an interesting program here for you on talking about the data management maturity process. And Jeff is going to give us a little bit of his experience. He'll also be speaking at an upcoming conference that we'll both be at. I don't know whether Melanie's going to join us at this one or not. But the DGIQ conference in the first part of June in San Diego. And you can hear Jeff do a much more complete version of what he's going to give us here. So thanks, everybody, for joining us. Let's dive right into the material at this point. We're really talking about how to achieve best practices here. And I want to start off with a quote from my book that is hopefully coming out the end of this month in here. And this is an actual quote in the book. So I'd say we've referenced the DMM several times so far. Now we're going to provide some context to it. All improvement efforts begin with an obligatory assessment process. However, this one that we're going to present to you today is the only proven framework that have literally the benefit of decades of practice and benchmarking data that's out there. Organizations that don't use this risk and inability to meaningfully compare results against other organizations and, as a result, adopt unproven methods. So that's really what we want you to think about as we do this. What we're going to talk today in particular is a little bit of motivation. We'll go briefly through where we got here on that. Whoops, I hit the slide too fast. There we go. We're going to talk on how do we get here then in terms of where the research came from. And again, as I mentioned before, Melanie's been instrumental in this. And then we'll dive in and talk about specifically what is the data management maturity model. Many of you have heard about CMM and CMMI. Perhaps more importantly, though, your bosses have very much likely heard of CMM and CMMI. And then, of course, the part that everybody's really interested in, how should it be used? And we'll turn it over to Jeff and have him work a little bit on that next. So let's get started and dive right in on this particular piece. That was interesting. I went too far again. I'm just terrible with my drive today. Go ahead, Melanie. I can hear you ready. All right. So the data management maturity model is a reference model of best practices that have been tested and proven over the last number of decades. And it helps you achieve solid capability improvements. The model is architecture and technology neutral. So it can be applied to any industry. And I'll give examples of that later. And so will Jeff. It is industry independent. And if you have lots of mainframe cobalt systems that are still in operation, or you have an absolute with bank contemporary platform, you are still managing data according to management practices that the model addresses. Also, if you have all your data in the cloud, you're still managing your data. So it will apply to any situation. It does tell you exactly where you are in terms of your capability state at the current time. That gives you a snapshot. And easily falling out of that snapshot is, gee, we need to fix that. Look, we have something good here we can use. And this is what we should do next. So it makes the process quicker and easier. You don't have to do a one-year study with a cast of thousands to get to these answers. It helps you to manage data as a critical asset, create a succinct data management strategy that aligns with the business strategy, accelerate your overall program, typically expand the number of people dedicated to data management, and improve governance, and of course, come up with some high-value initiatives that give you tactical responses that you need right now, as well as place you in a good position for strategic growth. Most people aren't really aware that data is an awful lot like Maslow's hierarchy of needs. At the bottom level, you have food, clothing, and shelter. Those are your physiological needs. If your food, clothing, and shelter needs are unmet, then it is unlikely that you will ever be safe. If you're never safe, you cannot love. If you do not have any love and belonging, or you're not attached to a family or a group of some sort, then you have no ability to get to esteem. And of course, esteem is the last step on the way to self-actualization. We learned this in high school. Everywhere I talk, people remember this. What they're not aware of, though, is that data is an awful lot like that. We have this wonderful golden triangle that is what you typically hear, read, or understand about data. And that more importantly, this is what's being communicated to our decision makers here, which is a real problem for us. So the only thing I've ever changed on this slide are the words that go inside of the golden triangle. We also need, however, to have foundational data management practices. These are the things at the bottom of the pyramid. These are the five areas that Melanie will get into in just a little bit. We're not going to talk about them right now. But what I want you to understand is that these foundational practices provide, as you might imagine from the word, a foundation on which to build the advanced data management practices. And if you have a weak link in your foundation, as I'm showing in this example with data platform and architecture, the foundation can only be as strong as the weakest link. So in our example here, if somebody said we should put some more money into data quality, Melanie and I would say, no, it won't work. You have to first get your data platform and architecture up to the level that data quality is at before you make any improvements in data quality, because otherwise you are just simply pouring money down the drain. The things on the top half are technology focused. The things on the bottom half are capability focused. And this is really what the DMM is about, is looking at the people and technology side, excuse me, people process side of this, as opposed to strictly on the technologies. Now we get a lot of calls of data blueprint on a regular basis. I know here all this stuff, Peter, but I've still got to have it by Friday. Can I do it without doing the foundational practices? The answer is yes, you can. But whatever you're doing will take longer. It will cost more. It will deliver less and it will present greater risk than instead if you learn how to do this properly, which is our goal in this particular webinar. So what do you get out of doing these fundamental practices soundly and well across your organization? You are moving in the direction of improving your data level of trust for both your internal and external customers and business lines. Because you have measured it, you've shown what you are doing now and what you intend to do. And that will affect the data. For example, you implement a data quality program and you have a nice metadata foundation for that and you have a data requirements process and your business representatives, as well as your governance stewards are well educated and prepared. You're going to make rapid strides in improving data quality. It also will, because you have more accurate data, improve the decisions that you make about risk of various types for the organization and definitely improve the climate for analytics. We recently gave a presentation at the ISACA conference on why analytics is broken and how to fix it, which talked about analysts and data scientists spending about 40% of their time finding and researching data and at least 20% of their time cleansing and structuring data which left very little time for what they were hired to do, creative analysis of modeling and predictive interrogation of the data. So if the data is in good shape, you know what it is, you know where it is, you're managing it well, your analytics will yield that much more benefit. Also, a lot of the costs of data management are buried in projects. So for example, data cleansing of the same data in different data stores is often done repetitively, project A, project B, project C, project D. And if you can bring that together into, let's say, a center of excellence and on-habit standard process, you can cut those costs. If you have a better architecture, a better target architecture, you can gradually eliminate point-to-point interfaces and lower the costs of integration testing with each release. So those are a couple of examples of the efficiency gains that you can make. And if you're in an industry that's highly regulated, demonstrating to the regulator what capabilities you have and the state of your data is very important, then you can always show the improvement plan. And since I worked nine years for FINRA, I'm well aware that the regulator, if you have made serious substantive effort towards improving the data, that the regulator will accept your improvement plans and they'll just come back and check with you. So this is very useful for highly regulated industries. Thanks, Bill. So let's talk now about how we got here. First of all, if you don't know where you're going, any road will take you there. And we got calls for years and years where people would say, I want to take our data program to the next level. Well, the question is, if you don't know what level you are at, you very much can't move in any one direction. You are currently managing your data, but if you can't measure it, how are you going to manage it effectively? How is it that you know where to put the time, money and energy in order to do this? Now I had the title at the Defense Department, U.S. DOD Reverse Engineering Program Manager, and we sponsored some research up to Carnegie Mellon University asking the question, how can we measure the performance of DOD and our partners that we had in here? I was also told, go check out to see what the Navy is up to, and I went down and met John Fakman and Clifinkelstein down there in the Navy. But we went to SEI, and SEI responded with an integrated process and data approach on this. And DOD actually told them to take the data out because their name was the Software Engineering Institute. But again, the origin is very clear in terms of where this came from because this extra piece, through a series of sort of fortunate incidents, not supposed to unfortunate incidents, grew into the CMMI and the DIMBAK and other things that we've talked about here as well. Picture of Bert Parker, who was instrumental at the MITRE Corporation for picking up on some of this research, we published a paper in 2007 talking about sort of the roadmap and the key process areas approach in here. We want to certainly recognize Bert for his contributions that he has put into this effort. So with that, the thing ended up back at Carnegie Mellon where it started in sort of a full circle. And Melanie, you became involved after an extensive career in the federal government contracting area, working for a lot of different agencies, right? Yes, many, many agencies. And regulatory agencies, financial agencies and civilian agencies primarily. But I did a little work with DOD and now we're back with DOD again. Anyway, the CMMI Institute does support and provide all of the offerings that the Software Engineering Institute provided around those reference models. We also have, in addition to the Software Development and Engineering Model, CMMI, we have an Acquisition Services and People Capabilities Model and our latest creature, the Data Management Majority Model. We also provide training and certification and a lovely partner program to allow our partner businesses to get the most out of our products and expand. And we are now owned by ASACA, which is very helpful because they have 140,000 members worldwide, primarily in the area of IT governance, IT audit, cyber security. And we went to a conference there last week and there were 1500 people. They were very interested in the analytics presentation and they had a lot of great presentations. So it's another harmonious community that we're involved in. And this just illustrates that the CMMI for development, which has been out there now over 25 years, has made tremendous inroads around the globe as a global standard of process improvement for software development and engineering. So it is used in many, many countries. Our greatest areas of expansion now are South America, Central America, and China. And if you look at the bottom, you can see that we did our partners and we did 1,900 appraisals against this reference model in 2016. I mentioned before the real key to this is understanding that because of the rich history and the academic origins of this technique, it has more robustness. Gosh, I think I'm making up words again on here. For example, this is one study done by the conference board where they looked across CMM, ITEL, RUP, COVID, and the one on the end is PMI on the right-hand side. And they found out that projects done with CMM generally performed on budget and on time at a better rate than without. If you notice what's happening here in the statistics, the projects done without are in gray. The projects done with are, in this case, the left-hand side of that diagram. And in each case, what you're seeing here is organizations achieving better results. So the piece I say at the beginning, a lot of people will say, well, can't you use any old assessment process? And the answer is simply no. There is so much research and academic findings in this that are corroborated by a number of different sources. It's not just Melanie saying this is good, but a lot of people using Melanie's technique say this is good. So this just illustrates some of our other key products in our portfolio. And we are, by the way, coming out with an overarching product called Next Generation that will allow organizations to select from specific profit areas in any of our reference models to develop a complete custom capability measurement instrument for their use. So that'll be coming out in approximately nine months from now, and it should be very useful. I don't know if you might suspect. I don't know if you might suspect, but we're collaborating between the two organizations to eliminate the confusion between the tools and to highlight the real nice complementarity that goes into this, because Melanie was very familiar with what was going on in there, and this will extend training to different organizations and professionals and provide benefits to members of the respective organizations here. So let's look at what does actually comprise the model. So we were released after an intensive peer review. Peter was on our list, and he was number one on the list because of his last name beginning with A. So lots of respected peer reviewers. We also had Bill Inman, father of data warehousing, and Peter Chen, inventor of the entity relationship diagram. So a lot of great people took a look at this. We got 1,200 comments, and we applied as many as possible for version 1.0 with our sponsors, Microsoft, Lockheed Martin, and Booz Allen. So we have a lot of practices in the DMM. There are 414 practice statements throughout 25 different process areas that are essentially data management topics focus areas. And we also have maturity practices, which I'll talk about in a moment, but that's essentially how stable are good practices that you've put into place, okay? And our approach is about what you do. So you cannot just talk to talk with data management because it's your data. It is the substance, the stuff, like the air you breathe or the food you eat or the water you drink. So you have to actually do things. And therefore the DMM emphasizes positive, proactive, behavioral changes across the organization. It emphasizes the utility of repeatable processes that are well-constructed so they can be reused again and again, and it places a high premium on leveraging and extending those processes across the organization. And as Peter could tell you over the past number of decades, if you just allow things to remain project by project with your data, you'll really never get off the ground and never yield the benefits that you hope from your data assets. The other thing we look at, and there are 596 of these, are work products, artifacts that are produced as a process of going through the process. So you document your processes, you come up with appropriate standards, let us say, for data modeling, for data representation, for data security, for metadata, you have guidelines, templates, and later you have training so that everyone can be aware of this. And all of this goes towards higher quality and better reuse. And the model is very practical. One thing I'll say right in the middle here, since we're talking about implementation, you do not implement the DMM as such because it is definitely designed not to be a cookbook. Each organization will progress in different ways, depending on where it is, where it's going, the industry that it's in, and the emphasis that it places on various aspects of the data management dial. So let's talk a little bit about what the scale is here. We're gonna do it two ways. I'm gonna give you a simplified version here, and then Mellie's gonna give you a little bit more detail. But in order to explain it, I said one of the nice things about this is that we don't actually have to explain it to management because management has very likely already been exposed to it. You get one point in the scoring. It's a one to five scale, no big excitement there. And if you have a pulse effectively, you get one point. There is no zero. We get a lot of organizations that say, well, you know, you say you start at one, but we're actually at a negative one, and that just doesn't work, right? But if you get to the point where you're starting to document and define your practices, they can now be defined as managed. And you get three points if somebody then gives you the ability to standardize and use whatever it is that you do consistently. Now let me just speak to the scale so far. One point to a pulse, two points if you have any processes. Notice we're not commenting on the specific value of the processes. Melanie's last slide on this one said you are what you do. And this is really looking at it and saying, do you do what you do consistently? Is there a way to standardize that process? And then to take it to the next level, which would be if we can't measure it, we can't manage it. You've all heard that many, many times. And of course, this model incorporates that absolutely. So until we have a standard and consistently used practice, there's no point in taking any measures, but now that we do have them, we can start to measure those pieces, four points for that. And finally, we ought to get together periodically and say, can we improve this using those measurements? We'll look at some very specific examples here, but this intuitive basis is also the basis for TQM and ISO 9000, other things that go into here. There would be normally a Dilbert here where we would say that a level three organization, an ISO 9000 is level three. You don't care how bad the processes are as long as we follow them consistently. And that's a little bit tongue-in-cheek, but the point is most organizations don't even have that. So this is absolutely steps in the right direction. Now, Melanie, you're gonna talk about them in a little more detail. Yes, so first, if you take a look at the cylinders, I guess they're, what are they? Red and green on the right-hand side. Yes, red and green on the right-hand side. You go from higher risk to greater quality and stability. You go from ad hoc inefficient processes to the ability to reuse and save costs and efforts. And this is very important. Data is very important and data management in most organizations is still not very well looked at from the enterprise level and it's not very well pulled together. So what you have is people reinventing the wheel or halfway reinventing the wheel and not knowing what assets are available for them to use. So you have higher stress, people running around reinventing processes. Whereas once you achieve level three, as Peter was saying, then everyone has clarity. They know how to proceed if I'm about to profile a data store to discover defects in it. How to proceed if I need to change an approved term in the business glossary, et cetera, et cetera. So very quickly, capability is you're doing effective things and you're doing it decently and you have work products to show that you've done it. So that's capability improvement and maturity is essentially for the purpose of stability and resilience for those processes. So it ensures that you can operate under stress conditions. For example, if the financial industry had had better data management practices before the crash, recent crash, they would have been able to reconstruct that data to send it to the regulators instead of it taking weeks and months for people to figure out exactly what happened. And these things that are conducive to maturity are very sensible supports like policy, training, assigning resources, doing quality assurance, et cetera. So once the practices are established, then we can measure maturity like an appraisal. The structure of the DMM has a lot of contextual information about it. So each process area is able to be evaluated by itself if you wanna just pull one out. We've had people say, oh, business glossary, we're doing a business glossary now. I'm gonna go home right after this conference and see how we're doing according to the business glossary practices. So you can do that. You can use the model in whole or in part. But everything in there tells you basically why this process area is important for the organization to do well. What are the goals of doing it well? And what are the questions you can ask yourself to put a finger in the wind about how you're doing? What other process areas are closely related to it? But the only things that get scored are the functional practices for capability measurement and the infrastructure support, the maturity practices for a maturity assessment. So let's take a look at those practice areas. Mention them again before on the bottom of the pyramid slide. Again, strategy, quality, operations, platform and architecture and data governance. And each of those is broken into subcategories in here. But let's just take the high level ones at the first floor. Data is your most important asset that you have. It's the only asset that you have that is not depletable, that doesn't degrade over time, that is durable in nature and it exists at the strategic level. So wouldn't it be nice if we managed our data assets coherently? Similarly, we have a class of governance professionals now. We have standards in those areas and there are some people who you can look at and say they are data governance professionals and others who simply are not in there. We understand that when we're trying to make data quality improvements to our data assets that the goal is not perfect quality but actually fit for your purpose and that is a very important distinction. Then of course we wanna do it with the right tools and with the right methods in order to pull them into place. Lastly of course, we do need some organizational support in order to provide all of this capability. And now Mellie's gonna take you through these in a little bit more detail. Okay, so the data management strategy category has five different process areas in it. Number one is what the industry has been clamoring for for a while because not only Peter and Data Blueprint but many other thoughtful firms have stressed the importance of the enterprise data management program to manage data assets in the same way that you manage corporate finance, human resources, facilities, management. Permanent functional areas that require permanent funding, focus, strategy, et cetera. So that is what we're trying to help organizations do with the data management strategy. A part of strategy is communications which is vital because the program goes on forever and the data management function is essentially the shepherd function primarily for persistent products that are created over time. Like the enterprise data model, the business glossary, the metadata repository, et cetera, and architectural standards. Then for data governance, we have governance as such. We define governance primarily as collective decision making across the lines of business about shared data. And we define it like that because it has a compliance component that is an important component who can access the data, who can change metadata in the repository about a data store. But it also has a building and nurturing function as well, caretaking of the assets. So governance practices are treated in governance. The business glossary is essentially business metadata that's the cornerstone of everything that you do in terms of where data is or how it's represented in one database or another. It is the business concept, that is the term that the business agrees to use and also the definition of what it means to the business. A quick example here, we've had companies who use a similar term differently and really step on their own foot in terms of results. For example, an insurance company using the word product and program interchangeably across seven or eight different business lines, they could not come up with a collective risk, they could not come up with a collective product list, and it confused their forward planning as well as their financial results. So that's what business glossary is for, not to have things like that happen. It's a business problem and it's this solid. Metadata is all other information about the data assets. We call it knowledge management for the data assets and the model gets into quite a bit of detail on this. Data quality in the model is essentially four process areas that comprise a sort of Larry English total information quality point of view on data quality, which is that it is a 360 degree endeavor to improve the data quality. And as many of you know well, you're always trying to get to acceptable quality and many, many factors comprise the elements that can erode data quality from bad data entry to bad modeling, to bad design, to tough integration, to external data coming in. So you need a strategy and that is our first process area. These are more or less in sequential order although any one of them can be of benefit to the organization in any order. What are you gonna do to improve data quality? And we find that organizations, hundreds of organizations that we've worked with or talked to, if they don't have a plan for how to improve quality across those areas of enterprise data that they deem important or critical, then not a lot happens. It gets stuck in the project. I've talked to organizations with 15 or 20 different data quality tools. One project in one area doesn't even know what the tool is and the other area that they're using. So if you can think of the inefficiency of that and the fact that we have in almost every organization significant amount of data redundancy of core data such as product or customer data, then you know that there's a lot of flopping around when everybody should actually be aiming towards a target that is mutually conceived and approved by all the key customers. Profiling, discovery of problems in an existing data set or unified data sets. What's actually in the data? What's wrong with it? Does our metadata need to change? Data quality assessment is the business user's determination of fitness. What is the data quality in the key area that I can accept? What is my aspirational target for this? For example, 99.9% uniqueness of customer IDs may be a target. So this is, and also it gets into the application of data quality dimensions as a way of thinking systematically about quality and engaging the business to do that with you. This allows you to create quality rules for critical data or highly shared data that can be shared everywhere in the organization. Improving quality across the organization in the data areas you care about. And we mentioned this thing already. Additional planning for data cleansing and being careful about costs, using business impacts helps you to spend less money and get a better effect. Operations do a very good job in functional requirements for years. You know, I've reviewed in the past many, many functional requirements that were definitely written and had all these testable requirements in them. And most of them really didn't address the data upon which all these operations were to be conducted. So we look at techniques and emphasis for getting better data requirements. That's both at the high level for the enterprise as a whole, for a business line and for an application data store or repository. Data life cycle management is about deciding which data sources are authoritative and how are you going to track data lineage from sources to targets to targets to targets to targets for that data that's necessary to have a complete audit trail on. So it's very useful, that particular process area. And we'll show you another slide in a few minutes that organizations don't do a very good job in data life cycle management, kind of across the board. Provider management, quality in the data that you receive, either internal data from other systems within your organization or external data. Do you have quality requirements? Do you regularly meet with vendors? Are they meeting needs of the organization? And there's a lot of process areas dealing with architecture. However, there's nothing technical in the DMM about architecture in terms of telling you what to build or even what approach to use to design systems or model First of all, all of the collective authors of the DMM felt that data design was a fully mature discipline. That doesn't mean in every single organization you'll always have crackerjack modelers or crackerjack enterprise data architects. But the knowledge is out there and probably would form a stack of books all the way up to the attic. So many tools, it's been going on since 1975 when Peter Chen came out with Entity Relationship Modeling. So we don't talk about modeling itself, but we do talk about modeling and other data standards. We talk about engaging the business in coming up with the future architecture. We talk about sensible practices that will make sure you don't make multimillion dollar errors in choosing a data management platform. And just one example here. In a federal agency, there was a customer relationship management system, which was perfectly adequate for the current functioning. They wanted a few additional requirements for it and they ended up replacing a system that cost a quarter of a million to maintain with a $20 million vendor system. And then they had the problem of having to upgrade it with many point-to-point interfaces every time that vendor product was upgraded and it became an expense nightmare. So that's an example of how bad it can be and we wanna help organizations not do that. Data integration, best practices when you bring data together and this concentrates on the business engagement in those practices and making sure the requirements are met. And of course, historical data, archiving rules and records retention. And then good housekeeping practices. This is something that you definitely do pretty well on the software development and engineering side of the house. But these processes are usually not applied well to the data assets or the data management processes themselves. So these are all sensible things. Have a process asset library, do some quality assurance, do some metrics that really mean something to you, to your governance representatives, to the enterprise data management group and to the executives. So all good practices and you will benefit by implementing them. So let's move on then and talk about how this can be used and at this point we're gonna bring Jeff up. Jeff, we thank you again for participating with us on this and we're really looking forward to hearing about the story. Sure, thanks Peter. So implementing a data management program is really an exercise in organizational change. And one of the tools we use to drive change in Arizona or any state government really is policies. Very little happens in state government without a piece of paper telling someone to do something or how to do something. And that's pretty much the same in many businesses as well. Whether you have a policy or not is something that you have to look into and perhaps fill a void. We've found that it helps when policies are based on a widely accepted framework. For example, our security policies reference missed controls. When I drafted our state data governance policies without a framework, there was a lot of pushback. People said, where'd you come up with this? Well, it seemed to be a great idea. It's something I had read about in the book or felt it was a best practice based on experience. But after the DMM came out, I inserted references to it right in the policies. As you can see in this example, I don't know if you can read it, but there's examples to the practice statements that refer back to how we implement those various recommendations in the DMM. And then the pushback turned into tell me more because they wanted to know what the state of management maturity model was, and that helped us move towards adopting the DMM as a framework for data management. This is all about communications and gives you a context for communicating when you're implementing policies. Next slide. So Arizona's Governor Doug Ducey's strategy for Arizona is built on what we call the Arizona Management System, AMS. It's very strong on metrics, lean and process improvement. Five main concepts drive it. Decide faster, resolve faster, respond faster, put more services online, and reduce costs by improving efficiency. And as I'll tell anyone who listens, excuse me, good data and good data management is key to all of those and you can't get those in place without good practices. We chose to adopt the DMM because it provides a methodology to measure maturity in multiple process areas and a way to track it. We emphasize enterprise architecture in Arizona as well. And the DMM methodology allows us to measure the as is state and identify gaps and create a roadmap to close the gaps. And that's really enterprise architecture best practice. Next slide. So I first heard about the DMM from a data diversity presentation as it turns out back in 2015. And when we put on our data management conference in 2016, that helped us kickstart the program. Initially that was supposed to be data governance training and then people yawned about that and if you ever wanna get someone to fall asleep talk to them about data governance training. Eventually I changed the name to data management conference and that's when people started to pay attention. I engaged our communication team to get the right people to the conference because getting the right people in the room is like 90% of the success factor of any conference or communication endeavor. And we had an executive briefing on the second day. And thanks to the governor's chief of operations, Henry Darwin, he got us to bring about 37 directors of cabinet agencies into the room and we filled the room for that executive briefing. Peter was there and that was an excellent briefing for them. That's where we introduced the DMM and I had people come to me afterwards and said we need to be doing this. And we wanna reach DMM level three. They were right on board after that conference. So that really gave our program a big kick. As far as what we're doing currently, we have CMMI's building enterprise data management capabilities for 20 people back in March. We have six people that are very interested in that. They wanna come back for advanced training. I approach training a little bit differently than the some others. We've always had a modest training budget but very often the holding the events was just a matter of checking off a box. Training is done, what next? But there wasn't any follow up. There wasn't any program around it. There wasn't any purpose to make sure people have the tools that they need when they get back to their desk. So when you're trying to drive organizational change, training is an opportunity to build a team. It's a communications opportunity and should be used as that. Community ideas, share plans, develop strategy and so on. For next steps, our data management program is getting broader visibility across the agencies. We just had our conference in April. We had about 75 people attend. And the data management maturity model was pretty prominent. We had Melanie speak to it. And people are definitely on board with getting that moving forward. We're planning baseline assessments for four agencies at this point. And we're on the road to implementing the best practices to help the governor's goal councils achieve their objectives. Some of the goal councils, for example, are around reducing opioid deaths and reducing recidivism, which both require collecting data from multiple agencies and massaging it and coming up with all the rules for how we share data, how we match it and so on. So our data management practice is coming in very handy in that whole effort. And of course the governor's chief of staff is on board with this as well. This effort really started at the grassroots. And one of the efforts that we've been, one of the goals that we've had was to get the attention of the governor's office so that we can get their buy-in. The DMM lended credibility, structure and measurable goals to our data management program. And that is the kind of thing that did get the attention of the governor's office. Always nice to have friends in high places, right, Jeff? Yes, it is. When do you apply employee to DMM? When is a good time to use it? So obviously what Jeff just told you is this is a very, very well conceived and completely ambitious program to raise the data management capabilities across the whole state, agency by agency. So it's just a wonderful vision. Well, within an organization, if you need to do a data management and or architectural strategy, it's very good to use the DMM for that purpose. It's also very good to use it if you're going to do an enterprise data warehouse or anything else that involves significant amount of shared data. And of course, some of you are thinking master data management, absolutely. That's a textbook case because master data management firms who provide products, I'm thinking of one that provides master data management for the financial organizations, they said that if they had governance in place and standards and a good requirements process and cemented data guidelines, their implementation would go from a year on average to six months on average. So that's how important it is. Okay, let's take a couple more rather quickly. Obviously governance because governance often fails because people don't have a governance agenda. In other words, what are we getting together to accomplish? You don't just want quarterly meetings or monthly meetings with no fixed agenda. We highly recommend that governance keeps its enthusiasm and effectiveness if you run your governance operation like a series of projects with lots of accomplishments. So let's say we have a customer master data store, let's do the business terms across all the lines of business using this data store together and finalize this piece of the business glossary. That kind of approach gets you much better results and also sustains energy. And of course, for analytics. So these are all areas in which you might consider employing the DMM in whole or in part so that you know what kind of support you need for these desired outcomes. One of the things that's so powerful about the technology, we tell people all the time that if they're implementing and again, let's just pick on MDM because that's a very reasonable thing to do. The MDM technology, as Melanie said, works very well. But what we find is that organizations are more successful if they implement the technology in conjunction with governance and quality because it doesn't do any good to have master data if the master data is of poor quality. So what we've talked about here is two sides of a triangle here. We've got sort of the pieces of the people and process and technology that we wanna pull all the way together. And this assessment allows you to do that overall perspective. So what we've outlined here is the data management practice areas, the foundational practices that are necessary but insufficient to do all the fun things that everybody wants you to do with data. And that we've seen that these practice areas can be performed at specific levels, one, two, three, four, and five. And when we take those two bits and combine them into the assessment process, you can come up with results such as the following. I took the 30 insurance companies that we had done this type of an analysis for and gave the results there. You can see that these 30 insurance companies did not perform reasonably well in their area. And they all turned around and said, yeah, this is what we've been yelling at our management about, about how inefficient we are at processing insurance data in this area. So that's a summary of a bunch of data. Here's another one, which is an airline that I was working with. And you can imagine looking to a group of executives who say, I've got two ones, two twos, and a one, so what? And I say, well, that's where you are, but here is your industry competition. And I have got their attention when I do this because they know they are the ones and their competition is the two. And that is, of course, not an area where they'd like to be. And we can also then look across all respondents and just sort of see how you're doing, how the airline industry, if you will, is doing relative to the others. But most importantly, as Melanie and Jeff both said, this also leads you to show that we've got to take those ones and make them twos. There is no point at all in trying to make the two into a three because the overall data management program is only as strong as the weakest link in this area. And this chart in particular illustrates that quite well. Let's take one more example. You'll notice I didn't tell you which airline or which insurance company that would be all confidential information, but the World Bank did tell us that we could use their information on this because they thought this case was very instructive and we did too. In this area, we're looking at a specific subgrouping under governance on this. And we were asked to evaluate the treasury function at the World Bank, the information systems group at the World Bank. And you can see that these are not good results, but the neat part about this analysis was the business portion of the bank actually set the standard. This was the highest score we had ever obtained in this measurement of over 500 different scores that we had here. So what you're seeing here is a message that says to management, yeah, you could go out and hire a bunch of expensive consultants, but you know the only thing your people really need to do is to walk down the hall and ask their colleagues who are doing this well how they're doing it and would they be willing to share their expertise with the rest of the organization? Again, I hope that's real clear from that chart. This organization did not need external support. They needed to walk down the hall and learn from each other. And that was a wonderful message for them. You can see also they were out performing both the industry benchmarks and the overall benchmarks that we had in there. In fact, these are what I call the era of big data. These are the accumulation of all the measurements. So you can see overall organizations did not perform well in 07. And unfortunately by the time big data era was over at the end of 2012, the exact same thing had happened. We had not significantly improved our ability to help organizations leverage their sole non-depletable, non-degradable, durable strategic asset. So Melanie, you can talk to us about getting started now. Yes, so what we have done with the DMM over the past five years basically since it was in beta stage is to come up with a method that allows tremendous participant involvement from all of the key business lines as well as the enterprise-facing organizations such as risk, audit, enterprise architecture, business architecture, et cetera. So we bring together a large group of stakeholders representing all of the core business lines of the organization and we navigate this dense material in a high-paced, fast-paced discussion workshop with everyone in the room. That way, people can come up with accomplishments from various lines of business and sometimes other lines of business never even knew that they had done that. They never knew there was anything to leverage that anyone had documented a process, come up with a standard, had a good strategy approach. So it's very, very useful. It typically generates a great deal of energy and the organizational participants make affirmations consensus. So they will decide we meet this requirement, we don't meet this or we partially meet this and here's the example of how we're moving towards it. We do some supplemental interviews for interested senior managers and executives that are stakeholders and we spend time looking at the work products because the rubber meets the road in work products and often we will find very good work products that have been insufficiently leveraged so it can be very useful. And then out of this, this all takes four full days, this onsite portion. Then we do a report and you get the detailed scores, the scoring mechanism, the gaps, the strengths, the contextual information, how to remediate the gaps and project initiatives that you can very easily put into a roadmap. This is an example of at the end of day four, the final workshop, the organization has a visual snapshot of exactly where they are. This particular diagram is not any particular organization. We changed some of the values but we have had diagrams that look a little bit like this and what I want to point out is the three areas that are above level three. That shows that the organization, for one business goal or another, has spent time, treasure and talent in certain areas and then you'll see that some of the areas are down near level one, meaning they've neglected these processes. So our recommendation is for all organizations to strive over time to achieve level three because that is the 85-15 rule. You're going to get a great deal of value out of doing that. And for governance, the soft benefit of assessing organization against the DMM equals the technical knowledge and project initiatives and actions, actionable recommendations that come out of it because people are engaged, they agree with each other, we've never had a fight, they will disagree at times and will come to a conclusion. So it educates all of the involved personnel, it's kind of like an education with a fire hose, both for the people facilitating the assessment and for those involved. So it really helps give governance a shot in the arm. Mellon, do you want to skip the Microsoft thing again? I think we did that last time too. Yes, I think we should because we're getting through it. We're including in your slides here though a case study on Microsoft as well which is a very, very nice case study on that so we'll just encourage you to take a look at the slides after we get them out to you. Just a couple more things as we get back to the top of the hour here. First one, Virginia has actually gone into this in a very significant pattern based largely on the Arizona model where we are doing this type of a process around what has to happen here and there's a lot more infrastructure in place here. We're just giving you the tip of the iceberg and Mellon, I'll let you describe a little bit about the training and opportunities that are there. In case you're interested in learning more about the DMM there's a very easy way to do that. We have an e-learning class that's about eight to 10 hours to complete and that gives you a complete picture of the implementation features of the DMM. What are the benefits of doing these process areas well? What are the obstacles to implementation and what are the success factors that help you to succeed in implementing a data management program across the board? We also offer interactive three-day classes that are on-site at your organization and that's very useful again for governance engagement and education of data steward. And if you wish to use this model like a pro and go in front of any organization and give them a really high value in a really short time we have an advancing capabilities class which is a five-day class with what it does is it helps you really put a polish on your consulting in data management because you can see how everything ties together and you become very conversant with any gaps you may have had previously and then our enterprise data management expert course which is a dual certification for expert consulting and the ability to do an official assessment against the model. We also will have an associate certification in the next few months after the e-learning or on-site building class has been complete and I think we can skip these slides. So again, lots of partners in this as I said before there are lots of vendors that will come at you and talk to you about how to do an assessment. This is the only authoritative one backed up by literally decades of research in here and just to summarize here at the end you know, those of you that have been in the webinar before know we stop at the top of the hour and we'll take questions and answers and Jeff will be with us as well in case you wanna ask him some Arizona specific questions but the key is to look at these five areas that we have put out here, strategy, quality, operations, architecture and governance and that rating those on a one to five scale will help the organization to get better at doing all of these things which means that when you want to do analytics, data mining, big data, et cetera, et cetera all of these things become easier we're not gonna claim it's easy. So we're right at the top of the hour at this point here. Again, what we've talked about is where did this come from and where are we trying to go with it and where we wanna go with it now is to get you all involved. So we have a wonderful set of resources here that we can utilize on this but at this point, let's turn it back over to Shannon and see what sort of questions you have for Melanie and Jeff and perhaps even myself on this. Thank you, Peter and thanks to Melanie and Jeff for the assistance presentation. It was a great piece of content today and just to answer some of the most commonly asked questions we receive just a reminder I will send a follow-up email by end of day Thursday with links to the slides, the recording of the session and anything else requested throughout. So let's just dive into it because we've got a lot of great questions coming in already into the Q and A here. Let me make sure I can scroll up here. So what is the cost to implement the DMM? Is it affordable for a medium-sized company? Well, actually, the cost is going to vary considering what level you're aiming at as well as how quickly you wish to progress. So for the Securities and Exchange Commission, the Chief Operating Officer there wanted to achieve level three across that very ancient and highly siloed organization and he wanted to do it in three years. So that was going to be a total of approximately 40 projects and I would guess close to $25 million. Now, that was very ambitious where that organization was and where it wanted to go. So for other organizations, for example, the Neoway Solutions in Brazil which is an analytics as a service company, a very Silicon Valley type small company, they implemented improvements to data governance, quality metadata and their business glossary in a period of three short months and increased their sales by 20% because their clients loved the fact that they could attest to the data quality and that they had good metadata for the solution sets. So that will vary a great deal with the size of the organization and what areas you emphasize and what level you want to achieve. And Jeff, you're seeing some of the mid-sized agencies that are interested in this too, are you not? Yes, we have agencies of various sizes but even the smaller agencies very often have multiple lines of business and multiple different data sources that they have to deal with. So sometimes the size of the agency really isn't the question. We sometimes have very large agencies that have, I'm sorry, much less complicated data and data issues than some of the smaller ones. So that's a great response and the other thing to think about too is let's just pretend that this costs a million dollars. It does not, but let's just pretend that it did. The only way that management is going to invest in something like this is if you can show them that by investing a million dollars they're going to achieve a million dollars and something else out of it. Because that's what organizations are in the business of doing is leveraging their investments into things. We've got an example here in Virginia where this process helped an agency that was dedicated to helping children that may be at risk actually eliminate much of the data collection around the process which was redundant and trivial. It was just simply not useful information that was being collected and as a result the agency was able to devour a million dollars into services. So we'd much rather instead of intervening in the kids we'd rather prevent them and provide services to what's actually happening out there. Another way to think about it though is to say, look, regardless of the price. It is, I won't say it's not trivial to be involved. Somebody needs to be certified. You want to make sure when somebody comes to you and says I'm going to evaluate you using the Carnegie Mellon standard that they have actually done what they say they do because there's a lot of people out there we found that say, oh yeah, we do this because it's standard. Everything we do is standard but they can't actually show you that they've attended the classes or passed the certification in this case. So a very important part of that as well. All right, so how does the DMM compare with a data management capability model, DCAM, from Enterprise Data Management Council? Well, that's probably a better one for you, right? We used to be the same team early in the development of the model. So the models are somewhat similar. The stated objective of the DCAM is to concentrate on compliance with risk data aggregation. So that is identified mostly for implementation of governance and it is definitely targeted towards the financial industry. So our scope is much broader and we are completely industry agnostic, although many, many of our clients have been financial organizations, banks, hedge funds, insurance, conservatorships, et cetera. So complimentary in general, right? Sure. Yes, and if you have that particular scope, if you're a bank and you want RDA compliance and you need to get something implemented right away that was kind of designed from the original model, then you might want to consider that and if you're not, then you might want to consider us. I love it, it's perfect. So there's a lot of questions about finance and the cost of the DMM, Melanie. Maybe there's a link or something that you can, for a typical assessment cost, things like that, that you can point my way and I can get out to people. Yes, okay, I'll do that, yes. Perfect, and then. It's essentially, it's Kay Morton at CMIinstitute.com, our business development manager. Okay, perfect. So just post the DMM assessment, might there be multiple action goals like formation of a data governance council, creation of policies, definition of data quality metrics? What would be the quick wins that companies should target? Well, there are going to be, yes, there are going to be, there's going to be different quick wins for every organization because one of the things that we've learned in our careers, plus in working with the DMM is that each organization is unique. There's no two alike even in the same industry that progress identically in the same path or the same way. So often what we do recommend, some quick wins are if you do not have governance set up, writing a chart or four and kicking off an executive data governance council to get executive sponsorship for your program. Another very good quick win that is sometimes appropriate is doing a data quality pilot for a very small data set and coming out of that with processes, results reporting, some quality standards, quality rules that can then be extended to other data sets and other business areas. So those are examples. Peter, would you like to give some more? Yeah, I was just going to take your hypothetical example. So again, these results on the screen do not show any organization's information. This is an example of Melanie made up. But if we were to look at the data quality portion of it, which is the light blue round four or five o'clock there on your screen, you can see that this organization got high marks for having a data quality strategy, but actually didn't do as well when it came down to looking at assessing, profiling, and actually cleansing the data. So clearly, the advice in this case is to take your data quality strategy, which appears to be of higher than average as a work product deliverable, and start to invest in specific profiling, assessment, and cleansing technologies. Now, this doesn't mean go out and buy stuff. It means go out and buy stuff that will help you achieve the specific business objectives. So you can't do the data quality piece of this here absent a data management strategy. Again, you may just have a lot of individuals. Imagine Jeff's position. They said, Jeff, you're the czar here for what's going on. So anybody has any data problems come to you. And Jeff turns around and starts handing out tools to everybody that doesn't really help. We have a saying that a tool with a fool is still a tool. Oops, I said that wrong, right? A fool with a tool is still a fool. And that's really the problem. It's a matter of getting the people and process portions around that so that the data strategy could be implemented well given this type of guidance. Now, what you get from the DMM is a lot more detailed than this. This is a high-level summary of what goes on. Obviously, there are specific recommendations that come out of the assessment on the organization by organization basis. One of the exercises that we did in training was we had 20 people from 11 different agencies talking about what they would address first in a data management strategy. And I'm sorry, they all came up with different ideas. Some of them wanted to do a business glossary first. Some of them wanted to address metadata first. Others wanted to talk about data strategy first. It all depends on what your issues are and where your low scores are in the data management maturity model and where the biggest impact will be. And one of the things, Jeff, you've been able to do there, too, is you're sort of sitting in the catbird seat where you can watch all of their individual progress. And you've also been able to point and say, hey, you people over here in, oh, education, say, should really be talking to the people over here in corrections because they're doing something that you're trying to do. And again, because you have these summaries that come up to you at that level, you're able to sort of direct things in a more programmatic fashion for the good of the whole state. That's right. Just being able to get together and have that conversation is very important. Actually, it's kind of nice when your specific agencies are told they do a good job because nobody really knows. This is another way of highlighting this thing. Wow, that's actually a really, really good piece that you guys have done. And again, that's one of the reasons you're going to be speaking at this upcoming DGIQ conference. Right. Yes, and from our perspective, we've been speaking with a couple of agencies in the state of Arizona about doing assessments for them. And it's wonderful for us because we're so interested in learning the business of these agencies, things that we didn't know about. Can I say one, Jeff? Yeah. Yes, so our first one is going to be the Department of Corrections, which has a very big initiative on tracking of inmates as well as lowering the rate of recidivism. So that is a fascinating area to apply to the best practices for their data assets. So we're really, really looking forward to that. And we thank Arizona very much for providing the opportunity to apply to DMM. Since we have worked so hard on developing the DMM and making it a useful tool. So Prisons is a different industry than we've said the DMM has been applied to. We recently did a gold company, Barrett Gold Corporation, and are about to work with a payment processing firm, Paychex. So it's very, very interesting to see. And I will tell you a little secret here. This is a secret. When we go into any organization, I know that Peter and Jeff have heard this too. If you're in a different industry, then people will tell you, well, such and such an industry, that's one thing. But our data is special. And I have to say that in our work with many, many different types of organizations, federal and commercial, regardless of the industry, the management practices that Jeff is trying to inculcate throughout the state of Arizona and that are represented in the DMM, as well as in the DINBOC, those are really pretty much the same. And when we're working with companies for this purpose, we never even look at a data record, because the practices are going to be virtually identical, regardless of the industry. So that's my secret, and I'm revealing it here. Do you disagree, Peter? No, I agree with you. And one of the other things we talked about in terms of the comparisons that Jeff's able to do within his state by joining these larger groups that we're talking about as well, Virginia actually has the lowest precipitism of any of the states in the country. So Jeff is going to be able to come over to here and put his people in touch with our people, who have in fact learned from this and can take some of these pieces home. You won't typically hear governors talking about this, because governors hate to talk about corrections. There's just no good news that can happen in there. But our governor, Governor McAuliffe, did, in fact, announce just a couple of weeks back that the Virginia Department of Corrections has the lowest precipitism rate in the country, and it is due entirely to these data management practices here that we practice. Super. Cool, looks like we can learn something from each other. Absolutely, that's what we're here for. So in addition to a lot of questions on just the cost of implementing a DMM, I think the question's really stemmed from, is it applicable for a medium, a smaller medium-sized business? Is it feasible to implement at a smaller level? Yes, so I can tell you, I'll tell you two examples. One is one of the agencies that first used the DMM back before it was released in an earlier version was the Department of Treasury Office of Financial Research. And at the time it was a new agency, they were getting ready to build their huge data center to do analytics on systemic risk in the financial arena. And they wanted to use the DMM to proactively put in place efficient sound practices so they'd be able to deal with these reams and reams of data and their tremendously contemporary technology stack they were implementing. So that had 47 people at the time that we were there. And they asked us to recommend a bunch of projects and come up with a roadmap for them. And that had 23 projects. They immediately kicked off six projects. And of course they made rapid progress because there weren't that many people. And they had a tremendous acceleration in their plans and their implementation. And then of course we've used it at very large places like Microsoft. So the size really doesn't matter. It would matter, let's say one more thing, for the Health and Human Services we are very close to releasing a derivative product based on the DMM called the Patient to Demographic Data Quality Framework. Which is 75 interrogation type questions for ambulatory care practices, large or even very small. So for a very small ambulatory care practice who is trying to improve patient identity integrity and patient safety through better record matching and identifying a patient uniquely they are, it might be one half of a person helping to put these practices through registration, through care, laboratory, pharmacy, claims and billing. In a large practice like Mount Sinai Hospital it would be more people. But yes, it's tailorable to the size of the organization, definitely. And Shannon, you seem to get some questions around this. So really the thing to think of is not so much the size of the organization but what are people spending their time inside the organization doing? Again, remember Melanie's slide says you are what you do. And if you have organizations that are spending large amounts of time reworking things, changing things around, doing things that seem unnecessary. First of all, it's frustrating to the people who are doing the work because they're all saying to themselves there ought to be a better way. But secondly, if you have this working at a reasonably dysfunctional level in the organization, just imagine what would happen if you had standardized practices around this. Again, in a couple of years, Jeff is gonna be able to say to employees joining the State of Arizona government team, here are some rules about handling data, here are some processes, there are some expertise in the State of Arizona where you can go to when you run into these kinds of problems. You don't have to know somebody who knows somebody but it's actually out there as a formal body of knowledge. So we really don't wanna look at it as an organization must be of exercise in order to be able to take advantage of it. It's more a question of what your people are doing and how can we direct those efforts to more productive types of activities? Certainly makes sense. So what is the first practical intervention non-academic to build a business case for a business to buy in? It helps if you have a burning bridge issue, if you have something that has recently blown up or crashed or collapsed or something like that. I'm trying to think what I can tell. Let's just say that an agency has something that made the papers and that was perhaps necessarily a thing that put the agency in a favorable light or the organization in a favorable light. These would be the kinds of things that help with that motivational process but unfortunately like therapy or counseling in general, you're actually better off getting this when you're healthy instead of when you're at your lowest point because when you're at your lowest point where people typically turn for help, then the helping professionals are gonna say, well, you really have to change a whole bunch of habits and start doing a bunch of other things whereas if you go to therapy when you're healthy, you'll actually get healthier whereas what you're trying to do before is restore health. I don't know if that make any sense. Definitely not. This is a question of really getting the understanding what the governor's objectives were and depending on what the level is that you're working at, it could be the director of an agency or the CEO of your company. What are their hot buttons? What are their issues and what are they trying to work towards? And if you frame it in terms of how you can help them meet their goals through a data management program, you're probably on the right track. Actually, I will give you a specific example based on that because Jeff is right in terms of whatever the leadership of your organization is, they're going to have specific priorities. And by the way, if you don't know them in your public company, look it up in your form K1 that gets filed with the SEC that Melanie was talking about before because that's where management articulates the specific challenges that they have around those areas. Virginia took effort and became the first state to be certified such that we have no homeless veterans in the state of Virginia. That's something our governor's very, very proud of in order to do that. That was also an effort that was done directly at his specific behest because he cares very much about the veteran community here. That's absolutely lovely. So, I'm along that same line there. So can you address, post a DMM assessment, there might be multiple axioms like a formulation of a data government. Oh, we already went through that, what am I doing? I'm rereading questions. How and when can I, so we got that. Sorry, I'm getting caught up here. That's like the gravity to write a book, huh? We have one, would this certification be suitable for a CDO enterprise architect or data architect? I see that on the list. We have this tiny scroll area on the screen for the questions, but anyway, that's a question. And my answer is yes. In fact, we've had a couple of chief data officers in the advanced enterprise data management expert classes. And we've had other people who are quite senior in our industry, 25, 35 years of experience who've said that chief data officers should take these classes because data management is such a broad field since you have the summer intern and the CEO and everyone in between, touching, creating, managing, editing, updating data, that there's no one lifetime, one professional lifetime, although Peter is trying, that is enough to master every topic throughout data management. So it really, really helps to fill in gaps and straighten out all your concepts and really know kind of from the trees to the forest how it all fits together, why it is all interdependent, like Peter's diagram with the chain. And I put up here a copy of one of the roadmap examples that you guys did as one of the assessments as well, so. All right, so sorry, let me get back to my place here. So it was mentioned that 30 insurance companies were surveyed as part of DMM assessment. Were there any regulation-specific data assessment controls that might be applicable for insurance companies? So Melanie mentioned that the regulated industries are the ones who can benefit from this tremendous amount. So this was the slide that had the industries here. We didn't look at any specific data. What we looked at is how the organizations process their data and do they do it in an ad hoc fashion, in a managed fashion, in a defined fashion, in a measurable fashion or in an optimized fashion. Those are the five scales that we look at. Perfect, now we have additional questions too on the training available that you mentioned, Melanie. So I will, if we can also get a link to that, we'll get that included as well in the follow-up email along with links to the slides and like recording. Yeah, so that's great, this is awesome. But Jeff, I see you answered a question here. But did Arizona people, let's ask it out loud for everybody, did Arizona train to do the assessment? And if so, how did that work out? So there's three levels of training that the CMMI offers. One of them is the basic building enterprise data management capabilities is the first level, and that's the level that we had 20 people with 10 last month. The next level is the advanced training, and we're currently working on getting a class together for that and scraping together the funding. So we have not gotten to the advanced course yet. What the plan is, is that we have between six and 10 people that are interested in that, and we want them to take the advanced training and kind of form a SWAT team so that we can have them go out not only to their own agencies but to other agencies and do DMM assessments and turn it to a small army rather than just one person at a time. And I think, Jeff, what the question was asking specifically though was you are training your folks. These are not consultants coming in from the outside, but these will be capabilities that Arizona has from this point going forward. That's correct. Those are employees that are being trained. And this is the first state that we've seen that has been willing to invest in this, one of the reasons we wanted to invite Jeff to join us on this call today. Yes, that's true. And it's typically the larger organizations who realize that it's far more economical to train a couple of their own in-house experts if they wish to be measuring the capabilities over time and developing these enterprise programs. So we do recommend that. And that's excellent that Arizona is stepping up because it is better to have more people, especially when you have 100 agencies. And you want to provide services to all of them. We have 35 cabinet level agencies that we want them to reach DMM level three within three years. So that's a pretty big undertaking. We're going to need a lot of people doing these assessments and guiding the agencies through their different projects to reach that level. Jeff, I would say your guaranteed employment's at least 2020 then. Yeah. It's a non-trivial task, absolutely. Yes. Now, you touched on this a little bit as well. You mentioned insurance companies, but were there any banks in the US and Canada assessed? Oh, yes. Yes, the banks were the first customers of the DMM. And a lot of them used it internally. In fact, Citibank was the first pioneer of doing an assessment. They didn't do it using this particular method because they were global and they were huge. So they had a coordinating team sending out detailed surveys. And the model was only about six months old. It wasn't even complete. But from that, they established a very sound governance program and implemented data quality improvements. And that program is still evolving based on that first initial assessment. Also, Wells Fargo had done early assessments. And then they came to the CMMI Institute and said, we did so many early assessments with the early model. And we don't want to, we have seven major business lines. And we're big. And we don't want to do them, again, as an enterprise because they won't tolerate that. But we want you to write us something that our well-educated data governance officers can use for each business line. So we developed an abridged version of the DMM called Compass, commissioned by Wells Fargo. And they're using it internally. So there's many ways to put it to use. Many roads to move on are right, Melanie. That's right. So have you encountered any companies or agencies that have applied these principles to spatial or geographic data? It looks as if one of them will be coming up for the state of Arizona. Jeff, you want to talk about that? Well, spatial data is really everywhere these days. There's so many different agencies that use it or refer to it or need it. And the principles are really not any different. It's a different kind of structure and so on. But fitness for use is just as important for the geospatial data. We have, for example, roads that have suddenly jogged to the right because the measurement system for measuring where they're supposed to be was different over different periods of time. So they didn't line up when they were actually built or fixed. And we have our state cartographer and our Department of Environmental Quality and our Department of Transportation that are very actively working with us to help develop that knowledge base for the Arizona Geographic Information Council. And, Melanie, there's nothing in the DMM that would emphasize one type of data or another. It's generic data management. So if you have data that you think might not fit, it actually does. Yes, one of our students, for her first assessment, assessed to be US Geological Service science base, a collection of research databases all with scientific information, mostly structured and some unstructured. And it was a fairly small scope. And the DMM fit it perfectly. So it's pretty versatile. Shannon, we're getting close to our time limit here, aren't we? We certainly are. So we have more questions about availability to different groups and memberships. Again, we can get all that information out to you guys in the follow-up email as well. But that looks like that is the bulk of the questions there. Thank you so much, Peter. And thank you, Melanie and Jeff, for joining Peter this month for this great presentation. I know our audience is always very interested in the DMM and how it works and how they can get involved more and how it applies to them. So thank you so much for continuing to join us and pass on that education. And thank you very much for everyone who attended. Melanie and Jeff, thank you both. And just a reminder, I will send a follow-up email by end of day Thursday again with all the links to the slides recording and additional information that we've been talking about throughout the Q&A here. And thanks to all of our attendees for being so engaged in everything we do. We just loved all the questions that have been coming in, and we really appreciate it. And we hope to see you all next month in the next webinar with Peter. So I hope everyone has a great day, and thank you so much. Thank you. Have a good day. Bye-bye. Bye.