 Welcome. My name is Shannon Kemp and I'm the Executive Editor of DataVersity. We'd like to thank you for joining this month's installment of the Monthly DataVersity Webinar Series, CDO Vision. This series is designed to give you around education on data strategy topics in addition to our annual face-to-face CDO Vision event happening this year in Atlanta, Georgia. We're already well underway for planning this year's event. And this month, the Webinar Series, John Miley and Kelly O'Neill will discuss real-world data strategy success story sponsored today by Hewlett Packard Enterprise. Just a couple of points to get us started. Due to the large number of people that attend these sessions, you will be muted during the Webinar. For questions, we will be collecting them via the Q&A in the bottom right-hand corner of your screen. Or if you like to tweet, we encourage you to share highlights or questions via Twitter using hashtag CDO Vision. As always, we will send a follow-up email within two business days containing links to the slides, the recording of the session and additional information requested throughout the Webinar. Now let me introduce our speakers for today. Well-known industry analyst John Miley is a business technology thought leader and recognized authority in all aspects of enterprise information management with 30 years experience in planning, project management, improving IT organizations, and successful implementation of information systems. He is the president and chief delivery officer at First Sam, Francisco Partners. Also joining us is Kelly O'Neill. Kelly is the founder and CEO of First Sam, Francisco Partners. Having worked with the software and system providers key to the formulation of enterprise information management, Kelly has played important roles in many of the groundbreaking initiatives that confirm the value of EIM to the enterprise. Recognizing an unmet need for clear guidance and advice on the intricacies of implementing EIM solutions, she founded First Sam, Francisco Partners in early 2007. And with that, I will turn it over to John and Kelly to get today's Webinar started. Hello and welcome. Hello. Hello, everybody. Hello, Shannon. Hello, Kelly. Good morning. Good afternoon. Good evening, maybe. I think this evening for some of our folks looking at who is out there, today we're talking real-world data strategy successes. We're going to look at a lot of different types of data efforts. We dove into the archives. We looked at work we've done. We looked at work some others have done, and we have gathered up together a lovely collection of things that, first of all, because of our CDO vision theme here, are areas under the guise and of interest to chief data officers or the top data job. The other driver for this was these are all successful efforts and success of being defined as, well, across a bunch of things. We'll take a look at that. But some of the approaches here were different. Some of the answers were unconventional. So let's just kind of move forward here. Let's just keep going here. Kelly is driving today. And first of all, we're going to talk about what is success. Now, success can be defined a lot of ways. When you're doing a strategy, the big success indicator is that it's engaged and it sticks. And you look at it six months or a year later and people are actually executing it. Kelly, what other items, we talked about this when we were kicking around our topics and stuff. What other items are there for success? Well, we do have a slide on that next. So I can go ahead and adjust to that slide. Before we get off of this, I think there's a variety of different ways to look at success from a data strategy perspective. And one of the things that we wanted to do, and you'll see this as John and I go through the slides, is that there's some big picture successes that cross enterprise information management, which is kind of the realm of the CDO from the big picture. And then there's some little picture stories that talk specifically about successes in specific areas like governance. So we wanted to provide those two views, not just thinking only about big picture EIM, but recognizing the smaller successes too. So back to you, John. Well, there we go. And then, well, you know, when are you done? Right? When is a strategy done? Right, Kelly? I mean, a strategy is done when people are carrying out the strategy, and it becomes, goes from strategy to business as usual. So it's visible. People are pulled behind the vision. People are thinking more about data. We're seeing this as a theme a lot with a lot of our current clients, as well as historical and this whole data-driven, data-centric type thinking. Obviously, some benefits should be there, right? Efficiency, productivity, and reduced costs and things like that. And of course, getting to market quicker with your projects and having your data program show some benefit. The real thing, though, is that this strategy becomes business as usual, don't you think? Absolutely. And I think that this is one of the most important takeaways from this call, is that it's important to have some understanding and agreement on what you're trying to accomplish in the first place, so that you can start to work towards that common goal, and you can identify some success as opposed to just, you know, going along. And at some point being questioned, why are you doing what you're doing? I want to make sure that we've got an answer for that. Well, you know, strategy becoming the part of the fabric of the organization is another way to say that you're done. I mean, how many organizations, and I think of the thousands and hundreds of thousands of people listening to us today, how many of them have strategies that are on a shelf in a notebook somewhere? That's not really, you're not done. That is not a done strategy. That is a suspended in motion strategy. Yeah, absolutely. Come on, why don't we just move forward here and we can start talking about some of these things here. I think I've got the first one here. Yes, we did a large energy company, and the background here is they spent a whale of a lot of money on SAP, like a lot of capital intensive industries have done in the last 20 years or so. Did that big mass of one-time data clean up and went through the whole upheaval of new ways of doing business and adopting the things like that. And of course, the driver here was it was going to take care of all their problems and they could get reports and they could get in because everything is integrated and all that. And it's no surprise to a lot of people that after a year or so after they were done, nobody still trusted the data. There were still data errors, and they actually were still having business issues. They went for a rate increase from a regulator. One of their business units was regulated and had to have its apply for a rate increase, and they were denied because of the inadequacy of the data for their request. And then it along comes smart grid and Internet of Things, and just when you think you're getting things under control, let's just shove a few five, six terabyte a day data at you and start to analyze that too. So they were like, wow, this isn't worked out. So moving on to the next panel here then, we'll kind of the solution. Because of the nature of this organization, in fact it had several strategic business units, some regulated, some not. And because of the nature of their culture, their organization or engagement model was not a Gantt chart. It was not a hierarchical type thing. It was more of a federated matrix where communities were built and some communities were hands-on, immediately engaged with the issues at hand, others were an extended team, others were involved, and others were supportive of the activity. And we went through and defined the interfaces between these and when things moved from one ring to the other. So in the middle, you'd have your analytic project or an MDM project or data quality or something like that. And you had your team, which might be managers and architects and stewards and things like that. And you had the extended team, if it was a little bit bigger, you might bring in the PMO. If it's MDM, you're going to be pressing on some business processes, so we brought in the subject matter experts. And if it was a broad spectrum BI, which we did do a redo of the data warehouse too while we were there, we, you know, that kind of stuff. So, and then you have your executives and the business areas that are supportive as needed, external files, touching other systems, interfaces, remote locations, that kind of thing. So, again, the key here, the key takeaway is not, you know, stuff we did. The takeaway here is that this was a creative approach and it was reflective of the culture and it was flexible. It brought roles in as roles could be brought in. Let's go on to the next slide and so we can see how the architecture kind of wrapped up. And this is, again, this is not your typical left to right or top to bottom layers of boxes here because they had old stuff, they had new old stuff, and then they had new new stuff. And it wasn't just a transformation or a transition, it was a coordination juggling almost between all of these. So, and since they wanted to be data savvy, data driven, then metadata, of course, is really, really important. And getting to and using the data is really, really important. But making everything sync up and talk to each other was important too. It was a lot more than just MDM, which we ended up with, of course, with SAP. Of course, you've got your de facto gold copies already defined, so it was a matter of synchronization rather than some type of hub. But so there is an abstraction process or processes going on all the time here to balance things out and make sure things. So a canonical type of idea here and a layer of getting to it that was very flexible and offered lots of choices was really, really important. So we had to reverse engineer, we had to manage current state, and we had to be able to forward engineer new things as the market demanded. So that's just some of the things that we're done there at this client. And still in process, the takeaways here then moving forward are first, reorganize around data. That's a key thing. If data is going to become important, be willing to organize around it. And then operationalize these new thoughts. Of course, data quality becomes a mantra. And one other little thing that was done here, that this particular organization fully embraced on their own rather than just having an external group do the organization change, is they internally embraced trained change teams that were cross-functional and not only cross-functional, but also multi-discipline. There was an executive and a middle and operational people on each change team. So it's okay to be not standard here is the takeaway for this organization. They were really proud of it that they didn't look like everybody else. And middle management embraced it because they were fully engaged in the change teams. And we can have hybrid. You can have stuff that's got old stuff in it for your architecture if it's still working and you're not ready to move from it yet. So those are some lessons learned from that one. Time check here. Let's just move on to the next one. Here, Kelly, anything to add from that one or shall I just forge ahead? Well, I think the one thing is a recognition just as you said. But for every organization, as much as companies are trying to improve their maturity, there's usually a constituency within an organization that wants to be cutting edge and bleeding edge and all of that. The reality is just about every company has old systems. So it's really difficult to rip and replace. So considering how to leverage old systems and old processes and optimize them to support what you're trying to accomplish, I think that's a key realistic and practical learning. Yeah. Yeah, reality just gets in the way sometimes, you know. Right. Yeah. So the next one is an organization that had some operational issues. They were in retail, but they were specialty. So anyone who's not been in a cave in the last 10, 15 years knows that retailers have had some real challenges. And they've all had to make some changes. This one was no different. Some very typical data issues. Merchandising was kind of one year we get lucky at the holiday season. The other year we didn't get so lucky at the holiday season. The data redundancy when this organization was examined was amazing. It was the highest redundancy I think had been witnessed by a lot of people and people I've talked to and in my involvement with this as well. And on top of that was it's a retailer. They don't like to spend a lot of money on things. So a lot of it was a lot of old stuff out there. Again, the same problems we always hear. First of all, mundane data problems. We don't trust it, but also business challenges because of the data in terms of agility, in terms of Wall Street's view of their stock price. Their stock price was explicitly affected by financial analysts taking a look at their operations and saying, wow, you're not as sharp as your competitors. And that was a data, those were data issues. So that's their problems. Move on here and talk about some of the things that this organization wrapped their arms around. First of all, full reload of the data architecture. Get away from some business model mentality of their outlets and their touch points and their use of the web. And then create an architecture that allowed for many, many, many types of channels and anchoring it. Anchoring a course with their operational data and reporting it. But different ways of moving data around and different ways of getting things published and subscribed or requested or things like that. We got away from the local data store. That was really, really, really hard at this organization. I did not, there was a lot of, it's my data type things going on there and that had to be a matter of policy to move away from that. The organization, that was probably their biggest issues were cultural and political there. But all of it, you know, with a good sense of data management and modeling and consistency and a good semantic layer there to hold it all together. We'll move forward then to then the other really cool thing that this company did, which I just, it gave me goosebumps to watch it, was arm in arm working with enterprise architecture. A lot of us in the data world don't have that level of fortune. But having the enterprise architects there at the table along with the data architects and coming up with something that was not an add-on. A lot of times the data stuff, it seems to be a bolt-on to other things. It made us a lot with Hadoop and Big Data and predictive analytic type things where it's just kind of tacked on to the top of things. And it works and there's a benefit to it, but it can also be maintenance and interface issues and things. This case, everyone really got together and focused on working together on the architectural side, and that was a pretty cool thing. And again, our theme here is some things that we thought were a little bit different the way they stood out and away we go. The results and the takeaways for those of you out there in radio land. One thing this organization did way before a lot of other organizations was create a top data job. They brought in the title was not Chief Data Officer. It was Executive Vice President of Data Assets or something like that. And they had a formal area and were willing to refresh technologies. They also created internal change teams. They took this evangelizing things very, very seriously there. The real key takeaways, as I said, are enterprise architects can be engaged. Some of you have, it's easy to do that. Some of you is hard to do that, but it's a really cool thing. And the executive team was engaged because there was a new person at the table. Honestly, they weren't well-received initially. It was a tough, tough, tough thing, especially in an industry like retail where roles are very long, well-established. We'll just leave it at that. But that was, again, a neat, neat result from that. Let's move on forward here to the next one. And I believe that's one for you, Kelly, or is there a way to get another term here? Yeah, I just wanted to emphasize this takeaway of the engagement with enterprise architects because, well, it's relevant to the next case study. But also, if your organization is fortunate enough to have enterprise architecture, that enterprise architecture group has an opportunity to embrace data, data architecture, and a lot of the standards that are created and meant to be used as part of a governance organization. And so partnering with an enterprise architecture group can help identify more evangelists, more people to work with to improve the formalization of processes like governance and other standards, and to document and improve data understanding as it pertains to architecture. So we've seen that that is a, many times, a critical success factor is that partnership between the two, the data organization, the enterprise architecture organization, because many times the enterprise architecture organization does not sit underneath the CDO umbrella. So it has to be a partnership. It's not a direct line of reporting. Anyway, just wanted to ask that. Sorry, absolutely. I was talking into the mute. All righty. Got it. Data-centric project or executive reporting. Take it away, Kelly. Yeah. So this is actually a great example of an organization that, in fact, didn't have strong enterprise architecture. They were very application-centric. And so it was an organization that had strong application-based architecture under what they called an enterprise architecture, but they didn't have this understanding of architecture across applications. So as a result, in the context of this case study, reporting was done on an application-by-application basis. They tried to create a more unified reporting environment by implementing ClickView, but what they had done is they just transferred that application centricity just into another environment and called it a ClickView data source. I'm not sure if anybody on the line has experienced that same sort of thing. But the proliferation of ClickView can happen quite quickly and can exacerbate application-specific silos and, therefore, data silos as well. In this organization, data quality was done on an ad hoc basis. So when quality of data reached a critical level of terrible, they would react and clean it up. The business side was not really that involved in data, and they assumed that their IT counterparts were taking care of it, which resulted in some of this reactive and case-by-case cleanup exercise. And going back to their implementation of ClickView, they didn't really look at what they wanted for the future. They just re-implemented what they had as an existing system, which wasn't really working in the first place. So I think just all in all there was a very low level of data management maturity and a view of data across systems and across the enterprise. As a result, there was very little trust in the data, and the business really wasn't involved. So there was a lack of accountability on the business side and involvement on the business side. And so this issue around trust in data was like a hot potato going back and forth. As a result of the lack of trust in data, it was really hard to do cross-functional or cross-system analytics. And the end result of that were decisions were made largely on gut instinct rather than being data-driven. The organization put together a data strategy, which was a great first step in increasing their maturity around data. So the data strategy was created and approved. That data strategy included activities and resources to improve their data understanding through things like metadata. And there was a recognition that data architecture and modeling was important as a capability. And a long-term plan was created to invest in data quality. So I think that kind of as an example in contrast to your previous case study, John, this was an organization that was just improving and growing the maturity of their enterprise architecture. Yeah, little incremental bumps are really good. Those little incremental bumps, people underestimate how valuable just a little improvement is. Absolutely. Just a little acceptance of something new. Absolutely. And along those lines, a wholesale change is also very hard. So this organization decided to, as a way to make all of this real, to take a project view of increasing their level of maturity. And so use projects to increase their level of maturity, which I think is a really good pragmatic way of doing it, to be honest with you. It may not have an enterprise-wide impact from the get-go, but it is a way of making it real. So this organization identified a near-term project that was already in flight. It was already budgeted. And it had a reporting component. And that reporting component was a great opportunity to either do it the old way or to take a more data-centric approach that would support the data strategy. That reporting project did include a lot of sales marketing and operational data of interest from an analytics perspective. And the goal became to build out subject area-specific data-marts that could be used beyond just this application-based reporting. So it was a way to still meet the goals of the project from a reporting perspective, but start to implement and architect future data stores. So, again, a way of leveraging a project to start to implement more long-term vision of the strategy. And in the process of building these data-marts, they profiled the data maps, the data model, the data, captured the metadata, et cetera, et cetera. And this really increased the transparency and understanding of the data really far beyond what would have been accomplished, had the reporting component been implemented in the previous way that it was done, which would have just been, we have these 800 reports, let's just implement these 800 reports again. The other thing is that in order to get all of this done, the business was essentially, you know, kind of forced to participate. So those folks within the organization that were backing the data strategy encouraged those members within their team to participate in this whole process. By participating in the modeling process, oh my gosh, business people involved in modeling, imagine that, they actually learned why it was important for them to be involved. And those things that they thought were too technical in the past, like data modeling, they realized that the building of a conceptual data model is actually understanding how customers relate to products, which relate to quotes and applications and other sorts of things that are typically called data entities but are in fact business tools and terms. So this learning process was also a really critical aspect of this way of progressing the data strategy and understanding why information requirements were important, what they mean and how they can participate. And the other group that learned quite a bit in this process were their business analysts who started to see the data requirements and information requirements gathering process as important as functional or UI gathering process. So the business analyst team were also participating in this process to upskill what they were doing, which was a great learning opportunity for them as well. So just some key takeaways. So the result was really the availability of data for cross-functional fact-based decision-making, the improved infrastructure for analytics so that they could do not just what happened yesterday but what could happen in the future. And then most critically, the business ownership of information assets was really a huge accomplishment for them and a big change from the way that they were viewing data and information in the past. So key takeaways. Using projects can help ground new data-centric strategies in something like reality. So it can ground governance in reality, it can ground data modeling and data architecture in reality. And adjusting a traditional sort of SDLC to become more data-centric can help both identify and catch potential data discrepancies and also slowly build out a data infrastructure and level of data understanding without doing it in isolation. So regardless of whether this organization was actually an agile organization, so they did this in an agile way and were able to document enough around the data to increase that understanding. And then another key takeaway is that incremental change is easier. So although it might not be as enterprise or as pervasive, sometimes it's just easier to get done and more practical and more realistic. And therefore happens, sometimes I think, you know, it's big bang versus incremental, right? Incremental may not be super sexy, but it gets done. Okay. We had a question come in that kind of, it's easy to answer from this point. Okay. So I'd like to, and that was, someone wanted specific examples of how to reorganize around data, because I said that, and I, one of the first case studies here. One of the examples here is the data-centric approaches. Organizations think in terms of processes, A, A, B, C, D, and do this, then do that, then do this, then do that. But they don't ever think of the data that is actually being applied probably at all of those processes along the way. So one way to do that is to just change your SDLC, which, you know, we're talking about all kinds of different examples here, but that's something we do. The whole process waterfall type thingy is secondary to other ways we do things. But any organization, regardless how we do it or not, is thinking of data. So imagine in your head just kind of a matrix of you've got functional things across the top, your columns or functions. But along the rows are, you know, create, update, delete operational activities and then using activities with the data. You can do that by domain or you can do that by event or by some other type of category. But, you know, take an inside-out look of your organization. That's the first way to reorganize about it. You know, throw away the A, B, C, D process type stuff and start to think of it in terms of keeping the data supply chain vital. And that's a great way to start to organize it. I kind of tacked on to this one, Kelly, so I wanted to grab that one before we went too much further. Sure. Questions are coming in. Feel free. Please, please, please help me with my low self-esteem. Please ask some questions. And kick it back to you there, Kelly, for the next one. And then we'll keep moving here. So I think this one's for you, John. This governance. Oh, it is. Sorry. Yes. And then we go back to me. Okay. Isn't this your favorite? This one was cool. There are some organizations which, and I know by the questions we've received from many of you over the years, you've been doing this particular topic, this particular webinar. A lot of the questions are, how do you get started? My organization, it's politics in a way or culture or whatever. And there are ways to do it. And this is one where there's an organization that, you know, you've got this spectrum where there's no governance. There's this, what a lot of people used to think was necessary, this, you know, iron fisted, centralized, enterprise-wide, follow everything, and then do all of this kind of stuff. And, you know, no one fits either end of those, really. But what's in between has to be defined. And so this organization is by-designed, extremely, extremely decentralized. It is by-designed to be able to have organizations operate independently within the corporate framework. When you go and you would approach a leadership about something that might be something to integrate things, the answer was, as long as it doesn't affect this philosophy, which is we want, we are by-designed, we want to have all these separate operating units. Okay, great, that's how you want to run it. Now what do we do? Well, there goes the central thing, all of that. But we still want to get things out there that are important, like compliance. So if there's a compliance issue, the whole organization takes a hit. It's not just that operating unit, all right? You know, ACME organization is the one that gets the fine. It's the one that gets the reputation ding. It's the one that gets the stock price fading. It's not ACME's operating unit Z, okay? So you have to be aware that there are some enterprise things. But then there's some other stuff where you're not going to operating unit Bravo and operating unit Charlie, they're going to do their own thing, even if it's the same data. And you know what? That's it. You've got to live with it. Well, maybe they can collaborate. Maybe they can cooperate. So the term that was used in this is referee, right? Referee kind of gets out there in the field and says, you're following these rules and that's great. Oh wait, you broke a rule. So there's a yellow card, okay? Or for American football, you throw a flag, okay? And you've got to follow the rules. But other than that, the game ebb and flows and people kind of do what they want to do. So the referee model kind of really, really set well with this particular culture, because we wanted the right data across all the different levels, a baseline of stuff so that the reputation and the image is maintained, but folks can kind of go off and do their own thing. So it's federated, almost confederated type thing. And of course there has to be a roadmap to get there because everybody has to get something. So you've got all these units and all that. We'll go to the next one and we'll kind of talk about how this kind of panned out. What we ended up doing was having some local oversight and some type of central oversight, some formal engagement, again with the support mechanisms. And each unit, all they had to do was follow a basic set of rules. So again, I like the example of soccer or football if you're in Europe. And that is the teams ebb and flow. There is no rigid lineups. Things people fill in the roles where they need to fill in the roles. Yes, your offense. Yes, your defense. Yes, your mid. Yes, your fourth. But there's roles. But there are minimum things that have to happen to keep the game running. And everyone knows those minimum things. The referees there. In case you kind of push the rules a little bit and then that little card comes out. And that's what went here. So the local areas, you think of it as the teams or the squads that were there. And they had what was defined by this organization as a minimum viable state. Well, you do your own thing with your data, but you've got to attain some, just some basic behaviors so we all can talk to each other. And it played out on this operating framework. The local areas. And then at the very, very top where there were some really issues or something enterprise-wide had to come down to do that. And they would then of course work with their various projects and programs and things like that. Did we end up with metadata by business unit? Absolutely. Did we end up with multiple data models there? Absolutely. The disorganization maybe, are they going to mainly have multiple CRM someday or multiple BI things or warehouses someday? Absolutely. All right. But there's a minimum state that they're going to aspire to make it work for their organization. Let's go on to the next one. Then here we'll talk about how that kind of rolled out. Rolling that out was a real set of choreography. They worked really, really hard on this. This is a roadmap, not via projects, but via agile releases. So various things were defined to go into operation to achieve a business-as-usual state. Various things were modified along the ways. Various measurements were put in. But when it came to the projects, notice what's not on here, are explicit projects. That's up to that area. That's up to the team out there. But just whatever you're doing, apply the rules. Kind of a really cool, innovative, really need to watch this happen. Really need to watch this unfold. Very, very interesting effort. A lot of work. I will say this, for those of you listening, you've probably learned this by now. This stuff is not easy. This was a lot of work with a lot of people scratching their heads, sitting around tables, and working within the constraints of the realities of the organization. But it is from a... We talked about being done and being successful from a success standpoint. It is being addressed. It's not accelerating at light speed. To an outside observer, it might seem like watching paint dry, but it is moving forward. So let's...the takeaways from this were all kind of interesting. And that was the big takeaway here was this minimum viable state where each division had to achieve...just follow some basic rules. Where there are some critical concepts, where they couldn't agree on some things, they agreed to agree on a few things. They also agreed to disagree on a lot of things. And maybe from a pure data architect standpoint, it might make you nuts, but it worked for them, all right? And their data sharing and data consistency is improving at this organization. It's...the minimum state is still progress. That's the takeaway here. If you have an ideal future state, but for a lot of this, just you may never reach your ideal future state. It might not be because of anything you did with your strategy. It might be because business changes and the strategy has to change. So...but having something that you can get too early and make it stick and measure it and make sure it sticks, it keeps you...that keeps you from falling back. Think of it as I had a very old car once, and if I had a flat tire, I could not rely on the emergency brake. And so I had to carry a big cinder block in the trunk of my car, and I would put that cinder block behind the tire so I wouldn't roll down the hill as I changed my flat tire. Think of the minimum viable state as the cinder block. It keeps you from moving back to where you started. Of course, you know, of course we tuned this to the culture we had to, all right? The referee body was something everybody could accept. And then the high-priority areas, some stuff's really visible. They said, well, we're going to work on this. And it's our high-priority area, and they did their high-priority area. This organization chose their prioritizations but then applied the refereed minimum viable state model to their work and the way they went. Next one, I believe Kelly is for you. Self-off-mute here. So this was an organization that was going through a transformation of many of their back office systems in order to update them, get them current. Some of them were off support. And as well to increase their capabilities as they were going through a high growth period. So in doing so, and in looking at all of these back office systems, they realized that their customer data was inconsistent across these systems. And if they were truly going to support their future growth, that that needed to be rationalized, improved, and potentially centralized. So enter master data management. So master data management was identified as the best way to do this. But they needed a way to make it incremental and specific, not general or big bang, if you will. And they needed to understand their data a bit more thoroughly, look at it across the different systems, not just system by system. They need to identify those quality gaps that were meaningful to them and that were meaningful to the mastering process. And lastly, they had just launched a data governance initiative. So they needed to figure out how do we take this newly created discipline and organization and have them participate in the data mastering process. So the solution was to align MDM to specific business use cases. And these use cases were incremental and would build upon the previous use cases. So the idea was that a complete customer master could be created over time and driven by specific end results. One of the high priority business programs as part of their redoing of many of their systems was a relaunch of their consumer online portal. And this was chosen as an opportunity to align with the data mastering efforts because it would eventually cover all of their customers. And it was also being rolled out customer segment by customer segment. So this sort of iterative approach worked well and aligned well with the concept of prioritized use cases. The prioritization process was driven by the business to ensure that there was understanding approval and commitment to these use cases and expectations. And those use cases were mapped based on the level of complexity and effort to the organization and effort in the MDM solution and value to the business. So those rationalization efforts of integrating different types of parties and customers that were highly complex and involved multiple business units versus just two were mapped on the higher effort side and they were potentially higher value but also higher effort. So there was a mapping and a prioritization process that occurred. Then in order to make this real, it was put into a business driven roadmap that included the technology implementation and the governance activities that were necessary to meet the use cases over time. So the process of creating and approving this roadmap aligned the organizations to a longer term plan and the understanding that this was an incremental plan. So again, being driven by the use cases helped to create some business recognition of what is happening, why is it happening and what's the end result I'm going to see. There was also a training and awareness plan that could be implemented incrementally as well that was also aligned to those use cases. And having that training plan aligned to the use cases ensured that the understanding of MDM and governance was integrated into that business process in the first place. So it wasn't done as an isolated incident but it was truly integrated into the way that that data was being used. And it gave people an opportunity to adapt over time. So essentially a key takeaway and result of this is that because the use cases were business driven, the business community better understood what's in it for me, how am I involved and what do I need to do about it. The business was also able to influence the MDM plan based on what they felt was a priority. So it wasn't actually prioritized based on easiest to access the data or this system by that system but it was really based on business use cases and what was meaningful to them as a business. As well the MDM and the data governance participants understood how will they work together and therefore how will they create and agree upon things like definitions, data quality requirements, usage priorities, et cetera which would be important for the long term sustainability of the solution. So I think from this organization's learning and their key takeaway of this whole process and the learning for the folks on the phone is that mastering data is a business process. It's not just the implementation of a tool, although a tool could make that business process more auditable, easier to trace, et cetera, et cetera. So understanding that mastering data is a business process is a great way to look at the success of a MDM strategy. I'm just going to move on to that next use case, the last use case and just in the interest of time, want to make sure that we go through this but also have some time for additional questions and conversation at the end here. So this company identified a need for metadata management as a result of an enterprise transformation initiative as well and the company had launched a customer experience initiative to improve the way that they sold and service to their customers. The customer experience initiative touched just about every group within the company and had both internal and external goals. So it was clear that in order to implement an improved multi-channel customer experience, they also needed to better understand their customer, better understand their customer data and therefore master their data. But specifically what could mastering their data do? So similar to the previous company, an exercise was initiated to look specifically at how mastering data would enable the omnichannel goals and would enable the customer experience effort and how it would assist in delivering the vision. So the exercise went through and said, okay, how would mastering your data actually improve revenue growth? Well, if we could understand who our customer is better, we could increase our upsell and cross-sell and see incremental increase in overall premium volume. So they went through and each business purpose by business purpose understood exactly how master data management could impact that business goal. So again, this is the idea of creating some specific definitions of success that map to the expectation of how that data is being used. The other thing that was quite interesting here is the call out of some of these internal improved capabilities that by improving those internal capabilities would then consequently improve their customer experience. For example, creating an improved understanding of customer knowledge internally would then help people better service and support their customers by making that information more available to the organization and more available to people like call center agents and that sort of thing. It actually improved their satisfaction because they were able to understand the customer's products, understand what they owned, what policies they had, and respond to them more quickly, which meant that they had happier customers, which meant their call center experience was better. So lots of specific things here. And then the solution was to map all of those goals from an MDM perspective over time and address both short-term tactical goals as well as long-term predictive analytics goals. Specifically, this was put into a very tangible, taxable, tactical long-term plan that clearly identified which lines of business needed to be involved at what period of time and how this is all going to work so that they could plan for it. There was recognition you could see by the timeline at the top that this isn't a quick fix, that this needed to be a long-term plan, and therefore people agreed of their participation and understood what their upcoming participation would be over time. There was the involvement of not just master data management, but data quality and data governance and how it was all aligned together. And another key success factor here was the identification of training communication and general change management and change leadership to make sure that there is a constant understanding of what the change is going to be, why the change is important, how it relates back to those individuals and their ability to do their job, and therefore how it relates back to their own internal customer knowledge and the customer experience plan in general. So just a couple of key takeaways. Aligning an approach like master data management to a strategic initiative helped to ensure the funding and the business involvement. So they just viewed it as part of the overall customer experience plan, not as an isolated technical hub deployment. Creating an iterative implementation ensured that the pace reflected their ability to absorb the data and available funding. So rather than allocating multiples of millions up front, which wasn't realistic for this company, they were able to pace the implementation and pace according to their anticipated profitability over a period of time. And clear goals and expectations of what was required now and what was going to be required in the future helped to encourage and involve those key stakeholders because they knew how to be involved. So key takeaways of this, doing MDM in isolation can create a challenge and therefore aligning it to some strategic initiatives and strategic approaches is a good way to get it funded, get people involved, get the resources you need, et cetera. Understanding that there will be constraints like funding, like people, and planning accordingly is really important. So some of you may have looked at the previous slide and thought, oh, my gosh, four years. Who can plan four years out? I can hardly plan four quarters out. You know, might not work for your organization, but the idea is that there are some constraints in every company, making sure that you recognize what those are and plan accordingly will help you be more successful. And then following a plan to align to a data-driven culture will help to make sure that people understand what they need to do, why they are doing it, and how it is integrated into the overall transformation and experience of the company. So the idea of creating a vision of the future customer experience, why that customer experience vision and omnichannel vision is important, what that new engagement process will be in that picture of your future customer experience, what the plan will be to get to that future state and how each individual participates in that overall customer experience. All of that is important to involving and rallying the participants to be engaged. I think that's the last case study we've got here, John. Yeah, well, you have a couple of questions we can get. Yeah, yeah, we do. We do. First, it's kind of a real discreet one, and then some of them hopefully we'll get them in. Within the last two or three case studies, someone asked about what type of metadata tools were out there. Do you remember what the metadata was mentioned prominently in all of these? Obviously, what tools were people using? I know for some of the ones that I'm more intimate with, they were using spreadsheets and some crude leftover data management tools from the 90s, but maybe there's some other ones out there that we wanted to just throw out there. I think it's quite a number of them, actually. Yeah, you know, but this isn't a product pitch, so I don't want to be talking specifically about products necessarily. You know, one of these companies leveraged the metadata capabilities that were available as part of their MDM solution and their ETL solution with an Informatica deployment. Another organization used a third-party metadata tool that ironically wasn't part of their ETL, but they used Abinicio as part of their metadata and business glossary also. There's lots of metadata tools out there on the market just to call out the business glossary is a type of a metadata, so when you're thinking about metadata, it's not just technical metadata, but also business metadata. There's lots of choices out there for you. Some of those metadata tools do support some unstructured data and do work within the concept of a data lake. Sometimes you need to look at other metadata tools if you're dealing with unstructured data versus structured data, so again, more choices for you. Yeah, yeah. One more, another one here. Let's try to get maybe one or two more in here. Should data-related initiatives be led by individual business operations or by IT? If I want the data to be reliable, in other words, should this effort be led by IT or individual business units? I'll let you take 30 seconds on that one, Kelly, and then I'll take 30 seconds on that one. Yeah, so I think in a perfect world, it would be a partnership between both because the reality is that the business knows why they need the data and what the data should be used for, and the IT organization generally understands where the data is instantiated in systems and databases and that sort of thing. So in a perfect world, it really is truly a partnership and each organization has a different role in making the overall data strategy successful. Yeah, and for someone who says, well, I have to have one or the other, here's the risk you're embarking upon, and this is absolutely factual. If it's led by IT then the impression is that IT now owns the data and is responsible and people just come up to the window, like they're going to get an ice cream cone and say, give me a data, and they hand it out and they go, well, this isn't what I ordered and it'll be back and forth. And you will permanently immerse yourself in that fire drill until you start the partnership that Kelly talked about. Conversely, if it's the individual business unit, it will be siloed. It'll be absolutely wonderful and an awesome solution for that area and eventually someone who has to support it will wonder why are we spending all this money for this one thing. So big downsides to picking one on that side. The best way is, way Kelly said. I think we've got time probably for one more here. And we can broaden this one. You mentioned Omnichannel in the last case and I mentioned it around retailers and how are retailers dealing with the data issues that are in Omnichailer? And I'll take the retailers, almost everybody now is in the Omnichannel type of mindset. In the retail area, the ultimate goal still stays the same. Somebody buys something, whether it's more visits to your store or more visits to the website or a seamless experience across everything from apps through store to whatever, that is still the goal. So you have to master those touch points. The real data thing that I have seen most recently is analytics, predictive analytics around all the touch points and all the interactions and what the results of those are and then trying to predict all of the circumstances. And it's going way, way beyond the fact that I looked at... I looked at Tennis Rackets on Amazon and then for the next three weeks my Facebook page shows me Tennis Rackets. It's going way, way, way beyond that. It's talking about every possible way I can engage and capturing something along those lines. In retail, though, of course, it's executing that to me buying something that I get it out of the warehouse or off the shelf and into the consumer's hands. Kelly, anything else in the other industries? And then we'll have to wrap up here. Yeah, you know, I think that consumer engagement happens across every industry, not just what we think about retail being like, you know, in Nordstroms or something like that. I was talking with a company yesterday that provides loans and they have a retail storefront approach and they're trying to grapple with that same challenge right now is how do they actually attribute the marketing message to why that person comes in and gets a loan? And I think the reality is that in our world, it's not single stream attribution anymore. And you can't say that somebody drove by or they heard something on the radio or they got an ad on the Facebook or their friend told them that it was a great place to go. It's not single channel. Sometimes it's all of those. And it takes four, five, ten touch points for someone to decide, I'm going to go walk into that retail establishment. And that recognition is that maybe we can't isolate so purely is important. And therefore, how do we look at levels of trending and can we identify relationships in the data that give us some trending analysis to make decisions in an informed way? All right. I think that's it. The rest we'll have to get at some other process. Back to you, Shannon, to wrap us up here. Thank you, John and Kelly, for another great presentation and thanks to Hewlett Packard Enterprises for sponsoring today's webinar. Much appreciated. Just a reminder, I'll be sending a follow-up email within two business days with links to the slides, links to the recording of the session. So by end of day, Monday for this webinar. And thanks to all of our attendees for being so engaged in everything we do. We just love all the questions that come in. And everyone has a great day. Thank you. Thanks very much.