 Hi, everyone, and welcome to our next EDW session called Moving the Needle on Data Quality, the foundational framework as implemented by a 110-year-old Fortune 500 company. This talk is being presented by Prathu Shikanchanam, who is the product owner of Data Governance, Data Quality, Metadata, and Data Privacy at Mutual of Omaha. From logistical notes, all audience members are muted during these sessions, so please submit your questions in the Q&A chat window on the right of your screen, and our speaker will respond to as many as possible at the end of the talk. We also want to note that there is a linked form at the bottom of the page where audience members are able to leave evaluations for the session, and those are really helpful, so if you have a chance to do those, please do. All right, let's get started. Thank you, and welcome Prathu Shikanchanam. Thank you, Natalie. I'm really excited to meet all of you. We'll see more questions probably at the end of the discussion. So today I'll be speaking about a foundational framework that we have used and that has helped us to more than any longer data quality. So we all see a lot of articles, a lot of discussions going on where we say that, you know, data is the strategic asset, and we want to use data as a differentiator, but at the same time, you know, you may have so many analytics initiatives and data initiatives going on, but they can only be successful if the analysis they provide is useful, right? And for that to be useful and correct and accurate, it is very important that we have the best of the data quality, and that is where I think the data quality has a lot of importance, but oftentimes we see that, you know, while data analytics and data initiatives are taking so much importance, sometimes the data quality programs struggle to kind of establish organizations, and that is what I will be talking about. So that is an introduction again. I'm the product owner of our data governance, data quality, security, privacy, and metadata at Mutual of Omaha. Mutual of Omaha is an insurance company, 110 plus year old company, and some of you may know us from the Wild Kingdom videos, we used to post that. And as any 110 year old company, we have a lot of data being an insurance company and being a 110 year old company. We have a lot of data, a lot of data systems, and the technology landscape is pretty complex. So today I'll be talking specifically about the data quality journey that we have taken. So covering how we got started, what was our roadmap, what was the framework that we built that worked for us, and then also talking about the metrics that we built, and what is the strategic value add that this whole program is providing to the organization. Moving on to the starting point. So some of our challenges were, as I said, complex technology landscape. We had a lot of legacy systems, but then as we moved through the years, we also have the newest of the technologies. So the landscape itself is pretty complex. And then we also have quite a bit of a siloed approach for so many years. So the SBUs were provided, the SBUs were provided a lot of independence as they built their systems. And as they were building their systems, they did not think about how the other SBUs or SSUs can use. So there was that level of independence where there was some integration and discussions and everything that happens, but there was that independent way of, okay, let me do the things. I need this problem solved. This is the data I need. And this is the data system and just kind of stand it up right there. Then we also see a lack of ownership in the sense that there's no clarity and who is voting what because there isn't so much of inventory of assets and data systems and applications. So that is another challenge that we see. The other thing that we saw was the perceptions around the data quality itself. When I say perceptions, like there was no clarity on our definitions. So what somebody would consider as this system is at a quality that is acceptable. Somebody else may say that, no, it is not because there is no, the clarity was lacking and it was all based on people's perceptions and based on the discussions that they have. So it is very subjective in nature. There was no measurement in place. The other thing we saw was that there was no clear understanding of the data quality impacts, meaning that we know there are data quality issues, but what does that mean? What is it impacting? How can we improve efficiencies? So we didn't have that. Moving on to that. So with all this, some of the impacts that as, any of you would identify is like, riskage posture is one of the things, right? From the data quality issues that we may have, then there is a lot of reward because of the data quality issues. We are continuously kind of, having to resolve those issues or do the rework and that is one of things. And then improper capacity utilization, that is based because of, we didn't know what it was impacting and how it was impacting. So we used to work on like, any ticket that comes in first in kind of a thing instead of working on the most prioritized kind of way. The other impact we were seeing and this was big is like any new projects that were trying to be, sort of they need to understand where the data is. So the same data may be available in a couple of places and trying to understand which one is the source of truth, which has the better data quality, where should I take the data from? So there was those kind of discovery efforts that were really costing the organization. So we looked at the challenges and the impacts and that is when the organization said, no, we need a enterprise data governance program and we need to have data quality as part of it. So when we start the enterprise data governance program we have data governance, data quality and metadata and document content management for some of the things that we had as part of that. So after that, we kind of move on to the roadmap. We started it with a maturity assessment. And again, like any of you might have seen there are several maturity assessments on data. Gartner has one, Forester has one and data management book of knowledge talks about the maturity assessment they have. CMMI has one. So there are several of those but because we hired a consultant to do our words we use the maturity assessment that the consultant have and this was not limited to data quality but it was more limited, more the scope was broader and it was about the entire data management and data governance aspects of things. So we finished, we knew what maturity assessment model we were using. We went with an external assessment. And I mean, at the conclusion was that it kind of re-validated the analysis that we already knew probably, right? It was just re-validation of whatever we understood as our impacts and challenges. So that came through in that assessment. And some of the main things that they pointed out was there is no clear definition on data quality. There is no measurement that is happening. And again, those perception versus practice. So it was kind of that re-validation that we had. And then the vendor also kind of the consultant also put through a execution roadmap how they would execute the data quality program. But for us, because of how we were spread out into different SBUs, SSUs and where we were in our journey overall when it comes to data, we said that we will kind of start the program ourselves and establish ourselves. So we kind of adjusted a few things to kind of suit that. And the big thing is when we did the roadmap we kind of took a step back and understood that the first thing as was very evident was to do the data quality definition and understand the scope of the program. What is that we are trying to address? So we kind of put that in place and then started to think about how do we do the roadmap? Like what is that we need to build into it? So at a very broad level we kind of divided into what is a reactive component of data quality and proactive component. And I don't know if any of the articles would speak it in that way but that was very pertinent to us. So if you think of it like we have data quality issues and we don't have a good mechanism or a good visibility into where the data quality issues are. So that is where we were talking about reactive data quality and where we are having data quality issues in production and we need to kind of resolve that is a reactive mode but then we have to first establish a framework for that and then we can move the organization towards proactive data quality where we can use the tools where we can establish more the automated way of doing things so that the proactive data quality would come into picture. So today I will be focusing more on the foundational piece which we'll talk with the reactive data quality but also how we pivoted that to the proactive piece that will be coming next. And then we also have that foundation of how do we monitor this, how do we socialize, what are our communication mechanisms, training and all that. The other important part that I will say this is because of the data governance, not just data quality. So we said that we need to have this alignment, the data governance alignment, data quality was anyway part of the data governance. So how do we align with the data management communities of practice that we have? So that was very important and that was key to our success is what I can say. So I'll probably tell us, we'll speak a little more on that. So once we did that, let's move on to the framework and how we built the framework. So these are the different things that we considered when we thought about the framework. We again said, what are the things we need to address as part of the framework? One is the definition and the scope and what is our operating model? What is the ownership? How do we do that? How do we track the data quality issues? What should be the workflow? How do we prioritize these things? Because these were the some of the things that some of the business stakeholders were asking like, we would like to see this, we would like to see this. And that is what we were trying to understand it. And let's think about these components as they built the framework. So then when we moved on to the framework, the first thing that we did was, let's first define the data quality. And this is where I can totally see that we leverage the DIMBARC, the Data Management Book of Knowledge. And it has been absolutely useful for us, especially being the foundational framework that we were trying to do and where we were with our journey. It was a very good starting point for us to kind of think about, how do we connect with all these knowledge areas? It was not just about data quality. It is about data governance. It is about how we operate together as an organization. So it was very, very helpful for us for the Data Management, to kind of leverage the Data Management Book of Knowledge. And we just adopted the definition of a data quality from there. And then we also did the dimensions, the data quality dimensions, and we kind of provided the organization and also took the input from the organization. So apart from the data quality dimensions that are in the DIMBARC, there was probably at least, I think that there is one that we added where the subject matter experts said that, this may not be something that, over 100% a data quality issue. But then I have not seen this trend before. Like, we were supposed to expect 50% of the policies come through in this way, by the end of the quarter. But this quarter, we are not seeing those numbers. It has dropped pretty much, or it has increased pretty much. Is that a true trend, or is it some issue in the data pipeline itself that we are seeing wrong data? So they wanted something for that. And so we created one dimension for that, which was very specific to our organization. Our rest of the dimensions are basically the accuracy, consistency, validity, the same things that we kind of see across. The names may be a little different, but we adopted whatever was in that data management book of knowledge. And we not only did that, but as we were doing this, the whole data governance program was getting established. And so we said that this is a good book. We need to kind of adopt this for the entire organization because that could help us talk the same language. And we can kind of, when we talk about any terminology, anything that we are talking about knowledge areas, they would understand. So we needed that standard language that we could use to understand each other. So we have an enterprise a license for it. And we have been using it very successfully actually. And moving on to the framework, if I think of the definition in scope and the operating model, the ownership cue creation work from prioritization, we spoke about that. And we are in the, sorry, in the slide was. So moving on to that, then what we did was we took the data quality policy and we also published the standard. So the data quality policy kind of was the overarching thing, kind of a hooking on to what we had in the data governance policy actually, but then talking more specifically about the data quality and how we are going to operate and what does it mean for the organization, kind of relaying that foundation. And then we also published a standard which was more talking about the dimensions that we are adopting and how we are establishing the ownership and things like those. And the next thing that we did was socialize across the organization. Again, this program was new, data quality program was new, data governance program was new. So we did a lot of road shows is what we called. So where we would go to different themes and present where we are in our roadmap, what we are trying to do and how these are the standards and the policies and what we are trying to get the inputs to some extent. So we were constantly in loop with several of the teams and trying to use that as our input mechanism, even as we built our framework. So going on in the same thing, if you look at the data management book of knowledge, we have several knowledge areas established. So with those knowledge areas, then we said that, okay, let's kind of categorize these as data governance communities of practice and data management communities of practice. So we identified that and we introduced one more thing, which is data privacy, which probably will not see in the data management book of knowledge, but at the same time, being an insurance company that is heavily regulated, privacy was on our top of the man for us. So that was something that we did not want to leave out, especially we wanted the privacy office to be more involved with any of the data initiatives so that they are hearing and seeing what the rest of the organization is trying to do and kind of provide their inputs and whatever compliance and control mechanism. So we wanted them to be as part of the entire journey that we were taking. So what we did was we had these knowledge areas in the communities of practice, but we also established what we called a data review of the projects. This is not specific to our data quality piece, probably, but data quality is part of this component that is where the importance is. So if you think of it, any new project, any new data initiative that would happen across the organization, will have to submit a data review request. And that would be coming to all these knowledge areas who are the representatives. And then the architect of that project would come in and kind of present what their vision is, what is that they are trying to do. And all this knowledge area representative sitting there, we had some standard questions that they would have answered even before coming in. So we would kind of go in knowing, the subject matter experts would go in knowing what they are providing us. And then we would listen to their architecture and what they're trying to do and then ensure that whatever standards and policies are being provided from each of these knowledge areas, are they adhering to those or are they not adhering? What is the recommendation we give them? They may be a novel new case or new technologies being brought in or new way of using data, so which may not be a standard policy yet. So that also drove what other policies are best practices that we need to do. So this was a data review that we used to do for any of the projects and have some followup recommendations and they need to kind of close out on the recommendation. So I just wanted to kind of take time to kind of explain that because this was a huge value for us because we were bringing all the knowledge area representatives into one room to kind of discuss what was happening in the organization for the new projects and kind of setting those new projects at least in the right way. And this is where we, you know, data management book of knowledge or the data material really helped us. So apart from the full knowledge areas, we also had the data analytics as another communities of practice. So there were oftentimes where they would come in and also work with us. So the three communities of practice, data governance, data management and data analytics, we all work very, very closely. So it has almost become like, you know, this is how we've worked together and pretty much known by the data owners and the stewards of the organization. And going to the framework back to the framework. So we established the definition, the standard and the policy. And then we defined what is the scope? Or is it like the one specific data domain or it should be the enterprise, the data or should it be specific SBU by SBU kind of an approach? Then we said, we went all in. We said all enterprise data. We know that adoption takes time. We were okay with that but we wanted to make it more as in a, we wanted to establish the data governance and the data quality programs in such a way that the organization knows that this is for all of the enterprise and make it that umbrella kind of a function that centralized a function who would be working with each of these SBUs, SSUs and even all of the data management functions and knowledge areas. So we established that. And for data quality also we said that, yes, all enterprise data kind of falls into this. And then we said it is production data system that we are worrying about no POCs or the other initiatives that you are trying to kind of stand up for. But then for the operating model that was another important thing that we were thinking about. So we were a centralized organization. So there is a lot of questions right when we stand up a new program. Is it centralized? Is it decentralized? What would we do? So I think we kind of went a hybrid way or maybe we can still call it a centralized way because we as an enterprise data governance or data price data quality we were the centralized program but the, for example, for data quality if I take it there we were kind of establishing the framework for using the standards and the policies that the enterprise has to adhere. We were monitoring and tracking and we were the centralized function but at the same time we were not the ones who had data quality analysts who would ask towards who would look at the issues and kind of resolve the issues. That was still lying with those data system owners. So we had the centralized program management whereas decentralized implementations what I can say. And the other thing that we established was the ownership because there was always the clarity that was needed for the ownership. So for every data system we started identifying who are the business data system owners and who are the IS data system owners. So we had owners and stewards but at the same time both on the business side and the IS side or the IT side if you want to call it that. We call it information service system that is what we have. So that was one thing that we identified and kind of established that as a almost the owners and the stewards were established and socialized and the enterprise had one place to go and look up to kind of have that information. And then we moved on to them how do we track these data quality issues? What is that we need to establish? We looked at some of the tools we tried and we ended up saying that as we were not getting into the tool yet let's just establish something in JIRA which can track some of these issues. And our all other projects were using JIRA Qs anyways so it was a good way to kind of link into those projects. So what we did was we establish a customized flow in JIRA and I kind of speak about that in the next slide. And we did, when the consultant came in they kind of identified all these data quality issues that were in different Qs of different teams. So they kind of brought that together. So we put all that into that Q so the Q was already loaded up with those data quality issues to start with for the program. So for the workflow what we did was we said that we, let's say the data quality issue has been submitted by, it can be any associate can submit data quality issue. It was open for everybody. If you see it in a report, if you are just a user or if you are a stewarder, you can submit that issue. And then it would first stop at the data quality program and we would kind of ensure that whatever data that they are submitting in the ticket is kind of enough for the stewards to act on. If not, we would kind of go back to the submitter and kind of get that information. But then we would pass it on to the first to the business stewarder. So we kind of split up the steward reviews as business steward review and IS steward review. And after both the reviews happened and the resolution happens, it would again come back for closure to the data quality program and we would close it. For the reason why we differentiated the business steward and IS steward was like, the business steward was more about, okay, is this being caused by my data system? Is this issue really something that is caused by my data system or is it something further down the line or upstream? That is what we were trying to understand. And then they can always reassign it to somebody if they think that it is not theirs but to some of the data systems causing it. So we wanted to give them that. And then they also would completely impact evaluation. I think I'll go more into that in the next slide but they would also complete an impact evaluation. And then they would pass it on to their data systems IS steward. The IS steward would then look at the thing and kind of say that, okay, for this we need, this is the kind of resolution approach we need to take and then make that decision, document that decision, put an effort estimate there. And then they would link it to a different ticket. They would create a different ticket in their own team, Sahajira, and kind of link it here. That is where I was trying to say that we were linking to their processes but not still trying to keep it separate. So this was almost like where we were not, what should I say? We were not disrupting any of the processes. That is what we did. So going more into the prioritization. So that impact evaluation that the business towards the data was for a reason. So when we first built it, we did not have that but as we got feedback, we had been changing that workflow and probably took us a few months to kind of get to a place where we say that, yes, the workflow is in good shape now and we don't have to change it anymore. And we have been using that process for quite some time now. So the impact evaluation, basically they would see that how many other data systems are impacted by this issue. What is the volume of records? Because oftentimes what we saw was, people would say there is a data quality issue and they might have seen a couple of records and they would not even look and query and to find how many records are there in error and they would just blow it up into a big issue sometimes. Sometimes it was true, sometimes it was not. So we wanted to know what is the volume of records that are in error. So that we know that if it is a larger volume then I think it needs more attention and higher priority. So that is the reason we put it there. And then we also said, is there any regulatory impact? Is there any financial impact? Is there any customer impact? Because those are the things that were a priority. And that is how we said, once the impact evaluation was done, then the data quality program would kind of step in and kind of put a score to it. We had a weighted score for all of those factors and did a weighted score for it. And based on that score, we gave it a priority of either low, medium, high or for each of those data quality tickets. And apart from this, when we started this program, the entire organization was not working in agile fashion. The agile transformation came a little later. So the other thing that we did was to adopt the agile. We said that for each of the other agile release trains, almost the SBUs have agile release train for them. And so we said that, okay, for your train, what are the data quality issues that you are impacted by? Can you give us your top 10 that you would like to see resolved? So every train kind of marked those. So we brought all those together into the same Jira dashboard. And that would be the prioritized list that, along with the priority level of high, medium, low, kind of helped the data system owners to prioritize those tickets. So that is the kind of prioritization that we put in place. The other thing we did was, then moving on to the next thing, as I said, one is, there's two words will be allowed to kind of redirect the issue to another data system if they feel that it is the other system that is causing it. And we also did a wraparound kind of a thing when I said, when I was talking about the Jira work use, that is what I was talking about. We did, each of the teams had their Jira queues, but they were operating in a different fashion. So we said that we don't want to disrupt your process because that is where I think we need to get that engagement when you are establishing a new program, because if we were disrupting and saying that the entire organization needs to come on to this queue and do this and not do whatever you are doing right now, you have to change your processes, it would not have been adopted to the level that to as much success as we were able to. So we kind of said that you can continue with your process. This is more like a wraparound. We are collecting the metadata about the data quality here for tracking purposes, for monitoring purposes, but the ticket, the actual resolution and the SDLC process that needs to happen for the ticket will lay with your teams. So that really helped us in pushing the adoption to the teams. The other one that we had was IS still retained the control of the resolution decision. So we provided them a framework where there are inputs for their prioritization. And we said that it is the data system owners who can prioritize now and think about what is a good resolution decision for this issue. And sometimes it may be a core fix, sometimes it can be a data fix and sometimes it can be an enhancement. So moving on to the escalation. I mean, for the escalation, we just said that if the issue is not moving, just let's start with the data systems to work, if not the data system owner. We also are in the parallel of the data governance side established data domain owners. Like for claims, we have a data domain owner. For customer, we have a data domain owner. So we established the policy, we have data. So we established the data domain owners. So we said that, okay, we're slowly trying to move the organizations with the data domain concept and trying to meet the data domain owners. So think about the strategic initiatives with the data domain. So we are still on that journey. So, but we started off saying that, that can be one escalation path where we can kind of move it up to the data domain owner and say that, okay, this is relating to the policy data. Maybe they can kind of say that which is a priority. And if there are even like, enhancements or projects that needs to be done for that. Apart from that, we also had, as I said, we had the data governance data management and data analytics practice leads for each of those communities of practice. And that, all the three of them would be the triangle. And that is where we would escalate some of our data quality issues if they were not getting results and it was a bigger risk for the organization. So we don't have a chief data officer yet, but the triangular leadership of those practice leads for those communities of practice operate functionally as a chief data officer for us. So moving on to the metrics. So we established the workflow. We had this two words, we trained them. And then we said, we need to measure this in some way and kind of provide that visibility to further the enterprise level. We also needed it at the data system level. Because the agile trains were being built and established, we said, we need that. Apart from that, we also established the key risk indicators. And that is, we reported the high impacting issues all the way up to the operations risk committee. And so we established the key risk indicators for that purpose. And only the high impact issues were something that we reported. But at the enterprise level, we said that what are the number of data quality issues by the status? Is it like business tour reviews complete? IS tour reviews complete? Or is it still in the tour stage? Are both the reviews are complete but the resolution is still happening? Those kinds of things. And then we also said, what is the number of issues by the data system, by the impact level? And what are the few data systems where we are continuously seeing a high number of issues? And we also did a monthly training. And similarly, we did the same thing for the data system level also. So this is a mock kind of a dashboard that we had in built-in Jira for each of the data systems. As you can see, we had these data quality dimensions that were defined. And we would say that, how many data quality issues were for each of those dimensions? So this was a good view for some of the stewards of the data system owners, maybe not for the individual data quality analysts or associates, but if the data owner is seeing continuously a higher number against the data quality dimension, that kind of helped them. We also, I think the readability is less but let me talk through it. So this one is more around the resolution decision itself whether it is a code fix or whether it is a data fix the steward might have said that this is not even a data quality issue. So how many are we getting like that reported as data quality issue when it is actually not? Or we also gave the option for the stewards where they said that, okay, we did have a review but we are not going to fix this issue. It could be for several reasons. They're not fixing the issue because they know that there is a project going on and the enhancement that is happening through the project will probably fix this issue. And so we don't want to spend time now it may be because the impact is probably something that they can observe. So it can be that. So the data system stewards would kind of take the decision but then this visibility would help the data system owners to kind of look at that. Then we also had something by the status. So whether the business review has started, business review completed, IS review started, IS review completed, a resolution in progress, resolution is complete and it is in data quality programs review or it is closed, is it done? And then we had a hand-band board for each of those statuses and who is doing within that data system. So this was a view that the data systems have used very successfully. It was very useful for them for the data system owners to provide that view. So apart from that, we also did the agile dashboards for the agile teams. So that was another view that we had. So this is only one of the dashboards we have. I thought this would be useful for getting started for anybody who is initially setting up their programs. So moving on to the program itself, what is the strategic value add that this program is providing the organization? So that was another big thing that we had to think about. Like even as we are building smaller components and making revisions to our program and kind of establishing the program, the continuous thing we were thinking of how can we use the metadata that we have captured through these data quality issues as input for some strategic things? So that is where I'll be covering some of the important stuff. Just to let you know, it's about five minutes and so we're gonna do a Q&A session. So if you have any questions, please feel free to put them into the Q&A chat. Thanks. Thank you. So we were thinking about what is the analysis of the issue and how can it feed into the strategic inputs? Some of the things we had was, for example, if you are seeing wrong queries, that kind of showed that we need to do more data catalog. Maybe people are still not at a point where they are understanding the data. Or it could be a skill set issue with the associates. Is there more training that we need to do because they don't understand the data or is it that they're not able to query the right way? It may be a intern or it can be an associate level. So that kind of fed into that. Similarly, is there a business process documentation is lacking and maybe the business process is not having the correct control. So the data coming in itself is coming in the wrong way and not enough controls are put in place. So that could be one of the things. Or based on the patterns and the trends, are we seeing a specific data quality dimension as high? Let's say completeness is a continuous thing that we are seeing. Then we need to think about what is the data strategy and how can we correct that? And are there any technical ways of doing that? How can we automate that? How can we have those automated checks? And we also said, let's identify the critical data elements so that we have to have some focused efforts and that will also feed into our data domain strategy, for example. So if there are data inconsistencies across the systems that kind of showed us that we need to publish one of the trusted source with the specific data. I want to kind of cover this quickly because this one is something that is being very useful. And especially for new programs, this might help because this is how we would connect to those other functions in the organization. For example, let's put data quality program in the center. How would we connect to the data governance? So data governance would give us that who is the data system owners in the stewards? That is where our connection point was. For the data domains, what was the connection point? We identified the critical data elements for those domains and we established the data quality rules for the data domains. And we also use the data domain owners as our escalation point. When it comes to the risk organization, risk and compliance, we said that, okay, these are our risk factors and these are our KRIs and continuously report on that so that we can understand and put that pressure on the organization to have some of those resolved. On the other side, if you think of it, we have the data management communities of practice where I spoke about all those knowledge areas. So the data management communities of practice and data governance communities of practice continuously integrated with each other. We had is, we always have a monthly meeting that we conducted together for the entire data owners and stewards and the stakeholders. We almost get almost like a 300 to 500 people's presence on that monthly meeting, where we are talking about the re-initiatives of the new policies, why we are doing what we are doing kind of things. And then the data review that where all the full knowledge representatives were there that was really helpful for us. And then we also clutched with Ajay PMO and the data analytics COP. And all this kind of helped us to pivot to that proactive data quality and establish those data quality rules and checks that we will be able to ingest into the data quality to the, so that is where our proactive piece was coming from. So I'll stop on this slide and properly. So VIN says again, we have a centralized place. We have visibility. We are not disrupting and we were able to ensure that the teams are able to adapt and establish. So we were able to move that needle for the organization. Challenges as you all can see, there is always that resistance to change. There is additional, if it, you know, it is considered as additional work, but we had enough mechanisms to kind of say that, you know, we are trying to make it easy for you, but this is a needed change that we need as an organization and that prioritization also kind of helped them. I would suggest that think about simple wins when you think of your programs. Have that one first follower. If there is a strong follower or one team that can speak about the program instead of yourself speaking about the program, that would be your best thing to happen. Keep the disruption minimum. Think about the student competencies continuously and pitch for the proactive and align with the important use cases. So that will help you to move the needle. So I'm seeing a question where they say, to improve data quality from a data perspective, both technically and process wise, what challenges have you faced and conquered? So the very first thing that we faced was the business and IS kind of thing, right? Should IS be doing this? Should business be doing this? And that is where we started with the workflow and the framework with the, okay, there will be Stuart doing this, IS would be doing this, but then IS was like, no, this is, we are trying to pitch it to the business value. We want to prioritize what is valuable for the business. So we would want the business tell us, come in and pitch in. So that is where we cover one of the challenges was. So there were several discussions leading up to that where we said, okay, every data system needs to have a business owner and an IS over, business to what an IS to what that we could use. So that kind of really helped us to kind of do that. And the other thing we did was that centralized, like, you know, trying to keep the disruption minimal, that was another thing. So before that, there was a lot of resistance before we put that model through when we said that let's move a centralized and there was a lot of conversation that we had to do. And then we finally said that, okay, we are not disrupting each of the cues. You can keep the cues same, whatever you are using, because agile adoption was also going on that time. And we knew that some, at some point in time, they will all be moving days, similar kind of cheater of laws. So we said that, let's keep the disruption minimal. Let's not do it twice. So we kind of addressed that by kind of trying to put a wrap around kind of a mechanism around it. So those were the two things that I can think of. And just the buy-in, I think, just the buy-in into the program was a bigger challenge. So we had to do several ways. One is from the top-down approach, where we were constantly doing the road shows and bringing in even executive management and practice leads for those communities or practice talk about it. But at the same time, we were establishing those champions with each of the teams and kind of showing their successes and how some of the data, we brought some of the data systems to project their dashboards and what they did and showed that success to others. And the others wanted to pitch in and kind of do the same thing for their data systems. So that helped. The senior leader buy-in, and they have a certain amount of data literacy already unrecognized. I would say no. So for us, when we started, it was the program establishment was not difficult because I think there was some level of risk factor, compliance, and the data journey and the transformation that the organization was looking for and that initial assessment that we did internally and the external assessment kind of spoke to that. So the program establishment was okay, but then for the framework, we had to have a continuous socialization with the stakeholders. So we brought several of the stakeholders from the SBUs along with the practice leads into a room when we were establishing the program. So when we were initially establishing the program and the framework, we used to kind of provide them a progress point where we would tell them, where are we in the journey? How is the program coming along? What did we change as a program? And I'm not talking just the data quality program because the data quality was part of the data governance program. So we would continuously provide them those updates, let them question us on why we are doing something or why not some other way. So that kind of helped us kind, that those served as some, in some places the literate, providing the executives the literacy kind of a thing on the data journey because data was never looked at that way. So that helped really helped us. So they would ask us questions and we can provide the materials and we could also provide them the, what should I say, the reasoning behind some of the things and how that would pitch to any of the strategic initiatives or the longer transformation of journey that the organization was thinking of. So that is where we were able to use the practice leaders of kind of, bring them in and kind of that bridging that gap between the executive team and the enterprise data governance team to kind of do that. So that really helped us. I think that that is, first all of the questions that we have this afternoon, feel free to continue networking through the spot me app and through the speaker pages I'm sure that Prathisha will be happy to answer any additional questions that you may have. But thank you for that amazing presentation. For those viewing the session, please take a moment to submit your feedback through the link form at the bottom of the page that's very helpful to us. This wraps up today's scheduled events but the app is always open. So go hang out and have some networking in there. We will zoom our live sessions tomorrow morning, 8 a.m. Pacific, 11 a.m. Eastern. Thank you all again and we will see you tomorrow for the final day of EDW 2021. And Chris, I'll probably answer you from the speaker spot Chris. Thank you.