 Hello and welcome. My name is Shannon Kemp and I'm the Chief Digital Manager of Data Diversity. We are proud to produce the webinar series of Data Governance Case Studies for the Data Governance Professionals Organization. We'd like to thank you for joining today's DGPO webinar and the opportunity not to be missed at the Data Governance journey. Just a couple of points to get us started. Due to the large number of people that attend these sessions, you will be muted during the webinar. For questions, we will be collecting them by the Q&A section in the bottom right-hand corner of your screen. Or if you'd like to tweet, we encourage you to share our highlights or questions via Twitter using hashtag DGPO. So now let me turn the webinar over to Niko from Data Governance Professionals Organization to introduce today's webinar and speaker. Hello and welcome. Thank you, Shannon. Good morning, good afternoon, and good evening, everyone. As a reminder, the recording for the webinar will be posted in the DVPO members only section of the DVPO website in a few days. I would also like to provide a brief overview of the Data Governance Professionals Organization, affectionately referred to as the DVPO. The DVPO is a community of Data Governance Professionals whose mission is to share knowledge, content, and best practices for its members to build a community of practice. Towards that goal, a group of individuals are working on expanding our best practice information for the six areas you see in the graphic on the lower left side of the slide. To learn more about the DVPO, please visit our website at dvpo.org. To honor those companies that have advanced in their Data Governance Program, the DVPO awarded our first Data Governance Best Practice Award this year. The award is given to the practitioners within a customer organization in recognition of the business value and technical excellence they have achieved in the design and implementation of an outstanding Data Governance Program. We had 18 submissions, and these companies are being featured in our DVPO webinar series throughout this year. I am thrilled to have the privilege of introducing today's speaker, Kevin Lehmann from ADV. Kevin is the Director of Foundation Master Data for ADV Biopharmaceuticals and has over 20 years of experience in the life sciences industry where he has exercised his IT and CPA backgrounds to lead large-scale transformation initiatives. A self-described beam counting that had Kevin strives to identify and drive a clear business value approach through the disciplines of data governance. So without further delay, it's my pleasure to hand today's session over to Kevin. Welcome. Thank you, Annika. What I'd like to do is share with everyone the journey that ADV has been on. This is really our story of development of a dedicated master data governance organization for our core transactional data. Who we are, where we've come from, where we're going, and some of the things we've accomplished along the way. Like many of our peers, in many areas we feel like we're still in our infancy, but we're eager to stretch our legs and move from crawling to toddling to running. We've seen some of the discussions from the winners of the award and it really is something that I aspire to be able to share sometime in the future. We appreciate that opportunity. I am excited to discuss our journey. I'd like to thank the DGPO for giving us this opportunity. We've learned a lot from our colleagues in data governance and appreciate this opportunity to share back. Throughout the discussion, I'll use the terms master data governance and data governance fairly interchangeably, as I think a lot of the principles apply regardless. You can't have good transactional or analytical data without good master data. I also want to point out this was very much a team effort. I'm the only one speaking today, but along with me are many others from the ABV master data team, IT, and our external partners, and I'll mention some of their contributions throughout the discussion. We're all working hard to bring the very best possible data to our users, customers, and vendors. You'll notice this deck is more of a narrative style rather than a strict bullet point, which may help if you look at it a little bit later or don't recall everything that I had to say, or if you want to share it with your colleagues who weren't able to join today. On to the next one. So as we look at what has been accomplished, first I need to share a little bit about ABV and a little bit about myself. So ABV is a spinoff of Abbott Labs. We're a global biopharmaceutical company with over 30,000 employees in a few key therapeutic areas, immunology, oncology, neuroscience, and virology. As was mentioned in the introduction about myself, I have finance as well as IT experience, thus the beam counting bit head, with over 20 years in the life sciences space. And I come from a professional background, and I really want to use that mindset to build a professional organization. There should be generally accepted good governance practices just like there generally accepted accounting practices. And ideally there would be associated training and certifications recognized throughout the industry. Thus my support and appreciation for the activities and services of the DGPO. One last thing, I am recovering from a cold, so if the line goes silent for a few seconds, that's just me pressing the mute and clearing my throat. I apologize ahead of time if we need to do that. When we look at our data governance journey, it's really about addressing our global business needs. So a little bit of background, in 2013 after becoming an independent company from Abbott, we had decisions to make. We needed to either set up our own new systems or clone the Abbott systems and processes, or some combination of the two. And the decision was made that we were going to set up our own new single instance SAP environment. So if you look at the lower left, that was the Abbott ERP landscape in 2012. That's where we were at. By the time we finished our foundation program in 2015, we were on a single instance SAP system. And not only had we implemented a new system, but we had also put in place global harmonized processes for order to cash, procure to pay, record to report our operations and payroll. And so that model that we used is what we wanted then to be able to replicate across the organization wherever possible. So we have a global process organization end to end. We have gone through the hard work of harmonizing those processes, getting them documented, identifying what were the best practices and really instantiating them. We also looked at how we could leverage shared services to the best extent. So we have a BPO partner that we use, not just in the transactional processes, but also in some of the master data processes. And we'll talk about that. So coming off of the formal foundation program which ended in 2015, we had a very unique opportunity to build a good architecture, a supportable, sustainable organization. And this was truly an opportunity not to be missed. And so coming out, we decided that we wanted a master data organization that would reflect some of the principles that we had seen as we built the foundation organization and landscape. And so what we're going to see a little bit more here now is that organization to reflect. There we go. So our overarching vision was to model this master data organization on the post-implementation, global process organization, harmonized tools and processes, global scope, dedicated focus on improvements and so forth. And when we were working on the development of this organization, we talked to a number of our consulting partners that had helped us with foundation. And we said, okay, what are some of the best practices in master data governance? What's the best way to structure this organization? What should be some of the goals, key objectives, et cetera? And so what you're seeing here is really a compounding and culmination of those suggestions for best practices and what we could develop within our existing organization structure. Okay? So the data governance program had three primary goals. One was to make sure that we're leveraging this new asset that we've built, the master data component of that. Make sure that we have a team that is focused on the master data, making sure that we've got a dedicated business team. So we're set aside in the business. We're part of the CFO organization. We're not part of IT. And we want to maximize then again our tools that we have. I'll talk a little bit about that in a few moments. But we have a tool set that we inherited and we wanted to be able to leverage that set as much as possible. So when we look at our objectives, the first again, make sure we have a business led master data organization that's going to oversee our vision, drive best practices, improve our ROI. We had our first significant milestone which was creating the organization from the different functional areas. So we had our own budget head count. We have project funding set aside for our various improvement projects. Not a lot of funding, but some funding. So at least we've got that. And then we've been working on aligning our business stakeholders with our processes, goals, our long-term plan to make sure that we're in sync with their objectives and we can support them and that they understand and recognize what it is that we're doing from our objectives and best practices. And what we were able to develop along the way is a five-year roadmap that what we want to do and when we want to get it done. Now it's a roadmap. It's not a funded plan at this point, but at least helps set the direction, helps us communicate with our stakeholders where we're going, what we want to do, et cetera. And we've made some progress on that and I'll talk about that as well. From the tools perspective, we have, I think I mentioned we have SAP is our global ERP. We have a tool from back office associates called Data Stewardship Platform or DSP that we've leveraged for a lot of our front-end data quality and we're continuing to leverage that tool and continuously improve our processes and efficiencies. We have a dedicated IT team that helps us both as well as understanding the SAP ramifications of the master data. Some activities we do directly in SAP and they assist us with that. So we've got a good working relationship with our vendors, with our IT colleagues and to use that and to leverage that to continue to build our master data capabilities within the organization. So when we look at the scope of our organization, we do have the global SAP instance. That's our primary responsibility. But we're also looking at how we can improve upstream and downstream systems. I'll talk about a couple initiatives we have underway right now when we get a little bit further in the presentation. But primarily right now we're focused on our transactional data. We have material, vendor, customer, and finance master objectives as well as teams to support those. We're looking at both the, I guess what we call the active management of the data. So where we have new records being created or existing records being updated. But we're also looking at the quality of the existing data. And when we get into a couple slides I'll further flesh out a little bit. But that's one of the things that we learned from some discussions that we've had with other data governance professionals. Some of the seminars that I've been to is really what are the components of data governance that we need to be focusing on. Initially it was primarily just the assumption that hey, if we make sure we have good records coming into the system, garbage in, garbage out, or quality stays in I guess. But there's also this concept of the aging of the data in the application and how do we know where we're at with that. When we talk about the progress of the organization, this is a good part of our journey that I'll talk about here. We started as official organization January 1st, 2015. And primarily that was more of an accounting adjustment rather than anything else, budget adjustment. We moved the headcount from one organization to another. We moved the funding to the central team here. But most of the efforts were really still focused on finishing up the deployments that were ongoing or completing and really cleaning things up from those deployments. And then we moved into an upgrade of the DSP tool. That was our first major project as an organization. It was a very substantial effort. But we're seeing benefits from that. And I'll talk about those in another slide. As we went through that upgrade, we learned a lot of things about the tool set, some of the capabilities and limitations. We learned some of the things about how we're working together as a team that I think has helped mature the organization. So we went live, what, September, August, September, something like that in 2016. And then we took a little pause, a little breather, to evaluate where we were at as a new MDM team. And we had some consulting assistance with that. There's a model that they compared us to along 11 different categories of maturity and really helped us understand where we were doing well, where we had risks, and where we had opportunities. So that was a couple of months' engagement, but really helped us better understand where we were at in our maturity, where we needed to go, and fed in a lot of the components to our five-year roadmap. So as we spent that time with our consultants, we were talking to our stakeholders and really, I think, focusing both inwardly and externally to understand what we need to do next. And I'm going to talk about the five-year roadmap, and I don't mean to keep you on a baited hook, but we'll talk about that in a little bit further. What I want to do right now, then, is talk about some of our organization roles, have an understanding of what we look like as an organization, what we're doing, how we're structured. So when pulled together as an entity within finance, we had, again, worked with our consulting partners to pull together best practice recommendations, and we are a set of dedicated folks that focus primarily between the strategic activities and the transactional. Of course, we have to focus on the transactional activities first and foremost. We need to make sure that we have new customers set up, new products set up, et cetera. But as we mature as an organization, we want to move away from the more basic transactional activities to more of a strategic focus. And I think you'll see that reflected here in our roles and responsibilities. So when we look at the organization, I'm the director, the master data lead, so I'm responsible for integrating across our different stewards. I mentioned earlier we have a steward per data tower, so a finance master data steward, customer steward, material steward, and finance steward. And they each have their organizations that support them. But I'm responsible for making sure that we're well integrated across the different stewards that we have a good understanding of the data throughout the organization, that we understand the impacts if we're changing data, aligning with our key business stakeholders, and really setting the direction, it's a director role, right? So setting the direction for the organization, making sure that we're striking the appropriate balance between strategic initiatives and the more tactical transactional initiatives. And then also working with the stewards and our business stakeholders to make sure that we're continuously improving our processes, that we have both data integrity, that we have good data quality, and that we're cognizant of the business priorities and the projects that our stakeholders are working on so that we can be prepared and support them as well. I'm accountable to our global process leads, so ordered cash, procure to pay, record to report. Those are my direct stakeholders, customers, if you will, but also the other business stakeholders. So Treasury, finance, operations, those are all business stakeholders in the master data processes. Our stewards then, so my direct reports, are responsible for executing the strategies for their respective towers. They make sure that they're closely aligned with their respective global process leads, so customers working with ordered cash, et cetera. They're responsible for maintaining the data quality, making sure that we're developing usable KPIs making sure that when projects are proposing changes that we understand the impacts of those changes and we can prepare for and support those appropriately, making sure our users are knowledgeable of the tool and trained in it, and being consistent across the different data types within their areas and across the other different domains. And then supporting our stewards, there's a master data coordinator. This is the individual then that is a little bit more tactical focused, but they are the right-hand person of the stewards. And they're frequently the ones that really have the in-depth data knowledge. When you look at the steward role, it's going to have more of a strategic component than a tactical component. And the coordinators, typically, are going to be the ones that really understand the depths of the data. When we also look at the organization that supports master data, I've talked about my team of direct reports and consultants, but we also have our shared services organization. And I would be remiss if I didn't also describe their role. So it's evolving, but as much as possible, we want our shared services organization to be able to provide the more transactional focused activities, the rules-based activities that we can move off the plate of my more experienced stewards and coordinators so that they can focus more on strategic activities and have more of the rote transactional activities handled by the shared services organization. And they're frequently time-distributed, so they're more able to support our global organization on a timely basis. And I should also reinforce that our organization supports all the countries, so we have a global support requirement. And so the BPO, by being in the different time zones, can help us with that. Let's talk about communications a little bit. I've mentioned the stakeholders that we have. It's very important to stay closely aligned with our process stakeholders and our other business stakeholders. We provide a monthly update to our stakeholders through what we call end-to-end meetings. And this group really functions as a de facto data governance board. We don't have a single set-aside data governance board at this point. That may be something that we look at in the future. I know a lot of organizations do have one, and it's recommended as a best practice. I don't have one at this point. Maybe something we grow into. But we do have these monthly updates with our senior management, and that's where we have an opportunity to present updates on our metrics, as well as the projects, the key initiatives that we're working on. We have a formal project process, a methodology and tool for our larger projects where we go through and we identify what are the costs of the projects, what are the impacted organizations, what's the timeframe for getting the project done, what are the business benefits, what's the cost, et cetera. And we review these in a bi-weekly, what we call change review board, CIR review board, sorry, but basically turns into a change review board. So they add all of the different functions and organizations that comprise the foundation landscape are aware of these large initiatives. Where we have master data-specific initiatives, we'll provide updates on those. Through this organization, as well as our monthly meetings. When we talk about the KPIs and the metrics, we have both performance as well as quality metrics that we're reporting on. The performance metrics talk about how quickly we're adding up new records or how quickly we're processing updates. We talk about the costs for the records. We'll talk about how many records have been archived in the time period if that's a relevant metric. So we have those performance metrics. We also are working on quality metrics. So that is an area that we're revolving on. I mentioned earlier that we had really focused primarily on our active governance and now we're focusing a little bit more on our passive or quality governance. So we've introduced some quality metrics that we're providing on a regular basis and we're going to continue to grow that. So those are some of the metrics and KPIs that we report on. One of the things that I've been interested in is more performance metrics with other organizations. And that's a challenging thing to do because everybody is different. It depends on your industry, et cetera. But if there are other published metrics that someone can point my way, I'd really appreciate it. It would be useful to understand that and just to see where we fit up against other organizations. There is a consulting organization. I'm not sure if I can say it on the call. So maybe, Shannon, you can tell me later. But we have worked with them and they do their own metrics gathering and comparison. And so we're working with them on comparing ourselves to the organizations that are within their specific domain. But it's not an industry representative sample, unfortunately. But it's better than nothing. So that's on the metrics and initiatives. And it's been very productive to have these updates. I think that's one of the key criteria to being successful with this type of model in this organization is to make sure that we have really good open discussions with our key stakeholders. We also have key user meetings. This is a little bit lower level where the data stewards are talking about the key initiatives, et cetera, with our power users, our key users. And they go down to a little bit lower level talking about tips and what issues we're facing, what are some upcoming changes, et cetera. Again, in the communications realm, very helpful, very important to keep those discussions going back and forth. And then finally, we have a SharePoint site that has been developed and we're working to continue to develop and improve where we're able to share tips and FAQs, metrics, training, et cetera to anyone in the organization that is interested that has a use for it. Metrics, there are some things that I can't share, but there are some general things that I can share and wanted to provide that. When we implemented our tool, specifically benchmarked before calculations and some of our performance metrics. And what we saw before the tool and shortly after the tool implementation was a 12% reduction in our average cost per record, 14% reduction in time to enter, 22% reduction in the number of clicks. The metrics themselves, as important as the story that they tell and the fact that we took the time to measure before and after. So I talked earlier about our Charter AsterData as an asset and part of what we want to do is make sure that we are taking appropriate and relevant measures that looking at things from a cost-benefit analysis standpoint. So when we look at the average cost per record reducing, that's a solid win that we can take to our senior management and help them understand this reduction times this number of annual records being created and updated across the organization. That's real savings. That is significant when you extrapolate that across the organization. And so as part of our approach, if you will, as part of our finance approach, these are the types of things that are very important to be able to track and demonstrate. We also have been looking this year on metrics, our quality metrics, I mean, where we've identified what are some of the quality metrics that we can track through the system, through queries and so forth. What's the baseline? So where are we at right now? And where do we want to get to? So, for example, I put one up there. So we measured here that we're at 90% as a baseline, for example. So the next step is to work with our stakeholders and say, okay, we're at 90% now, where should we be? Should it be 100%? Is it okay to have less than 100%? And why? And so we're taking a rational approach to these metrics, making sure that we understand, first of all, where we're at right now and where we want to go. And then as much as possible, being able to automate these metrics so that we're not burning a lot of human-being cycles to generate something like this when a simple query would be able to return it for us. So that's just an example of some of the things that we've been working on from a performance and quality standpoint. And then our next slide will talk a little bit more about some of these activities that we've got going on this year as well as planned for next year and some upcoming beyond that. Looking forward to the future, things that we've done this year and really things that we want to do going forward. And this is part of our five-year roadmap to elevate our master data governance across the organization. So the first thing that I want to talk about is this racy matrix that we did. So this was our first attempt at identifying four-hour key data elements, the business owners. I think there was a lot of instinctive on who the data owners are. But up until this point, we as an organization had never gone through any exercise to verify that our assumptions were correct. So each of our teams went through and identified what are their key data elements and then went through with their stakeholders and put them into a racy format. So the key data elements are those critical ones that if we don't get them right, the transaction is going to fail or bad things will happen. So we had, I mean, this is a multi-month exercise to go through with our stakeholders. But in some instances, there was very clear whatever Treasury has this one. Okay, Treasury is here, they agree, great, we're done. In some instances, we have others that nobody wanted. And in some instances, we had some that multiple organizations felt that they owned. And, you know, when we all have immediate agreement, okay, that's a nice intellectual exercise. But where we really found the benefit was when we didn't have agreement where there was no clear owner and we needed to identify one or where there were multiple owners that disagreed on who owned it and we got to play referee. Now, in my view of the world, master data, our organization shouldn't own any of these elements. There should be a clearly identified business transactional stakeholder, someone that is going to suffer if something goes wrong with the data. And in very rare instances would it be the master data team? And I'll give you one example. We had a tax field. And you would normally assume that tax would say, oh yeah, that's my field, you know, we're clear on that. But in the discussion that happened, as I'm told by my finance master data steward, tax said, oh yeah, this was IT. And IT basically came back and said, oh no, no, this is a tax field. Well, tax says, well, when we were populating it, you always told us to put this value in here so we thought that was just our only option. IT further explained, well, no, this is a default. You can put whatever value you want in. In that case, we can improve our tax position, reduce exposure, et cetera, if we can change some of these values. Great. So we now have an identified funded 2018 project to go through and rationalize that field across our master data domain so that we can improve our tax position and reduce exposure, et cetera. So that was a very clear win coming out of this exercise. Other examples, maybe not so momentous, but we had a recent discussion on a master data field. The question was raised, well, who owns this field? Oh, well, let's go check the RACI. Oh, this organization owns it. Oh, okay. So off we go to that organization. Hey, we have a problem. Can you help us? So that was an exercise that I think had benefit to the organization. We started small, relatively small, 173 elements, and looking forward later this year and the next year, we want to expand the number of elements that we are evaluating and documenting. The next step then was to take these elements and say, okay, how do we know that our data is good? Why do we have any doubts if we had good data coming in? Well, how can we possibly not have good data in there now? Now, there's a number of different reasons why that may be the case. Business rules change. When they change, do we always go back and retroactively apply the business rules to our existing database? Maybe not sometimes. It's coming in. We did our best to make sure we had good data coming in, but it's possible due to the rush and the fuss. Maybe some less-than-perfect values are in there now. And then data can go bad. It can go stale. You've got licenses that have an effective period for two years. So is there a trigger to go in after the second year and refresh the license? So these are all questions that we look at when we evaluate the quality of our data. And through discussions with other DGPO colleagues, consultants, et cetera, we came to understand that there are different dimensions of data quality. So you can have uniqueness. Are there duplicate records? You can have timeliness. How recent is the data? Are all the licenses unexpired? You look at completeness. Are all of the mandatory fields? Again, what is mandatory now might not have been a mandatory field six months ago. So we define accuracy by, does it abide by the business rules? So we've identified quite a few, thousands of business rules that we have put into our DSP engine. So when you're adding a new customer, is it a U.S. or O.U.S. customer? If it's a U.S. customer, it needs a five-digit zip code. If it's a Canadian customer, it needs to have an alpha in the middle of the zip code. These are business rules that we've put in the tool. And so we know that by the time a record gets added, it's passed all these business rules. Well, that's great. That's good for new records. When we update a record, it applies the business rules to the update. But what about the records that haven't been recently updated? So can we apply the business rules to those records? That's a component of accuracy. Do we have reference data, otherwise known as drop-down boxes? So do we have a field with only 10 current valid values? Well, let's go look through that field in the records and see if any records have values, which aren't in our current list of valid values. And some of these are very simple put into a query. Some are more expensive. Some are more challenging. And then you get into some of the more challenging aspects of data quality. What I call, it could be accurate. It could be timely. It could be all of those things, but still wrong. So what we call fit for purpose. If we have a customer record, it can be absolutely a bona fide record. It could be an address that's in the U.S. Postal Service Database. It could be timely. It could have all of the necessary attributes, blah, blah, blah. It looks perfect. It's just wrong. It happens to be the guy next door. And so you're not going to know that until you have a failure. So then you have to look at the failures and say, okay, what's the root cause of the failures? Well, there could be a thousand different root causes for a delivery failure. One of them may be that you had bad master data. So that becomes a much more challenging, complex way to investigate the data and determine the quality. Just a couple more points I want to cover real quick. And the reason I spent so much time on quality is because that really is an area that we're investing quite a bit of time in and focusing on. But we also want to look at efficiency. How can we improve our process? How can we streamline it? How can we leverage our BPO for repeatable activities? We recently completed a project to move some of our purchase, our vendor master activities that were done by my employee team here to the BPO. And so where we can, that's something that we want to be able to continue and free up more time for the folks to focus more on strategic activities. And then where we can use robotics. This is something that we're just getting started in. But how can we automate some of these more repeatable activities that we might not even need to move to a human being? We can use computer-based robots for this. And so we're just getting started in that area. And then finally, looking at expanding the governance. So we're looking at enterprise finance master data. We've identified there are challenges with some of our finance master data across different applications. So not just SAP, but other applications. So we're looking at how can we use the principles and tools that we have available for our foundation master data, our SAP master data, to improve our enterprise finance master data. So with that, and you wanted a few minutes for questions. I apologize I ran over a little bit, but in the excitement and spirit of all this, I wanted to share as much as I could. But let me turn it back to you for questions. Karen, thank you so much for this informative presentation and sharing insights of your data governance journey at Leigh. As a reminder to all of our audience members, please feel free to submit your questions in the Q&A section in the bottom right corner. I see we've received a number of questions from our audience. So in the next few minutes, I'm going to get addressed as many of these as possible. The first question is, Kevin, you mentioned that master data stewards establish and publish KPIs related to their data domains. Could you give us an example of one of your KPIs and how your efforts were able to improve business outcomes? Sure, sure. So one of the KPIs that we have is the time to add new records, as well as the time to update records. And we've seen since our tool upgrade a significant improvement. I had some metrics I shared a little bit earlier that were from the early part of this year. We've seen improvement throughout this year and when we're able to compile our metrics, I expect to see even greater improvements in terms of the time it takes, the activity required to update these records. Thank you for that. Next question, is the data stewardship platform tool from Black Office Associates specific to SAP, or is it compatible with other databases too? So it is compatible with other databases. It is something that we use for SAP. I believe they have some, we want to say some preconfigurations for SAP, but I understand that it can really be used for other databases as well, other tools, other systems. Do you have any system on top of SAP, or is all data entry done in SAP? As well as, did you have any issues regarding trying to synergize data elements and rules across multiple countries without a governing committee making decisions? Excellent. So we do have data governance across all of the countries. So the tool and process that we use, so the DSP tool from Back Office, that sits in front of SAP, and that is used by everyone, regardless of whether you're in the US or outside the US, whichever country, that tool is the front end for probably 80%, 90% of the master data activities. Now there are a few things that we don't do in DSP that we'll do directly in SAP. Some of those are done by the BPO folks, some of those are done by my team members. But when you're talking about the end users, primarily they're the ones that are working directly with DSP to request a new material, a new vendor, a new customer, et cetera. So we do have consistent tools and processes across the globe when it comes to the SAP master data. We have many questions about tools in general. So besides DSP and SAP, what other tools does your MDM team use? I assume the question is around maybe data quality, perhaps vintage, or other areas of governance. What other tools does your MDM team use? Sure. So our primary tools are the DSP and SAP. We use SharePoint as well, primarily as a communications mechanism. And then it's really just Excel, is our other analytical tool. We are working with our IT organization to develop BW queries that we can use to further analyze our data and our performance metrics and so forth. But that's our primary tool set. Within V as a wider organization, we have quite a few different tools in use. But from the foundation perspective, our general area is really DSP and SAP. Now one of the things that we're working on, as I mentioned, was robotics. So depending on how we proceed with that, there may be some new tools introduced or new capabilities to DSP may be developed. As we look at building further capabilities within our governance organization, we want to explore better ways to identify, document, store metadata. And I know that there are multiple different metadata tools out there, so we will have to decide what's the best tool for our metadata. Right now, we have a very basic start, I think primarily in Excel, but if we really want to do metadata right, then we're going to need some tool to support that. There's a comment that came in, it says, got to prove a future economic benefit with some easy visual metrics. Have you developed any visualizations to show your data quality to your shareholders? Yes, back up right there. So this little donut is where we're starting. We have the basics, the beginnings of this for our different quality metrics. Now what this doesn't immediately translate to is cost. So we can look at quality metrics and say, okay, we have our accuracy at whatever, 90%. We want to get it at 95%. We don't currently have a way to translate that into dollars. So what would be the dollar savings of getting it to 95%, what's the dollar cost of getting it to 95%, that may be something that is a little easier to calculate. But when we do large projects, calculating the cost and the savings is something that we need to do for each of these projects. So at some point, if we're doing a substantial effort, we're going to go through and say, okay, what are the time savings across the organization if we do this? How does that translate into dollar savings, et cetera? So from a performance metrics, that's easier to do. From quality metrics, it's a little bit more challenging to do that correlation into cost savings. But the good thing is people intuitively seem to understand that if we improve the quality, it's going to improve most people's lives that are dealing with the data. Well said. Next question is, what process did you use to identify key data elements? It would be interesting to know how many different definitions for each key data element you came across and how you were able to rationalize that across the company. Yeah, okay, that's a good one. I wish I had my stewards here in the room with me. I think they could each take and give a slightly different explanation. But I did allow the stewards leeway in how they went about calculating or determining what the key data elements were. But in principle, these were the ones that really was going to screw up the transaction if we didn't get it right. So for example, the customer address, vendor address, vendor bank information if we're doing electronic payments. When we're in our material elements, of course, there's very key material master data elements that have to be right. And so if I recall correctly, the material master team did a voting, so each of the team members voted on what they thought the most critical data elements were and then averaged them and whatever elements had the most votes won. I think it was brilliant. Others maybe didn't go into quite that much detail, but the approach was basically the same. What were the elements that were really going to mess us up if we didn't get them right? And who is the master data director yourself? Who are you accountable to? Is that an executive sponsor from the business or is that the chief data officer of your company? Oh, excellent question. So I report in to the vice president of Foundation Processes and she in turn reports to our CFO. So there's one person between me and the CFO. We are part of the finance organization. We are responsible for the maintenance. Maintenance is not the right word. We're responsible for improving and managing the asset that is our foundation implementation. So when we look at the global processes and the BPO activities and the single instance SAP application, this is the organization, the business organization that's responsible for making sure that we are being good stewards with the asset, that we're continuously improving our processes, that when we have legal or regulatory changes, that we're implementing them quickly and thoroughly. That is all the responsibility on the business side of my boss. And then we have an IT counterpart who's the vice president of enterprise applications that has the SAP responsibility directly. This is all the time we have this afternoon. Thank you, Kevin. I want to thank you for your presentation this afternoon as well as your time. I'd like to remind everyone that the recording for this webinar will be posted in the DGPO members' only section of the DGPO website in the next coming days. Back to you, Shannon. Thank you so much. And Kevin, thank you for this great presentation and thanks to our attendees for being so engaged in everything we do. We just love all the questions coming in. And that's it. Everything's been covered. I hope everyone has a great day. Thanks, everyone. Thanks, all. Thank you.