 Good afternoon, everyone. Pretty good to have you all come here to talk about metrics right after lunch. So I appreciate that. As he mentioned, Sallie May is an organization that helps people plan, save, and pay for their education. And we have servicing centers across the United States. Predominantly, we're supporting US citizens, but we do give loans to international students and students going abroad. What I'm going to cover today is really some of our journey for our data governance program. Because what I want to talk about is along the way where we captured metrics and why they were important and the types of metrics we captured. And then partway through, I'm sort of going to deep dive because we're going to go into our data quality program and talk about how we formalized doing business value calculations and really step you through that process. And hopefully it will be straightforward. And if you have any questions, please feel free to go ahead and ask. And then I'll also touch on our data quality dashboard so you can see how we're presenting that information to folks in our organization. So for us, data governance is an enterprise program. It sits underneath our enterprise data management strategy. And the data quality program also sits underneath our data governance program and you'll see why that's important. So it's basically where we're pulling together the business and the IT folks to work through data issues that span our organization horizontally. So we have a couple of things we do. Proactively, we develop policies and procedures. We react to data issues coming up with resolutions for those issues. And then when I refer to ongoing efforts is everything you have to do for your stakeholder care. Which as many as you know, it takes a lot of your time. How many here already have a data governance program in place? And how many are here because you're gonna be starting one? And you want some? Okay, so we're about half and half, I think. Which is good because it's interesting to see over the years how the audience is maturing and there's more and more speakers talking about data governance. Here's our timeline. I'm not gonna go through each one of these in detail, but so you see we've been going now for about six years. And I'm gonna touch though on some of these phases just to talk about the metrics we captured during that particular phase. So this presentation's all about metrics. So our first initiative was to identify our enterprise data. So it was very important to show the before and after images. And so these were the metrics that we captured. Now we did a process where we had a lot of conceptual models and we took those and drilled down to the logical level. And then my group, I had a data modeling team so we had access to all the physical schemas. And we took those and drilled up to the meat in the middle and we moved like processing data, Q type fields, that kind of thing. So as you can see, we were able to demonstrate that when you got rid of all the noise, only about 15% of our data was truly enterprise data. We defined an enterprise field as a field that was used by two or more business areas. Also from this effort, we were able to show by line of business, the data that each line of business was using. So you know, of course you're probably like us, you have lots of systems that support multiple lines of business. So that was the first opportunity we had to really go through that. And so if the originations area said, well what tables do we use in the system? We could now tell them that and show them the fields that they were using. So these were metrics that became very important, not only to kind of let everybody know as we were laying the groundwork for data governance what the scope of our project was, but then we can use that metric and I'm gonna talk about the maturity model we developed to then show how many fields were standardized and what percentage of the total that is. So having a metric like this will help you over the years. Our next initiative was a pilot project and it was called the Seven Data Elements Project. Where as a company we were moving from predominantly marketing to lenders and schools to marketing to consumers, students and their parents. And management asked us to look at these seven fields that they thought the data wasn't very good quality and it wasn't timely data. So we did that initiative and we went through a two day jazz session to identify all the issues and possible resolutions. But we then had 43 action items that needed further investigation. So for the next six weeks after that, we actually went through, meaning on a weekly basis, working through those action items and tracking that information. Then we also for a metric tracked the number of implementations that were successful and so it turned out that actually out of everything that needed to be done which included both business changes and IT changes, we implemented about 75% of the changes. And you're gonna see in the 70s is a recurring theme here. There's always a few that seem not to get done because other priorities come in and take precedence. So we were actually asked by upper management to track those changes for the next six months to a year. Now at that point, I wasn't very good at metrics and I realized after we had gone through most of that that we really need to show the value to the organization. So what we did, we went back, we worked with the marketing VP and identified some scenarios and we're able to show this business value for that project. Increasing revenue by 2.4 million and we reduced costs because we were moving from a snail mail sending out letters and to the Bowers on marketing campaigns to email which was a big thing at the time. And so we were able to show those costs and that's what one of the key lessons learned is to look for something like that where you can show actual business value very quickly. This information also funded some strategic corporate initiatives for IT the following year. Now this is just real quickly our organization just so that you know when I refer to the different groups what they represent. We have a data governance office which is very small just a couple of us and our role is to facilitate and coordinate the whole program. The Data Governance Council is made up of all business representatives for all the lines of business in the company. There's probably about 25 folks. We done at the bottom we also have business subject matter experts. So those are additional people that those data stewards can tap. We have IT subject matter experts and data folks as well. The new group that we started a couple years ago is this data quality services team that does a lot of our active profiling and you'll see they're very involved with keeping the dashboard up and running. So that's just a very quick view because the council plays an integral role with helping you with your metrics. Now anything we do in the data governance program has to affect one of these or more of these corporate drivers. So we're always looking is this going to increase revenue? Are we decreasing our costs and complexity? Or is this really a compliance or a risk issue that we need to deal with no matter what? And so if you keep grounding yourself and asking yourself how does it fit in? This will also help set you up for how to deal with the metrics and categorize them in one of these buckets. So early on, we weren't very good at quantifying metrics but what we did was we went out and we looked at industry publications and here there's an example from Harvard Business Review. We also talked to marketing and said, if we can give you better information to do better marketing campaigns, how will that help you? And in fact, the marketing person we sat down came up with this very conservative estimate at the bottom. So if we assumed we had a million mailings and the average price of our loans is $4,000,000, she said it was very, very conservative. If we just said we improve our rate by half a percentage point, we are gonna result in $50 million more in loan volume. And she was very comfortable with that. It was very, very conservative. And I'll keep saying that. One of the things we make sure we do is you always wanna go with a very conservative number so nobody can question where that number came from. And I'll give you some more examples of that. So at least when we showed this information to upper management, they could all say, yeah, that makes sense and not their heads even if you couldn't quantify it at the time. Here's some examples of managing cost and complexity. I won't go through them all but for us, a lot of these examples, they ring true to our organization and upper management knew exactly what we were referring to. And I included some examples of compliance initiatives. Again, most people won't give you too much trouble with those because they know that you have to do that. We have to, we're regulated by the federal government, the Department of Education for our federal loans and there's things that we need to do to make sure we're in compliance. So go out there and at least interview and talk to your business areas and they'll help you come up with these lists and that's how we started. One of the things we did when we were designing the data governance program was to develop a maturity model for our data. And this was based on Gwen Thomas' data maturity model and we adapted or modified the levels to suit us so those levels make sense to our organization. But this was a wonderful way to then tag our field. So first of all, remember all those enterprise data fields I talked about, we did in another session, project rather. Those all immediately became a level two because they were identified and they were enterprise data and then as we would go through and work on the names and the definitions, moved it up to level three if we identified and went through all the process of what are the issues associated with that field that would be a four and five is that field we fixed all the issues associated with it. It's now a level five. So all of this is kept in our data modeling tool. We're able to generate reports and at any point in time I can show you how many we've done and how they've moved from level to level. Again, a great metric that you can use for upper management. Another thing that we found was that we were doing a lot of work on different things not really getting credit for it. So we started tracking our time by these different buckets because in addition to the data standardization we were doing a lot of master data management support or helping on project teams and we wanted to track our time. So we could show where our time was being spent. So I'm gonna talk about our issue reporting database. So this was in addition to that, we're like, okay, that works really well for tracking the issues but we need a way to track all this other stuff we're doing. We do have an issue tracking database. It's just an access. One thing you have to think about is how to set up your issues and we've gone to like by year. So it's like DG2011-001 would be the first issue for 2011 and then do the same thing for 2012. And as you can see, we've done 184 major issues. 76% of those have been closed. So again, in the 70s, about three fourths of our work we're able to accomplish. We have lots that are in progress and several that are open meaning they're waiting to be looked at. There are some that we also put in a tracking status that I'm not showing here that we're asked, even though it's like an IT implementation is required or changed to a business process, as the data governance office, we track that and report it on it later. As we were going through data governance, about 2009 we hit funding to work on data quality and we said, okay, those same three corporate drivers that we use for data governance will apply to data quality. And here's a sample of how we represent that and I'm gonna walk through kind of how we get to these slides. So this is just a summary slide. I like the graphics, I left it in. This poor guy's working all the time to capture the metrics and it is a lot of work and you have to always think about how to update it and be able to show it to management. And as I said, just a reminder, use industry publications, work your way to where you can quantify the benefits. So when we started data quality, we said, well, it's gonna sit underneath the data governance umbrella. And again, this is where the council comes back into play because they're gonna prioritize and rank anything that we're gonna work on. So they feed us the issues and they prioritize which ones we're gonna oversee and resolve. And our whole goal was to develop data quality as a competency just as we did for data governance. Now we worked with Danette McGill Ray and this is her 10 steps methodology which would take a day to go through to do it justice. But at least I put it here so that you would know we did have a methodology behind what we did and it's important to have a framework just like for our data issues, we have a framework so that every issue is treated the same and ideally you're trying to resolve the root cause issues and fix the problem and prevent it from occurring again. Now, this is when we got training. So up until now we did a lot of grassroots efforts and as you can see, we're moving along with metrics. Now we're gonna learn some techniques and the techniques highlighted in yellow is basically what we followed. Now the anecdotes, that's pretty simple, right? That's just collecting the stories and sharing those stories. But that worked really well because everyone in our company understood or knew about that particular issue or at one time when something happened and everybody heard about it. We also did tracking of what was the uses of the data and you'll see that for the business value calculations. How is that data being used currently? And you might always also ask yourself how could it be used in the future? My favorite technique is the Five Ys because it's very simple and it worked really well for us and we continue to use that going forward. That ranking and prioritization number five, I mean that's what we're asking the business to do, the council when we meet, they're consistently and constantly ranking and prioritizing our work and we're probably on our fifth or sixth phase of rolling out business value metrics. And then what you need to do is quantify the cost of low quality and you're gonna see how we do that both so we show potential value, here's the value, we could say the organization by cleaning up this data and now in the past year we're tracking actuals as well. So let's dive into some examples of each of these. The anecdotes, what sits behind each anecdote is probably a one to two page paper that summarizes, well I should say, it's the details of the actual situation and what occurred and what, by the data being wrong, erroneous, like what was the issue and then we take that and we take all the details and we rolled it up into number two which is really summarized examples. And so that was very easy for management to look at but then we said we can do better than that especially when you have to present to a CIO, any C level individual and so we further summarize those two page papers into these couple of bullet points but for us when our management looks at that they know exactly what we're talking about. So this is more here just to show you, think about it easy way to show that information. Now the five Y's is, like I said, a very simple technique to use. Now we do student loans, well if there's a lot of, we call them bower benefits, there's certain programs we offer the bower if they're good with their payments. So for example, if they pay the first 24 payments on time we might give them a percentage off their interest rate or we give them other incentives. So of course they benefits the bower so we started to look into these fields because it turned out some of that data was wrong you would ask well what is that, why is that an issue? And in this case, of course the business like servicing area would say, well that means the bower benefits is inaccurate and therefore the bower's not getting credit for what they've done or whatever. Okay well why does that matter? Well they might tell you well it matters to us because then we have to get on the system, go and find the issue and then fix it. And they keep digging okay well why else does it matter? Well it turns out that those bower benefit fields they have to be there and be valid in order to go through this process where we actually push the loans to the Department of Education that's called the PUT process. Well if we're not able to put the loans through that increases our funding costs and so we're actually losing revenue because it's costing us more money to do that. Okay so now not only there is a cost to fixing the bower benefits but there's a decrease in revenue that you can quantify by having that data that needs to be fixed. And those are the kinds of conversations when we would sit down with the business area and you just need to keep drilling down and down and down as you go. So when we did the data quality program just like with our data governance program we did a pilot project and I highly recommend doing that. It really helps you test your roles and responsibilities, helps you identify the framework that you're gonna use, you know how everything's gonna work. So we met with the data governance council and that was the first time we asked for what would be the top fields that you would want us to actively monitor. Well of course they came back with a pretty good big list so one of the first things we did we had time boxed that we wanted to get this pilot done within a couple of months. So we looked at anything that was really complicated like looking at three different fields together and we didn't feel like we had the expertise yet to do that we kind of discarded or it turns out we maybe didn't have the data to actually go profile against then meaning we didn't have access to those systems or it would take time to get that all set up. So we ended up coming up with a list of 10 fields that we were gonna look at. Well that actually was a pretty good number because in reality when we looked at what the business rules would be it evolved into actual 22 business rules because the business rule will also vary you'll have a separate one of the different source system or if you might be looking at three different things about a field so it would turn into three different business rules. And what we wanted to do is find out from the business who really cared about those fields. So this is a sample of a spreadsheet that we sent out to the council when we prioritized here's the top 10 we asked them do you care or not about these fields? So by line of business area they just check the box. So that would tell us who we needed to interview and get more information from. So let's say if the collections area didn't care about this field we weren't gonna interview them, right? So we used this to kind of drive the interviews and know which business areas were gonna be involved. Now one of the things we had to figure out is well what are those costs to you know if you have poor quality data? So we looked at some industry lists Larry English and David Lotion and Danette has stuff in her book as well. We took a look at all those lists and we said of these things what makes sense for our company? What are the things we wanna track about the quality? Then our my chief data steward is great because she's worked in the servicing center she's also been our controller so she's a very good finance person. And it turned out one of our meetings she said well you know a lot of this looks very much like our general ledger when we roll our costs up for the company some of this stuff ties in pretty nicely. So a good lesson learned there we looked at that we picked the categories that tie to our general ledger so then any business area they all report their finances according to that chart of accounts, right? So they knew what those buckets represented so that worked out really well. So we had to define each of the business value categories and then also come up with questionnaires to interview these folks. So this is easier to follow so there's an example so we have lost or missed opportunities in the line of business and we would give that a description and we map that to what business value category in this case it was a funding impact. And then we took that category and if you look down at the bottom it'll show that funding impact went under the revenue generated bucket, right? And then next example, work around costs and decreased productivity, okay? So that's poor data quality causes, manual workarounds, they have to correct the data, et cetera, we defined everything that tied to staff costs and staff costs go under costs avoided. So we did that for all of them and we also had a section for intangible benefit. So now we had the framework for how we're gonna capture all that information. We used that to come up with these questionnaires and it's just a word document that we would use. But one thing we did and maybe because some of us had been there a while so we know the business area fairly well even if we're in IT and then with my chief data stewards help we actually looked at each field and what the business rule was gonna be and tried to guess say okay we think this is gonna impact funding. We think there's work around costs associated with this. We think they use this in marketing campaigns so it's gonna impact them. And we would populate this questionnaire with the categories that we thought we're going to apply and the ones we thought were maybes but we needed to ask so that when we went into the meetings we were prepped and ready to go. And so you can control the meeting and the discussion and the questions that you're asking. And then we would fill that out and then add any like follow on action items who we needed to contact because sometimes maybe if you were looking at address fields we might work with some third party vendors and they would have to tell us well we send that out to Novus or somebody and they'd have to go and find out what the cost of doing that was. One of the other things we did and they'll be the next slide I'll show it more graphically it'll be easier to follow but in the beginning we had the business areas sort of checked do they care or not. Well we started to realize that if we wanna determine who the approver or the data owner or whatever you wanna call them will be for a field or a business rule we need to know more. We need to find out whether it's a low, medium or high priority. And so we defined what those meant. So of course high we said okay if this data's wrong it's a complete failure or it has major implications it could be something where we're gonna be out of compliance and we need to know immediately. Medium okay there is an impact it's not as significant as a high but we have to take a look at that and then low is okay there's some minor impacts but it's not a big thing and so actually if resources are tight those are the ones that are gonna be looked at last. One developing the calculations we needed to look at some results of what the data looked like and also set up these criteria ranges. So for each business rule that we're gonna have does it fall into we came up with these green, amber and red ranges. Now the reason we picked green, amber and red is that our PMO for all our IT projects uses green, amber and red to indicate how the projects are doing. So everybody knows what those colors mean. So green everything's okay you're falling within the range you're right on target. Ambors well you failed but you're not that bad yet and then red is totally unacceptable. So what happened now since the pilot when we send out that original survey to ask the business areas they will now mark instead of just checking the box we have them mark high, medium or low and then for us we will take the highest ranking and that's what the business rule gets. So in this example enrollment period begin date the check for that it got a high ranking because there were several business areas that marked it as high. The high also tells us that let's just say the originations area cares the most about that field or they marked it as high. They are then tapped as being the approvers and having to follow through on that field. So that's one improvement we've made to the process. Another thing that we realized is that it's very important to do some monitoring of the field before you get together with the business so that you can actually show them what's out there so you've got some real facts to look at. So we'll run the profile. So we've got the business rule done, we'll run it. We can tell them how many records we ran against how many were successful, how many weren't and that helps them determine those status criteria ranges on the far right. So in this example it's a relatively innocuous field you know the ranges aren't as high so really not until it gets to 84% or less doesn't get marked as red. Now we have some other business rules having to do with, you know, paying back the loan and the term counters and if they're in the negative, one record is the negative number, it goes right to red and somebody's working that. But we've actually put in code to fix, there was a problem with that and that actually is one of those anecdotal stories that everybody knew about and they actually put in code to fix that and we have not seen one since we've been, you know, running this. So that gives you an example and then for all of those we're also showing intangible benefits. Because they are important, it's just another reminder, you know, it can affect a reputation of the company or poor customer service even if there's not a direct quantifiable calculation. So we went ahead and just this is just to follow up with the pilot. For the pilot we ended up going back having more meetings because as we were learning the process and it took us more meetings and it does now a lot, a lot we can accomplish in a single meeting now that we've got our process down and we're much better at it. But we just followed up with everyone in emails. Now this is the important thing is the spreadsheet and it may be hard to read so I'll just walk through an example and this will help you understand the five why scenario. This is a fictitious numbers but let's say our business rule is state code is invalid. So you can, we're meeting with the servicing area and we'll say okay well what happens if state code's invalid? Well the letter may not get delivered so it gets returned. Well why is that an issue? Well it could be a compliance issue because if we have to send somebody like a privacy notice to some other compliance letter they're not gonna get it. So there's, we won't even talk about that right now. And then we'll say well what happens if it's returned? Well we have to have somebody in the servicing center look it up, they may go to the white pages, they may have to pull the original application trying to get this information. And that's when you start digging. Okay that's when you say okay well where is that done? Which group usually does that? Oh well that's one of our servicing groups. Okay are they in Pennsylvania or Delaware? Most of that's all done out of Pennsylvania. Now what we did was we went to HR and got our hands on the list of all the salaries by geographic location for our company by level. Okay so now because they said Pennsylvania I can go to the Pennsylvania grid and look at salaries and we'll say okay well what level person usually does that work? Well it's usually anyone from a level three to five. Okay so right there I'm gonna take level three because it's the lowest most conservative number. And we add into the salary, we add in our percentage that gets applied for benefits. So that number includes all of that and we'll break it down to what the cost is per minute so done by hour then per minute. And so we'll say in order to do that how long does it take them to make that change? Well it's usually four to six minutes. So what I'm gonna plug in there is my four minutes. So the overall cost of making a change to one record is very small but if you have thousands and thousands of records that have a problem the numbers add up pretty darn quick. So that's what we did. We have a spreadsheet, let's see. Okay so we have a tab for each business rule. On the tab for that business rule is all the calculations, whether they're revenue and we sort of haven't divided up if they're revenue generated or cost reduction. And then we have a tab, one tab on the whole spreadsheet that's four intangibles. And we keep those in one place. Okay so everybody with me so far. So now we have, we have 95 tabs because we have 95 business rules. On the first tab though, so the first tab now would be 96, we have sort of the summary area because we run the metrics on a weekly basis. So but we don't, we update our dashboard only once a month. So but once a month we have to like put all the numbers in and you don't wanna be going from tab to tab to tab. So for business rule one, I have a guy in my group who's great with Excel and he linked all those calculations to this front page. So for business rule one when it runs let's say there's 10,001 errors. All you have to do is go into that column D that says number of errors identified and put 10,001. It automatically calculates and it'll show me all the value for potential costs avoided and potential revenue generated. So that all happens very automatically, right? And then what we've started to collect, you'll see down at the bottom right is actuals. So when we need to work the issues and we've asked folks on the council, so if it's a business area doing the fixing of the records or we have to open up a service desk ticket with another group to make the changes, one of the things they bring back to us is the metrics of how many they fixed. So when they say, okay, out of this 100, we fixed all 100, that's now going in and it pre-populates and all those calculations and we're showing that on the dashboard as well. That one's the tougher one, right? Because you have to follow it and you have to be diligent about keeping track of the metrics and we haven't found a way to really automate that any better other than through the service desk tickets. Okay, so we wanna present all this information so we wanna make it easy to look at and drill down though. So we developed a dashboard and basically the important thing is you can see it at a very high level or you can drill down as you go and I'm gonna go through, I think the next thing, best thing is to go through the chart and you'll see things a little bit better. When we started, conceptually, we said, oh, wouldn't it be nice? We'd like to look at the state of data quality and in my mind, I was thinking of more like a heat map so you could see where the dots fell and the green, red and amber and we knew we wanted to track business value that was always on our radar and then also, because of the data governance program and I know how important it is to keep showing value, we said we're gonna do the same thing for the data quality program. We wanna track like the initiatives and where our time's being spent. The reality was, we really, on the right hand side, when we went to implement the tool that we were using for the business intelligence piece, we couldn't, for example, we couldn't get the heat map to work so we went with something else. And the number of, so at the top and we're gonna go into these a little bit more, but state of data quality, so that shows you at a glance, how many of our business rules are in green, amber and red, you can click on those and drill down and I'll show you another way that, like any dashboard, there's multiple ways to get somewhere. The business value, we're showing, you can see the projected versus the actuals, there's quite a bit of difference. The number of DQ issues on the bottom that includes not only the active monitoring, but the one-time kind of ad hoc checks that we do. And then the number of data quality engagements, we update this, we'll keep it updated all year long, in fact, so at the end of the year, I think this might have been done at the end of the year, but I've got the numbers covered up anyways, but we're tracking the certain things so everybody can see how much work went on in other areas. That's a better view, it's easier to see. So this is the February copy and you can see the number of DQ engagements actually looks different, that's because this is just for 2012, it doesn't show everything we did for 2011. So this works for most upper management that kind of like, oh, I just want to see where's the business value or how many of our fields are in amber or red? But for the next group of folks, usually our data stewards that are drilling down and looking at this stuff, this is the next lower level. What's, I think we've covered most of this except the current week and the previous week and they love to see the trends. So we show them whether the trend is up or the trend is down, okay? And then that tells them, you know, geez, I really haven't had time to go look at this in the last week or two and the trend, I mean, it's going down, I better do something about it. And we're constantly struggling with, you know, getting people to have enough time to work on some of this because this is just one of the roles that they play and they have full-time jobs doing other things but once in a while the crisis situation comes into play or an issue, like maybe something with addresses comes in through the compliance, you know, the customer address to the group and so it gets, then it gets attention. And then the other area fields are the things that we've already talked about. This gives you more detail on the things that we're tracking on the program performance and some of the things that we do, we provide consulting support to projects so we're tracking that. We may actually implement projects. In other words, we do all the business rule development and kind of take charge of the whole thing and we treat those projects separately than just, you know, helping out a business area or providing consulting. We can, we do training on the tool so a couple of my folks, if there's a new area, like a new finance group or somebody on the enterprise data warehouse team that wants to be trained, we'll do that. We support the tools so we're constantly doing the upgrades and just everything that's associated with that in addition to the ongoing monitoring. Just like for data governance, when we had an issue tracking database for that, we have one that we're tracking for the data quality issues as well. And this allows us to generate reports. We have a website where we put all our reports, we slice them and dice them. You can see all the DG issues, all the DQ issues, all that are opened, all that are closed, all that are work in progress, any way you want to look at it. And it helps us stay on track and so we don't lose track of what we need to be working on and following up on that kind of thing. Everything we're, well no, it's kind of like a, it depends. I think most of the issues will show up in the bottom left quadrant. That's when I said there's ad hoc ones and then the ongoing monitoring. We tend not to create an issue if it's like a project related thing. We keep another spreadsheet for the stuff on the right hand side. So I'm pretty sure there is a little gray area but we have to decide which side it's gonna go on. Yes, yes, and you guys, yes. So we have a whole data, well it's actually the same database and that we can separate out though our data governance issues from the data quality. And in fact when I was talking about DG 2012-005, well the DQ issues are DQ. And to use the, because when we started out, we weren't that fine grained and then you find after years and years of doing this, it's hard to keep track. So by putting the year in the ID for the issues has really helped. So think about something like that over time is helpful. Sure. A good data governance issue might be helping to define the business rules for data coming into our corporate information system where they're asking which source system trumps the other. So it really has nothing to do with quality. It's more a business rule driven thing or it's standardizing a field like we have some examples now that we have data that we're getting from another system that's outside our organization. So for the enterprise data warehouse, we've got apples and oranges. How this field is defined in our systems is really a month end versus the system we're using. It's a cycle date and the cycle date isn't necessarily month end. So the council will work together to derive what the definition should be or is it two fields. So it's not, doesn't necessarily have a quality issue per se. Does that help? Okay. Everybody always wants to see this. What does our infrastructure really look like? So we have, we use Dataflux. That's our data profiling tool. We run against, and we actually do most our querying against our enterprise data warehouse, which is in Natisa. And we go against the staging tables. So the data is exactly how it appears in the source system. That just made it very easy for us to do like one stop shopping. We don't have to have access to all these systems and the nightmare with security and all of that. So we've been lucky most of the 80% of what we're looking at, we can look at through the EDW staging tables. So the tool runs against that. We store the results of those runs in Oracle database. And also then the Excel spreadsheets we talked about that has the business value calculations. Our dashboard was developed with OBIEE, Oracle Business Intelligence. So that's basically what we have in place. Very quickly, we ended up creating this graphic to quickly show the business, the data stewards, what it is we were expecting of them and what the process looked like. So behind that could be a couple of pages of detailed action items for them to do, but basically just to walk them through what we've been talking about. They're gonna prioritize and give us the information. The data quality service this team is gonna do the monitoring. And then it's brought to the council to look at and then based on these monitoring results, if it's green, it stops there, right? Nothing else needs to be done. If it's amber or red, they're notified and I'm gonna show you on the next page how we do that. And then it's up to them to go in and look at the data, figure out if they can determine root cause analysis, what's really wrong, get the current records fixed, maybe there's something has to be put in place to prevent it from happening again and communicate through the whole thing. What we do, this is really nice, it's an automated email that goes out to any approver who has a business rule that shows is amber or red, okay? They'll get a notification and automatically gets sent to them and says, by the way, you might wanna take a look. One of your field's business rules needs to be looked at. And we decided as a team, we wouldn't send it to anyone if it was green. They could just assume that, because nobody likes getting emails all the time, you stop looking at them. So I included this as an example, and then we also put together this quick little chart for the approvers to kind of tell them what their responsibilities were. We always do kickoff presentations where we've got presentations for the data stewards or the IT folks and part of those presentations is what's expected of you? What are the roles and responsibilities and in this case, kind of accounts for what do we expect you to do with the business rule when you're looking at it? And if they're not the ones to fix it, they are the ones ultimately responsible for it. Okay, so it gives them all that information. So going forward, and I think we're just about on time, you know, my focus is gonna be doing more and more of the actuals. We're trying to get the data cleaned up and fixed so that we can show more value to the company, which we'll continue to do. That's where the root cause analysis and then tracking of all the fixes needs to be. And then continue to expand. Since we do a lot of project work, especially the loan acquisition area, we bring deals of loans onto our systems, finance, many areas in finance, you know, they ask us to help with just data reconciliation. We're actually using in our collections area as well now. They do a lot of manual work, or for example, we provide a service to schools where we're tracking their cohort rates, which is like basically their default rate by segmentation of students because they'll lose their standing with the Department of Ed and they won't be able to offer federal loans to their students. So we will actually profile the data and the files that come in from the schools. And now we took a very manually intensive two week process that now is automated and kind of runs 24 by seven that says, okay, these files that came from X University have some bad data and we need to let them know and they need to resend to us in order to get that rate put out. Dataflux, Dataflux is the tool we're using. Any other questions? Well, the tables that it's, okay, so if anyone can hear, he asked why do we use the warehouse and why not check closer to the source? So we're checking on the warehouse in, well, we have these staging tables, they're exact replication of the source and in fact, we run the profiling tool to verify that. So it's checking A to A or A when A went to B, it's still A. So in a way it is the source and it was a way to cheat an easy way where we could go to one place, get all the data we needed because we're looking at multiple sources without going through security and then worrying about, even though it's a read-only tool, someone's saying, well, this is our busy time, it must be you guys that are running, they're slowing us down, that kind of thing. So it's kind of just, it has worked out really well. So if you have that opportunity, that should work well for you. Any other questions? I know it was like drinking from a fire hose. Good luck with your metrics. Let me know if anybody has any questions. Thank you.