 Now, let's do a quick review of the data quality tools in DHIS2. In DHIS2, we have four key data quality apps. The first one is the WHO data quality tool. The second one is the DHIS2 data quality app. We also have the DHIS2 data visualizer and dashboard applications, as well as the reports app. And we'll quickly touch on each one of these to show its unique functionalities, as well as how they are able to complement one another. Starting with the WHO data quality tool, the primary role of this tool is for national annual data quality reviews. The tool can also be used at low level for routine data quality reviews as well, maybe district and facility. And essentially what this tool does is it bakes in the analytics that are coming out of those WHO data quality review guidelines that I mentioned earlier in the presentation. It bakes those analytics into a DHIS2 application and automatically produces those same analytics for you, just here though in DHIS2. So it presents you with dashboards for consistency over time, internal consistency, and it enables you to do outlier detection. This tool was developed several years ago now. The goal for this tool is that it will be continued to be supported. But all of the data quality analytics that this tool introduced to DHIS2 are being merged into the standard DHIS2 visualization applications, which are the data visualizer app as well as the dashboard app. The validation rule analysis in the data quality application is another really powerful feature. And as we mentioned previously, a validation rule is simply a predefined logical relationship between multiple data items. So in this example, you can see ANC1 and ANC2. This example is saying that ANC2 should be less than or equal to than ANC1. That's a logic statement based upon a general rule of thumb. Now there are certainly situations where ANC2 can be greater than ANC1. We see in some countries where there are seasonal trends in birth that you have a cluster or a cohort that's going through ANC2, a different cohort, than who's going through ANC1. So practically speaking, it is very much possible for ANC2 to be greater than ANC1. However, it's extremely uncommon in most countries. In most countries, we see that there is a dropout between ANC1 and ANC2 that's relatively predictable and reliable. So when we have a scenario where ANC2 is significantly bigger than ANC1, that's usually in most countries an indication of a data quality issue. That's what we see being indicated here, where we have 32 ANC2 visits being reported and we only have 14 ANC1 visits being reported. Again, in most countries, this would be evidence that there is a data quality issue somewhere in my ANC2 visits count. As we mentioned earlier, this can be run at different points. So again, data entry, while the user is actually entering the data, it can be done ad hoc by any user who has access to this application. So any kind of frequency that they want to actually just run the data quality analysis, as well as it can be automatically generated. And if it's being automatically generated or being done in an ad hoc way, you can make sure that it sends alerts and notifications automatically to users. And again, I think this is an extremely important functionality that DHIS2 has to actually send data quality notifications to the people who can actually address them. In the DHIS2 data quality app, we also have outlier detection. And this produces a report that's very similar to the validation rule analysis. You can see here that we're able to run outlier analysis using Zscore and modified Zscore in the data quality app. This is a good practice to do if we want to analyze large amounts of data at one time for outliers. For example, if I want to find the outliers across the entire country for an entire dataset over the last six months. In most countries, that is going to be a lot of data. And once you start this job, DHIS2 would be able to process that large amount of data and produce a single report. Now looking at the data quality analytics on the dashboard. And as I mentioned earlier, the goal is that the dashboard application will be able to support the same data quality analytics that are available currently in the WHO data quality tool. The reason is because the WHO data quality tool when it was initially developed several years ago was a big step forward for functionality to visualize data quality in DHIS2. And what we have seen by the use of that application is that users want to have the data quality analytics more easily available to them. That application, they're using that application, required users to be trained up on a whole different application. It required them to have authority and permission to be able to view that application. And it presented a little bit of a use barrier that the users had to click out of dashboards, which most all DHIS2 users know how to use a DHIS2 dashboard and into a separate application that presented data a little bit differently in a different dashboard design. And we found that for many users, especially at lower levels, this was confusing and they were hesitant to do it. So what we've been doing is merging functionalities in the data quality tool into the standard dashboard. And that's what you see here, actually, in this example. In this example, we have a year to year chart just on the standard dashboard, very easily to see trends over time from 2020 to 2021. We've seen that data is following a similar trend in this example, as well as looking at a scatter plot that's showing outlier detection. So in this example on the image on the right, you can see that we have a scatter plot and each one of the dots here represents a health facility. And we are looking at the comparison between ANC1 visits and ANC2 visits. And we have defined an acceptable range or threshold for this relationship. And that's what you see as the lines going across. So here we have applied probably inter-cortile range, which is a statistical methodology for detecting outliers. And you see all of those dots that are green are showing up within the acceptable thresholds that we have defined. And then all of those dots that are showing up in red are outside the acceptable thresholds. And those threshold lines are indicated by the gray lines on either side of the black line. The black line represents the mean value of all of the data that's displayed there. So here we're able to quickly see where the outliers are as they're showing up as red dots on the screen. And we can hover over one of these red dots and see what the actual values are. Another important functionality of this scatter plot is that we have these 1% of total values lines. These are the dotted lines that you're seeing going across that particular chart. These tell us what the value is for 1% of the country's total ANC1 or ANC2 values are. Why is that important? That's important because if we have an outlier that is beyond 1% of the total ANC1 or ANC2 values, then that is throwing off national statistics. That outlier is affecting the total national values. That outlier is big enough that it is actually changing the ANC1 coverage or ANC2 coverage that we see at national level. Those should be your highest priority outliers to address. The ones that are throwing off that are big enough to throw off national statistics. And you see in this screenshot that we're actually hovering over one of those. Where one of these facilities here has reported 4,681 ANC1 visits and only 1,780 ANC2 visits. DHIS2 is automatically saying that that ANC1 visit is way too big. And it's so big that it's throwing off your national statistics. It needs to be followed up on. Is it technically wrong? We don't know. Could it be a data entry problem? Absolutely. Could it be a programmatic thing that actually just happened that somehow at that facility, we just happened to have such a large number of ANC1 visits or relatively small number of ANC2 visits, possibly. But at the very least, it's worth following up on. On the dashboard, we're also able to have scorecards, looking at reporting completeness and timeliness, as well as line charts, looking at those reporting completeness and timeliness trends over time. We also have bar and column charts to look at internal consistency and dropout rates. So do I have an expected dropout rate? How big is my dropout rate? You're able to show negative accesses on the charts in DHIS2. You're also able to upload survey data to DHIS2 to look at external consistency. So I can have my ANC1 coverage indicator that's coming from my routine data alongside my ANC1 coverage indicator that's coming from my demographic household survey and compare those two, all here on a standard dashboarding. Now, looking at the Reports app, the Reports app is a bit of a legacy application in DHIS2. It's one of the first applications that actually produced some type of analytics for the end user. And because it's one of the first apps that's been developed, it is heavily institutionalized in many countries' data quality standard operating procedures. All of the analytics that are shown in the Reports app can also be shown on a standard pivot table on the dashboard as well. But we still support the use of the Reports app simply because so many countries are dependent on it for their standard operating procedures. The Reports app shows really two key things. The first one is data set reports and reporting rate summaries. The data set report shows us the number, as you can see here on the screen, the number of actual reports, the number of expected reports, the reporting rates, the reports on time, and the reporting rate on time as well. So it gives you the entire breakdown. One other functionality of the Reports app is that you can upload resources to it. For example, a Word document outlining standard operating procedures, a PDF document giving instructions or guidance on how to perform some of these data quality reviews. All of those can be uploaded to the resources and then those resources can be linked to the dashboard so that a user can have access to these key resources or guidance documentations or tools directly on the dashboard. Here I just want to reiterate a point on outlier analysis. As I mentioned earlier, we see many countries are having issues with detecting outliers in a consistent and timely way. And again, in DHIs too, we have many tools for outlier detection. Here again on the screen you see the WHO data quality tool which is showing you a pivot table. The cells that are in red are automatically detected as outliers. In the bottom left corner, you see a year over year chart where you can easily see there is a spike and that spike is indicating an outlier. And again, as we previously talked about in under the dashboard, you can make scatter plots and year over year charts that enable you to detect outliers. The point being made is that there are many tools to detect outliers in DHIs too. And the struggle now is not to develop necessarily new tools but to make sure those tools are implemented and available and used at the lowest levels. To partially address this need, we have developed the DHIs to Data Quality Academy. This Data Quality Academy walks you through all of the functionalities for data quality and using data quality tools in DHIs too. It is free and available online. Anyone can enroll and take the Data Quality Academy. It's self-pay so you can take it at rate that works for you based upon your availability. And it also presents many use cases and examples so that you're able to go in and see what other countries have done, other users have done and maybe use that as a framework or foundation for your own personal practices.