 One of those use cases that is worth discussing quickly here in this presentation is the Rwanda data quality case study. So in Rwanda they have developed very clear and specific routines for data quality reviews and data quality checks. These data quality checks are done at every level regional district and facility level. So we see here that they have defined very specific data quality standard operating procedures. These standard operating procedures define what needs to be done for data management and reporting as well as data validation and data quality. There are multiple data collection tools that are incorporated into the standard operating procedures, patient files, registries and reporting forms. They also have an indicator reference manual, a metadata dictionary that goes hand in hand with these standard operating procedures so that users of the standard operating procedures will know exactly how these indicators are being defined and what are the key metrics that are being used with these indicators. Their HMIS requires facility level reporting by the fifth of each month. So that's what is defined as a timely report. Data correction is allowed between the fifth and the tenth of the month. So between the fifth and the tenth, users at facility and district level can change data without actually having to ask permission. So they are doing their own data quality desk reviews. They are maybe doing some validation or verification of the data and if they're finding any issues they are changing them between the fifth and the tenth of the month. After the tenth of the month, the data is locked. That means that they are no longer able to make changes to the data unless they submit a formal request to change the data. A formal request to change the data is a fairly involved procedural activity where a facility or district has to send a paper request to change their data and with justification and rationale for why the data needs to be changed. So here we see the steps of the data validation process as defined in the Rwandan standard operating procedures. The first one is at service delivery data at health facility level. Here the first step is the validation of the data that's been reported. The health facilities will actually hold a meeting of their staff, review the data, go over and submit a meeting report where they are actually looking for data quality issues at facility level. The next step is at the clinical director level where the clinical director is validating the data submitted by all of the heads that are providing services and again submitting a meeting report on the validation of the data submitted by these service heads. The third step is that the data manager rechecks the data. So this is what's happening between the fifth and the tenth of the month where the data manager is reviewing the data that's been submitted and reaching out to the clinical directors and service directors or service leads to address any data quality issues that may be found. Finally, the head of the facility in step four signs a report saying that data has been checked. It's gone through these three previous steps. And at that point there's a coordinating meeting where their data is reviewed again and everyone is in agreement that the data is clean and ready to be submitted. Rwanda also has a process of reviewing this process. They have a organization called the integrated support and supervision that in supports this process as well as checks to make sure that this process is being followed. Rwanda, like most countries, is very concerned with completeness, the accuracy of the data, the integrity of the data and the timeliness of the data. So completeness again is all data fields have been completed. Accuracy is the data reported should be the same as the source documents. So that's the what we've defined earlier as verification. The integrity is that the data is without any kind of personal manipulations the data has not been tampered with. This integrity part is very important as it's been defined in their standard operating procedure where eventually the data is locked and is not able to be changed after the 10th of the month. And then finally timeliness as well. So the data is prepared and sent in on time. At central level they conduct every six months an internal supportive supervision. So this is when the central level is reaching out to districts and health facilities. They have done data quality validation rules. They are presenting those rules and analysis back to those individuals at district and facility level. They are reviewing data use and the HMIS analysis and feedback. They are reviewing the process for data set locking and approving the data sets. They are also assessing performance and performance is tied to financing and accreditation standards. So here you see that supervision support is integral to their routine data quality practices. And we find this to be a very good best practice that supervision and support should be done to all health facilities and all districts at some frequency. And during that supervision and support those conducting the supervision need to go through just as Rwanda has outlined data quality data use any kind of feedback following policies and procedures standard operating procedures. And then in Rwanda's case they have also tied it to performance based financing and even accreditation. The last point to make here is that Rwanda follows a somewhat decentralized levels where facilities are responsible for conducting their own data quality reviews at the lowest levels within their catchment areas. So again the facility is going through a process of conducting those data quality reviews, signing off at the various points and then having the opportunity to change their data between the 5th and the 10th of the month. So now let's take a look at some of the data quality dashboards that they have started to develop in Rwanda. So here's an example of a dashboard that's monitoring maternal and child health data set reporting completeness and you can see that on this dashboard they have many different line chart to measure the reporting completeness of different types of health facilities. So you can see that the top three charts are showing the same type of metric where we're looking at reporting rates broken down by different types of health facilities. So referral hospitals, provincial hospitals, district hospitals, etc, etc. And so what they're able to do here in Rwanda is quickly identify the types of health facilities that have the lowest reporting rates, which is interesting. Different health facilities may have different types of challenges, may have different types of supervision or oversight. And so knowing which ones are not performing the best will be a good measurement of taking corrective steps to getting to those health facilities, getting to those types of health facilities, maybe rewriting standard operating procedures, putting in types of new policies to improve their reporting completeness. So this is just one example. You can see that at the top they have several other dashboards to monitor data quality. We won't go into these other dashboards, but just to highlight that there are countries that have been producing very good data quality dashboards and sharing these dashboards, building standard operating procedures around them. And Rwanda is definitely a leading example of that. Another key point of the Rwanda case study is how they are conducting monthly coordination. So again, they have outlined a very specific standard operating procedure and they have outlined the key decisions and stakeholders within that standard operating procedure. So in health facilities, again they're conducting monthly coordination meetings and at district level they're conducting these monthly meetings as well. At central level they have meetings for planning and policymaking, adjusting interventions at routine frequency. They're also communicating with development partners, so giving data to different development partners at different points. And again, these are all done in a routine way and they have specific standard operating procedures for it. So some of their analysis that they have conducted based upon their supervision and support is very interesting in that they actually categorize the types of data quality issues that they are seeing. So here we see some data from their supervision and support process that tell us what kinds of issues are causing the various data quality problems that they have detected and corrected. And you can see by and large the vast majority of them are counting errors. So facilities are incorrectly counting from the paper records into DHIS too. You can see there are other problems as well. For example, missing information registries, typing errors, missing data collection tools, misinterpreting of an indicator. But by and large the vast majority of them seem to be associated with counting errors, at least in Rwanda. And I think that is a reasonable expectation that this would be similar in most countries.