 And thank you for allowing me to present in this academy. It's a unique opportunity to share our experience on the ground and my presentation will focus much more on the approach that Rwanda used to strengthen the HMIS through the HS2 and improve the data quality. So like other countries, many other countries, we went through the challenge of having the system that was really responding to the needs of the Ministry of Health and the system by that time in before 2020, 2000, the system was not a web-based system and I remember by that time the reporting rate was 60%. So imagine taking a decision on the 60% completeness so you can obviously see that the data is not representative. So that's when we started looking at a web-based system that could be able to improve the reporting processes but also improve the completeness. So in 2011, we started the process of reviewing our reporting form because we could have not managed to move to the web-based when we still have tools that were not harmonized. And secondly, we wanted to have one integrated reporting form and bringing up also the development of SOP to standardize the data collection to ensure that the methods and interpretation of indicators across the facilities the same. And also reducing the workload to the health facilities. So those were the major criterias that we used to review the reporting form. So in 2012, we launched the web-based system which was built on top of DHS-2 platform, DHS-2 and later in 2013, we developed different tools like indicator reference manual and others. But as soon as we started in 2011, we started with the standard operating procedures for data management and for data quality assurance. So in 2018, we kept on integrating other systems like case-based systems like ETB and others built on top of Tracker. Then in 2018, we kept on strengthening the system by integrating the dublature data quality apps and other packages. Then what we did was coming up with these tools before we even engage or bring the web-based system in place, which is the IT solution. So we developed the standard operating procedures for districts and health facilities that governs the data management and reporting, data verification and validations, and we also reviewed the data collection tools like patient files, registers and reporting forms. Then later, as I said, we also developed the indicator reference manual to ensure that everything is harmonized in terms of indicator definitions and interpretations and common errors in interpreting some of indicators. So that was not only what we did, it was also to bring in some accountability kind of, but also sharing the publication of a new statistical booklet and also promoting the information use. So you may ask yourself why it was very important to have these tools standardized and why SOP, developing these SOP. One is that we had any clear roles and responsibility at the facility level. I remember in 2011, we used to blame some of the facilities that they are not reporting. Then they could ask you who was supposed to do it. There was no clear roles and responsibility to know who is doing what at which level. Then another one was we identified that we have fragmentation of tools, many tools that some of them were repetitive. You can imagine that Andrew, you're writing Andrew in like 30 registers and you're repeating the same name. So we said, no, it's better to harmonize the tools and ensure that someone writes, if it's a case, you just write it once or two times, but not many times. And also systems, we had many systems that were fragmented, vertical ones. Then another gap that we had, this was that there was no standard data collection processes. So each facility used to just report the way they want. And this one was a gap. Another one was a clear definitions and many more. I will share these ones, then you can be able to read. So there was an agent needed to ensure that all of these are harmonized and standardized and we have like a standard operating procedure to ensure that everything is done in the same way. And the way we define our indicators and collecting them is the same way. So you can believe that when you are improving or harmonizing your processes in terms of data quality, this is what you do. So you are trying to move people from this workload that you can see that people are dragged into the data and they spend much time just to collect data and they spend much time to keep these papers. So when we share that ensuring data quality, what do we mean is ensuring that we reduce all these workloads to the health workers so that data collection is simplified in a way that it doesn't take a long time for the health workers. The SOP, what was the purpose of the SOP? The purpose of the SOP like data management standard operating procedure was to really clarify or highlight across the facility the procedure for data collection. What did you do when you are collecting the data and the procedure for data quality assurance? What do you do to ensure that the data you're collecting from the registers is the same as what you are reporting in the electronic system. So again, it was also to clarify on the health related records storage. Most of the time they used to ask us how long are we going to keep these facility registers. All of these were supposed to be clarified and also retention. Another one was data reporting, like putting in some timeline knowing that if you're reporting on a monthly basis, when do we say that data is reported on a time basis, when do we say that data is reported, not reported on time basis. So we have also the chapter on the data sharing and the access and then release, but also we have the analysis and dissemination. On analysis and the dissemination, we also added the key indicators that they have to analyze and measure their performance at facility level. So all above chapters, they really highlight the general principles when it comes to data management, then roles and responsibility. It means that if you have someone in charge of data management, what is what is the role of the data manager? What is the role of the chief nursing? What is the role of the clinical director ever enough to have a role? What is the role of the head of the facility? So when everything is really well defined, so from there you could be able to know who didn't do what he was supposed to do to ensure that data is reported on time. So we also devoted the data quality assessment SOP, which is kind of an operating procedure that you have four chapters that really highlight the introduction, why do we have this data quality assessment SOP, then the overview of the data quality assessment process, then the data quality assessment procedures and data validation, then again, there's something that we always don't consider when it comes to assessment, giving feedback, which is the important part and the briefing. So this SOP really defines well how you do the overall assessment, but in a way you give the feedback to the facility to ensure that they also consider the improvement on what you have identified as gaps. The SOP again highlights the appendices, like a final report template, we don't want people to invent, like example of notification letter, like if you are going to visit the facility, you don't have to write a letter, the letter is there, you only put the demo facility, then the correction form, in the case you want to do some correction to ensure that we track the logs, we have the correction form, then the routine data quality tool assessment or the tool. This was there before the data quality app, but now we are shifting towards the data quality app. Again the data quality app also focuses on the general principle, as I said, and they also define the roles and responsibilities of each staff to ensure that whoever is in the facility knows his role, then it details the procedures for each chapter. The idea is an example of what we have on the data quality standard operating procedure or data management. First of all, it defines the type of reports that we have, responsible person to ensure that that report is submitted, reporting level, is it central level, is it district, then time frame, is it immediate, like disease severance, is it monthly, is it daily, is it weekly, so everything is defined, then from there, you can see the table on my right that really defines the months, that it defines what they have to do. On 25th up to 30th, they have to ensure that they gather or register together and ensure that no register is missing, then first to 5th, they have to report into the system and the system tracks the reporting, the reported data up to 5th as the timely reporting. Then 5th to 15th, the system is open for any data correction, then 15th, the system automatically blocks the data set, so it means after 15th, they cannot report anything into the system. So at least this one gives them the discipline to ensure that on each provided time frame, they are really using it to maximize the process that is done at that time frame, then when they know that the data set will not be open for data to be entered at any time, they also make this business to be serious. So in SOPs and indicator refresh manual, we tried to make it flexible to ensure that it also tracks some common errors that someone can encounter, so we highlight them clearly on the reference manual and SOP also. As I said, we also use the packages to ensure that we simplify the way of data visualization and the more you want people to use the information and improve the data, you have to ensure that the tools they're using are simple to use and they are able to visualize the information that they want. So these are different programs that we have, modules so that we have in the system. There is a program related to routine clinical reports, case-based data, which is ETB, AP tracker, medical satisfaction cause of death. We have annual reports, we have daily first reports from the hospital that are used by the health sector to by the Ministry of Health to track the process, to track the daily update from the facility. We have disease surveillance, we have PBF performance-based financing, we have a COVID module into the DHHS2. Then regarding the mechanism, I think many people are really interested to see what are the mechanisms that we use to ensure that the data is of good quality. We have different mechanisms but I will highlight the major ones. The normal ones that we're all familiar with is the normal data quality checks where we have what we call ISS, integrated supportive supervision, that we jointly work with the development partners and we take data from HMIS and we go down to the facilities to check if what they reported into HMIS is exactly the same as what they have in the source of data. So this is a central level data checking which is ISS, integrated supportive supervision. We have also the program specific data checking where they bring the head of the hospitals together because each hospital at least has 12 health centers. So when they come together, they give them feedback on the data, then from there they are able to check their data. So we have also validation rules in the system. We also promote the information use, the analysis and feedback from the central level which is the HMIS team. Then we have this data set locking and approval. Then we have the performance based financing which also promotes the data quality. So we also have the accreditations. It means the performance best is that people are really, we have indicators on completeness and timeliness and these three indicators when all of them they are 100% that's when they have the scoring of the PBF which is performance based financing. Then accreditation is that most of them they struggle to ensure that they get on the certain level when it comes to accreditation. Then we have decentralized also data review which is the monthly data validation done by the facilities. So that one also is the additional part of the data review and an improvement of the data quality. So this is the example of the monthly data validation meetings from the facilities. You can see that they come together on monthly basis. Then they review the tools that they have if whatever tallies they made is exactly what is in the registers. You can see that my people here they are on the registers checking. You can see the other ones are just there. They are presenting the data that have been reported in the system to ensure that they validate before the data set is locked. Then here they are also checking if exactly what they put is the same as what they have in the source of data. So on the data use part we also have the coordination meetings. Coordination meetings is the monthly meetings that the district hospitals because you know at each district hospital they have the catchment area. So that catchment area comes together to to review what are the indicators to check if they are really on track. So at least these coordination meetings on monthly basis also promotes the information use. We have also the DHMT quarterly meetings that is done bringing together the overall district with each catchment area has facilities and they have selected people to attend. They also use it to present the health status on their catchment area. So all of this is towards to improve the information use but also it also contributes to the data quality. So not only that we also have at the central level we have senior management meeting that also looks that uses the data. We also have development partners that also look at data quality. All of that also contributes a lot on when it comes to quality data. So we also use the global tools like many other countries because you know these tools have really brought a huge change when it comes to data review and checking like the WHO data quality tool. It's a really a nice tool that we have institutionalized in our standard operating procedures and also trained our people to use it because it's a tool that can be able to show you the outliers and it can be able to show you the inconsistences alongside along your data when you try to analyze it as a trend. So this is the example of the WHO tool and some of the charts that we generate. So what we did is that we thank you Scott and Bob. We really worked with Scott and Bob to ensure that our teams in I mean in Rwanda they are trained to know how to use it and it was a collaboration between Universal Voslo, WHO and Minister of Health. So the tool for us to maximize its use we reviewed in 2012. We reviewed our standard operating procedures to accommodate some of the futures of the tool and to ensure that it's also part of our everyday life in the health facilities. So we not only that we've been working with the team I mean Scott and Bob and the Minister of Health team to ensure that we have the dashboards that are able to allow people to monitor their completeness at facility level to monitor the trend of the data over time and also this one is a way of simplifying the visualization of data on their fingertips. Not only that but also we whenever I present this people always ask me about sustainability to ensure that whoever we train is in place and nothing is compromised and we have like we have a continuity. I think as part of that's why I like these academies. The academy approach is really a best way it's a good framework to ensure that countries have their sustainability and they also have the continuity of these good interventions. Like in Rwanda we've been really hosting different academies before COVID even we've been attending different academies. So what we did was like having our people attending these kind of academies like Data Quality Academy then we used to maximize it like ensuring that at least at each program we have a maximum number of people that are able to customize the DHS2 that are able to configure the Data Quality app that are able to support the facilities. This one is also a motivation because you know even if one person leaves the ministry or the facility at least we are able to have someone that can be able to train the rest or can be able to replace other teams. So I could encourage countries to attend this kind of academies to gain skills because the more you attend academies the more you gain skills of new futures and the more you get some insight from the country's stories. Not only that we as we've been really implementing these good initiatives but we were also having some constraint when it comes to trainings and the problem was that we used to train because we have money of training but we are looking forward to see how could we train based on the needs from the ground not based on the funding available. So we thought about the e-learning and we thought let's come up with a way a remote way to train people because the more we have new futures the more we'll need to train people the more we see that we have gaps the more we need to train them. So we came up with e-learning that is now online previously we managed to train all health centers and district hospitals and district administration staff on the data quality processes and it was much more to ensure that the data quality app is well used. Then our slogan is that it's time to move learning to people instead of moving people to to learning. So you may ask what is the contribution of these the quality data or having the web based system in a place or not only having the web based system but also using the DHS2 because you know the more you have it updated you gain from many futures that can really improve your quality data your data quality but again also raise the confidence of policymakers to use your data. So most of the contribution is on the policy and the strategic decisions because you know we are lucky that we have one integrated case management formation system that collects all data so it makes it to be used and trustable and used in planning program planning and management resource management capacity building disease surveillance because even disease surveillance we are using DHS2 innovations and research and MRE. So even though I presented some interventions around data quality but we still have a long way to go because you know we are not where we want to be we still we are still promoting the information used because it's not at that level that we want and because you know now we are focusing at facility because you know at the central level it's good but it's much better if we keep pushing people to facilitate and district level to use data instead of us telling them that maybe malaria have increased we are promoting now another approach of having the information from them down to bottom up. So again we are really now installing different futures WHO futures to ensure that we really maximize the use of these packages because most of them they are developed based on the standards or the global standards and you know like because Rwanda is also part of the global standards so we are taking advantage of these global standards because most of them are being they have some experts that have been really concentrating to ensure that they bring together information that may really help and you countries may take advantage of what I like is that these packages are free and that makes us really to really play with them and use them to really improve what we are doing in the countries. Then definitely having these futures in place you have to consider building the capacity and these are the future plans that we have to ensure that at least we maintain what we have adopted adopted and implemented across the facilities. So as I said I wanted to show you I don't know other countries for Rwanda this is for Rwanda you can see that even though we have all those good interventions in place but we still have much more to do when it comes to analytical futures or favorite views you can see that most of the users they are using pivot tables and in the system we don't have only pivot tables we have dashboards the dashboard is the black one you can see is still down so we need to have people really using the dashboards visualizer which is the chart maps and others that really promote the information use so but we have already started this initiative with Universal Voslo finding out why people like to use pivot table than other futures that's why we can see that there are many futures that have been released based on the on the field feedback. So without taking long these are the links where I can share the link but on my point to have the link where you can be able to access all these tools the SOP the reporting form the metadata dictionary which is the indicator reference manual the annual statistical booklet that can be able to show you what we are reporting and the template the list of health facilities can also be found on the website. So thank you let's let's keep the beta information beta decision and beta health this is the end of my presentations you're welcome Scott to take over and guide us to the next step thank you. Great no thank you so much Andrew able to measure the data quality improvement or the data use improvement over time or how have you been able to monitor that? Okay thank you we first of all we we have different articles that I will share that manage to measure different indicators more specifically we've been working with partners in the health but also we have that assessment that we do which is the ISS when you look at the measurements you can see because we know we have the ranging of the score that when it's five percent we if the error is around zero to five percent that one is okay but if the error is above five percent that one means that we need to put much efforts to ensure that we reduce the the the the margin of error so when you look at the the results from the ground and the study that have been done you can see that the the data quality is pretty good when it comes to to our data that we collect from from Rwanda but again you know it's it's it depends on some indicators there are some indicators that when you take them those ones that are not much more used they may be lagging behind but the key indicators really have over 90 95 percent of quality. Great okay thank you a couple of additional questions what was the process that you all went through to develop these data quality standard operating procedures and and how have you been able to to implement them you talked a little bit about the remote training but maybe you just give a little bit more clarity on the standard operating procedures how they were developed and how you implement them in practice. I think it was my second that that is a good question first of all before you develop even the standard operating procedures you have to understand your gaps and you have to understand what you do you want to address remember that in 2011 it was the whole year of preparations so we've been really understanding the gaps that we had and we already had that the reporting rate was 60 percent so we wanted to improve the reporting rate and the completeness so from there we identified different gaps one was that the reason why people are not reporting is that they don't know the role and their responsibility so we said we need to come up with something that really defines what does Andrew have to do what does another person have to do what is it what is the synergy between them so all of that was also part of what we wanted to make sure that it's documented and the process of reporting so again we were also moving from a locally based system to a web based a locally based system was was a little bit tricky because by that time in 2011 before 2011 we were forced to go to each facility and install on individual computers but now it was a web based so we were also forced to ensure that whatever that will be accessible online is it used the same way is it standardized reporting is the same so that we because we didn't have time to go and really and really train them or mentor them at the facility I remember one time before 2011 there is one I always give this example there's one colleague of mine from one facility that you take eight hours of drive eight hours drive from that facility he came with like a flash drive with a backup that I have to restore in the server so at the time he was in in in the MOH I checked in the flash drive there was no backup in the who kept there so it was like the virus has really eaten the backup so the guy was forced to move again to the facility and come back again and go back again you can understand the time wasted by that time so from there at least when you have a web based system you are lucky but again you have to ensure that you have everything standardized regarding training our team we really did the because it was the first web based system that we had in the country regarding the health and management information system we were forced to really use the training of trainers which is the TOT and again use the training of trainers to ensure that we reach everyone in the country and we transmit the same message so that is what we did but maybe currently we have many ways that they can be done but that's how what we did so the SOP was much more to address the gaps to identify it and to ensure that we standardized everything regarding the data management very verification and others because even users used to ask us you are telling us to verify the data how then you could try to tell them do ABCD but your colleague could also say this different way to do it then when you go down to evaluate because we have the ISS what ISS does which is standard operating procedure it go no sorry ISS integrated supporting supervision what did they do they go down to assess what we documented in our SOP is it implemented so by that time that we used to go down and you are not able to see what to evaluate so we said no let's have a standard that we can even go down and evaluate if we say that people have to have a meeting once a month is it done they have to have a report documented is it documented those are kind of things that we wanted to have in place to have evidence based not a theory based I don't know if I answered your question no you did and I think that the clarification on how you actually go down and validate or verify what is actually the end the standard operating procedures is a really good critical step you know I think I'm from all of the case studies that have been presented I think hopefully one of the very clear messages is that standard operating procedures are absolutely necessary and you've even taken a step further by having a process to actually verify if the standard operating procedure is being followed so that's a I think that's really really interesting we have time for a couple more questions and they are coming in actually quite quickly now one question is they are curious on how data quality is done with the facility staff do you call them into one place is there a meeting you know just what is the standard operating procedure for the actual facility staff for them to conduct data quality checks thank you Scott that is a good question again so what we did we have two ways one is that we already have the data checks that is documented in our SOP which requires certain group of people like if it's a hospital we require the chief nursing the clinical director the data manager the M and E and some people in the services who records the data to come together and review the data before they report them into HMI and this one is done either before fifth because fifth is when we say that they have reported on timely basis but we give them at least 15 days it means from first to 15th they have to undergo that process so at least within those 15 days they have to sit down those key people I mentioned and I have even I've even shown the picture and the review reviewing means they bring the register and check if what they they really counted is the same as what they reported so but again after that meeting they have what you call coordination meeting the coordination meeting is the same as the previous meeting only that the previous meeting focuses on the quality of data checking if there's anything that you reported wrongly but the coordination meeting is the catchment area meeting where it brings together with the health center in that catchment area then what they do is that the M and E at the hospital presents the key indicators saying that for example that we had three maternal deaths maybe last month and the first month it was one maternal death then they may ask some people in the maternity why it happened then they have to explain then when they're explaining they take some actions those actions at least they may say we have to improve their case management in maternity we have to improve the way we treat the way we treat the mothers we have to ensure that the ambulance when they are going to transfer mother to their facility it doesn't take long those are the kind of actions those are the kind of discussion that they take discussion that they have then at the end of the day they take also actions at least those actions are on top of the real data that they took from the system again in the case they reported the wrong number you can understand that someone in the maternity will say no we didn't have four for maternal death we only had one so we I need to understand where three are coming from so those are kind of discussion that discussed but again it is one is just improving the quality of data the second one is improving both the information use which is the actions then again also checking if the what data says is what people think that they have they have they have in their services so I think that is what I could say that we have but again as you said Scott whatever that is written in SOP is evaluated after six months so each six months we go down which is the ISS integrated supportive supervision with development partners though then we check if they are really implementing what is written in SOP at least that one makes them to really make sure that whatever we have in the SOP is implemented and is used and also we ensure that the SOP also defines well those kind of meetings of data checks and what they have to check and who chairs the meeting if the person who chairs the misses knows around who replaces that person after the meeting where how is the you have seen that we have appendixes of the report the type of the report that they have to document what actions that I have to take how they have to conduct the next meeting by reviewing the previous actions those are things that we we manage to document well to ensure that is it also guides the facilities thank you great that's that's very very clear thank you Andrew we have time for one more question and I think the question is how what is your experience in sharing data with stakeholders maybe implementing partners NGOs how do you regulate the number of data managers in the system and and do you have any challenges with this yeah um that's a good question that is a discussion everywhere that I've really visited so for the for the facilities and district it's by default all of them they have to have access and we give them access it doesn't require anything because the data that they report is from their institutions but when it comes to development partners we also have this mandate of ensuring the development partners are also part of the process that we undergo so what we do is that we in in in HMIS we have a focal person that is assigned to each development partner that ensures that whatever request they you they need is shared with them so for them when like if today we have a new NGO that coming to support the health sector what do they do they just do a request they just do a request where you say that I will be needing this data to monitor our interventions so what we do we also respond back and say I mean the minister of emergency response I can say we will be giving you this data please contact this contact this focal person that will be ensuring that whatever data you need is on your disposal it doesn't take long when you're going through the processes so that's how we are doing it but currently we are not giving access to everyone to have access in the system we are ensuring that access in the system is limited but in a way that we are able to serve our our stakeholders by assigning the the the focal focal person but again without we are now developing the data warehouse which is the portal that will be making some of the data to be available to the partners but again we also have a mandate or obligations of ensuring that on a new basis we publish the statistical booklet that really highlights all key indicators that anyone a researcher or a a DP which is the partners can be able to use when they are reviewing their interventions thank you okay and that's sorry one last question here I think it's a good question one that I was wondering about myself is there any penalty or are there are there are in are there any rewards for actually following the SOPs or if they're not actually following the SOPs are there any penalties when you do the the the the actual like go into the field that which the ISS um that's another one so what we did was in linking together some of these mechanism or framework that we have accountability framework I think for us what we have in Rwanda is that each staff have what we call KPI key performance indicator like if Andrew working in the facility I have a key performance indicator and my key performance indicator is linked to my tour and what I have to deliver on daily basis so that's one and we have some uh some portion of the what we call pbf performance based financing that is linked to that to that to my KPI which is performance based financing but it's not only that we have managed to to link all those mechanism that really promotes the good quality data or reporting to different mechanism one is performance based financing where people are evaluated on quarterly basis based on different indicators like in data management for us we have like three indicators or I mean for data quality the completeness timelines and uh completeness and the accuracy all those three indicators are in performance based financing in the case one indicator is not well done you get is real so there's no tolerance so it means they have to make sure that all those key indicators uh reported and we really reported well so if it's timeliness you reported on time 100 percent if it's accuracy at least you are in a range of five percent if it's completeness you're on range of 100 percent then that's when you get 100 percent but if one of those is maybe 80 or 60 you'll get zero for everything so you mean it means you have to make sure that everything is 100 percent another thing is that um accreditation we have managed to link also with accreditation in accreditation when you they are trying to move facilities from level one to level two level three level four they also review these tools they have to check if you have SOP in place they have to check if you understand the chapters of SOP they have to see if you undergo the process of what SOP says they check different components but they also want to come to data management they check those areas at least that one when like me I mean facility A and I see that my components really made my facility to go in level two it is also a motivation another one is there with this way we are bringing the e-learning they've been telling us that we train them but we don't give them certificates now we are trying to keep the database that also highlights who we are trained in the specific period of time but also they get certificates so those are things that really motivate the facilities interlink frameworks that are in a place that motivates the facilities and as you go on people start having the attitude of doing it themselves maybe one time when you visit us you will see that when it comes to first and fix when you go to facility everyone will not will be telling you oh we are late to report they will be really rushing to ensure that you report on time and they ensure that everything is really accurate because you know on a quarterly basis and six months they have a team of people that will go down to check if what they reported is the same as what they have and another thing is that after the ISS which is integrated supportive supervision the report is the feedback is submitted officially is the minister who signs the feedback it means imagine if I'm Andrew and I see the letter of the minister saying that my department is not performing well so they really make sure that the feedback comes when it's positive instead of coming when it's negative and another thing is that we don't only report what they have done specifically we also get the general feedback that can improve the system so we say generally we are fine that people don't conduct maybe data validation meetings so it becomes like a general feedback that one also improves because you get your specific feedback but I guess so you get the general feedback to that we have found everywhere but at least you make sure that the next time it's not also you who is going to be highlighted in the report so I think those are the key things we have other steps like development partners in their in their in their support area they also conduct the self checks to ensure that people understand how to check these data we have districts that supervises lower levels we have lower levels the district hospitals that supervises the health centers and all of this is also documented in SOP and we also go down and check if the hospitals really supervise their health centers all those mechanisms that brings in the supervision accountability roles and responsibility contributes a lot when it comes to data quality that's probable that is the best way to end the entire academy andrew you actually have the honor of being the very last presentation in the in the entire academy and I think this is a really great one to end it on