 Okay. Well, welcome to the next session, everyone. I'm Scott Russ Patrick. I'm the DHS2 Analytics Product Manager, and I am really excited to be able to provide, to host a session on data quality, and exactly what we're doing with DHS2 for data quality. We're going to have three really interesting case studies come up, as well as some nice kind of tricks and tips presented in the beginning. But I'm just going to say a few quick words about the program and introductions here. So, first of all, I'm giving the introduction just a few more things to say on that. And then we are going to have Bob Pond, who can't join us virtually. He's on the West Coast of the U.S., so it's a little bit early for him right now, but he has made a video for us. Bob is really kind of one of the global experts on data quality working with us and the WHO, and he's prepared a quick presentation on how to use DHS2 standard dashboards for data quality. And then we have three really interesting case studies coming. So, first, Angela Abba from Nigeria, on exactly what Nigeria has been doing for data quality checks. Then we have Tina Kunmunja. Sorry, I just destroyed your surname. I'm so sorry. That's okay. Kunmunjim. I apologize for that. And Tina's going to be taking us through exactly what the WHO has been doing for workforce monitoring and accountability and data quality checks that they've instituted there for that. And then, finally, Joseph Fanner is going to take us through exactly what Matt Gaskar has been doing for their data quality checks. If you, again, just as a reminder, if you have any questions at any point throughout the presentation, hopefully we might have a few minutes of the infer to answer those, but more than likely we won't. And please, so, please post your questions to the community practice. And I'll be having a look at that while the others are presenting. And hopefully we can get back with answers as soon as possible. One quick point to add to the introduction is what we are doing in collaboration with WHO. Really, we have been kind of moving step by step with the WHO as they've been updating and improving their data quality approaches. We've been trying to backfill the same approach into the technology so that DHS2 both supports the WHO approaches as well as we have the training materials, the guidance documentation to support the use of DHS2 in the WHO data quality, various WHO data quality approaches. And actually Andrew in the last presentation made references, which they're doing in Rwanda, which is really cool. You know, essentially the WHO has two standard approaches. They have the WHO annual data quality review. And we have developed in partnership with them the WHO data quality application that is there to help countries through this review process. It actually will produce an annual data quality report automatically for the countries based upon the standard WHO format. We've also been working with them to support more routine lower level district data quality reviews. Again, Andrew actually made reference to this and had a wonderful picture of what they do that on a monthly basis in Rwanda, the data verification validation exercises they do. But we're also trying to build out tools, functionalities and DHS2 to help the user perform these data quality reviews. And I'm going to highlight some of those and Bob in the next presentation is going to highlight those as well. And finally, all of this is coming together in our data quality tools and training documentation and guidance. A couple of quick links for you here, the data quality guide, which we produce a data quality guide for how to use DHS2 for data quality, various tools, tips, applications, everything that you need to know about how to really get DHS2 helping you out with data quality is in the first link there, dhis2.org backslash who-dataquality or dq. And then the second link there is to the WHO website where they have all of their various data quality tools and guidance documentation. Really a treasure trove of information there. A lot of really great support documentation. Please have a look at these two. Again, we're trying to make sure they line up and kind of sing the same song so to speak, but a lot of useful information. And then we are trying to summarize and produce all of this information into a data quality academy. We actually have the next data quality academy. It's a level two academy, meaning you need to be a bit proficient with dhis2. We will get fairly technical on how to configure dhis2 for data quality checks and how to really utilize some of the more advanced data quality functionalities. But that academy is coming up October 19th through 30th. The registration is open on our website dhis2.org backslashacademy. If you're interested in that, please go ahead, head over to the academy, register. We're very interested in having country teams attend that academy. We're specifically devoting some time to work one-on-one with countries to implement some of the tools and guidance directly into the HMIS if they're interested. But really, this academy is kind of the go-to place to get all of this information condensed down into something that's a bit more actionable. So with that, I am going to switch over to another presentation. So bear with me as I change the screens. And because Bob can't join us because he's hopefully still in bed, he's produced a video for us, like I said. So I'm going to play the video and Bob's going to take us through how to utilize dhis2 standard dashboards for data quality checks. Good afternoon. My name is Bob Pahn. I'm a consultant working for the University of Oslo and the World Health Organization. In my brief presentation on standard data quality dashboards, I'm going to share with you a couple of tips on how to use the standard dhis2 applications, which you are all familiar with, the data visualizer app and the pivot table app, how to use these apps to visualize suspicious values and place them as alerts on a dashboard. The first tip involves using charts to show month-to-month trends and to review these charts, particularly at a decentralized level. And the second tip is to make use of pivot tables that present a summary of all the extreme outlier values of a collection of key indicators. These are extreme outlier values which have been identified with dhis2 predictor rules. So let me go ahead and get started. Many of you are probably familiar with how the values of immunization indicators and indicators on maternal health services don't vary a great deal from month to month. And when you configure a chart like this, such as may appear on a WHO standard dashboard, you typically find lines that are almost straight and almost horizontal when you are presenting the trends in immunization services or in maternal health services. But occasionally when you configure a chart like this, you'll find an indicator which shows an embarrassing jump in the value for one or several months, such as these values for antinatal care first visits. These are classic outliers. The challenge with using a month-to-month trend chart in order to identify outliers is that when the chart is viewed at national level, it is only going to reveal the very largest of extreme outliers. And the chart might provide a false sense of security that there are no outlier values for some of these other indicators, such as in pentat third doses. And in fact, if we had a more sensitive tool, we'd be able to identify extreme outliers even in these other indicators. Many of you are familiar with the excellent job done by the WHO data quality tool in rapidly screening the data sets and identifying all of the outlier values. In this case, for pentat third doses, we found all of the district-level extreme outliers in the data sets. And we can even use the tool then to drill down and identify the specific health facility which is responsible for the extreme outlier. Now, that's great. The use of the data quality, WHO data quality tool is to be encouraged and promoted and people should make greater and routine use of it. But we also need ways of bringing evidence of these suspicious values to the user so that they don't have to log into a separate application in order to see numbers that just don't look right. So another solution is to decentralize the process of data quality review to the level of the individual district. And when at district level, people view the same month-to-month trend chart, that chart is much more sensitive for picking up the suspicious numbers. In this case, we've filtered the standard dashboard so that it is showing results only for one specific district. And of course, when the district logs into the DHIS2, this is the version of the chart that they will see. And again, the chart is showing the values of a couple of maternal health and immunization indicators. But it shows that at level of district A2, we actually see that there's this quite suspicious rise in the number of reported third doses of pentavexine. So the chart can pick up outliers that have a magnitude on the order of a thousand or two when it is viewed at the level of the district. And it's good to configure this type of chart to look not only at the values over the previous 12 months, but you can extend it backwards in order to pick up historic outliers from previous years. This type of chart can be used not only for maternal health and immunization indicators, but you can also use it to pick up outlier values and such things as the number of clients currently on antiretroviral therapy, or you could even use it to pick up outlier values in the number of diagnoses of diseases which you don't expect to be seasonal, such as the number of diagnoses of notifications of tuberculosis that are recorded quarterly. When we see this outlier at district level, one of the things that we immediately want to know is where did it come from? Which health facility? And is there a way to identify that health facility? Well, we could if we wanted to use the WHO data quality tool, but that would require leaving our dashboard. So that brings us to the second tip, and that is that it is possible to configure a pivot table like this, which summarizes in a single pivot table the values of all of the extreme outliers in this district for multiple data elements or indicators. And the pivot table is actually identifying the specific health facility in the specific month when the outlier was reported. In this case, the pivot table has been configured with a legend that highlights in red extreme outliers that have a value greater than a thousand. Now, you should be asking yourself, where do these outlier values come from? Do they come from the WHO data quality tool? Well, no, they have been generated by what is called a predictor rule. And predictor rules are amongst the applications that are found in the maintenance app. In this case, predictor rules have been configured to identify a couple of different types of 10 to 3 outliers. How does a predictor rule work? Well, it uses a generator expression, which you see here to evaluate the data on the number of 10 to 3rd doses given and determine whether the value of that indicator exceeds a certain threshold. And if it does, then the value is exported to the new data element that has been configured for this predictor rule, the 10 to 3 outliers. And it is this data element which is used to configure this pivot table. Before closing, let me touch on the challenges of identifying suspicious values of indicators which show a seasonal trend, such as the number of reported malaria cases. This year on year chart shows that each year there is great variation from month to month in the values of malaria. And this makes it exceedingly difficult to use even the WHO data quality tool to identify suspicious numbers. You can't tell whether the number is because of a possible error or it's due to a seasonal increase. However, there are workarounds and this chart shows an example of one such workaround which looks at the percentage of malaria cases that are due to children under five. And it identifies months when the age distribution of malaria cases is quite abnormal, such as this value of less than 1% of malaria cases for this facility in February of 2020. So with workarounds such as this, it is also possible to look for anomalies and therefore suspicious values in seasonal data. Thank you. Okay, so that's the end of Bob's presentation. I realize that he went quite technical very quickly and I want to let you know that if you have questions about the predictors or identifying standard deviations or outliers with predictors and being able to put those onto dashboards, please don't hesitate to put that question in the chat. We have guidance documentation specifically on this and I will definitely forward that to you and also talk with you, work through the process with you. Something that we definitely want to promote amongst the community is it can be a very powerful tool. So now I'm going to give us another glimpse of suddenly utilized but very powerful functionality of DHHIS to specifically for data quality and that's validation rule notifications. So one of the things to point out quickly here is that really most of what we're doing on a day-to-day basis, most of all of our jobs revolve around our email in some way. I typically like to ask that, you know, everyone here is a DHHIS to expert, a DHHIS to user, how often did you check your DHHIS to dashboards? Have you checked them today? Have you checked them this week? You know, oftentimes you find that even DHHIS DHHIS to administrators in countries rarely check their dashboards but how often do you check your email? You're probably checking your email right now, some of you. And the point to be made is we need to send the data to where people are actually spending their time and especially send the alerts and data quality checks to where people are spending their time and that's usually their email. I think we know through a lot of research that data trust generally is quite low and because data trust is low, data use is fairly low. And the way that we get around data trust issues is that we need to make sure that people have transparency to the data quality issues. The best way to make sure that people can see the data quality issues is for have DHHIS to detect them and send them to you in a place that we know that you're paying attention to usually your email. It can also be DHHIS to can also be considered, excuse me, configured to send alerts and notifications to email and now even WhatsApp which is actually from Pilot showed that PSI has configured. So one thing to be very specific about though is that these alerts and notifications need to be useful, right? It doesn't need to just be, hey, there's a bunch of problems. They need to actually tell the user what they should do about the problem and that's where we bring in standard operating procedures. Andrew in the last presentation about Rwanda touched on how thoroughly developed their standard operating procedures are in Rwanda and how well they're adhered to. That's a great model for any other country. We want to make sure that you have really clearly defined standard operating procedures and then you're sending the information to people that aligns with their standard operating procedures so they can actually respond to the issues that DHHIS to is detecting. They know exactly what to do because it's defined for them in their standard operating procedures. Really quickly here on just an overview of the validation rules. Essentially, validation rules, if you're unfamiliar, are predefined logic between different data elements in a reporting form, essentially. So you can say very common one would be the number of tests, or sorry, the number of treated should not be greater than the number of tests, right? And if it is, there's probably a data quality issue in either the number of tests or the number of treated. They don't necessarily fully assess whether the report is accurate or complete. It just basically is a, like I said, a predefined rule or logic defining the relationship between two different data items or data elements. And what we can actually do is we can schedule these validation rules to run routinely. So they can run whenever you, however frequently you want, daily, nightly, weekly, monthly. And then they can push notifications when they detect something. So when they detect that the number of treated is higher than the number of tested, it can send an alert. And this alert we can define specifically in the validation rule notifications. This is another functionality in the maintenance app. Excuse me, it's actually in the data quality app. And here you can have custom subjects, messages, templates for each one of the validation notifications. Now, here's an example of one of the actual alerts that was sent out. So I configured a simple validation rule to run for any natal care visits. And I said that any natal care, the count of any natal care visits number two should not be greater than A and C one. Typically we find that A and C two is less than A and C one. And what happened? I ran this against our demo database and just two validation rules when I ran it for the entire country for a whole year, I checked all the data reported in the entire country for the whole year, and it produced about 2,000 alerts. Okay, and these 2,000 alerts were sent to me in 19 emails. Now, I asked the question, is that actually useful? If you're a national A and C program manager and you received 19 emails full of 2,000 alerts, is that useful? Well, no, it's absolutely not useful. It's just going to be noise. You're just going to delete all of those emails. There's no way you can respond to that. So what we have to do is we have to make sure that we send the alerts, make them specific and useful enough, and a manageable amount. So here's actually an example from Rwanda where they have configured very nicely some key validation notifications and alerts to be sent out. Here, they're using these validation notifications to send out an email whenever there is a outlier for PINTA3 detected. Now, I want to make sure that people appreciate that in national databases, data quality issues are not a bug. They are a feature. There will be data quality issues. We just have to make sure that we have the tools such as these kind of notifications that allow us to address them. It's no problem that Rwanda has outliers. It's very normal. Every country has outliers. But Rwanda has been very proactive in setting up these notifications that get sent to administrators' emails. Again, a manageable amount. This administrator is only receiving 23 notifications in this email. This is something that they can be very responsive to. Here's another example from Cameroon. I won't translate it. But Cameroon is using their validation notifications for actually disease surveillance, so outbreak detection. And here, they're actually saying if there's any cases of a certain disease reported, then it should trigger a validation notification and an email should be sent. And so that's another way of using it. So it's kind of outside the realm of data quality and thinking a little bit more creatively about how to use this same functionality. Okay. So with that, I've ended my presentation. I am now going to hand it over to Angela from Nigeria. So I will stop sharing. And Angela, please go ahead and start sharing your presentation and take it away. Angela, you're still muted. Thank you so much, Ms. Poff. So my name is Angela. Good afternoon, everyone. My name is Angela Abba. I work with Affinett Nigeria. And I'm going to be making this presentation on behalf of other co-authors. So my presentation physically is using innovative approaches to improve written immunization, data, completeness, and accuracy on the DHIS2 platform in Nigeria between 2019 and 2020. So this will be the outline for the presentation, introductions, methods, results, conclusions, accomplishments, and references. So by way of introduction, of course, we know how it's important that quality data is needed for planning and of course for resource allocation generally in the health system and particularly also for routine immunization. So Nigeria has been using DHIS2 as the only electronic platform for reporting my health data since 2013. However, the routine immunization model was introduced in Nigeria in November 2014. And currently all states in Nigeria have that model and are reporting their routine immunization data on the DHIS2. Over time, reporting has been stable. Reporting has been excellent. However, we did notice through depth reviews that they were issues with data quality and then they were also inconsistencies in the quality of data that was actually reported. And so there was need to actually put in more effort to see how completeness and accuracy of the data that is being reported on the system achieve the desired level that is required. And so this presentation basically describes one of the ongoing efforts within Nigeria to actually improve the reporting, the quality of data that is being reported for routine immunization in Nigeria. So what was our method? Basically, the major method which we used in implementation was the Geeks framework. Geeks actually is growing expertise in eHealth knowledge and skills. It's actually a US-based fellowship that allows for transfer of capacity between a mentor and a mentee. In the case of Nigeria, we actually had the mentor from Affinet and then we had a government officer from the National Primary Health Care Development Agency as the mentee. So we had to transfer capacity in terms of how to identify data quality gaps and address those gaps between Affinet who is a partner and of course the government of Nigeria and a staff from MPCU. So these were basically some of the strategies that we used. We started by identifying baseline. We started by, first of all, we had to even decide what indicators we're going to be tracking in terms of data quality because it's quite broad. So we just restricted ourselves to look at conflictness and accuracy. And under accuracy, we needed to define what indicators we would be looking at specifically for meeting immunization. So we came up with our broad indicators. We had to document baseline and we documented baseline in February, 2019 based on the data that was on the DHR2. We documented for all 36 states in Nigeria including the Federal Capital Authority. So over a period of months, monthly and weekly we were monitoring those indicators and seeing how progress would lead. So subsequently, monthly and weekly we had to look at our indicators. Of course, we also developed a dashboard that was specific to the project and we had to design the dashboard with all the indicators that we intended to monitor. So the dashboard basically helped us to identify outliers, doing reporting periods and then we could send feedback to the states. So once reporting begins, we begin to monitor using the dashboard and then we're able to provide quick feedback to the states. Just like Scott had given in his presentation, we used WhatsApp platform, we used email addresses, we used SMS to send feedback to the states. We had four car persons that were identified in each of these states and feedback were sent to them through these various channels. Again, after sending them feedbacks, which were quite also specific because we could drill down to health facilities to say which health facilities were having these data policy issues and provide that feedback. We had to also follow up with these officers to be sure that they had implemented their recommendations that they have provided. Again, we also came up with a government sheet, spreadsheet and Excel sheet such that the officers could also communicate with us and document that, okay, this is how far I had gone with recommendations of the actions that had been provided. And then finally, we also had capacity building sessions. Like I said, the entire framework was actually in form of mentorship to transfer capacity to the government officer. And so we had different capacity building sessions between the Appinette staff and the MPACDA staff. So these were some of our very quick results. Number one, this is like a snapshot of the dashboard that we created, the dashboard where we had to put in our project-specific indicators that we were tracking. Unfortunately, we couldn't show entirely all the indicators that we have with the dashboard, but this is just a snapshot which we check monthly, weekly as data reporting starts and we're able to quickly identify outliers and provide a quick feedback to the students. Again, this chart is showing us the conflictness of the data set. So we're using just one data set as a proxy. In Nigeria, we have four data sets. However, this chart is just showing us one of the data sets, which is basically the major data sets used for reporting in country. Like I said earlier, we had documented baseline in February 2019. And as of February 2019, which are the green bars, you can see where we were. And then by March 2020, when the project had started, we did an average of reporting between March 2019 and 2020. And we could see that there were a lot of improvements in reporting from the state and notably, we had four states who were quite the least reported as of when we started. And a lot of focus was made on those particular states to bring them up to student center reporting. Again, we had data entry error proxies. So in Nigeria, HPV1 and 2 vaccines are not on our schedules. Hepatitis vaccine is actually given through the PENTA. So we were not expecting to see HPV1 and 2 administered on the DHRs to expect to see results for PENTA and not for HPV1 and 2. However, we did notice that some states and health facilities were actually imputing these data. And so we needed to start giving feedback to the states. So as a baseline, when we started, we started in February and then in March, we had to provide feedback. We could see that there has been a decline in the reporting for HPV1 and 2. However, we're not yet where we expect to be because we expect to see zeros, but we know that we'll get there over a period of time. So this next result is showing us the vaccines and diurents. So this is BCG vaccines and diurent, BCG vaccine, as well as measles is actually given alongside with the diurents. So we expect that if you open a vial of BCG vaccine, you should also open equivalent number of the diurents. So we did notice that there were discrepancies between these two and that was also one of the indicators that we were monitoring. Of course, again, we're not where we expect to be because we expect that at this point, both lines should be touching each other. That is the standard and that's expected. But we have noticed over a period of time that the gaps, the discrepancies between the vaccines and doses are gradually closing up. And of course, with regular and continuous support to the sub-national level, we would get to where we expect to be. So just to conclude, we know that regular deaths review capacity building, tracking and providing of feedback to officers at the sub-national level can actually improve the quality of RI data as well as other health programs data. So we need to just keep tracking and as much as possible provide feedback, detailed feedback and specific feedback to health officers at the sub-national level. Again, government engagement of government officers can also promote accountability and ownership of data and data quality action. So even though these intervention had stopped as at March 2020, but because we had involved officers from the government level, they are still continuing this process and we know that the feedback of course will see continuous and being provided to the sub-national level. So for the indicators that are still trapped, we know that over time, we will still get to the expected values or the expectation for the S standard for each of these review papers. Thank you very much. We did like to acknowledge these agencies, the MPHCB, Das National Primary Health Development Agency, U.S. Center for Disease Control, Affinet and the Nigerian Center for Assist Control. Thank you very much. Great. Thank you so much, Angela. That was really amazing to see how much progress has actually been made and even though those lines aren't touching, they are getting really close. It's the first time I've ever seen something like that. So that was really cool. All right. So now we are handing it over to Tina from the WHO. So go ahead and start sharing your screen, Tina, and take it away. You're muted right now, Tina. Should be good now. Yeah, go ahead. You're my screen. So good day, everybody. This is Tina Feynman from the Data Evidence and Knowledge Management Unit of the Health Workforce and WHO Geneva. So needless to say, I want to present to you in this session how... Scott, are you timing me or should I time myself? I am, but you can time yourself as well if you'd like. So I'd like to talk to you about this session, how WHO used DHIs to improve the health, workforce, data monitoring and availability. Needless to say, given everything that's going on, the need for health workers' data is critical and we've seen that. In the past, yes, there's been a few global calls for strengthening and charge data, whereby in WHA resolutions, we've had urged member states to report on a core set of human resources for health data through the progressive implementation of the National Health Workforce Accounts, NSW. You'll hear more of the NSW throughout the presentation. We've had resolutions following with fiber action plans with strong focus, again, on NSW and data and evidence. The SDG accounts for monitoring health, workforce density and distribution through the SDG3C. Having said all that, what is NSW? NSW is a framework based on health labor market. So it's a health labor market framework. It's a system by which countries could progressively implement or improve the quality and availability of their health workforce, data and the evidence. And hence, through progressive monitoring on a set of indicators, which will then enable them to achieve their USC or SDG targets. The framework itself covers set of indicators on education, regulation, monitoring, migration, employment, information systems and so forth. So what the NSW has helped us do or the NSW implementation is to move from very entropic as many of you might imagine, health workforce data is in pockets. And sometimes the systems don't talk to each other and the classifications are not harmonized. To move from such a system into one where you have it all consolidated in a unified format with harmonized metadata indicators thereby enabling multi-stakeholder database to really provide evidence-based policy decisions. How are WHO going to do that? One of the aspects of NSW implementation was to provide an environment where the data could be harmonized, collected at the national level. What we did in WHO is to use DHIS2. So we have a DHIS2 instance in WHO, which we lovingly call with it, which is the WHO Integrated Data Platform. We're in 230 now, so moving to 234. We're excited about that. We have a couple of users within WHO, but their health workforce data sets have about 400 accounts. We have data entry happening all year round. We don't make a close and open cycle, but we do our annual cut for publication and release. How did we manage that? How did we say, okay, this has to be country at the center. Country at the center, meaning you have the country at the center doing the collection and the validation. That's when you have accountability. So we have a country focal point at the country that enters data into this NSW data platform, which is the DHIS2. We at WHO also work with external partners whereby OECD or ICN or other UN entities, and we share health workforce data. We entered them complimenting the data entered by the focal points, and we also work with ILO so we collect the labor force statistics. We also do a fair share of data mining and compliment what the focal point has entered. All of this then goes into a process, a big machinery where we do a lot of data triangulation validation, we use different tools to get to this and that is where we highlight some of our quality issues and we do country consultations, validations. Eventually the validated data is then used for reporting purposes, different analysis at the global level. One of the features that's really helped us in WHO is the data audit trail and the comment box. To have one place where it's transparent and you see who has entered when and the comment given explaining why this data is falling down or dipping up has really helped because we have a system by which you have the ministries of health with education who entered the data and then you have the colleagues at the country office, the regional office and the HQ to have one place and not shared in emails have the centralized way of data versioning has really helped. Another feature we use quite a lot is the standard deviation outlier analysis whereby we use it as a big net to catch the really outliers the big ones and then we take off and we have much more granular data analysis and validation process that goes out of each other but mainly these two have helped us to improve the quality and have more meaningful feedback to the countries and useful for country consultations. But what you can all imagine is having an excellent IT tool isn't enough. You really want accountability at the country level and the NSW implementation has helped to improve the data accountability at the country level by improving data governance by having this one stop shop for a country whereby you didn't it didn't matter if your base education graduates data was somewhere else but if you could put in the core set of education into one system and then you have the core set from the payroll all of them being in the NSW data platform which is our DHIs to instance improve data governance improve to see the data quality issues and our NSW I will not get into the details about this implementation guide but the process itself helped countries to take them step by step into a process by improving their data quality improving the data flow as you can imagine in many countries the flow is not always the same you have different you have the civil service you have the Ministry of Education the finance and within the HRS system you have several pipes flowing either way so understanding your data flow and then having to harmonize them and put them into a platform has really helped enable us to to promote decisions in policy level actions that has made an impact in the country to a few of those encouraging ones that I'd like to share with you here is countries have gone into this roadmap by which they say we're going to implement NSW and thereby we're going to improve our data on selected indicators be it on education, be it on migration be it on health occupation hazards whatever it may be we have a few country examples where this has created evidence to promote to pass a bill to improve more health workers and which has really created a difference in the economy you see more interest with stakeholders coming together to see okay my data is actually being used for evidence and I need to improve the data improve the quality needless to say at the government at the regional level of course you have this new possibility of being able to compare countries because you have one set of harmonized data harmonized classifications which then enable the cross country comparisons and to see what's happening in a region at the regional level all of that accounts up to the global level where we have increased data quality increased comprehensiveness of data most updated data and of course it enables us at the global level to provide the global community with a much needed update on what's happening with health workforce we recently launched early this year the state of the world nursing report which is which goes into much depth on what's actually happening and the policy livers that actually regulate or what are the different policy aspects that are related to the nursing workforce the snow nursing as you can see are pivotal to health service generally they they comprise more than 50% in some countries of the national health workforce all of that and before I end I just want to I have to speak through the our new NSW data portal again released very early this year this is a fancy data portal which I'm very proud of and our team is really proud of to showcase is it's pulling off this data that we have now this wealth of information that we have in the DHIs to instance of the WHO we pull it out and we have country profiles we have occupation profiles you can you can do your demographics of the country within country and now enabling you to do question data queries and and it's really it's really come to that that over the years improving the data quality as Scott just said it's never a one-off it's it's it's never to seen as a hiccup but it seemed to be as a venue whereby you can improve your data you can improve the comprehensiveness and understanding of the data and this is one of the examples and with that I conclude I think I you owe me two more minutes but I conclude that there's anything that I need to clarify I'll be happy to you can always which is at HR statistics that's cool thank you that was excellent Tina thank you so much it's also really incredible to see how much you are using the data audit trail in the comments in the aggregate data entry that is a profoundly underutilized functionality very very powerful and it's it's due for a facelift it's it's time to do a bit of a revamp and improve that functionality I think and so I'm going to come back to you and get your yeah get your thoughts and maybe we can get something on the road map for development actually cool all right so Joseph are you here yes I'm here wonderful thanks all right so go ahead start sharing your screen and the floor is yours all right can you see my screen yeah sure can all right thank you well hi everyone I'm Fallon Joseph resident advisor in Madagascar I'm working for GSI I'm going to present you the abstract on the effect of DHHS2 implementation approach to improving their quality analysis and use in Madagascar this abstract was running by myself in collaboration with my colleagues Moussali Jordanson from GSI and from Mamisua Rasulu from the Ministry of Health in Madagascar well the the topics I'm going to get you through objective of the session and the Madagascar health profile a little background on the on the study and the DHHS2 implementation process as well as the some DHHS2 outcomes and finally a conclusion well at the end of this presentation we expect that participants will be will learn about how to use all the use of digital health technology resulted in the improvement of their quality and use as well as Madagascar's success stories and listen learn in the implementation process of DHHS2 well needless to say that high quality data is key is key for willy employment follow to monitor progress of program but the health country of Madagascar is built on the like as you can see here the Madagascar is the fourth biggest island in the world we have a population of 27.2 million and 22 region and 114 district around 4,000 health facility including including private and public sector and around 18 thousand community sites. So the Ministry of Health in 2016 in collaboration with measure evaluation which is a USAID supported project conducted a baseline assessment of the HIS using the wisdom assessment in Madagascar. The main result of this assessment were pointed out with the poor data quality and sufficient data reporting because at that time there was a an access database using to HIS data management that lead to limited access to health data by stakeholder there was a creation of power system because they cannot access data so that they have to put in this a power system to get the needed data and we the reason pointed out as well the lack of coordination among stakeholders in term of HMIS activities so based on this result the Ministry of Health decided to confirm every all the stakeholders involved in the health sector in Madagascar to develop a analog HIS roadmap as well as a HIS strategy plan one of the main activity of these HIS strategy plan was to move from the standalone database to a work-based system which was the HIS too so in during this year the Ministry of Health in collaboration with its partners start the world out of the HIS too the HIS too a national HIS team which was very important in the process because they are the one who will build the system and so on so in 2018 the Ministry of Health as you can see in the picture in my screen launched the HIS too as the national platform to be used for to for integrated data in Malagascar so from then they organized they will they have a pilot test they posted to a pilot test of the HIS so in two regions and 14 districts and through six months later with the world out the HIS too to the whole whole 114 district and from 2019 to now the Ministry of Health monitored the use of the HIS too and we can see some improvement in terms of their quality so as you can see in the in the map here the whole out of the HIS so were possible by the contribution of the USAID funded project like major evaluation my family access impact and so on and blow off on and World Bank as the result of this shift to the web based system we can see slight improvement in terms of their quality if you consider the two main indicators to measure this their quality like completeness timeliness and their accuracy so by using the in 2016 the the standard on database access was used we had a completeness rate of 36% and from the with the use of the HIS too is going up on at 91% as well as timeliness and that accuracy and we can see there that Ministry of Health is conducted that quality visitor regularly to monitor the use of the HIS to and improving that quality in terms of lesson learning in this process we can highlight the fact that in Malaga study using inclusive participatory approach in the implementation of the HIS so that allowed to ownership of the system and in the train of M which as I explained earlier in which the national Malaga starting is the one who is a lady the implementation of the HIS to in Malaga so if you obviously they have assistance from the university also by participating the academy with the commit the HIS to community and support from the heaps and so on but what is very important is the fact that they are able to manage the system and in Malaga so the use needless to say that the use of the HIS to increase time is to call the access to data so as I explained before by using the access database to get access to data the stakeholder needs need to go to the department in charge of HMIS at the Ministry of Health and request data and after some days they send data through email so now with the use of the HIS so they can access data from the office and data is being used by the Ministry of Health and stakeholder in the planning process for example there is a the development of the health sector plan the five-year health sector strategy plan in Malagascar so the HIS was one of the data source in term of HMIS strategy so the use of the HIS so as you all know as well facilitate the integration and tell the ability between systems so now the the country is moving to the integration of all the parallel system like unisation LMI and to be so what I can say the can the approach when they're taken in the use of the HIS who provide implementation models to help country to move toward to the tuner to self-reliance so all of that were possible with the contribution of all the partners obviously we are we face some challenges like all of the low-middle country like internet connection turnover staff heavy procurement for equipment and material at the site level but still the country is moving to wall out the HIS who at community level at facility level in community level in the common deals so it's what I would like to share with you if there is an equation I'll be there to enter in the the HIS to come in to you thank you thank you so much Joseph it's really it makes those of us who actually develop the HIS to give us a lot of pride to see how much it has helped and how well it's been implemented in in Madagascar it's a really compelling story definitely so thank you for sharing with that we are out of time for this session please again direct your questions to the community of practice we'll be there to answer as much as we possibly can and with that thank you for your attendance and I'll hand it back over to Max to get everything lined up for the next session