 which is coming from the community. So the protocols also need some enhancements so that they are not outdated and they are in line with the latest DHS versions which are used in the industry. Technology, of course, is the most expensive thing that we have apart from the people and building the capacities. So you need to understand that whenever you're investing in technology, you can ensure that it is long-term and sustainable for sustainable infrastructure and a continuous need assessment needs to be carried out so that you have gap identification after every few years and so that you can plan the investments in the same way that you're not investing only on the people and the possessors, but technology also plays a key role as you have seen many DHS implementations as the skill grows up, you have to similarly increase your technical architecture and infrastructure also. Next slide. But we've seen what makes a good DHS implementation but how to actually measure it. It's very important for us to measure the different aspects that we spoke about that we need to measure people's capacity, we need to measure the existence and sustainance of processes and we need to also see how the implementation is well-supported by technology. Therefore, keeping all these areas into account, the teams that are Global Fund, Oslo and these groups work together to build a much more comprehensive toolkit which has pinpoint questions against each of these categories trying to understand where your implementation stands out today, what are the key gaps that you see based on the assessment results and how you can plan to improve your implementations moving forward through those problem identification steps or problems identified and what kind of interventions will lead to resolve those problems. So basically the tool allows you to do kind of measuring your progress on systems running over time by measuring each of these individual companies that we discuss. And then of course, unless you don't have your fixed priorities, your kind of long-term vision, it's difficult to buy funding, it's difficult to get funding or attention from the partners of the funding partners that we have in countries. So you need to have these, this homework done in advance so that you can encourage investment in core DHS capacity which contributes to a sustainable implementation. And then of course, you have multiple funding partners funding their own priority areas but then each of the funding partners can also contribute to towards the core areas which could be a possessive, which could be people, which could be technology. So you need to align all these segments together so that you have a coherent pool of funding which again gets distributed into multiple streams depending upon the priorities that the funding partners supports. So basically through this maturity framework, you can evaluate the readiness of where you are and what you to do in case you have plan for further scaling of DHS to implementation. So you get, so the samples screenshot which you see, you see there are three components. So we'll take these components one by one. One is foundational, the second is aggregate, third is structured. So it's very important to set the foundation of your DHS system well in advance and with good quality so that it will be further built on by making it more comprehensive, moving to aggregate data collection and then even going to our next final steps of introducing individual level data transfers. So what the tool does is that it will measure and understand how the country is progressing for its health information system strengthening and beyond just summarizing what the activities are with funds because activities every country is doing but then are those activities in line with your priorities and in line with your actual problem and actual needs that needs to be understood and your implementation plans need to undergo a change so that you include the observations which are coming out of this tool. Next slide. Now the benefits of using this tool are since it is getting endorsed by major funding partners, Global Fund Gavi. So they're trying to drive their funding decision based on the outcome of these assessments. So this is kind of trying to encourage investments into core DHS capacity in the foundation areas because we all want fancy things of course to work with that we want aggregated, we want to move to track the capture individual data because it gives us an opportunity to do much more final analysis but then in order to be sustainable to do that you need to ensure that your foundation aspects are well funded and they're well structured and well organized. And then through this particular assessment you have an opportunity to align investment on interventions together. So I think on day one Ola in his presentation discussed about making a big country work plan where you could have in all the activities that a country is supposed to do in the near future and then align those activities to donor priorities and kind of present that as a one big country work plan so that each donor can identify the areas where they can support because their priority lies in those areas and then you could have one joint stream of funds which can be used for different areas together. But that only becomes visible when you do such an assessment and it gives you a picture of every standard what we need to do further. Next slide. So yeah, so yes as I think I've already mentioned this that what the majority of people try is to give them good indication that where does the country stands and whether it is in a good position to scale up further moving from aggregate to traffic and further final details. So we can look at the individual components now. In this slide there are some recommendations which we're trying to give but they are not kind of based in stone or past in stone. First the country can't decide things on their own but then there are some recommendations which we feel should be followed to be able to have sustainable implementation. So as I mentioned the tool measures the majority in these three core areas. The first one is the start from the bottom for the foundational where we talk about the legislation and governance the security infrastructure and facilitate population profile, training of end users at the quality of DHS to measure and organization units having a core team in country and yeah. So these are the foundational aspects each of them has a set of questions which a country has to assess based on the current situation to try to address the current situation that these foundation areas are currently not yet achieved or they're early progress or they're adequate or mature depending upon the current situation. The same questions and the same scoring mechanism is followed for aggregate and for traffic as well. Next slide please. So let's look at the foundational aspects. You see here most of them are crucial for decision making. So you need to have a strong leadership and governance focusing on your digital health strategy which you chose to DHS to. You should have kind of dedicated mechanisms to think of the strategy in terms of investment in terms of having funding support. So you should know that how to reach out to the funding partners and to present your case in a way that it is convincing enough for them to fund interventions in your country. Then we come to the local needs where you need to have a whole team for DHS to maintain it. You need to think of security and compliance because you are managing starting with aggregate data but it doesn't take much time for a country to start collecting leverage level information as well. Metadata organics you understand are the kind of the bottom line for any DHS or instance so they need to be of good quality. The capacity building of users again is very important because they are the ones who will kind of bring life to your information system so it's important that they're well trained and precedent for population profile kind of are your basic denominators without which you can't do your analysis and you can't measure the impact of interventions that you're going to do. And infrastructure as I mentioned is very important for your implementations to actually work on the people. So the recommendation here is that when you're looking at the foundational topics and when you do an assessment if you see that all your foundation aspects are at least in early progress stage and you're already using aggregate program. So before you plan to scale that just try that none of your foundational topics should be in not-it-achieve. So if you see you're using aggregate data sets already for data collection but on analysis of your infant situation based on the toolkit you find that any of your foundation areas is falling at a not-it-achieve. Then we need to take a pause rethink the strategy of how those areas would be strengthened before we scale things up in the aggregate domain as well. Tracker we all know configuration of course is one thing but then the actual implementation of tracker is kind of a very large scale exercise which needs investments in terms of devices in terms of trainings, in terms of technology. So you need to make a very crucial decision of when switching from aggregate to tracker that your foundational aspects and your aggregate domain aspects are relatively stable before you can jump into collecting individual information. So for ensuring that you are moving the right direction before implementing tracker you should have first of all the institutional buy-in and support from at least their holders. Since we're dealing with individuals data it's very important that we make conscious decision in terms of what kind of data we're trying to collect and who all should have access to this data. These principles should be defined well in advance. Funding again, you need funding for your DHS to configuration, you need funding for implementation, training and convenient support. So you should identify that whether funding is available for you to support this particular intervention or not. Again, I just spoke about individual data. So you have data privacy and it is the question which we need to answer and make sure that our systems are aligned to that. And then whatever data exchange standards are in place which could be used potentially moving forward so they should be thought of very in advance. Capacity and competence again it kind of is kind of depending on his group you're working because they do have the capacity and competence to work with you. But it's important that as the progress move happens in the implementation the country or team also present capacity and competence so that they're not totally reliant on his group support but they are also self-reliant and can manage things and grow together with his group in terms of their own capacity. Infrastructure again, we've already discussed with devices and their availability server and support. So the larger your scale the better you should be prepared to handle the challenges which come with infrastructure. So all these parameters you've taken to account before moving into a traffic implementation. We have seen that things are relatively smoother and you face less challenges as you move ahead. So the generic recommendation is the foundation domains should be at least a early progress or advocate before you move on to the traffic implementations and since we're talking about individual data you should have enough security and compliance procedures followed for DHS and security that should be at least advocate. So these are just generic recommendations of course there's nothing to stop a country from using traffic but then just that you're well prepared and the implementations don't suffer or the challenges as they go ahead. So it's better to be well prepared with the foundation activities and specifically the ones which focus on traffic. Next slide please. Yeah, so for again for traffic systems it's not that you've just started implementation and it will just work smoothly. As you saw in the case of Nepal HIV the presenter discussed yesterday that just by designing the system and implementing it did not really solve the purpose they had to focus on countries improvements at all the levels. So they had to kind of inculcate the main cause of why the users need to use the system. So it's just that designing and implementing just doesn't stop there. Need to continuously involve in terms of with people and as you see the program grows more you'll need more funding and more budgets to be put into place to introduce the individual data domain. So you start with HIV then you think of the other two programs TV, malaria and things like COVID that just can happen without any previous information. So I mean you need to be well prepared and you need to make to ensure that you are continuously looking at the performance of and the gaps which are there in the implementation which could be of the three domains that we discussed. It could be gaps in the people's capacity, it could be gaps in the processes, it could be gaps in the infrastructure that's right. Next slide please. Okay, so that was my last slide. Maybe you can quickly show you the tool which is there. Currently the maturity assessment is being done in the countries which are global for eligible as per the data asset project it will be the his public board. But the countries are kind of encouraged to use the same tool. So if you want your country implementation to be assessed using the DHS to maturity assessment toolkit then you could reach out to your his group. They will definitely help out and work with you and can facilitate detailed orientation of the toolkit and can also give examples of how they have done similar exercises in other countries. So if there are any questions we can discuss those or else I can just give a quick review of the tool itself. Yeah, so there's a question on the online chat for regarding a tool for readiness assessment before doing a DHS implementation. So we have a session on that. So just after this assessment session we have a review of the readiness assessment tool. So as you can see on the screen it's Excel based toolkit where there are several subdomains defined for different areas. So for example you start with leadership and governance. There are some guidelines given that how you can assess this particular subdomain. And if I move this to the right then you have these grades available, noted achieve early progress, adequate and mature. For each question you have a definition or description of the measuring criteria in terms of how can you grade this particular statement. So if your situation in the country matches any of these four then you can grade yourself or the country implementation by adding in the score here. So you have the option to select if you put early progress and if you want to add more information then you can put in a description as well that why what's the current situation and why you have marked yourself early progress. Similarly you have sections on strategy investment. So these are some of the questions that we're going to understand in terms of whether a strategy is available or not, whether an assessment excise is done or not. Work plan along with the budgets is it available or not available. And similarly do you have a kind of sustainable funding available at credit or not supporting your digital health intervention. Then you move to DHS your security and compliance. So we have some questions related to availability of manpower who can handle DHS to security in terms of your servers, your SOPs and everything. The ownership of data for the data as well as technical ownership and do you have the key SOPs in place or not. Third we have core team for DHS to administration. It's just about having internal personnel within the ministry who could handle DHS implementation on their own and availability of SOPs for system management etc. Then we come to the building blocks of DHS to the metadata and documents. So in terms of what is the quality and for assessment of the quality you have the categorization and the description given here which you can choose what matches to your existing implementations. Then we come to the end users, how well they're trained at the district level and the facility level and whether the country has a specific training instance dedicated for training use in the country. So that can be evaluated. Next we come to our denominators for data for work index, population, human resources. So that would be assessed here. Next we have infrastructure. You availability of computers, mobile phone, server hosting mechanisms, budget for server hosting, mobile device management inventory, all those questions could be answered. So in case any of these questions don't apply to interventions then you can skip answer those and that doesn't contribute to the overall score. Then we have the aggregate sections where you could see that whether you're using the DHIS to platform for aggregate HMIS, if yes, then you could say at scale it means that it is being used at the national scale. If it just started then it's valid. If it is still under development you could put under developments or for this specific use case DHIS will not be used at all then you can just select not used. And if it's not used then you can use questions plan. But if it's at any other stage other than not used then you can try to answer these questions. Some of them might not be relevant but then you can skip the origin of that as for the present state of implementation. Then we move into some programmatic focus to talk about data sets, the data for aggregate programs for HIV. Then we talk about DHIS to being the main source for HIV data collection and there are no parallel systems in country. Standard based configuration, the kind of analytic and dashboard which are available. And then private sector data, community data, the completeness and timeliness, the whole factors which kind of define your program performance are available here. And trying to understand how is the coordination between the HMIS and the technical program staff internally and of course with external stakeholders that can also be evaluated here. Then we have data use guidelines. So the same set of questions are then repeated for the aggregate PV program and aggregate malaria program. Then we have a section for immunization. So if immunization, routine immunization is part of your DHIS implementation you could assess that using the same set of questions. Surveillance aggregate, if you're doing that using DHIS too then you can assess your IDSR program. Same for COVID-19 surveillance because a lot of countries use DHIS for the COVID-19 data management related to surveillance, immunization, et cetera. So we have a section on that as well. CHIS, the few countries which use data for community health as well, through DHIS too, that could also be assessed. Then we come to the tracker components trying to understand the tracker capacity that the core team has in country. So there are some questions on that. And then we move into program specific, program specific trappers for HIV. We have components for the privacy impact assessment, the hosting arrangements, quality of metadata, alignment with the WTO standard packages, use of Android, completeness of case-based reporting, and whether this data is contributing to routine HMIS reporting or not. And the data use at test level and similar questions for PV case analysis as well as for malaria elimination program. And then we have electronic immunization registry, same set of questions, followed by COVID-19 immunization history. If you're using individual tracker for AFI, case-based surveillance, and for COVID-19 surveillance. And in case your country has done mass immunization campaigns using DHIS too, then based on the latest assessment campaign, the mass vaccination campaign carried out, you can assess this section as well. And you could put a quick summary. So how it works is, and where the history will help you is, so you fill in this tool and score each of the section at your notes. There is a centralized database where all this data gets imported and you get some standard reports which are automated and kind of give you a summary, the screenshot of which are shown in the presentation. Using that, you get some outputs trying, giving a grade of each of your domains for foundational aggregation tracker. So you could see which of these domains are early progress, which are mature, which are advocate, which are mature, and which are the ones you've not yet achieved. So that can be your primary intervention areas moving forward. So I'll stop here. If there are any more questions, please feel free to ask on chat and we can respond to them accordingly. And in the participant's presentation, yes, Dr. Tanthu. Yes, this is the interesting tool. So this is based on the, we can use this tool to explain the capacity of the system, particularly specific for the sector tool, or we can apply for the whole health policy system. So yeah, I mean, it's not kind of a hard one to be used to, but I think it's kind of more generic. I think all of you know the answer to this question, right? Okay. The way we designed this was to, of course, since we largely work on DHS with the focus was there, but then if you want to assess any health information system, which is kind of HMIS plus program specific, you can use the same tool because the principles are kind of common across the information system that we're going to do for health. Thank you, sir. Yeah. Thank you. This is the tools, because we cannot just give the answer even before we need to more collective people in the team to give the answer. And then the final score will automatically be generated, right? Which area we need to improve? Supposed to be the effect of the capacity of the data tool now in our country. And so that we can have the system, the tool can generate which area we need to improve, right? Is this based on Excel or on? Yeah, so currently this is based on Excel. I think what has been done is that this Excel can be imported into a DHS, for instance, where all the calculations happen and you get a final scorecard of the majority of the testing. There you get area-wise grading of different domains by their status, which are defined here. So, but I mean, if I think that's a question for that, I mean, if they want to do the testing locally and not only import data into the DHS, then maybe the tool gets to have some internal calculations to get them and also maybe not an electronic output like it would be from the DHS. So I think, I mean, we'll take that recommendation around and we'll follow it up on this. Yes, and I think this tool is very detailed by program and also the year on and applying it to countries so that we can use to assess where it is to improve the system. If it is the government, if it's capacity building, or infrastructure, or if it's a standard company like that. Okay, so first, in our country, we really use this tool to assess what we are in the rich level now. Well, we know that we already start to implement the system since the 14th, like this morning, as a president. And we apply all the aggregate, the event captor, eye captor and trackers to different program, but also including the program from Global Fund like the TV and Malaria in our country. So currently, we have by the team in our department in the district of Wichita Health. So since the 2014 until now, I know for how long if the system can be handed over to the country so that they can not derive, not any question, and our program, they can authorize just asking the HIPP Vietnam to support. I don't know for how long in this country so that they can stand so that they can more sustainable. Because I know that in our country, mainly we have based on the like of the data, we have support from Global Partner to make the system functioning. I don't know where the partner has gone, how the government can continue to sustain the system. One more question about it, because we know that all the data now keeping the clouds over in other countries. I don't know, don't tell me it was somewhere. The government asking me, because I don't know, there are questions about the security. They want to hold that data in the country. I don't know whether you need to move or just keep them in that cloud server. So I have that two questions. Thank you. Jens Lee, for the first question about kind of reliance on the district, but I know this is a growing challenge for many countries. I'll present actually a little bit of an assessment framework that we can use to get a baseline skill set and kind of do some planning around that capacity building more long term. So hopefully that'll help a little bit and answer your question later on. For the second question, I think this is something you might want to take up with John and Sam, so you can have a little more advice specifically on what your needs are. Or it's a difficult question, of course, to answer in many cases. And I know you've been using cloud servers for many reasons, but you know, that's the challenge that's always in the internet system. I would suggest we get a little bit more detail than John and Sam, we're more aware of the implementation in the world. Any other questions? Oh, yeah. So just a point to add. For now, I can be discussing a global fund. So very soon you'll have the same assessment told if you'll do it. So in case that now we'll work with you to assess all these programs for this tool and we will have a... So there was a question earlier about whether this tool can cover other systems or the technology. So when we develop this maturity assessment tool together with Global Fund, of course, we looked at a lot of the other tools that are available for the maturity, et cetera. So, and we agree that we would focus on the DHS two aspects and make it like a DHS specific maturity tool. But obviously it links many of these areas like infrastructure, governance, link to the broader system. So there are some linkage in here to other maturity assessment tools. I think Global Fund is working also with UN ITU, for example, on a digital health maturity tool. And if you already respond to that, I think there are answers here that you can bring back into this tool so you don't need to do everything twice. But I think there are... If you're interested in looking at some of these other tools, we can have to find the references because we used to unlock when we developed this. But the idea with this specific tool is that it's worth the chance to implement it. Subsequent, we have got two questions. One is whether the government can hand over to WHO information history. Usually, even the US government cannot maintain their own system by themselves. Why? Because the technology is changing and technology always updating. So technical people always lie with the technical importance like this. So heading over to government, definitely we want to and we are gradually doing so. But the point is that it's up to certain limits. So if we hand it over to HHS, we do hand over to the system, they are required to do it. But when it comes to security, specific technical issue, Russian upgrading or something with complicated issues, then they are asked to do it. So that is how you may hand over HHS in future, might hand over to you. But you should also in link with HHS in future. So any type of historical area. So usually government never have their own capacity, even if they have to hire external experts. Second part, the data security. In many countries, for example, in Bangladesh, the ICT law says that no data will exist in the outside of Bangladesh. So any data we have to put inside Bangladesh area. So how? Because if you want to make a data center, it's very costly with them. So don't go for that, because it is extremely costly and very risky. So better to have the commercial or the cloud provider inside your country, that's going to be a good solution. You can try with this one, it's an expert, but definitely HHS is there to help you. Yeah, I mean, I'd like to just add the link with on this kind of conflict capacity. I think the goal for us for the HHS network is that the government take full ownership and have capacity to maintain the system. So I think it will always be a transition and I think we'll come back to kind of some guidance recommendations we have in terms of what kind of people, what skills that we need in order to kind of gradually take over more and more of the responsibility and those activities. I think what Hannah was referring to is that it's the same in Norway as the minister of health of course relies on some outsourcing for specific parts. For example, it's very common on the server infrastructure that they outsource to another company or another government entity. But I think what is important is that we need capacity within the minister of health to understand the system, to understand what the needs are so that you can manage these agreements with a local company or some external provider of certain services. And I think that's the critical part that you have this hello. Gradually building more IT capacity, information system capacity within the government may not be expertise on every single aspect but at least that's kind of a high level understanding of what are the processes needed, what are the skills needed and then also be able to manage if it's relevant, manage these kind of agreements with all the companies that can help provide these services. Yes, thank you. So you mentioned that the tool can be used to assist planning and strengthen activities. How? How? Because we want to keep the government team to use the data for better planning and better results. But right now we are thinking how we're going to do it and what tool we can use to help the government to use the data in the headings tool to do the planning for the results that they have to use. So I can try to respond and then I think we'll probably get back to it later in the session. So yesterday in my brief presentation, I had some similar slides around the maturity profile and assessment. And I think doing the assessment I can identify the gaps and the kind of status in each of these components is just the first step. I think the second step is for these groups together with the governments to really review these results and come up with some recommended actions together. And that is kind of important input to a bigger, de-child strengthening plan. And you can have kind of through the tool you can identify clearly which areas, for example, on metadata that needs to be addressed or on capacity building, like the core team, for example, what needs to be addressed. And you can also do this assessment on a regular basis every year, every second year, to kind of monitor progress on these areas. So I think the tool itself is not like a template for the plan, but I think it will provide very important input to your de-child strengthening plan. And I think in collaboration with the HIST group, you can kind of work on that plan. We can also help, we have other tools that can help with budgeting and some more specific kind of planning templates. But first of all, I think it's really giving you the best input you can have in terms of what needs to be strengthened. And then we can also guide you kind of how to strengthen what are the kind of recommended activities. So each HIST group, when they do this assessment, now on behalf of Global Fund, together with the governments, is that filling out this spreadsheet is one thing, but the second part is to write a summary report where they kind of do their own analysis of the results based on discussions with the government on what are the priorities. They can also recommend in order to, if your priority is to strengthen, for example, the TV tracker, they can recommend them based on the whole assessment, kind of what the status of the foundational areas, status of aggregate, they can recommend what are the steps needed to have a successful implementation of the TV tracker, which may involve improving the server, improving the general capacity, or buying more devices, improving kind of connectivity. There's a lot of kind of steps needed to get to those goals and that's part of this report, this executive summary that the HIST provides some recommendations on how to reach it. And I think that could feed into a more detailed plan. We're also working with Global Fund and Gavi that of course are funding a lot of the HIST strengthening plans in countries through their country grants. So they are also part of this process. So they know the tool and they can kind of recognize the areas that should be prioritized in a Global Fund grant or in the Gavi grant. And I think that can also help if you need to get funding for HMIS in discussions with EPI program when you discuss Gavi grant, for in discussions with HLEPD malaria programs when you discuss the Global Fund grant that you have this kind of system strengthening and the HIS HMIS strengthening activities prioritized in those projects. Okay, I'd like to continue to look at it and talk about how we can make the support the country more sustainable way of how we can make the system more long term when all the partners, we know that the partner will be some day with the withdrawal and the government we can continue. So yes, we are not political people. We are the data network. We have a feeling that, yes, all the time when the system have a program, we need to request the HIS HMIS team to support. Either we only build the government with some of the local staff or they are construction staff. We need some data, also when the program is set out, they also need to go to another job. So we get to, so we talk about the government. So what kind of team that we are going to build? Because we need to start thinking now how many people we need to build between the middle health because we also have students in the National University of Health Science. We also have people who are currently working in the middle health, but they are still young. They can continue. So what kind of field that we need to start thinking and to put people go to abroad? Training, whenever they finish, they can come back to hand over the HIS HMIS team so that they can continue to build the capacity make the system functioning for whenever they continue. Because the government can only adopt that it will become the main platform for the middle health. So now we need to keep the momentum so that we can make it sustainable. In past, we only support because the government have a lot of partners to support. Computer, internet, it doesn't matter. But we just worry about those people who are going to maintain the system. Of course, we cannot forever rely on the HMIS to be there now to support. But I didn't agree with Hanan that we need to do the other, even though America also needs private sector to support the system. Because of the political people, we need to understand that somehow we need to make our people to maintain the system. But I hope people, we cannot make it immediately. Yeah, maybe take some time. I don't know if we can look at how we can build to people to look at the kind of idea. Except that this is a really good tool for us. Thank you. So I think I'll try to address that in my presentation a little bit more just so you can see kind of the makeup of some roles and how we can support that kind of long-term sustainability of system strengthening a little bit more. So, yeah. So last June, we had the HMIS conference in Oslo. And we had a session on very much this topic. And we invited some of the countries that, if we use the term here, are mature that have been using these guys over many 10, 15, 20 years and have very strong kind of in-government teams and are, let's say, less reliant on the his day-to-day support. So we had countries like South Africa, Ghana and Rwanda and we challenged them a bit to talk about what did it take to become this mature and more kind of self-sustainable. I think there were, of course, a lot of different details. But I think some of the common factors were very much around the leadership and governance that you have kind of a long-term plan, consistent plan focusing on building that capacity year after year instead of kind of changing path and changing. So there was one good example from Rwanda where they have since 2011 been very systematic in defining what is the core team we need and what are the skills and then what are the training opportunities. So sending the right people to academies and gradually kind of beating up that capacity and also using the script to do trainings and build capacity over time. I think that over time is really the key word here. And Sherad, it will talk more about kind of how you can do more kind of specific planning around the capacity. I think we should get started just so we can just see some of the other kind of contributing factors to this maturity assessment. So in that maturity assessment, it's quite long, but even within that, there are other components that you kind of need to look at in a little bit more detail to get the results to fill that in. So even though it's kind of quite comprehensive and you're able to assign scores, the scores need to be evidence-based. So in that case, then we need to actually perform some other type of kind of legwork to get the information we need in order to assign it a score. It's not just something we can typically just discuss and kind of come up with. It requires a little bit more kind of insight than that. So I'm just going to talk about three kind of contributory assessments, to the maturity profile. One is going to be a metadata assessment. So we've talked about data quality previously. There's also this aspect of metadata quality and we'll just also just briefly discuss why this is a problem. We did discuss data quality and I'll just present very quickly some data quality assessment frameworks and tools, not just all the different features and things like that, templates for reports and things of that nature that can help you when you're performing these. And then we'll also talk about this core team needs analysis. So this is all about this long-term kind of capacity building, strengthening and planning for different positions within this. And when we refer to kind of capacity building actually, it extends beyond training, right? These are not kind of interchangeable terms. So the one is part of this human resource planning where we're trying to figure out what the different roles responsibilities are, how do you retain those stuff, as well as this training component which is that additional component of scaling people's skills have over time. Okay, so we'll talk about the metadata assessment first. So this is basically just looking at the quality of a configuration for the most part. So we looked at data quality and we can think about the same thing for our configuration. Still a bit quick. I'll post all these frameworks and tools online. So when we're talking about this metadata assessment, there's really kind of two ways we can look at this. So one is reactive. This is when there's a problem in our system and we need to assess it. And for most mature systems, this is where we are. It's been going on for many years and we have all kinds of things configured in our GHIS2 configuration. And we need to kind of figure out what these problem areas are. The second is proactive. And this is better both for new systems as well as these old systems when we fix them. How do we stop this from happening again? So if we just talk about the reactive measure first, because for most mature systems, this is typically the process that we have to undergo. We have some tools built inside of GHIS2. We have built a specific tool for this. And then we also have some manual review procedures. Not everything can be automated. So why do we have to kind of look at this and kind of what effect does it have? So I just listed a couple of points here on the various types of individuals it might affect. For the end user, in particular, we've had some mention about data use and it's been a common thing for many people that I've talked to. It really creates some confusion when they go to create data outputs because they're not able to get the data they need. They're not able to find the indicators they need. And then all the values that they're looking at, they can't really trust them so much. Also, anytime they want to disaggregate those outputs, by mail, by email, by age, by any other particular disaggregation, this can often create a lot of problems. And just even accessing those items, they might not even see. And then on the data quality side, there's quite a lot of problems that can result create good quality issues. Looking at any long-term trends over time, if you're trying to figure out your completeness and that things are unassigned or assigned incorrectly, there's just a multitude of issues, many different issues that can result from this. Then from the administrator side as well, there's a lot of problems that can result. I just listed some here just to give some food for thought. But this configuration, these four kind of configuration principles can have a kind of wide-standing waterfall effect on your system and increase number of different challenges. So here's an example just quickly of an assessment that my colleague Olaf performed some time ago. And he was just kind of looking at the analysis. And in this system, there was all these dashboards, most of them had, I mean, 50% had nothing on them. Almost 90% of them weren't shared with anybody. So no one could utilize them with a single person that made them. We had a couple shared with individual users. And then there was all kinds of other problems with the way things were breached. So it was very hard to find anything in the system itself. So if this is kind of how things are set up in the system, it makes data very challenging. So one thing of course is understanding the data, but if you're not even able to make a basic chart or graph or map, and you're gonna have a lot of trouble being able to then interpret that information. So this is kind of how all of those other challenges that we talked about can cause this issue with problems with the metadata configuration. So we've created this tool. I'll just show the tool in a second. Basically to help us assess the metadata in the system. And I'll just pull it up. Okay, so we've created this tool basically to help us perform this assessment. So because before it's kind of difficult to assess some of this configuration on our own. So we use this against EHIS-2 instances. It can be any instance you have, your malaria instance or HMIS instance. It's good to actually assess some of your instances separately just to make sure that there's no shared problems. And if there are, what do we do to fix them? So we get this kind of summary report for this. There is a number, let's scroll down. You see there's a number of kind of issues that we identify. Each of these comes with a level of kind of priority. It also comes with a basic description and it also identifies the number of issues that are within the system itself. So I'm just gonna scroll down a little bit. And some of these are a little technical but don't worry about that. It's more about the principle. So you can see for each issue we have kind of a level of criticality if you will. Some of those are the most critical or maybe those are usually needed to be fixed as soon as they can. We give a number, let's say there's none in this system. And then we give a description and we do this for each of these issues. So it's quite a detailed report but the idea is for someone to kind of filter this out. You shouldn't really present this to any decision making body or any partner agency but you would work through this and kind of write a summary of what your most critical issues are. You want to triage those and prioritize those that are most relevant to you. Of course, you'd want to document all of these problems because you might want to take care of them at some point but you really want to have some mechanism to do this and make sure you can prioritize and evaluate them effectively. Of course, then fixing these is another challenge. So we offer this tool and you can get a little support in running it on your system but once it's set up, I mean you can get the results fairly, fairly quickly and review it together with somebody. And we offer this, it integrates all the automatic checks that I've talked about. So this includes these data integrity checks inside of VHS too. You can run it at the same time if you want along with all these additional checks that we've added since kind of give us this automated report on the configuration. And it will look at a number of different aspects of your system. I apologize, I'm going a bit quick, I know, but I posted this example report inside the Google Drive. So you can have a look at the full details of the report and see all the different information it contains. Some of it's a little technical, some of it's not too bad. Let's see for example, there's a lot of words maybe about none everyone will recognize but some are all about the analysis side as well like dashboards that haven't been looked at, charts that haven't been looked at and years, data sets that have no data. So things like that, you know, most people can recognize. And yeah, we just use this to kind of support our assessment of metadata. Of course for the maturity profile, but of course this is also something you can run fairly routinely on a development system kind of assess the health of your overall configuration. And we don't want to end up where many systems are right now, where we have so many issues that it's kind of overwhelming to manage. But of course that's something you can still help with. But if you run this kind of routine just like a data quality check, I think the health of your system and then when you see kind of some of these top up you can take care of them. You know, they will happen, these things do happen. You just want to kind of try and get ahead of them. So we also have this other side of things which is this proactive side. And this is more about kind of preventing those challenges from happening, right? So this is procedures, training, kind of mitigating measures we can implement either before or after, right? So you might find those issues, fix them all and then we don't want them to happen again, right? Or you just might be starting from scratch and you just want to implement the best practices you can in order to prevent those things from happening in person. So this is an example. So as an example, we have some SOP standard operating procedures available that gives some guidance on considerations to make to prevent these challenges from happening in the first place. So we're still working on packaging these appropriately. I'm happy to share this example with you, of course, but we're still working on kind of writing several guiding documents about all of this to basically prevent some of these challenges that we identify via the assessment from happening in the first place. And this is kind of the next level of kind of implementation support because in academies and things like that or even local trainings, you kind of learn where you clicked and how to do something inside of DHIs too. But this is much more of a coordination procedure and following something that you can implement at scale so you prevent some of these things from happening. So as an example here, this is for adding aggregate metadata and there's all kinds of information on category model, on data sets, on user groups, sharing, things of that nature. This is an example with specific groups that just a template that can be changed. So we do this in order to kind of prevent those things from happening in the first place. And making sure of course people, the whole idea is that this helps people access their data. It's really not just so we can have a clean DHIs too. Of course that's great, but really the end goal is to support data quality and data use and make it easier on users to be able to access the information they need. All right, so when we perform this assessment, especially in mature systems, large systems, the number of issues is kind of, it's often too much to deal with all at once. So we need to kind of prioritize the issues that were identified. So as an example, I just listed some issues. I'm not gonna say the exact issue, but let's say we have a handful of issues. Oftentimes experienced implementation team members can kind of advise you on what is a higher priority versus maybe what is something nice to fix, maybe a little time consuming and you can kind of park it for now and come back to it later. Of course, you wanna document that like I said, so you can come back to it when it does need to be fixed. But the idea is to prioritize this because you will often find some quite critical issues that need to be taken care of and not can sometimes be challenging to fix in and of itself. And then also with these procedures that I mentioned, it's often a good idea then in response to those most kind of critical issues that you've identified to try and write some of these, I'll share the procedure with you for one aspect, but it's often a good idea to kind of write some of these procedures, help the team, perform some training on implementation of those procedures to prevent these issues from happening again. So here's kind of the just overall process of the assessment from start to finish. You can take a little bit of time, can be quick, it just depends on the results and how organized everything is and what's in the setting of performing this. But you wanna make sure you involve the relevant team members and keep them updated at the beginning to find the scope of the assessment. Then you kind of have to identify the extent of all these problems. That's through some of these tools that could help you with that as well as some of these manual review processes. I didn't talk about that, but I can share some information on that. You wanna make sure through this whole process that you're continuously presenting these findings back. Chancely mentioned kind of an intervention of external individuals, especially if an external support is helping you with this. It's a good idea that they're continuously updating you on the progress internally, so you can present this back to whomever needs to know about all these things. Through that assessment then you wanna kind of identify strategies to mitigate these problems. That can be very challenging. Right now we're still working on some toolkits to help countries implement some kind of retail mitigation measures on the finals problems. We don't have all of that as yet, but we are working towards building some more kind of retail tools that will help countries to solve these problems. Of course we have some kind of small areas where we can offer some support, but we are still working on that in hope to have some more information on that soon. Then you wanna kind of prioritize these and then implement those on these systems and of course present back your findings. You can run the assessment again. It's kind of a pre and post evaluation of course, noting that if you implement something that should be fixed, it shouldn't pop up in the assessment anymore. So that's how that both ends up spectrum to kind of give you, check on your results. And like I said, it's something that you can run with you to help. All right, so switching gears a little bit. So I apologize if I'm moving around a little. But so the next one I wanted to talk about was the data quality assessment. And this is now looking at the data itself. So before we were looking at the configuration, now we're touching on the data. So within the maturity assessment, the permit that Sora presented, actually reviewing the data itself is not a requirement. But I think overall it's just the general health of the system. It's a very good idea to obviously see the quality of that data. So in the maturity assessment, what you're looking at more is procedures and having forums for discussion of data quality. But there are many other areas where this type of assessment shouldn't perform especially if you're performing any training on data use. Because then you can't really be confident in your data, in your data if there are issues with its quality. So what do we typically want to assess when we're performing in data quality review? So we can look at it at two different ways. We can perform a rapid assessment and then there are some things we can do very quickly regardless of our system, at least in my opinion. And then there are more detailed assessments where you can add on extra components and that can take a little bit more time but it can give you a little bit more insight into your data quality as well. So we can use a number of EHIS-2 features to assess this. Of course there are other tools as well but the EHIS-2 can really support a lot of this. And I have a presentation that I shared earlier today on this, but these in particular the reason why they take a little bit longer is because you may need external data sources of some kind to support this. So the outcome of these assessments, that's a little more challenging to define. Often the challenges with data quality that we see between are a result of challenges at the lowest level of data collection. Sometimes it can be very simple, maybe it's just a simple data entry error, but sometimes it's more serious where health workers might not actually understand service delivery guidelines and are misclassifying services for that reason. And in that case, that's a more serious area where other interventions will be needed. But after you perform this, it should also help with some kind of generating some discussion around implementing, for example, some of the EHIS-2 features to support routinely reviewing data quality. So it's probably if this is being done and the results are challenging, maybe the data quality hasn't been looked at in a while. Also drafting of things like manuals to actually use these features, the training guidelines, as well as standard operating procedures, which is how often should the data be reviewed? Who should review the data? How does the data get changed? All those other kind of small things that kind of seem small, but if no one is assigned, then it doesn't really get done. And then of course at the actual training of staff in both the process as well as the tools. So as an example as well, I will upload this template. So this is an example of a template report using some of the EHIS-2 tools. There's also some guidance on kind of setting things up before you perform the assessment. And this is just for a rapid assessment. This is not for an extended long assessment, but just some concrete guidance on how you set things up, what are some typical measures you can look at, how you might want to frame your report and what type of recommendations you might consider making. Of course, this is just a suggestion, but I know it's often hard when you're staring at a blank page, where do you actually start, right? So it's not really meant to be copied word for word, but it's meant to give you a sensor and idea of what type of information you could produce when you produce, I'm sorry, when you perform this type of report. So I'll upload this and it has, I mean, it is a template, so you can, there are areas that you can replace. I try to give, I notice here it is, this is not highlighted as well in this PDF I guess. Some kind of distinct suggestions where these various measures should be run as well, but what level of your system should look at, just to account for this issue brought in before, this issue of sensitivity, where maybe facilities are of value at the facility level, you won't really see that maybe a higher level. So I make some suggestions as well. So this is just a kind of help us assess our data quickly. And of course, like I said, there can be more confusion. So this comes back to Chancellor Lee's comments. Now we're moving on to the next one, the core team assessment. So I apologize, I have a lot of different assessments and tools that I'm presenting very quickly, but like I said, I will share them all online. So this concept of a DHS2 core team, it has been around for a very long time, and actually we'll probably speak to it better than me, but what we're kind of mentioning here, when we talk about this idea of a core team, so what is it actually? So we mean that it's kind of a combination of experts, both within and outside of the government, that are responsible for overseeing the DHS2 implementation across all domains within the company. So within health, for example, it doesn't mean just the HIV program or the malaria program. You know, we're looking at kind of an integrated system with people who can kind of support these different program areas. And of course, you can have program staff support that as well, but you really want this kind of overarching contribution system-wide. So we suggest that these core teams are there in every country, you know, really because we want these countries to build the necessary technical capacity to maintain their DHS2 implementation over time. And the assessment of this core team capacity, that's one of the blocks in our maturity profile, so this directly contributes to the result of that tool, and it's of course useful for many. So we've tried to, together, identify a couple key roles that are within this DHS2 core team. And where we can, we kind of, you know, some are optional, maybe some are maybe less priority at the beginning, and more so required to get things moving. So we've identified these different staff that can kind of contribute to this core team. So this includes what we call a DHS2 operational need. I'll explain each role, or at least one, I'll take one and try to explain it a little bit more. DHS2 implementation experts who are working on the configuration and maintaining DHS2 over time. I think I have security listed twice, sorry about that. Someone looking at security, trainers, and this doesn't need one distinct person for a role, can have someone share it, of course. The implementation person might be also doing some training for example, okay. A server administrator, subject matter experts, so people who are knowledgeable in the use of the data, whichever domain we're looking at, or whatever health program area that you're looking at. Program managers that are actually responsible for kind of managing projects, budgets, et cetera. And app developers. Now this is optional, one of the optional roles. So what does this core team actually do, okay? So they support the DHS2 implementation overall, right? And there's many different functions that they perform. They interact with users, sometimes even users in the field. They help design the actual DHS2 database. Sometimes they act more in a role of helping to harmonize indicators, look across various data sets, forms, individual registration pieces of paper, all the kind of different workflows that are available. Design reports help you support the creation of dashboards. They can also support the maintenance of DHS2. So this is all the configuration, all the server components, hosting, integration with other systems. We will have a session on integration tomorrow. Cleaning up the database, as we talked about this metadata assessment earlier, upgrading the DHS2. They perform capacity building. This is a big area, big effort, right? They're training end users, training staff in the field. And of course we have the DHS2 network, and they're just meant to support these local teams to implement these actions, okay? So what are we kind of in this kind of area where we're trying to assess the core team? What are we actually trying to do, right? So this exercise is kind of trying to outline the gap between current skill sets and the required, right? So maybe in your mind kind of the requirement is so they need to do everything. But we need to break that down into smaller components and understand, well, what are the specific skill areas they need to build on? And where are they right now? Because we need a baseline to make sure we can develop a training pathway or training protocol for that individual that will actually work in practice and not just be kind of scattered and selecting topics that we want them to learn about, but they might not be ready to do so. We need to do it in a good way that makes sense over time. So we have a tool for this as well and I'm going to explain about one of the roles that I've showed earlier as well as a tool at the same time. So for each role in the core team that I've described, we have a detailed list of skills and they're assessed on a scale of one to four. There's a description of what each of these mean, but at its most baseline level, one is the lowest level, four is the highest level, okay? And based on this assessment, we can identify areas that need specific attention. There's not really in recommendations that are given necessarily or scores so much based on the different areas. It's much more about where are your priorities, right? And now let you understand where those skill levels lie and maybe even you're missing people, you're missing entire people that should require a certain profile than what we kind of do from there, right? So the idea is to develop a structured training plan. So we often kind of, when we work in this area of training, it's kind of a little ad hoc, right? So there'll be a request for a training or there'll be projects with certain training aspects tied to them and we will go ahead and implement those. But really what we're looking to do is kind of look at this at a much broader view with a much longer timeline. Knowing that by the end, it'll be much more sustainable than where we are now where there's some challenges, right? So scores that are one or two, they might need specific attention, however they might also be kind of lower priority. So what is the outcome of this, right? So when we finish this, what are we getting out of this? So we have to review the assessment results and decide what are the priority areas for training, right? And also human resource development. So we should not separate these two concepts because capacity building, often when people say capacity building, they only think about the training aspect. It's also actually tied to the people that are there. We need the right people, we need mechanisms to retain them, to evaluate their skillset, to make sure they're managed appropriately. So as an example, there might be entire roles where no one has even been identified performing and therefore no current skills assist in the country, right? In that case, what you do, you select someone that you're going to have those skills built over time, do you select someone new maybe that you can start from the baseline and build their skillset over time. So there are some considerations here to make. So I'm just going to... Okay, so what we've tried to do for each of those roles that I've identified on that list is create a profile of sorts. So this includes kind of a description of what they do, some of the typical work tasks that they will do, some of the qualifications they will have. We're still working on this and I apologize that we've been kind of in this draft state but we're trying to kind of finalize this, right? So as an example, I've just taken one role. This is the DHIS2 implementer and written in JavaScript control. It's local work and it's meant as a template, right? When you're also to give people background about, you know, need more detail and what does each profile have to contain? What are the kind of typical tasks they would do? Because then we also kind of have and the next step is this assessment of skills, right? So as an example, we take each role that I've identified in that slide deck and we build out a profile for them in terms of what are the typical operations that they would perform in order to give everyone a better sense. So it's not just the listing of the roles, because that is often, you know, I'm not enough. It doesn't give us any community. These still need to be reviewed a little bit but, you know, then we have kind of, as much as we can, example work tasks. And I say examples because this doesn't mean this is exactly what they're going to do. Maybe it'll be changed a little bit, especially with input, you know, everyone has to kind of review themselves internally what are the actual tasks persons will perform with different contextual considerations, who have different experts who also might contribute to reviewing these and making changes. So we try to kind of be as detailed as we can not to be overwhelming, but much more to provide a menu of various items that people can select from or people can get an idea from when they're trying to kind of figure out who we actually begin to give us work or who's the right person. Because I think another thing that which I'm also trying to kind of explain, you know, you're not really looking for the perfect person with all of these qualities. You're really looking for someone who has the potential to get there. That's kind of unrealistic to expect someone to have all of these things come to mind, right? So for each of those, and I'll upload these as well, for each of those roles, we've created a profile. It's still draft, so I apologize, but I will upload those so you can have a look at those to get a better sense about each of the role types that you're trying to represent. Okay. It just is one of the things right now. So similarly for each of the roles, we've also developed this baseline on what we call a news assessment, okay? And what I've done in this tool, we've listed all the roles, okay? So all the same roles that I talked about, all the same roles that we have profiles for, I've gone ahead and kind of listed out a number of different skills that this person should have, okay? And whether they exist or not, okay, that's one thing. They don't exist, of course, you can assess those skills, but if they do, you can kind of go through this assessment and once again, so for each item. I know it's a bit hard to see, but the tool is also online. It's a little bit more, we pull up this, there's fours for one to four, and it's described for each area. What does that actually mean? So if it's not achieved, well, what does that actually mean for the person? If it's an early understanding, I still have to fill in for some of them, okay? So these are the categories we have, not get achieved, early understanding, adequate, and mature. So it follows a similar kind of wording as a maturity profile, a little bit different, but then we've also provided explanations for what each of those categorizations means. So it can help you to kind of assess the skills of this individual. And then we have each of the different, kind of just quickly skills, and then the kind of more detailed description. And this is related to that kind of skill profile that we've created for each job. So the idea is to kind of go through this and assess the person's skill and kind of see where they are, right? And it's not meant as a judgment, very judgment as we say, right? It's much more meant to give us a baseline where we have to start. If this person, for example, this program manager, right, they're not able to understand, you know, anything about basic budgeting for DHS too. It's not a criticism, but that's something that might need to be worked on quite a bit. And where do we start right from there? So like Orla said, there are some more tools we can provide it. And of course, we can provide some extra guidance in place. But if this person is very mature, very comfortable budgeting DHS to implementations and understands all the inputs that are required, you know, that's probably an area you don't need to focus on so much, maybe compared to some of these other areas And for all these profiles, we have a similar arrangement. So for example, this implementer, this role profile that I pulled up before, same thing, we have all these different skills in the configure aggregate, in the configure tracker, and you work with data quality, right? And, you know, depending on where they are, where they fall on those areas, we can help, you know, can help you identify some of these priorities. So this is meant as a tool to really give you a much more structured way of assessing the skill set of an instrument, right? And it's a way to document this. And of course, you know, you could perform this evaluation more than once. But this is really meant to be fed in and then to a larger training plan, okay? So not something that's, you know, something that you could build over a couple of years where you can take this person and walk them through many different areas. And also, you know, it helps provide some evidence where you are now and some justification as to why you're requesting, maybe, you know, maybe you're requesting that you have someone come and perform some type of training in your country, right? It helps to provide that justification to say, well, you know, we've done this assessment of their skills. They're really, there's really some various that can be strengthening and we've identified some of these here. Okay, so that was very quick. I know I've jumped through many different tools. I apologize for that, but I will end the presentation for now. If there are any questions, just feel free to ask them in your comments. Thank you very much. If there are any questions, we'll end for the day. If there are questions, just feel free to ask them. Please. Yes, it's online and then you will drive. Please. I guess you can tell the online people what they've done. Tomorrow will be soon. We'll be talking about the exhibition here. The key is the key, the key is the key of the exhibition. Sometimes, I think, I'm sorry. And there's people, I don't think we've got the key. And I'll be sorry. I need to talk about the counterfeit.