 We just want to let you know that we will be recording this meeting, just so that you can share this recording to your colleagues if you wish to cascade the training. And you can refer back to the recording after. Just a few notes on GoToMeeting itself. You may mute your line if you're not speaking, just to let everybody have a quiet line while somebody is speaking at that time. You can do that by clicking on the green button that you see on your screen. If you have any questions or comments during somebody's presentation or if you'd like to have a comment or a question in queue, you can use the chat box in GoToMeeting that you see on your screen. And if you're on social media, we invite you all to post on Facebook, Twitter or Instagram or any other social media channels that you use by using the hashtag that you see on the screen. Which is Ending Child Marriage, hashtag End Child Marriage. And you can tag us by atgirlsinspire and atco4g. So without further ado, I'd like to pass the mic to Mrs. Frances Ferrera, who's a senior advisor for the Women and Girls Initiative and the team leader for education at the Commonwealth of Learning. Thank you, Cherise. Good morning to you in Vancouver and good evening to the rest of you in Asia and Africa. I'm also at the Mobility Africa, so it's four o'clock in the afternoon where I am. This session will be one hour. May I kindly request these two people whose lines are active? That is Richelle Karim and Ms. Banu. Can you kindly click on your green button and mute your lines? Thank you very much. The session will be one hour. There are three presenters, myself, who will give you an overview of Girls Inspires MNE Agenda. Then Cherise, who is the project coordinator for Girls Inspire, she will give you an overview of the process, tools and the platform we are using. And then we have brought in Kuntal Di from Mandeshi Foundation in India, who will provide us with a practical experience and how they bring their theory to practice in regard MNE following this. So let me join Cherise by welcoming you all again. This session is about why is good data important? Why is MNE important? It is Girls Inspires' vision to create enabling conditions for sustainable livelihoods for women and girls that will break the cycle of child, early and forced marriage. Therefore, good data is important to measure up progress. The first in talks about the three critical functions of MNE. You can learn from MNE from the past experience. There are still two lines that are active that create a lot of noise. Who is this person? Can you kindly mute your line? His name is under Mustafa. I don't know from where he is. And also Ms. Daisy, can you kindly mute your computer? So first of all, we can learn from past experience. MNE helps us to be accountable, but it also helps us to manage our projects effectively because it can help us to improve our service delivery. It can help us to improve our planning and to allocate resources accordingly. So in short, MNE helps us to make better informed decisions. It helps us to improve project performance. And it helps us to achieve planned results. The next slide, please. Next slide, Cherise. Thank you. The monitoring and evaluation design, that is the next slide. Basically, the monitoring and evaluation design articulates what data is required and what methods are going to be used to collect and analyze this data. So first of all, we need to clearly articulate the purpose of the evaluation. In other words, why the evaluation is being conducted, at what particular point in time of the project by whom and who will use the information. Then what is the focus of the evaluation? In other words, the key questions that the evaluation seeks to answer. I will talk a bit more about that a little bit later. Also, the sources and methods for obtaining information. I have a slide that shows which are our data sources. The procedures that will be used to analyze and interpret data and report results. Cherise will reflect on that a bit. And the targets that must be reached for the project to be considered into the performance management framework. And then the evidence that will be used to demonstrate project performance and results. The outputs and outcomes, I will also reflect on that a little bit later. Next slide, please. Thank you. The next slide, the next thing that we should consider is the approach. The Girls Inspire project uses a results-based management approach. Combined with developmental evaluation. Developmental evaluation means that we also make use of case studies. We make use of observations. We make use of, I did say observations, case studies, site visits. That is developmental evaluation. But the results-based management, it is when we reflect on the results. I will show you in the next slide what it is. On the results, that is what we measure. We reflect on the results and we measure them. Now, we are using a participatory approach because we want our evaluation to be relevant. We want it to be culturally sensitive. And we want it to be useful to our intended users. You would remember that for the partners, we sent out the tools and we asked you to comment on it. We asked you to tell us whether that is culturally sensitive. But we also want you to take ownership of it in that process. Next slide, please. And because we are involving our partners, our approach and methodology can be utilized by development practitioners like you who have limited experience with M&E. You do not have to be an expert in M&E to work with us on our M&E plan. But it can also be used by experts in the field of M&E. Next slide, please. A very important focus of our work is gender equality. And therefore, our M&E strategy is a strong emphasis on measuring gender equality. On this slide, you will see a definition of gender equality from Global Affairs Canada, who is one of our sponsors for our project. Women and men enjoy the same status and have equal opportunity to realize their full human rights and potential to contribute to national, political, economic, social and cultural development and to benefit. Wrong slide. Next one, please. What's happening to the slides? Yes. To benefit from the results. Just leave that slide there. So our data collection, for data collection, it is critical to understand the lived realities and needs of women and girls. We know in the past the main focus was men. But for us, to ensure that there is gender equality, we need to measure it. Otherwise, we will not see how we make progress or do not make progress. And in our specific case, where our focus is on ending the cycle of child and early-force marriage, this issue is starting from a very early age. But the data that's available at this time focus mainly on women and girls from 15 to 49 between the ages of 15 to 49. There's a positive of data in regard to women and girls between the age of 10 and 14. And we know from experience from the field that there's indeed a lot of girls in that bracket who are also affected by this. So for girls inspired, it is important for us to gather data, to collect data of this whole spectrum that's affected. First of all, girls up to the age of 18 who are involved or who are affected by CEFM and then over 18 where we have already those who were victims and now are facing the consequences and we see how we can help them. The next slide, please. The next slide of change. Now, why is the theory of change important? The theory of change, colleagues, is the basis, is the starting point, is the blueprint of our work. This indicates the logical sequence of our work. But it is not only the blueprint of our work because in the theory of change, it tells us what is the problem, it tells us what is it that our vision is, where do we want to go. And in between there's a causal relationship between what it is that we will do, our strategies, our team, et cetera, et cetera, our outputs up to our outcomes and then to our vision. So it gives us that reflection. But it also allows us an ongoing process of reflection to explore how the change is happening because through the M&E results, through the results that we get from our data collection, we feed it back into our theory of change so that we can see how the changes have taken place. So the data feeds back into the theory of change and that is very important. Those of you who have focused or reflected or explored our theory of change before on our community of practice would have noticed that we have put some data in there for you to see how the change is happening. So that is why it is so important to start from the design where we have the theory of change as our blueprint. So there, our theory of change goes into our performance management framework. The next slide, please. In this slide, you will see the next one, please. Next slide. I'm not sure how clear this is from where you are, but the previous one, please go back to the first one. That one. Kindly just stay there. I will guide you to go to the next one, but stay there for the moment. I'm not sure how clear this slide is, but we are recording the session so you can go back and you will look into this again and you can listen to what I'm saying and maybe follow it again. Here, you remember I talked about we use a results-based management approach for our M&E. This is what it is. In the very first column, there's our results. The second one is the indicators. In the next one is the baseline data. Then it's the targets, the data sources, the data collection methods, the frequency, and the responsibility. You remember in the beginning, I said that the monitoring and evaluation design will tell us what data is collected at what time, by whom, and who is going to use it. This performance measurement framework outlines that in this format to you. The indicators is very important. It's those things in the second column that can be counted or measured and that provides reliable means to measure the progress towards our results, which is in the first column. The indicators are telling us so many girls will be trained, so many girls will... I'm not reading from there, I'm just making examples. This number of girls will complete life skills. This number of girls will find employment, et cetera, et cetera. We can count that number of girls, so it should be measurable. In other words, our indicator should help us to credibly demonstrate progress, impact and accountability. We cannot just put indicators there or measure anything. That is why it's so important that a proper planning is taking place. From the theory of change, we go to the logic model, from there to the performance framework where we list everything accordingly and each other. I want to go to the next slide where you see the summary of all our results. On this slide, we see the results that we are measuring. As you go along, there's the ultimate outcome, the intermediate outcome, the immediate outcome, the output. These are our results. This is a sample of the outputs, but there are more outputs. And below the outputs, you will find the actions, the activities. Those are the things that you are doing in the field. So for us to see whether we have achieved these results, that is where the indicators come in. But then it's important we chose how we define our results. Because, next slide, please, because that first intermediate outcome increased access to safe quality, gender-sensitive ODL opportunities for women and girls in selected countries. How do we make sure that we measure it correctly? Because remember, I said the indicators should give us that credibility that we have measured it so that we can now be accountable at the end of the day. So that first result that we have there, we have defined it, we have broken it down in the variables. And then the variables, there's a definition for it. From there, we have broken it down into a description that you can see there, the safe quality. What is our understanding of it? And when you go to the last column, that is a very important column. In that column, that is the tools that you are using, that we are using to measure each and every one of these results. So we cannot probably say this result is measured just like that. We have to make sure that we have some credibility and say, well, where do you measure this result? So in that last column, we referred to the tools that we are using, the data collection tools that you are using, that you are putting on Fluid Survey that Sharice will talk more about. Those tools are telling you exactly which tool are used to measure which result. And that is the credibility that I'm talking about so that at the end of the day, we can bring it all together. Next slide, please. The data collection are in three phases. We have the baseline data collection, we have the monitoring, and then we have the final internal evaluation. So right now, the majority of you, except for Spark, I think who started with monitoring already in some instances, the majority of you are still collecting baseline data. The baseline data is that information that we collect to understand where are we now. Before, remember I go to the theory of change again, before anything happened to this group, where are they? So that at the end of the day, we can say because we have provided training to this group of girls, because we have provided training to this group of staff members, this is the change that I've taken place. When we go to the monitoring phase, you will see now there are so many more girls, there are so many more centers that was established, et cetera. But the baseline may say there's no safe centers for girls. So in the monitoring and in the validation phase, we can go back and measure that against the baseline to see how the changes took place. So it is very important that we know where do we start. So the process of checking progress and quality over time, that is the monitoring phase. During the internal evaluation phase, we take the data over the course of the project and we take a final assessment. We collect data finally and we write up an evaluation report and we can now measure that against where we started and we can look at all the results and say these are the changes that took place and this is why we are here. The next slide tells us the data sources, I think. The next slide, please. Yes. Where do we get our data from? We get our data from, remember I said earlier, the developmental way of collecting data, the real-time observations, the case studies. Then we have attendance data and activity records. Remember recently we requested you to upload attendance record. That is data, that is valuable data because it tells us the age groups of the girls. It tells us where these girls are. It tells us whether these girls attend a skill session or a life skill session or a vocational skill session, et cetera. Another source is lessons learned. Again, this is developmental and then surveys with institutions, surveys with girls and women, surveys with communities and with employers and those are the tools that you are currently using. The next slide, please. The next slide is the four tools that we are using and I think I have alluded to that in over to Cherise is to bring it all together. The final slide. Here on this slide, it's an example and this is adapted from somebody else's presentation. We will put the acknowledgement on it so that you know this is not done by me. This is how our M&E framework fits into the whole process. This is a cycle, our M&E cycle. We have the girls inspire plan, that is our strategic plan and we have our results framework, the one that I showed you earlier or part of it. Then from there, we develop our M&E plan. We do our monitoring and then we do our evaluation. The monitoring feeds into our data hub. Then we interpret this data, we make decisions and it is feed back into our plan so that if we need to make improvements, it goes back there. That is why it's the cycle. Here on the right-hand side, you will see we use the data to report to initiatives stakeholders and also to partners. That is how far I will go. I will now hand over to Cherise to continue the presentation into the more practical and more interesting part where you are involved. Over to you, Cherise. Thank you very much, Frances. Thank you for that overview. As Frances had said, we just wanted to give you a snapshot or an example of exactly how we pull all of that data together and from all of those data collection sources that Frances was referring to and make sure that they are aligned within the performance measurement framework that she's also spoken about. What you see on the screen at the moment is what you've seen earlier when Frances was speaking, which is our performance measurement framework. Specifically, we wanted to draw your attention to a specific result, which is a 1300 result. For the purposes of this presentation, we will try to map out how exactly we measure that result within the context of our data collection tools. For the 1300, which is wanting to measure the enhanced economic leadership and family decision-making of women and girls, you'll see, as Frances had mentioned, what our indicators are, what our targets are, and we've mapped out what those data sources should be to gather data specifically for that result. Here, again, you've seen this chart before, which is the definition of our indicators. You see at the bottom of the page where the 1300 result sits and how we've broken that down into two variables and how we've defined those variables. On to the right hand of the screen is where you see the arrow pointing to is how we want to measure a specific variable. For the purposes of this presentation, it's the economic leadership variable that we want to measure. We determined that we need to measure that by survey questions. Specifically, that code on the bottom where you see is W6, which refers to the women and girls baseline templates and tools that all of you are familiar with within the Girls Inspire project and the baseline tool that Frances had talked about earlier. That's the sixth question in that tool itself. This is a snapshot of that tool, and a lot of you are familiar with this as an M&E focal person within your organization or a project manager who have been training your staff and your own data collectors on these tools that we've been using for Girls Inspire. You would have been familiar with this specific tool, and you would see that this is where you see the red circle is this is how we measure the 1300 results in the sixth question of the women and girls survey. All of the data that we receive from here, this is just a snapshot of how we receive that data on Fluid Survey, which is what we decided to use for this specific project. It's the web-based platform that all of you are familiar with within Girls Inspire, and you will see here that we receive data from all of different sources, for example, in Tanzania, and Godfrey Manubi would be familiar with this, and Salima Panda as well, who's currently on the line in the certain districts that they're working in. You would also see responses from Spark, from the different centers that they're working in, specifically which villages within those centers that they work. So you will see responses from Raul Pindi, for example, and Peshawar in the different communities. And you would see on the right-hand side the responses to that specific question number six and where they specified what kind of livelihood they're involved in at the moment or want to be involved in. So this next slide just talks about the other tools that we use to gather data that speaks directly to our results. And many of you are familiar with these tools that we use. And, for example, they're the baseline monitoring tools that we've just talked about. The registration forms for the girls and women. The attendance data form the EC on the screen at the moment. The semi-annual reports, and as Francis had talked about earlier, the other sources would be the case studies, observations, lessons learned. The specific materials, for example, that you're using in the field for the skills training and the life skills training that you conduct for the girls and women. So these are all of the different tools that we gather data from. And as you know, we collect all of that data and you upload that data for us on different platforms like Fluid Survey, which we've talked about. Also Dropbox, where you place all of the materials. And from there, we collect all of that data. We pull all of that data from all of those different sources and we place them into what we call the data hub, which is what you saw in that diagram that Francis had shown you earlier. And this is where all of our data sits. So we work through all of the data that we receive from you. And you will see on the screen is a snapshot of that data hub. So on the left-hand side, you'll be able to see a snapshot of the performance measurement framework specifically and the results that we're speaking to and the numbers within the performance measurement framework. The results, the indicators, what our targets are for the next three years and what the target's within the first year that we're working within. And we are also trying to provide links to the evidence. So that would link to, for example, the course materials that you've provided to us which you will see here on the 1,100 performance measurement results. And you'd be able to see how we track our progress within each country and within each organization and how that tallies up as our progress as a team. So that is just a snapshot of how important data is to us and how we pull all of that data from all of those different sources and pull it into one place, which is our data hub. And as Francis had mentioned, this helps us track our progress and feed that back into our strategy going forward in the project. So before we go forward, I'd like to hand the microphone back to Francis to introduce our next guest speaker. And as she said, he will be able to give you a real life on the ground as a snapshot of what monitoring and evaluation would look like in this context for Girls Inspire. Thank you, Cheris. I forgot to switch on the unmute, and I was talking, so my apologies for the time lapse here. I'm pleased to introduce you, Mr. Kuntal D. Kuntal is from the Mandeshi Foundation. And those of you who have heard him speaking before know that he's very, very passionate about the work he's doing. He's a photographer, he is an evaluator, he is just a very passionate person and he tries to capture everything he does. And we were so inspired by the work that they are doing at Mandeshi Foundation, but more specifically by the leadership provided by Kuntal that we have asked him to share in this presentation with you all as to how they go about to put our work in practice. So without further ado, I will hand over the mic to you, Kuntal. And I guess Cheris will also assist you with the presentation from where she is in Vancouver. Welcome, Kuntal. Thank you. Good morning, good afternoon and good evening to all of you. Cheris, if you could put the first slide that I had sent. Hello, can you hear me? Hi, Kuntal, yes, we can hear you. This slide should be coming on the screen shortly. Without taking time, I think what I would say is Cheris and Francis has given already a very strong background of what we are trying to do and the importance of data and the evidence that supports the data that we are providing. Well, to us, it's always that the designing of the program is something that integrates the approach to MNE. MNE is, in some sense to us, is also to look at whether the program is running and it's on its course, on its path that we envisaged in the beginning of the program. What we try to do to design the program is to break down exactly what are we trying to do and the supporting numbers. You can see from the slides that we first segmented our target groups vis-à-vis the level of learning that they are going to go through and also what they are going to get in terms of whether it's the basic training, advanced training and also the medium of learning, which is very, very important for all of us because it's primarily we are trying to use something called technology-mediated learning. We need to tag that technology part for each target group. So that's what we started with. We segmented the target group in terms of levels of learning and the approach of learning so that it's easier for us to track each group. Trish, can we go to the next slide? Yeah. So of course, once we have the target groups, we need to find out a strategy which lets us achieve that number that we have decided for the target group. And what we have done is first we have looked at community and local in terms of a general social mobilization. But one important part of this general social mobilization was also to give a kind of snapshot of what we are going to do in future because this is one of the issues that happens when we design a program or do mobilization. People try to understand and there is always a question, so what? So what if we do this? So we try to kind of create, it's not really the right terminology in this context, but something like a learning objective. Where are we trying to reach and where are we trying to go through our learning? So that was an integral process of the entire mobilization and enrollment. Trish, can we go next? One of the reasons we decided to break the entire program down in three or four different major phases is that it's very, very contextual and as Francis rightly outlined that it has to be culture sensitive, it has to be local sensitive and finally it has to be community sensitive. So to understand whether we are sensitive enough or our program is sensitive enough to address certain communities in certain local, we decided that breaking down the program in phases would be much more important because it is the first time we are doing something like that. So experiment and failure are the only ways to learn. To learn and to inbuilt the process of learning for ourselves, we decided that instead of going at one goal, that one goal we do baseline, one goal we do enrollment and then start the training program, we broke it down in three different phases. Trish, can you go next? So at this juncture, of course the data gathering is something as Trish and Francis said, it's a continuing and integral process of the entire project because it's not only monitoring in evaluation to us, it's not only something to showcase the success or the failure of the project but it also helps us in course correction if we are deviating from the path, deviating from, I mean some of the deviations could be even contextual and absolutely necessary for the project to succeed. So to understand those deviations and to keep our track, the continuous data gathering is absolutely essential. So we started mapping each one of the tasks that we are doing. So since we have many branches, it was important to see that each branch has a benchmark of its own plus as an organization, Mandesh has a benchmarking and standardization by which every field implementer follows the process of implementation. So baseline, we created management information system which gives us an overview of which branch, which implementer, what village and what community is at what stage of the program. Even if it's baseline, where are we? This location in the project, this virtual location where we stand today is very important for us because otherwise we don't know how much path we have already covered and how much path we are still to cover. So you can see on the left that baseline and enrollment mapping has already happened but we are looking at further monitoring for which we have identified the key parameters by which we can understand that whether we are achieving this target. So attendance, interest and performance that for a specific group of participants and that's to outline certain abilities, certain outcomes vis-a-vis the participants that we have drawn out on top like ability to showcase their skills. Now unless we capture their attendance and interest, we do not really know whether they will be able to showcase their skills at the level that we want. So we broke it down again vis-a-vis the segments of participants that we created in the middle, in the beginning and kind of mapped their ability that we are looking forward to at the end of the project. Can we go to the next slide? Can we go to the next slide please? Contel, I think there's just a bit of a delay when it moves to the next slide. What I'm seeing on my screen at the moment says quantity at the top. Is that what you see now on your screen? Oh no worries. Thank you. So as we go ahead with the monitoring and evaluation, it's essential that we look at both the aspects of the project, quantity and quality. We should be able to identify what quality we are capturing and what quantity we are capturing. Reaching certain numbers may not be enough. Those numbers will have further breaking down in terms of quality that they have achieved. This is what is outcome. Parts of it would be intangible. A girl's ability to understand the menstrual hygiene is something that's reflected perhaps when she becomes a mother. So that part of it cannot be captured within one year. But at least by knowing what level of understanding she has, we can understand and outline the quality that we want it to achieve and whether we have achieved it or not. So again for every segment, we have started breaking down the quality and the quantity and how we are going to achieve or record this quantity and quality both. So these are still in the preliminary stage. We are still developing our indexes. So it will take some more time perhaps looking at the enrollment. And this is where the continuing data collection becomes so important because the enrollment forms are our guideline to design the indexes, monitoring and evaluation indexes for quality and quantity both. So out of 1700 girls, how many girls do we really want to understand the health and hygiene issues? When do we say, okay, we have succeeded more or less? All 1700, little less than 1700 or more? Similarly, what level of health and hygiene have they understood? Are they practicing? Are they influencing others? So those will be the indexes that we develop in the next stage. Sheeris, can we move on the next slide? So again as we said that each one of the participant segments were broken down. Now, there's one particular segment which is very, very difficult to monitor and evaluate because it's a large section of 4,000 young women who go through this program on community radio. One, it's very difficult to pinpoint and see that somebody is going through the program in the evening when it's aired or in the afternoon when it's aired. Second, the communities that we are targeting are poorest of the poor. So they may not have the radio at their homes. So this becomes a really fuzzy area and this is a big challenge that we have taken up, saying that no, we'll have to use this wonderful tool which can reach a large number and how do we track it? So one of the things that we have tried to kind of look into is to see what is the approximate listenership that the radio can have depending upon the number of villages it reaches. We created samples out of those villages saying that okay, every village can we look at 50 people whether they are going through the radio program every day or not. Then what are the possible qualitative analysis that we can go? Is the neighborhood talking about the radio program? Are people who are not part of the program enrolled listening to the programs? Are they interested or have some of them come up saying that okay, I like this awareness part of it, but can I get trained more? Can I be a part of the original program that is happening to other media? So these were some of the key issues or key parameters we tried to define when it came to the community radio. Can we go to the next slide please? So again, breaking it down, the enrollment, the qualitative and the quantitative me through which we capture evidence. So whereas regular attendance is a data which leads to participation and showcasing that the program is running well, we also needed photographs to support saying that okay, this is what is happening. This is what is happening in the training sessions. These are the videos that demonstrate that we are taking the sessions in the right direction. So even one piece had certain kind of qualitative evidence that was essential for us and we defined the medium through which it's recorded accordingly. Again going back to the radio program which is the largest headache as far as the monitoring and evaluation is done. We started saying okay, can we at least see if a group of girls who are provided with a radio are listening to the program every day or can we do interviews in a certain random interviews in the community, in the neighborhood saying okay, have you heard this? Have you heard this program? Did you like it? And so and so forth. Can you move on the next slide? So until now we have deployed two types of tools for recording data on ground. We started the baseline software before the fluid software was mounted. So we started with printed forms. Now we realize that each one of the tools that we use have certain positives and negatives. For example if I take printed forms for baseline, it requires minimum of training. The implementers are already doing such things for a very long time. They are familiar with the process so it's much faster. But then without the digital version it's difficult in collecting such a huge amount of data. It's difficult to analyze the data on a single platform. Similarly, each one of the tools that we deploy will have its advantages and disadvantages. Whether it's analog tools, that is the physical tools, or it is the digital tools. Records from the local institutions which are readily available are a great source of data. We went to the Anganvaris which is like the childcare units to get the birth dates and the ages of the girls we were talking to. But there also there is a disadvantage which all of us have faced that we cannot validate the data. We are not too sure if it's right or wrong. Chris, next slide please. So when we migrated to the digital platform, again we started seeing that there are issues with each tool that we use. Connectivity issues, intensive training and practice, not all of them are still comfortable with a touch-based interface. Some of them have used mobile phones but they have buttons. They are old ones and the touch-based interface takes its own time to get used to. So as I have spoken in a couple of other earlier meetings, we had to find out alternatives for each one of them. What if plan A doesn't work? What we bank upon? So we created a kind of metric saying that okay fine, if our fluid survey baseline forms cannot be filled up, can we send an Excel sheet, Dropbox, if the local translation is not possible on fluid survey or on Microsoft Word, can we or in some other way can we use Google Docs. So again the tools are completely based on what we want to do and mapping their advantages and disadvantages based on the needs that we have, what we are trying to do. So we kind of created a metric saying that this is how we work. If this is the data and this is where the platform is, if it doesn't work, then there is a backup. So this slide talks about it and I have requested to share it with all of you so that you can see that how this in there, the metrics is created so that we understand if this is not working, what to bank upon. And first last slide. Well that's, I think that's my take on monitoring and evaluation. It has to be a part of what we want to achieve. Only when we break down the achievement in its basic components, we have a strategy for monitoring and evaluation and that strategy reveals what kind of tool and what kind of approach shall we take. So thank you. That's all about it. Thank you. Thank you very much, Kuntal. We'd like to open the floor to question and answers. But first I'd like to hand over to Francis for reflections over the past presentations. Thank you very much, Kuntal. That was an excellent presentation. And I think that is what we wanted to share with the partners and other invested partners who are not directly partners but who are invested in this project through their own passion for the work that we are doing, that it can happen if you have the passion, if you have a plan. And you have put it so clearly and so logically that I think you have inspired some of the other partners. Everybody wants your presentation. The presentation will be part of the recording. However, we will send it to you as a separate standalone presentation too because these are all OVRs you should just acknowledge from where you get it from. But I think we wanted to give you this teaser so that if you are not inspired enough as yet on your M&E that this should now be something that you want to use. I specifically wanted Kuntal to focus and he did on how they do the radio recordings because Resvan, you and your colleagues, you are doing the night shows or the boat shows. So maybe you can get some pointers from what they are doing in Mandesi to collect data from their radio programs. And some of the other partners in Mozambique, I know they will also use the radio. So the pre-conference workshop in Malaysia later this year will be an opportunity where we can have more discussion on M&E and specifically these issues. Since we have basically a few minutes left, I will allow for a few questions. Cherise, if you don't mind to take the chair. And more questions you can put on the community of practice or you can put on the base game. Thank you Cherise, over to you. Thank you very much Francis. We are opening the floor for questions and answers. I see one coming in from Resvan on the chat box specifically addressed to Kuntal. Resvan, would you like to unmute your phone and address your question directly to Kuntal? Hi Kuntal, thanks for the presentation. You were trained as an engineer and industrial designer. Then why didn't you decide to work in the non-formal and informal education? Thank you Resvan. Well, it's a long story but I started primarily working with the craft sector. But as I realized that, you know, this is what holds Cherise news. And I think that's something which really pushed me into this, saying that, well, there has to be a common factor between everything that we do. We can create policies but unless we understand that policy, we perhaps do not appreciate those policies or don't follow them. And the only thing that can coherently put all these things together is education, is learning. So that's why I try to move into what I do today. Does that answer your question? Yes, thank you very much. Thank you. Thank you Kuntal for sharing your background and Resvan for that question. Mostafa has put in a comment saying it's a very innovative process and Sabine as well. Mostafa, would you like to speak more on that comment? Yes, it is a very nice design what I saw that Kuntal presented. So I think the whole thing is a very innovative process. So whatever the monitoring evaluation is going on, it's a continuous process and continuous improvement. So I like his presentation very nice. It is very nice and the scope of improvement is there. So if you like to ensure the theory of change, the scope of improvement should be there because the plan may not go as per the design sometimes. So if we don't redesign the plan again and it is in the process locally or in communities, so it will not go as per the ultimate goals of the project. So I think it is really nice that it is a totally innovative process. So monitoring evolution actually helps the process to redesign again and again. And improvement, continuous improvement is there. So thank you very much. Just a little comment. It's very clear, the whole thing is very clear today. So that's why quotients are not there. Thank you. Thank you very much, Mostafa. And thank you for joining us from Bangladesh Open University. We really enjoy having you and others outside of the Girls Inspire Project join us on our webinar sessions. It's great for knowledge sharing and capacity building as well and sharing our experiences from here. We're seeing a lot of comments coming in from on the chat box. Sabine, Sajida saying their appreciation for the presentation that Quintal has made. If there are no other comments or questions, I'd like to thank you all and hand over to Frances to make the closing remarks. Ah, sorry. Thank you so much, Cherise and Quintal for co-presenting with me. I think it is a very exciting time for us. And I hope the theory that I've explained at the beginning, and I hope I have noticed that the theory that I've explained, the tools that Cherise explained, has come together in Quintal's presentation. But I think a very important thing that I wanted you all to see is while M&E is an integral part of the project, it should also be an integral part of the institution or the organizational plan. So we cannot just follow those instructions from call. Quintal has explained to us how they have developed their own framework, their own plan, how they allocated the numbers, et cetera, et cetera. So it's easy for them to do the job. So as I've mentioned earlier, hopefully later this year we can spend more time on M&E as we meet in Malaysia. Thank you so much, Quintal, for your presentation and for the preparations you have made. The same to you, Cherise, also to Jasmine, who have made the preparations and made my presentation. Thank you so much. But most of all, to each and every one of you who have joined us in this session, the session cannot take place without participants. Thank you so much for being here. And I've noticed that somebody's last day today at Spark and in this team. I want to think, I'm not sure which name I saw there, who's the person? Sajida? Or is it... Faktima. Faktima, thank you so much. It's the second time we're losing somebody. We just lost somebody on the M&E person in C, M-E-S also. Last day I think was Tuesday or so. And now we are losing you. We really hope that the next person can quickly pick it all up. But as we know Spark and C-M-E-S, we know it will happen. We wish you all the best in your future endeavors and wherever you go. Thank you so much to everyone. And we will send you the link. Cherise, Jasmine, thank you so much. Goodbye from my side. Thank you all. Thank you, Frances. Have a good evening, everyone. Thank you for joining us. And as Frances had mentioned, we'll be sharing the recording of the presentation and also Contella's presentation by attachment. Thank you all and have a good evening. All the thanks, guys, for the wonderful... Thank you, Godfrey. Okay, thank you and bye. Thank you, Contella. Thank you, Sajida.