 of, well, welcome everybody to today's talk. We wanna thank Mary for joining us and giving us her time today to talk about monitoring and evaluation, both of which are really important to think about for your research projects or just research that you might be doing. And so she's gonna start with giving us her bio. If you have questions, you can either send them to me through the chat or you can put them in the Q&A either way and then I'll share them with Mary. Right, thanks Mary. My pleasure, let me see if I can share my screen right now. Are people seeing that? Yeah. Okay, it's a full screen, it looks good. Yeah, yep. Okay, perfect, perfect. Well, thanks so much, Heather, for organizing this. I think it's wonderful. It's unfortunate that everybody's in their own little separate corner of the world right now but I'm glad we're able to come together to talk about interesting topics and hopefully learn a little bit about monitoring and evaluation today as the case may be. So as Heather said, I do have a background in monitoring and evaluation. While I was a tough student, I did a bunch of IGL programs. I think most of them, including Epic was probably the highlight. And then I graduated with a Bachelor's in International Relations and a Concentration in Environmental Economics. And I went straight to SICE, the School for Advanced International Studies at Johns Hopkins. And I did another master's degree in International Economics and International Development. And when I came out of SICE, I really believed that the best way to do good development work was to have great data and good research behind what works and what doesn't work. So my first big job out of SICE was working for Innovations for Poverty Action, which maybe some of you are familiar with. It does big evaluation projects for mostly bilateral aid donors. So my project was working with USAID in Uganda and designing an evaluation for a country-wide food and nutrition project and specifically a nutrition education project. So for that, I did a lot of data collection, six months of data collection in a row. And that was when I learned I really wanted to try and do something a little bit more direct, a little less research-based and a little bit more hands-on. So doing a little bit more project implementation. So I switched from doing development work and moved towards humanitarian work. And I didn't know too much about the humanitarian field. So if anybody has questions, I would be happy to field those as well. So essentially I went on to Relief Web and applied to all the jobs. And I ended up getting a job in South Sudan doing camp management. And since my background was in monitoring and evaluation and I have a strong skill set there, I ended up doing monitoring and evaluation and transitioning into that as a country. This organization calls it an AMEU manager and appraisal monitoring and evaluation unit manager. So that was in South Sudan. And then I essentially did the same job for Iraq and Northeast Syria as well. So in that position, I was managing a team of expats and national staff and designing AME plans for all of our different projects for which those were two of our larger country program profiles. So it was many millions of dollars. So lots of baseline evaluations and line evaluations and day-to-day monitoring and reporting. So I did that for about five years and then I decided it was time to go back to school. So now I'm finishing up my third year as a PhD candidate at Johns Hopkins SICE. Again, I'm in the African Studies Department and I'm planning my field work in South Sudan to be looking at the relationship between administrative unit proliferation and inter-communal conflict. And so that research is going to require a lot of focus group discussions, key informant interviews, and I'm also using a large conflict event data set from Akled. So it's a mix of qualitative and quantitative methods for that. And as you can see from the slide, I've done some other smaller projects in various countries. So that's me. So what we're here to talk about today though is about what is AME. I realized one of my friends didn't even know that abbreviation yesterday. So if anybody asks like, what do you know about AME? You at least know if they're talking about monitoring and evaluation. And Wikipedia gets a bad rap, but I looked through a bunch of definitions and this is a good one. So perhaps we can take a second to look at this one and break it down a little. So we have monitoring and evaluation is used to assess the performance of projects, institutions and programs set up by governments, international organizations and NGOs. Its goal is to improve current and future management of outputs, outcomes and impact. Monitoring is a continuous assessment of programs based on early detailed information on the progress or delay of the ongoing assessed activities. And evaluation is an examination concerning the relevance, effectiveness, efficacy and impact of activities in the light of specified objectives. So monitoring is usually this ongoing process, whereas an evaluation is usually done at the end to determine how it all went. So on very basic level. So you might be wondering, why would I do all of this work if I don't have to? Well, the reality is you often do have to because if you are not providing your own funding for whatever kind of project it is you're doing, usually donors wanna see some of how this project went and that all equates to monitoring and evaluation. So even if it's simple, how many trainings did you provide? How many shelter kits did you give out? That's pretty basic M&E. A lot of donors want to know a little bit more about how people's attitudes changed, what they actually learned in the training, what are the community feelings about XYZ thing and did your project have any change on those? So all of that falls into what donors want to hear about to make sure their money is being well spent. On the other side, you as an NGO worker or a individual want to make sure that your project is actually doing what you set out to have it do and you want to find out about problems as soon as you can so that you can change course if you need to and the very bottom line is you don't want to be making the situation worse for anyone else. So unfortunately in some cases, I have seen wells built in the middle of rivers and so you've now spent all this money and you can't even use the well, like you wanna be monitoring that so you can identify closely, this is not a successful project, let's change tact as soon as possible. In a corollary to that, it's a great way to improve your accountability to the hot lingo is affected population. So accountability to affected populations and people on the ground that are receiving your program are the real reason you're doing it and the reports you write should ultimately be going back to them and you doing all of this work is a great way to stay accountable and much like the other ones, you wanna make your project even better for next time. If there are weaknesses, you want to try and improve the project design so you can do a better job going forward. And if it's all right, I think I'll just save questions till the end because I think there'll be a lot covered here if that's all right, okay, great. So then what does this nebulous idea of M&E look like? You know you're monitoring, you know you're evaluating something but what does that actually really mean? A lot of it is not very glamorous. It is sitting at your computer in your office or at home and planning out what you're gonna be doing and trying to identify what the problems are gonna be. On the back end of it, it's the analysis. So you've collected all this data. So you either got a ton of paper surveys or you got a ton of electronic surveys and you have to clean the data, you have to do some analysis. And then of course, you know, an Excel sheet is not what you're gonna share with donors or your boss or management. You're gonna write a report. So all of that is sitting at your desk. The other part of it is obviously the data collection part. So actually talking to people, observing your project and getting the data that you want. So that is usually happening on the project site. So how do you do this data collection? There's a lot of different tools and I'm sure that you're familiar with some, if not all of them. Household surveys are a very popular tool. So you go up to a shelter or a house, knock on the front door and say, hello, is there anyone here that can, you know, help me with this survey? And, you know, maybe you're looking for women, maybe you're looking for men, maybe you're looking for elderly people and maybe that household has somebody that has the time to take the survey. A lot of time, yeah. As you kind of run down what the list of those are, can you kind of give it an example of the type of project that might use it? Yeah, I have that on my next slide. Okay, great. Yeah, sorry. And then for key informant interviews, same idea, but instead of having kind of checked boxes, you are going to somebody with specific knowledge or expertise in whatever you're looking for. Focus group discussions are a group of community members or students or workers, whatever your group is you're looking for and you're asking questions to a group. Observational checklist is you going out with a clipboard and saying, you know, is the fire hydrant in functional order, are the lights working, you know, are people, I don't know, talking to each other, whatever your checklist has to include. And then there's just input monitoring and output monitoring, which is, you know, did this site physically get set up? Did the trucks come? Did the grain come? Did it get distributed? So not complex, but absolutely necessary. And then of course another in the office activity is literature reviews, because the last thing we want to do is recreate the reel. So usually there's another NGO or another bilateral agency that's already commissioned a report on whatever it is you're working on. So you want to make sure you found that. So the literature review is you Googling it or Google Scholaring it, and then trying to make sure you're not going to get blindsided and that you're not missing something integral. So yeah, we have all of these different tools. So when I, we have M and E for measuring and for explaining. So you are trying to measure something with quantitative methods usually. So maybe you're familiar with qualitative methods versus quantitative methods. And M and E breaks down into these two major categories. A lot of times you need to count things. You need to order preferences, you know, do people like the blue buckets over the red buckets over the green buckets and you need to just put check boxes. Then the other side is you built the well and it works, but you've now observed that only 30% of the community is using the well. You can say 30% of the community is using the well, but that's not really telling you as much as you want to know. So now you want to explain why something is happening. So you need to talk to people and you need to use qualitative methods to determine what's actually happening. So it's a lot more about the process rather than the outcome. All right, so for most projects you're going to use a combination of both. You're going to put both into your report writing and you're gonna put both into the planning design of how you're going to monitor and evaluate your projects. So as Heather mentioned before some tools are better than others for certain things. So we have qualitative tools and quantitative tools. So the quantitative tools are usually closed ended in nature. So you think about a survey, maybe you've written some surveys in the past and you say, how many times in the last week have you eaten apples? How many times in the last day have you washed your hands? So there is a answer to these things and you can record that on a smartphone or on a clipboard fairly easily and it doesn't require a lot of narrative. So household surveys are good for doing this kind of quantitative work as our observational checklist, as are the basic input and output monitoring. So in terms of, you know, when I was brainstorming random topics, things that I thought of in terms of household surveys you're often doing distribution. So how many people received the distribution? And then you ask them, did you use the items in the kit? And then you ask them, did you use all of the items in the kit? And then you ask them, how much they liked the items in the kit? So you can design questions that are closed ended for all of these things. And then you can be counting or you can be asking people, you know, did you use this water point? Did you go to school? Did you like the new latrines? How much did you like them? So it's all the things that you can put into a format that can be ordered and measured. And then observational checklist, they're often used as a monitoring tool rather than an evaluation tool because you want to be doing them on a regular basis. So while I was working in camps, we would have observational checklist as the camp managers to make sure that the services were in functional order. So I would have one of my assistants go around and just say, you know, are latrines A, B, C, D and E in functional order? Is there soap at the latrine? Is there a line? Are they well marked? Are the tarps in good condition? And if you identify any problems, you're either doing this on a day-to-day basis or a weekly basis, we bring that to the partner that should be addressing it and hopefully we solve that situation quickly. And then the basic input output monitoring is just what it sounds like. Did you actually do what it was that you were supposed to do? In a lot of cases, I refer back to camp management because that's what I've done the most of. A lot of our output indicators are, did you have this women's meeting? Did you have a youth meeting? Did you have a meeting of the chiefs? And you have to record that you had the meeting by having an attendance list and taking notes from the meeting. And then you can say, yes, we did the meeting. On the other hand, you have qualitative tools and these are usually open-ended questions and they focus on the why, the process. Why did something happen the way it did? So key informant interviews are just what they sound like with a key informant. So they're not man-on-the-street interviews where you're going up to anybody. You see you're going up to a targeted person. So for my field work that's coming up, I'm not talking to everybody for my key informant interviews. I'm trying to find parliamentarians. I'm talking to journalists. I'm talking to political risk analysts. I'm talking to people that are civil society organization leaders. So you have a specific groups of people that you're targeting and they have to be involved in whatever the geographical, I'm not talking to political risk analysts in Russia, I'm talking to ones in South Sudan. And so then you're also doing this on a group basis. And when you do that on a group basis, it's called a focus group discussion. So you could do that with a group of experts, but if they're experts, you probably wanna be doing it one-on-one. Whereas if you're talking to a group of citizens that have been the recipients of either some policy or you're trying to gain a perspective into community perspectives on a certain issue, you wanna talk to a group of people. Maybe you wanna talk to a group of women about certain issues or a group of men or a group of recent immigrants or recent arrivals to a camp. You pick out a group of people that fit whatever specifications you have, usually between six and eight people and you interview them as a group for maybe an hour or an hour and a half. So sometimes if you can sequence things, it's nice to do your quantitative data collection and then look at your results from that. And then you can layer on your qualitative data collection. So you have eight different sets of latrines in the camp and you've asked people which latrines they use and nobody is using the latrines H and I. And you have done your checklist, you know that they're in good working order so you have no idea why people aren't using them. So then you get a focus group together or maybe you talk to the leader of that block in the camp or you do both. And they say, well, Mary, this is controlled by a gang and nobody lets us use these or will, Mary, there are like snakes and nobody can use them because the one guy got bit or whatever the case may be. But you wouldn't necessarily know to ask those questions if you didn't already have your quantitative data. That's a luxury, you don't always get to do it, but if you do, that's something to keep in mind. So these are just some pictures. I know it's nice to visualize. These are just pictures I've taken over the course of where I've worked. In the top left, this is my office. So you spend a lot of time, this was my office in Erbil in Iraq. Geography is really important. We were doing a lot of Mosul work, so we were doing rapid needs assessments in each of the neighborhoods to try and help NGOs target the neighborhoods that needed the most support and then what kind of support? Did they have water access? Did they need cash infusions? Did they need food distributions? So my team and I would go in and do rapid needs assessments across these different sectors and then publish that data very quickly, like work all night and have it out for the next day. And then we were doing bigger longer-term projects as well, but that was key. And then in the top right, that was a camp that I helped open while serving a dual-headed role of camp management and M&E manager. So when you're taking, I keep talking about observational checklist, but part of that is often also taking photos. So you're not necessarily gonna remember everything, you're not necessarily going to be able to describe everything. Photos are a wonderful asset for you. So you can see exactly how everything's set up in a month when you go to write your report and you can see if everybody's well-branded. That's what a lot of donors like to see as well. If USAID gave the money, they wanna see USAID on everything. So that's a good way to document that sort of work as well. In the bottom left, we just have a quick survey being done by one of our enumerators to a community member and then in the bottom right, we have a focus group discussion of men and this was also in Mosul. Once again, on the top left, there's a more observational checklist type work. You wanna document, you can see that the water is functioning. You can also see that there's some pooling. Maybe the water partner has to think about extending with concrete block or doing something to prevent too much mud. There's no line. So in a lot of water points, there's a long line. That's a good sign. So you'd wanna document what type of day this photo was taken. There's a lot you can learn from a photo. That is the point. And then in the top right, another focus group discussion with men. In a lot of cases, unfortunately, there is a bias towards men in certain countries being able to participate in data collection either because of language barriers or time constraints. Women don't have the same opportunities. And then the bottom left is the story of my life in so many places. You get power strip after power strip and so I've ordered hundreds of phones in my life at this point, so these are the basic Samsung Galaxy lines. I use a data collection tool called Kobo Collect and it is Android only. So outside of the US, Android phones are the norm. So depending on what country you're in, enumerators might have their own phones. In most countries, they don't or they're not reliable enough. So you supply your enumerators with phones and then you're responsible for charging them. So you better have a functional generator, inshallah. And that is what I'm surrounded with all the time. And then in the bottom right, we have data collection in action and finally at least a lady being interviewed. So in terms of who's doing all of this data collection analysis writing, the answer is it depends in a lot of NGOs, they don't have an independent monitoring and evaluation team within their NGO. So if you're hired as the program officer or you're hired as the program manager, you are also responsible for the M&E for the project. So you have to design your own tools and you have to do data collection and then you have to do those reports as well. You can imagine there are some limitations to this. Well, first of all, because of time and resources, but also there's less independence in a lot of ways because you're evaluating your own project. So of course you want your project to have succeeded. So there's mission creep and then there's just an incentive to make your project look good. Not saying it's there all the time, but it's a limitation. In other NGOs, they have designed their budgets and their team structures so that you have an M&E team in addition to your project management team. And as an M&E person, I've usually worked for NGOs that have those individual M&E teams. Do not get confused if you're looking at a job application that doesn't have M&E officer as the title. There are a million and one ways that NGOs write out M&E. And so you can see I looked up and did, I've worked as a few of these different titles and they essentially amount to the same thing. The A's are usually for appraisals. The L is usually for learning. Yeah, so NGOs use a variety of words to make themselves, I don't know, sound cool sometimes. So in this case, they're usually independent or considered independent branches of the NGO. So they have one step away from the project. So hopefully they aren't as biased. You can see that there's still significant limitations though in that respect, because you're still working with, living with the project team as well. So you don't want your friends work to be a failure either. And then the third group is independent consultants. So these are usually professionals that want to be less tied down to one specific project or maybe they're very good at doing one kind of evaluation, like country-wide evaluations of early childhood education programs. And so it's usually a big NGO or a UN agency that is hiring for a consultant to do a big evaluation. And you put together your whole program of what you would do if you got it and then you win the contract and you do that evaluation. So for some NGOs, this is the kind of person they rely on to do baseline and end line evaluations for their bigger projects. So they don't expect their project managers to do it and they also don't have in-house capacity. So they just budget $5,000 or $10,000 to hire somebody to come in for two weeks and to write the survey, conduct, collect the data and do the analysis. And that's how they do it. So pros and cons to all of them and usually NGOs use a combination of all three. As for when M&E is happening, I think as you can get a sense of it already, it's happening all the time. Before the project happens, you are thinking about how you are gonna be evaluating it and what you're gonna be doing every day to make sure it's on track. You're also thinking ahead. You're helping your grants management team write new proposals. Cause NGOs are never stagnant. You know, if you stop, you're not gonna get the next grant. You're gonna go out of business, unfortunately. So you need to be thinking about what other kind of grants we can take on and keep moving. You're also doing needs assessments in potential areas for figuring out what projects can we propose to donors that are necessary and that the community wants. And then at the very beginning of projects before any food's been distributed, before any trainings have taken place, you really wanna do your baseline evaluation because that's when you do a baseline. Once it's happening, once the project is in motion, this is when your monitoring happens. So you've set up your tools and you're doing your regular data collection in that respect. In a lot of projects, you know, humanitarian projects tend to be very short-term. Maybe you get a grant for six months. Maybe you get a grant for a year. If you're doing development work, you might have a three-year project or a five-year project. So you don't wanna wait until the end of the five years to do something significant in terms of an evaluation. So maybe you do an evaluation after a year or two years. That was what I was doing in Ethiopia for USAID. They had this endless emergency food distribution program that at the time we evaluated it had been going on for eight years and there was no end in sight. So it was officially deemed a midterm evaluation even though there was no end in sight. But they really wanted to get some data. So maybe you're doing something like that. You're also going to be contributing to whatever monitoring reporting requirements are required by the donor. They always want to have updates on the indicators. You're gonna be responsible for that in most cases. And then you're gonna be working on that accountability with affected populations as well. So if your project has any feedback mechanisms, you're gonna be working on those throughout the life of the project. And at the end of the project, you've gotta sum it all up. You've gotta do your end line evaluation. You've gotta do your final reporting and maybe you have some lessons learned from the project that you make a separate document as well. And then at this point, I'd like to not emphasize, but say there is a difference between an end line assessment and an end line evaluation for those folks that have done more stats and econometrics. And NGOs and donors have a tendency to call everything an evaluation when in the purest sense of the word, it's really more of an assessment because in almost all cases of development and humanitarian work, you don't have a control group. So I am not measuring the impact of my project against a group that could have also received food and then didn't receive food. I'm measuring the effect of my project on the population that received food. So it's a subtle difference, but we're not doing any kind of controlled work at all. So that was what I did in Uganda with IPA. We had groups that did receive nutrition education and then we had groups that didn't receive nutrition education. And we did all of our assessments with both groups, but for NGOs, you're not seeking out an entire different county and then saying, can we survey you as well? That's not the expectation. So just so you know. So in terms of what this looks like, I think I've talked about it a little bit, but I know when I walked out of Tufts and even when I walked out of SICE, I really had no idea what went into being a humanitarian aid worker or how the cycle worked. So just to make it maybe a little clearer in your mind. So you're in South Sudan and you are the M&E officer and USAID puts out a call for proposals. So USAID has said, you know what? We have these camps and we need somebody to manage them. So there's a call for proposals and your grant management team tells you and the grant management team is just the people responsible for bringing in the money. So the grants management team and your country directors say, we want to put together a proposal for this potential funding. So you as the M&E officer are like, great, I wanna support this. So maybe you do a needs assessment and you go out into whatever camp it is and you say, what are your biggest needs to the community? You observe it and you talk to maybe some other NGO workers and you find the gaps and you bring that information back to your grants proposal team. And then they can write up and say, hey, we did an entire needs assessment and these are the top five needs for the community. You also wanna be doing the lit review in case anybody has done something similar. And then you wanna look at what you've done before so that you're learning from the past projects. So you are part of this whole push to put together a nice proposal. At the same time, the grants management team is coming up with, okay, we're gonna do some seed and tool distributions here and we're gonna build a well here and we're gonna do some vocational training here and you are saying, okay, how am I gonna monitor activity A, activity B, activity C? And you're gonna organize your broad plan for how that might work if you got the money. Because USAID is gonna wanna see how you are gonna monitor it if you do get the money. So hopefully you win or you win some of the money and then you usually should have started last month. They don't give you the money on time except very rarely. So you have to kind of prepare for that and have your tools ready to go so you can get into the community as soon as possible and get that baseline done. And so at the same time you're doing the baseline, you're also making sure that, when you said you're gonna do a midline evaluation, you wanna have that midline evaluation written in good time. When you said you're gonna have observational checklist, you wanna make sure that those are ready to go. So you're writing a lot of tools at the outset. And then as the project gets underway, like I said, if there's monthly reporting, maybe there's even weekly reporting if you work for some UN agencies, you're gonna manage those requirements along the way. All right, so now you're in it. So for those people that aren't as familiar with aid funding or humanitarian funding, there are big players and really just depends on how you do the numbers, but you're gonna get funding from ECHO, you're gonna get funding from ECHO being the European Commission for Humanitarian Operations, and then USAID or maybe OFTA, the Office for Foreign Disaster Assistance is a branch of USAID that works on humanitarian issues. You're probably gonna get funding from Japan, you're gonna get funding from Canada, and those are the biggest donors. And then the other thing you're gonna get funding from is the UN. So it's not the UN's money per se, it's Germany's money that is being funneled through UNICEF, and then UNICEF is picking the 10 different project implementers across the country, and so you're reporting probably to UNICEF, and then UNICEF is reporting back to Germany. So a lot of it looks like UN funding and it is, but it's coming from the bilateral donor originally. So you have to be responsive to both groups, to the bilateral donors and to the UN donors. And then another thing that I wish I had a better conceptualization of before I had gotten there in terms of what are these projects that you could be doing. There's a million different things that you can think of to try and help people in need in terms of humanitarian situations. There's probably a standard list of the biggest projects, and I've listed some of them here. So you're doing food distributions, you're doing seed and tool distributions, you're doing shelter distributions, you are constructing wash facilities, so that's latrines, that is mostly latrines, hand washing points and water points. And a lot of cases, maybe people have set up camp in a place that doesn't have enough water, so NGOs are responsible for water trucking into a site on a daily basis. In some cases, you're doing education and emergencies, so that's the educational component for those partners. There's always healthcare in pretty much any humanitarian situations. So if you're working for the healthcare partner, you might be doing vaccinations, nutrition programming, surgical work, outreach work, messaging, lots of projects like that. If you're in the protection sector, you've got different women's empowerment programming or children's protection programming. It might be one-on-one support or it might be every person 60 and older gets a mattress, something like that. So you're doing either population or individual level protection work. And then you're doing infrastructure work. I've built a lot of bridges, you build a lot of community infrastructure. So meeting rooms and roads, you're always rehabilitating roads because they're never concrete roads like we have in the US. So they degrade with time and with the elements. So you're always doing rehabilitation to the infrastructure. I built greenhouses and when I say I, obviously I hired some people. I did not build my own greenhouse, but I was the project manager for variety of things. And sometimes you're building docks if you're next to a river, something like that. Schools are popular. And then maybe you're doing vocational services or technical training. So there's a lot of hairdressing projects or motorcycle repair projects or radio transmission projects. So trying to give people a pathway towards more sustainable jobs is a great use of funding. And then sometimes they're climate related. So there's a lot of resiliency programming that happens, whether that is planting trees or water catchment systems or irrigation. There's a variety of different projects. So resiliency projects are, unfortunately as we all know, more and more needed. So you've got your project, whatever it is. In this case, this is a resiliency program. And you want to think about what your project is doing and what it is trying to do. So you have a resiliency project. You've got a community that doesn't get enough rainfall that's being affected by climate change. What can we do to help them in the case that next year's rainfall is also terrible or there's conflict or there is some kind of, you know, I don't know, a global pandemic. How can we make them be more resilient? So maybe you're going to create committees to help people share resources. Maybe you're going to build some infrastructure. Maybe you're going to train people on how to be advocates to the local government so that they have more of a voice. Whatever it is your project is doing, you want to think about it step by step. So you have your inputs, so your trainings per se. So you're going to have, I don't know, your whiteboards, you're going to have your chairs, you're going to have your tables. Everything's got to be in place for training. And then your output is actually having that training. And then the intermediate outcome is hopefully that somebody learns something from your training, either who their local official is or that I have the right to a cleaner planet or whatever it is. And then your outcome is maybe that somebody did talk to their local official. And then the impact is hopefully that the community is less affected by a shock in the future because they are healthier or something like that. Or they are, yeah, something healthier is a strong one. So you want to think about your project step by step. You don't get from point A to point F automatically. Your project hopefully is physically doing something and you want to think about how it all comes together. So using a log frame like this is a pretty typical way to make sure you're not skipping any steps. And then this presentation is, I think, in the chat. So don't worry, everybody can have it for as long as they want. So now you've thought about what your project is doing and hopefully how it's going to do it. So it's time to make an MNE framework. So a big step for your framework is picking indicators. I know I've talked a little bit about indicators before. In most cases, if you do have a donor, they're gonna either have a set of indicators that you can pick from or maybe specific indicators that you have to use. So if they've picked them out, okay, that's what you do. If they haven't picked them out, my big piece of advice is do not make up indicators. I promise no matter how original and creative and unique your project is, somebody has done something either identical or very similar in the past. And it is irresponsible and you can't compare results and you're not getting as much bang for your buck if you're making up indicators. So I put a couple of the sectoral indicators just to give you a sense of the links to them that there are very standard set of wash indicators, very standard set of food indicators, et cetera, et cetera. That doesn't mean you use all of them. That just means you don't sit in your bedroom and come up with indicators by yourself. Look for what's available. There are professionals that spend their lives coming up with good indicators. So don't make up your own. So you do wanna make sure that they are well-defined. So it is very clear what you are trying to measure. You wanna make sure that they are context-appropriate, the indicators that measure childhood success in the classroom in New York are not the same for measuring childhood success in classrooms in South Sudan, maybe. So you wanna make sure that it makes sense, whatever you're measuring. You don't wanna make them up, like I said. And then I do have a link to a really nice little guide for how to put it together. So you might be familiar with smart indicators. You definitely wanna make sure that all of your indicators are smart. And so what is smart? They are specific. They are measurable. They are achievable. They are relevant and they are time-bound. So you want to not be having multi-part indicators where you say, and general indicators where my project will make women smarter or healthier and safer or something. You can all identify a lot of weaknesses with that. You wanna quantify the women. You wanna quantify the time that your project is going to say. You wanna quantify one of those two things. You can't have an indicator that has both health and safety. So you say the women that have been involved in trainings for the last six months will have feelings of increased safety by 10% or something like that. And that's not a great indicator, but the idea is there. You wanna make it stronger. So often the indicators that you're using are related to CAP. So what's a CAP survey? That's the bread of butter of a monitoring and evaluation specialist toolkit. You're always doing CAP surveys. So you're talking about knowledge indicators, attitude indicators and practice indicators. So your project was doing a wash training. So after the wash training, you're going to ask people how many of the ways that we can transmit germs can you recall something like that? And so people say we can transmit germs via our hands via food via XYZ. And so hopefully before and after the training, people can identify more after the training. And then you might ask people, how important do you think it is to wash your hands before eating? And maybe before the training, they said it's somewhat important. And hopefully after the training, they say it's very important. So they had a change in attitudes. And then after a month after the training, you are asking how many times in the last week did you wash your hands before eating? Something like that. That's not a great timeframe. You probably want to use like 24 hours. How many times in the last 24 hours before eating did you wash your hands? Something like that. So did they change their behavior? So now you've got great indicators. Congratulations, that's an accomplishment. So what do you do after you've got good indicators? You have to figure out when you're going to be putting them in place and measuring them and then who is going to be responsible for doing all of this work that goes around data collection. So how do you decide how we're going to do this? Most of the time it's dictated by your team. What are your team's capabilities? What are your capabilities? What is your budget? And maybe there's also some safety constraints or logistical constraints as well. So you've got to be realistic about what's possible. So if you have $500 for the whole project, you don't want to be saying, I'm going to do weekly assessments with 50 people. No, you want to say, I'm going to use an observational checklist that I am going to implement on a weekly basis. And then at the end, I am going to interview 50 people or something more realistic. You do want to keep it simple. You want to keep it streamlined. But at the end of the day, you want to make sure you do it, make sure you monitor and evaluate adequately. So in some cases, NGOs kind of consider M&E last when it comes to budget considerations. If that's the case with your NGO, you do need to be an advocate as well and say, I thought about this creatively. I thought 10 different ways we could do it. I need at least this much time and staffing to get it done. So it can be a battle, not always. So then you're putting it together in this lovely framework. And I usually use Microsoft Word that's very popular and you're making a table. So you've made this table with your indicators and then you are sharing it with whoever needs to see it. So it's usually not just you alone in the world. And even if it is, you want to have your friend read it so that you haven't missed something crucial. So management reads it, the grants team reads it, and then you adjust accordingly. So once you have this finalized, you have to make your plan. So in terms of what the framework looks like, we have the indicators. We have the clear definition of what the indicator is. We have where we started. So maybe 10% of people considered handwashing very important. Our target is that 70% of people consider handwashing very important. We know that we're going to be measuring it with CAP surveys at a baseline and end line. We're going to measure it twice. We're going to measure it at the beginning and the end. Who's going to measure it? It's going to be the M&E officer. And who is, where will it be reported? It's going to be in the yearly report. All right. So you want to have that clearly done for every indicator that you have. So this was a little bit of an example, one that I found this is a more American-based example. But here you have your indicator with your big goal is the percentage of grade six primary students continuing onto high school. So they've defined it and they have their baseline. They have a modest and yet important target of 10% that they want to have their program change it. Their data source is going to be the enrollment records. So that's a pretty cost-effective way of getting that data. You're going to be measuring it once a year. The program manager is going to measure it and then you are going to put it into your annual enrollment report. And then you're going to be breaking it down in terms of your outcome indicators and your output indicators as well. So outputs are the more basic level. Like I said, did the meeting happen? In this case, did the kids complete a summer reading camp? And then your outcome is hopefully when they attended the summer reading camp, their reading got better. So your outcome is that reading proficiency improved. So you've got the levels of indicators there. So this is every project, no matter how big or small, you're going to make one of these tables. This is another example with more traditional humanitarian indicators, but same idea. So now you've got this beautiful framework that you've put into a Word document that everybody has signed off on. But that doesn't really explain everything about how you're going to do it. So you have to make a more robust plan. So the M&E plan, for the projects that I do, they're usually 20, 30 pages long and each project gets a page, gets a document. And then in your plan, not only do you have your framework, you have the budget for each one of the activities you're going to do, you have the timeline. So you've broken down when exactly this is going to happen. You have who's doing it. So it's not just me. I have Sebastian and Samantha and Kevin who are going to be leading it in area X, Y and Z. We're going to be hiring 10 enumerators. We're going to be doing a week long training. We're going to require this many cars, this many phones, this much paper, this many, you know, whatever it is your project needs. So everything that you think you're going to need goes into your plan. And so for each activity, you've got the five W's of what's happening for that specific activity as well as the budget and the timeframe. And you just do it for every single activity. I like to break down the timeline for a project usually in Excel. So depending on the length of the project or how specific we need to be in terms of our timeline, you'll either have each day be a column or each week be a column. And then you have each activity be a row. And an activity is not baseline, baseline survey. Your activity is researching indicators on food security. It is refining questionnaire. It is programming questionnaire. It is pre-testing questionnaire. It is debugging questionnaire. It is, you know, there's a lot of little activities that go in and that's what you need to do to make sure that you're going to stay on track and help your team stay on track because you're not usually doing one of these. You're usually doing 30 of these. So everybody needs a clear guideline. So you've got this wonderful plan and it's got a great timeline and you're in budget. And so once you've got your document, you will share it again. You share it with everybody who is going to be useful in the success of this project. So it's not just you. If you're working in any kind of a dangerous environment, you might have a security team. So you wanna make sure that the security team signs off on everything. You probably don't have your own cars as a M&E person. So you're using either project cars or capital cars. So you wanna make sure that the cars are lined up. You wanna make sure that if you need any extra translators or any extra time in the county offices, everybody is aware that everything is happening. So this is for transparency and also making sure that it'll go smoothly. And then once everybody has read it and you've answered their questions and they've answered your questions, this is your plan. So this is what you use as your Bible going forward. Of course, things still change, but this is integral to making sure that you stay as close to being on track as possible. So once you get your plan, you're ready to rock and roll. So what else do M&E teams do in addition to creating and implementing these M&E frameworks and plans? Often, as I alluded to before, they're responsible for feedback mechanisms. So they're a key accountability tool in most development and humanitarian projects. So they can take the form of suggestion boxes or complaints desks or hotlines, depending on the context. I've worked in a lot of places without cell service. So a hotline doesn't make sense, but complaints desk in camps make a lot of sense because everybody is geographically close. So you have a complaints officer sit there all day and then people come up and say, you know, somebody, the latrine isn't working. I don't like my community leader. I didn't get the full ration of food, whatever it is. And then you try and address those complaints and bring them back to the appropriate source. And then at the same time, sometimes you're responsible for the feedback mechanisms about your own organization. So complaints about corruption or about harassment, something like that. Sometimes M&E teams can be responsible for that. Oftentimes those go straight to the corporate offices. So usually country level M&E teams don't worry about that. Then at the same time you're doing all these projects, you're making mistakes, you're finding out what works. You are having great successes. You don't want to lose all that when the project manager leaves the job after a year. You wanna capture as much of this institutionalist knowledge before anything else happens. So you want to capture that in a lesson learned document. So you, like me, I have sat down over a cup of coffee with the project manager and you spend 15 minutes and you say, you know, tell me the top five things that went right that we should be thinking about for the future. Tell me the top five things that were terrible. And, you know, often it can be working with certain vendors. It can be the way you organized activities. It can be the way you approached community leaders. So this is an invaluable tool for the next person who is trying to do this. So you wanna capture that. And a lot of what you're doing is data analysis and report writing, especially for people that are Tufts graduates. You're not gonna be the one that speaks Dinka anywhere that's gonna be out collecting surveys all day. You're gonna be the one that is writing those surveys in English and getting them translated. And then you've got the data and you're sitting there with Excel and you are doing the data processing. So you're gonna sit there and you're gonna do a lot of data analysis and report writing. In terms of the actual skill level you need when it comes to statistical analysis, I know that can be intimidating for some people. It is not overwhelmingly high. It was much higher when I worked for IPA. I did need advanced econometrics for that. I used data every day. That was far more challenging when it came to that side of operations. For normal NGO work, you don't need data. You don't need SPSS. You need to be good at making pivot tables. You need to have the basics of statistics so that you can make, you're not making simple errors in terms of the results that you're sharing, but this is not a brain trust. This is getting the process moving and then trying to show intermediate, data analysis at the most. And then in terms of how these reports look, your reports should speak for yourself. You're not gonna be there with the donor to explain everything. They need to be clear. They need to be well-written. They need to be aesthetically pleasing. You need to have captured the project and the challenges in a way that is appropriate and compelling to the donors. So being a good writer and a good communicator is, as with most things, a bonus. So in terms of if you are captivated by the promise of M&E, what might make you a better M&E person? So you need to be well-organized. You do need to be efficient and you need to be good at multitasking. Like I said, you're usually working on 10 things at once and you can't get frazzled. You need to just push ahead and try and use some time management skills. You do need strong communication skills. You are writing a lot of reports and you are managing a team. In most cases, if you're an M&E officer, you are managing the enumerators. If you're the M&E manager, you're managing your officers and whoever is doing the data collection. You do need to work in multicultural settings. I can't even think of a context where an M&E officer is not working in a multicultural setting. You do need to be able to organize and lead trainings. Every one of these baseline assessments or CAP surveys, you need to train your team on what exactly the question means. How do we translate it? What do we do if a respondent says XYZ? These are all what you cover in training, so you lead a lot of trainings. Also, like I said, have to advocate for your department and your team because your work is equally essential, if not more so, than the other departments, so you need to make sure that they get what they need. The more you know about the projects that you're evaluating, the better it is. It takes less time in terms of the questions and indicators that you are designing and you know what will work with the project manager's schedule and work and you can be on top of it. You don't wanna miss that seed and tool distribution because you just didn't know that there was gonna be a spring distribution. The more you know about the project, the better it will go. You do need to have electronic data collection skills, so electronic data collection skills. Like I said, the one I prefer right now is Kobo Toolbox. It was ODK Collect way back in the day and then Kobo Collect. So the Toolbox is a great way to take your survey that you've drafted in a Word document and put it into the format that will go on the phone and then bring it back from the phone to your computer fairly easily. So it's much less buggy than other softwares I've used and it's very intuitive. So if you are gonna, and it's free, so if you are gonna do a survey this summer or just wanna play around, you can create a free account and have some fun making some surveys, interview all your two other people you're quarantining with. You should be comfortable working with Excel. Yep, like I said, and you should be able to put together a sample that's appropriate for your survey. There are a lot of tools online out there, so you don't need to do all the stats by hand, but you should not get caught at the end of your project and say, well, we can't say anything about women because we didn't actually interview enough women or we can't say anything about children because we didn't interview enough children. So you wanna have enough knowledge to create a viable sample size. Of course, as with probably all work, a commitment to the work your organization is doing and the people it's working with is integral. Most of this work is not well-paid and it is not glamorous, so you have to be there for the right reasons and being passionate about your work and the work your organization is doing goes a long way towards that. And you're not always gonna be in the office, so you are gonna be working in far-flung corners of wherever the projects are happening, so you have to be willing to get out of the office and into the field. You have to be willing to talk to county commissioners or community leaders or the recipients of your projects and really try and understand what's happening with your project. You need to be flexible, you need to be adaptable and you still need to get stuff done. So the more creative and the more ability you have to troubleshoot problems, the better it's going to be and the more fun it is. So that is all I have for us today. That's my email and you're more than welcome to email me in the future. I'm more than happy to talk about M&E and in the present, I would love to field some questions. Great, so questions, you can send them through the chat or through the Q&A. I guess, Mary, one thing thinking about it, for students currently at Tufts who are thinking about research or research project or for groups like Engineers Without Borders, what can they take from this to put into place? And also if they have multi-year projects, so the transfers from one leadership to leadership, it's not. Yeah, I mean, I think it's never too soon to think about your evaluation tool. So even if you're not going to be able to get to the field per se in the near future, you do want to do the research on the kinds of projects you are thinking about so that you know the indicators that are out there and you have, maybe there's other students or there's other NGOs that have done similar work so you don't want to recreate the wheel. So that is certainly something you can do right now. I would also suggest if you are thinking about surveys at some point in the future, this is a great time to familiarize yourself with Kobog Toolbox so that you can be ready to go. If you do find pre-existing surveys or questionnaires out there, who knows, maybe you can even program them now so that they're super ready to go when the time comes. Yeah, and I mean, we can always be better with our interviewing skills so it doesn't necessarily come naturally. So you can always practice interviewing. There are really good tool guides about how to interview respectfully and appropriately in different contexts so you want to make sure that you are psychologically prepared to do that and you don't have any faux pas there. All of those are some things you can do. Can you send links to those two guides? Do you have those? I can look up some. Yeah. I can also send. That you rely on. One of the things is also, one of the questions is about how do you, like, you went to grad school and what are the tools you came out with to be able to do what you did between Hopkins and going back for your PhD? Like, were you learning on the job? Did you come out with specific skills, you know? Oh yeah, there's a lot of learning on the job. I think, like I said, the first job I got out of SICE, I thought was my dream job. It was working as a program coordinator for IPA. They are the most well-respected RCT bilateral evaluation company for development work in the world. So they are the gold standard for what you're talking about when you talk about evaluations. So to get that job, I did need to have taken all the appropriate econometrics courses. I had done a variety of those as well as advanced econometrics and microeconomics of development. So that was a concrete skill. And then I think I also was able to get the job because I had studied abroad in Uganda. So they wanted somebody who had Uganda experience and I had done some evaluation work at that point in Nicaragua before then. So these were small things, but for a first job, that's what it's all about. Since then, I have done less rigorous M&E work in the sense of its actual, the caliber of the econometrics involved with it. On the other hand, it's far harder because I spent six months writing one survey for IPA. I spend maybe a maximum of a week writing an average survey for NGO world. And you have to process a lot more information, a lot more quickly in that case. So I wouldn't have needed econometrics classes to do the work I do as an M&E manager. On the other hand, the more experience you have leading teams and being a good communicator and having strong writing skills, the better you're going to be as an M&E person. One of the things that makes me attractive for M&E positions is I like to do public speaking. I like to go to different clusters and clusters are how the NGO humanitarian space is organized. So there's a lot of cluster meetings. So I represent the organizations I work for well and we always look like we're being very active in the field of M&E because we are and also we're able and willing to talk about it and to share our work and present our work. So that's been very valuable. Great. Another question is, what are some of the common like beginner mistakes that people make in moving through this and kind of the, I mean, you talked a little bit about it in terms of designing indicators, but what are the things to watch out for, especially if you're evaluating your own project? Yeah, I mean, I don't think I can emphasize it enough. Don't make it up. I have done that. That's why I say it. We all think we are the first ones to do whatever project we're doing, especially maybe if you're working in another language, you know, I'm thinking of the work that I did in Nicaragua. I was working for a women's empowerment organization there and it was a unique project and we were doing a unique evaluation, but if I was doing it now, I would have looked up women's empowerment indicators. I would have looked up different kinds of evaluation tools that do exist. And I just didn't know enough to know that they did exist and I didn't have to recreate the wheel. In terms of beginner errors as well, I think it's sometimes harder than you think to properly train enumerators. So if it's just you doing the data collection, you can sometimes get away with doing a lot of things in your head. If you're doing it in multiple languages, maybe you don't need to make it clear what the definition in each language is, but if you're doing it with a team or if there is going to be a translation component, I have gotten into situations where enumerators translated words differently or the way I understood it was different than the way they understood it. So it is very easy to under train your team members. It can look, it can be so tedious because you've talked about a question for half an hour, but you do need to put that time in to make sure that the data you're collecting is actually good. So you really, you can't overthink most of the components of putting together a good framework and a good plan. The more time and effort you put into it, the better the product is going to be. And does that go for like when you transition or a team is transitioning in and out, like making sure that the same, if you're gonna be taking data later, that you're using the same kind of language? Absolutely. Yeah, so I mean, as we talked about having well-defined indicators, you should have well-defined activities as well. And though the M&E plan should be very well-defined in terms of your sampling strategy and how you engaged with the people that you are engaging with in a lot of cases, maybe you're interviewing every third house or every fifth house. Well, how did you decide what is your fifth house? Is it you walk to the center of the village and then you walk north or are you walking west or what is it? And then what is your protocol for if somebody who doesn't live there answers the door? What is your protocol if somebody says, oh, come back later? Like you have to have very well-defined plans for all eventualities. So that, yeah, having a handover plan is essential. So I haven't been in too many situations. I've been in plenty of situations where I've left halfway through because there's so many projects. So my handovers are extremely long and detailed. And they also rely on those M&E plans that we've put together with hopefully a great deal of detail as well. Great. One of the questions is also about what happens if you're indicated, you're expecting one outcome from a project and as you start to do the evaluation or the indicators are saying, oh, no, it's not going anywhere near the way you want it to. Dude, that happens. That happens a lot. Yeah. You know, I think that goes back to being realistic and having achievable indicators. And so the more you know about the context of where your project is happening, hopefully your indicators are gonna be more manageable. And maybe that goes back to the literature review and finding out what's already been done in that area. So hopefully not too many of your indicators are moving in the wrong direction. If it is a project where success is required for future funding, one of the ways that I try and make sure that our projects are measured accurately, but also in ways that will help promote success is you have two or three different ways of measuring the same thing, or you have grouped things in such a way that you can say a bunch of different things about your data. So it's definitely a bit of manipulation, but you also want to not rely on one question because you might also just have written a bad question or people are responding to it in a way or for some reason that you didn't expect. So you built a beautiful school, you ask people, do you like the school? Everybody says no, and then what do you do? You know, and they didn't like the school because of some reason totally unrelated to your project. But if you ask like, you know, how much more do you like having a school than not having a school, or how much has this improved your community life and how much have your children enjoyed the opportunity to attend school, you can ask a similar thing five different ways and that will help give you a better picture of what's really happening. And honestly, if your project is just not working, that's a great finding too. Always write up the negative results because you're not the only one trying to do this or thinking about doing this and it's gonna be incredibly valuable for the next person doing a lit review to know that this has been tried. And done this specific way, it isn't working. And then for our last question, we have what's been your most challenging experience doing this? Well, I mean, I should say I worked about, I worked three years in South Sudan. So they've unfortunately been experiencing a civil war for the last eight years. And I've also worked in Mosul and I've worked in Northeast Syria. So I mean, I've been shot at a few times that's always challenging running for your life. Let's say the most challenging experience. You know what? I think sometimes the most challenging experiences are really like motivating your team and making sure that they stay on task, managing a bunch of different personalities. I've worked in South Sudan where people have been, my team is usually really responsive if not entirely high capacity. When I worked in Iraq, people were much more high capacity but much more temperamental. So I had a couple of staff members try to quit on the first week because they did some data collection and it was okay, but it wasn't great. So, you know, I told them like, this is great and these are some things that we can do better for next time. And they considered that such an affront to their dignity that they all quit on me. So then I had to beg them all to come back because I couldn't hire another team in a week. So, you know, I've definitely, you know, I've had malaria, I've had every kind of diarrheal illness out there. I have slept on the ground. I have slept with snakes. I have gotten rat bites. It's like, there's a lot of crazy things that can happen but, you know, I think humans are sometimes the real odd card, the real odd one. Well, thank you for taking the time today and thank you for offering. And you said that if groups have questions about their own projects or students have questions about their research, it's okay for them to reach out to you. Yeah, 100%. You know, I would love to hear about your projects and help brainstorm what might be the right tools for you and what's manageable with your budgets and your timelines because there's no one-size-fits-all approach to M&A. Yeah. Thank you. Thank you so much. No, my pleasure. Thanks. We'll see you soon. Bye. See you. Bye.