 Hello and welcome to the impact evaluation for social entrepreneurs workshop. I am Rachel Brophy. I'm joined today by my colleague Carly Gallo. We are both at the resolution project, which is a nonprofit organization based in New York that works internationally with young social entrepreneurs starting supporting and launching their social ventures. Hey everyone, I'm so excited along with Rachel to be giving this workshop today. We are going to try to make it as interactive as possible. So throughout our presentations, we're going to pause to have some interactive components where we'll be working through various different exercises to make sure that this is both theoretical and practical. So definitely get a paper pencil or pen and get ready to work with us. There will be some designated times where we're going to be asking questions throughout so you can use the Q&A function during those designated times. So what is impact evaluation also sometimes called metrics evaluation and learning impact evaluation assesses what changes can be attributed to a specific project or program. So impact evaluation involves comparing actual changes with expected changes, which are goals, comparing achievements to a previous period of time and are measuring against achievements of a similar project or program. So impact evaluation is directly tied to an organization's strategy and learning. So strategy can be defined as a roadmap to achieving an organization's mission in an effective and efficient way. The process and organization will set goals and then through impact evaluation, you'll understand how it is achieving those goals. Learning can be defined when an organization adjusts its programs and strategies as needed based upon the impact evaluation data that is collected. So for example, at the resolution project, we use reports from our fellows who are constituents, as well as their guides their volunteers with the organization they mentor our fellows. And we collect these reports to assess the impact that the fellowship is having on our individual fellows, the impact of their work, their social ventures is having on their target beneficiaries in the communities that are looking to work in. And we use that data to help us improve the work that we're doing. We use it to improve the fellowship. We use it to refine the programming we're offering and making sure that we're continuing to be relevant to the population that we're working with. Yeah, and to get a little more specific about how we collect this data so we ask every six months from both our fellows and our volunteer mentors when we call guides to fill out what we call a semi annual report. Both of these reports are very tailored to meet the needs of both parties experiences. Through these reports, we find out like the Rachel was saying a lot of our programmatic enhancements each report is read by two staff members, and we take a lot of time to analyze and then learn from these findings. In addition to that we also have something called a monthly guide report that we ask each guide on a fellow team to complete monthly. These reports are much shorter but we are working with over 500 fellows in 81 countries. So they help us have a really good temperature read on how our fellows are doing. If there's any red flags if there's any immediate resource allocation that we need to tend to. And these are two really specific examples of how we're collecting impact data at the resolution project. So let's go ahead and jump right in. How do we do this. This is a really good high level overview of what we're going to be talking about during the workshop. So first, setting your smart goals. We're going to talk about sending two to five smarty goals and the way that we define smarty here is strategic, measurable, ambitious, realistic, time bound, inclusive and equitable. Those last two actually may be new to some folks who are familiar with impact evaluation and we're going to talk about those soon. Most of these goals should relate to your ventures mission, and at least one of them should focus on running an effective venture. This is also a logic model logic models are visual tools for mapping how the ventures activities will lead to impact creating a logic model will help you think through steps and assumptions in your venture. When you're creating a measurement plan measurement plans are tools for thinking through the information that you will need to collect to show that your venture is making progress towards the goals. You'll also figure out how to collect it then when you're going to collect that data. And then finally, monitor, measure, learn and repeat. This will not be the only time we're saying this during this workshop planning for impact evaluation is not enough. The most important parts of the job is to monitor what the achievements of the venture are and use that data to inform how you are going to run your venture. So planning for it is not enough collecting is not enough you need to analyze and repeat. So diving a little deeper into smarty goals. Here we have an acronym and the s stands for strategic, it reflects or specific you may have heard as specific as well. It reflects an important dimension of what your organization seeks to accomplish this could be programmatically or capacity building priorities. As measurable, we can define as including standards by which reasonable people can agree on how to meet their goal. So when you're measuring something this can be by numbers, this can be by defined qualities, make sure it's consistent, ambitious or achievable. So challenging enough that the achievement would be a significant progress or a stretch for the organization. You don't want to give yourself a goal that you know you achieved last year and just repeat. That's not enough you need to stretch yourself a little you need to make it a little bit ambitious, but at the same time moving on to our you need to make it realistic, and make sure that it's relevant as well. Challenge, make sure this is challenging but it also takes into account the resources that you have available, especially if you're working on social ventures if they're nonprofits. You know, take into consideration what resources you have at your fingertips, and make sure that it's possible to track this and worth your time and your team's energy to do so. And it's probably you would think one of the easiest ones but I think it's one of the ones that people forget most often and it's one of the things that's most important. So make sure that your goals include a very clear deadline. Now at resolution we do this by providing a month and a year so by August 2021 will have achieved this but you can also go by quarter as well. But make sure you're making sure that it is time bound and then inclusive. It brings traditionally marginalized people, particularly those impacted by your work into the process activities decision making policy in a way that shares the power. This part is critical. And this might be new to some folks who are familiar with impact evaluation out there. When it comes to inclusivity for us at resolution we've created something called a fellow council. This is a junior board of fellows that we will touch base with to do temperature checks to see if we're staying honest to our fellowship. If we're actually creating strategic goals that are meeting their needs. And we often have them weigh in on what strategic goals should be shaping for the fellowship and then equitable. It includes an element of fairness or justice that seeks to address systematic injustice in equity or oppression. This one might be a little trickier and it may take you a little bit longer to think of when it comes to your programming, but it's really important to include. It's really critical to make sure that you're including it. So like we said before, we're going to spend some time now actually being giving you all the ability to jot down a goal for your project organization initiative. We're going to have the acronym for smarty here we're going to give everyone about three minutes if you don't have one in mind we've created a fictitious example for you, hand washing for health, which seeks to improve hygiene, and reduce the spread of infectious diseases by working to promote hand washing and other good hygiene practices so you can use that example throughout if that's helpful for you as well. We're going to give everyone three minutes now. Now is also a great time if you have any questions, please put it into the Q&A and Rachel and I would be happy to answer them. Alright, I hope that everyone had some time to write down a goal. I know that is three minutes is not enough, but hopefully this started, you know, getting going for that. I'm going to hand it over to Rachel now to introduce our logic model. Thanks Carly. Yeah setting goals is really important, but then being able to actually track your goals and measure your progress is equally as important. We're going to talk about a logic model. So you have your goals, and then you're looking to measure your inputs, activities, outputs and impacts. And what that means is inputs are the materials used by a program. So, for example, inputs for a code drive would include coats boxes, maybe a truck to transport the coats. Those are the materials used by the program. Activities, which are what the program does with the inputs to achieve its goal, which can include specific activities as well as strategies employed by the organization for larger goals. So again with that code drive example activities would be publicizing the drive, actually collecting the coats delivering the coats, things like that. Then you're looking at your outputs. So those are the specific products of the activities. These are often measured by numbers for the code drive like the number of coats collected. Outputs are actually very commonly reported by an organization as measurement of impact. So they're usually more a better barometer of effort, you will oftentimes find yourself reporting on your outputs or thinking about your outputs as the thing that you have done. And that's not why you did the venture in the first place, that's not why you're doing a code drive you didn't do it to collect coats, you did it to make an impact. So it's also very important when you're thinking about your goals and how the impact you're going to have is what your impact the changes to the beneficiaries as a result of your program. They're related to the outputs, but they're not the same. So impacts can be changes in lifestyle behavior skills attitudes, decision making things like that. So impacts relate to the differences in beneficiaries after the program. So with the code drive example impacts could be reduced sickness due to cold exposure. You're talking about children fewer days mist of school impacts should be used creating smarty principles. So going back to that goal, thinking about what is the impact you're trying to measure. So if we look at the logic model in this framework, you see inputs directly relates to activities outputs impacts. We have two new features here. Those are indicators of progress and measurement techniques. So your indicators of progress are the measures or metrics you use to determine whether you're achieving success. For example, again with that code drive your indicators of progress might be reduction in sickness of beneficiaries. The measure of progress must be clearly measurable and help you identify whether or not you've achieved your desired impact. And your measurement technique is how you're collecting that information. So oftentimes you'll you'll collect information through surveys questionnaires observations interviews, there are a lot of ways you can collect. This can seem a little bit overwhelming or daunting to think about. So one example we love to work through is a sandwich. So thinking about a sandwich through the logic model. Okay, so you're going to make a sandwich your inputs are the things that go into the sandwich. Your bread lettuce, maybe tomatoes chicken whatever it is that you're making in your sandwich those your inputs. These are actually making the sandwich so maybe you have to cut the bread, or the vegetables, maybe you have to cook the chicken or whatever it is that you're you're doing. That's the actual activity with the sandwich together. Your output is a sandwich. So again this is where a lot of organization stop this is where a lot of work sometimes comes to a halt and you're like well my output I made a sandwich. You make a sandwich just to make a sandwich you made a sandwich because you were hungry. So your impact needs to be, did it reduce your hunger. So, in this case your indicator of progress would be maybe the amount of snacking you do over the next couple of hours. And the way you'd measure that is observing your eating habits. So again thinking about the sandwiches a great way to sort of assess, you're not looking to create a sandwich, you're looking to eat the sandwich and reduce your hunger. So that's the impact and how are you measuring whether or not it's had the intended impact. So again we're going to take a little bit of time. We might reduce it just just slightly. So our apologies if we come back just a couple minutes earlier than it says, but we want you to continue working on your example, taking that goal that you had started to flesh out and setting a logic model, just thinking about what are your inputs, what are your activities, your outputs, what are the impacts. How are you going to know that you're making progress and how are you going to measure that progress. Again, if you didn't come up with an example, you don't have anything in mind, you can use our fictitious venture here again, thinking about a goal related to that. And just start thinking about what are the various aspects that you'd have to think about while building a logic model. I'm going to come back in three to five minutes. So if you want to take some, some quick time to jot about it and think about what it is that you would do. All right, I hope that everyone had some time to start their logic model. I know that five minutes is very ambitious, probably not achievable to actually create one. And I know that there was a question that came through later in the presentation, Rachel is going to address that. Great. So moving on to measurement plan. Measurement and analysis should be integrated into the Ventures overall work plan. The frequency is from your measurement plan to determine where to build each aspect when it comes to collecting data, analyzing data and making decisions using data. Make sure your team is looking at the metrics you're collecting and using them to inform planning resource allocation and requests for help. If you use your measurement plan, consider whether you need to revise elements of your programs. So is your data showing that one program is more effective than another is less efficient than another why, or revise elements of your goals and measurement plans. So are your targets being, are they reasonable? Are you achieving what you set to actually achieve? There are a few best practices that you should, you know, work on that will help to inform your work and keep in mind. So first, consider resource realities. While understanding your progress is critical for strategic and fundraising reasons, you need to balance the resources spent on evaluation with those available to run your programs. Again, if you're working on a social venture, if this is a nonprofit, just really mindful of those resource realities that you have. Your mission is always going to come first and measurement will come second to that. Value consistency for your data to be meaningful. Make sure that you're measuring and reporting consistently. This means at the same intervals. So we use that example before how we have our semi annual reports every six months. So use the same intervals, the same cadence. This is going to be helpful for you and your team to evaluate but also going to be helpful for community members to know when they're expected to be reporting back. Using the same measures. So whether this is your measuring in dollars, how much money raised, leaders, how much clean water has been collected or distributed, number of students that have been impacted. Make sure that you're consistent with your measurements. And then using the same methodology. This is really, really critical every time you're collecting this data. So whether that is through a report, whether that is through the survey, whether that is through interview questions, or even budget documents, make sure that they're consistent every time that you are collecting this data. Prioritize transparency and inclusiveness. This data is not just for you. This is not for you to collect this data, analyze it and then just keep it on your Google Drive somewhere. This is something that you need to be sharing with your partners, your community members to keep everyone engaged and expand the pool of those who are helping you to ensure that the venture is effective. What we do at resolution is we collect all of the good quotes that we find at the end and we share them with the team. This helps to reinvigorate everyone. It reminds everyone why we come in at 830 at 9am. Everyone, every morning to make sure that we're helping our community. It reminds us of the work that we're doing. We also use a lot of our data in our annual reports to report back to our corporate partners. It's really critical that you're not just keeping this data with you, share it, be transparent with it. And then avoid overburdening your community members. So you need to strike a balance between getting the information that you need with the amount of time and effort it will take your community members to provide that information. There's no magic formula behind this, but it's really important to ask your community members directly what's reasonable for them. For us, we work with undergraduate students a lot. For the fellowship, you have to be an undergraduate student to apply. So when we're asking for semi-annual reports, we're trying to work around final season. We're trying to work around when our fellows are graduating, when they're having major life transitions. We have 500 plus fellows, so we can't always get it right. But just being really mindful of when that season hits or when that time during the quarter is. If people are particularly busy, you're not going to get the best data that you need. Amazing. So here we have a measurement plan example. So I'm going to walk you through the example and then Rachel is going to have a lot more to connect the stuff that we've been talking about. But we wanted to provide you with a tangible example. Again, continuing with hand washing for health or H4H to kind of solidify what one would look like. And again, this is a very short measurement plan. I hope that all of you take more time and have a much, much more expansive measurement plan. But here we're looking at our inputs. So for H4H, we're looking at volunteers, hand washing curriculum, instructional videos and games. So these are the inputs. These are what they're putting into the work. This is what they're giving out. Activities. So for activities, we have training sessions. So the activities is where the volunteers are getting the hand washing curriculum. That's where they're getting the instructional videos and games. Through these training sessions. The outputs are the students trained and students using good hygiene practices. Like Rachel mentioned before, this is oftentimes where organizations will stop. They'll say, oh, this is the number that we need. You need to go beyond that. There's much more that you need to collect. It needs to be more specific to make sure that you're actually meeting the goals that you've set for yourself and making an impact. So moving on to indicators of progress, we have really defined really specific indicators here. So the number of students ages to six, six to 10 trained by H4H, number of students using good hand washing practices, number of family members using practices, and then the average number of annual sick days. So these are going to be our indicators of progress. This is what we're going to be looking at to see how we're doing measurement techniques and tools. This is what we're going to be using to measure those indicators of progress. So H4H has participant logs. That's where you can see how many students ages six to 10 were trained by H4H annual follow up with sessions with schools. This is a great opportunity to see the number of family members who are using these practices, see the number of students who are using good hand washing practices, and then student attendance data. This is where we're going to get that number of the average annual sick days that have been taken by our attendance data. And that's a really critical number, right? Because that is the end game for us. We want to decrease that. So when we look at it and we put it all together, the impact that we're going to have over the next five years is have the number of school absences due to sick days from 10 to five for 1000 students ages six to 10 in our made up resolution land. So that is the impact that we're looking for at the end. I'm going to hand it over to Rachel because I know that she has a lot more to say on this and I think that we've had a few questions come through as well. Thanks Carly. Yeah, so when you're looking at our measurement plan, so you're looking at sort of all of those components of the logic model. Now you're looking at your actual measurement plan. How are you going to measure this? So what are the factors that you need to consider? You want to think about what do you want to accomplish? How do you measure your progress? Where will your data come from? Next year, how are you going to measure that? I know a lot of times it can feel very overwhelming to think about, you've got a big project, you're starting to think about like how are you going to make impact? It can get a little overwhelming. But if you start basically asking yourselves the very basic questions, who, what, where, when, why and how, you can start to build out a plan. So you've identified all of your inputs, your activities, your indicators of progress, your measurement. Now just think about how, how are you going to do it? So again, we're going to give a little bit of time to go through some examples. I know there have been a couple of questions. We're going to make sure either to address them towards the end in the question section or if they're embedded in the presentation, we'll get to them at that point. But here we again want to give you a little bit of time to play with it. So again, you've sort of identified hopefully a little bit of a goal, figured out some of your logic model, your indicators of progress, your inputs. And now just spend some time again with that who, what, where, when, why and how, and start to actually sketch out a little bit of a plan. You're not going to accomplish all of this in five minutes. This is a hope to, to give you a little bit of a crash course 101 into the whole process. So, and again, if you, if you haven't come up with a goal, if you're sort of new and thinking about what your, your project might be, you're again welcome to use our fictitious venture. Just a quick note on that. We've been using this example for a long time. Well pre COVID so our apologies if it's triggering example certainly wasn't meant to be it. It was just an example we've been using for a long time. So again, if you if you want to use our fictitious example here, you're welcome to do that. And then we'll circle back in five minutes and we'll make sure to address the end of the presentation as well as those questions that have come through. Alright, same with the logic model very ambitious to create a measurement plan in five minutes and I hope that you all just started to think about this at least. So I'm going to hand it over to Rachel to close it up a little and then we're going to address some of those questions that you all have been asking. Thanks Carly. Yeah and actually this last one I think relates to the first question we got which was fantastic about how do you measure progress in real time. How do you think monitoring to improving course correction. The first question is because planning isn't enough. You have to actually use the data that you're collecting. It should be iterative and ongoing. Oftentimes we think about really lofty long term goals and those are great. We should have long term goals for the work that we're trying to do, but short term goals are actually even more important sometimes setting those goals that are time bound and you know Carly talked about the beginning, you know resolution oftentimes we'll talk about maybe an 18 month goal, the six month goal. What are those goals that you are looking to achieve in a short term. And then what is it that you're looking to have an impact on and how does it actually factor into a longer term goal. Oftentimes progress that you're seeing in a short term goal can really help you quickly course cracked and figure out what's working. So at the organization at resolution we've oftentimes used our data to course cracked to think about things so we have some assumptions. So a program is going to be great. We get data from our fellows it's like actually it'd be better if it was this. We prefer this instead, or this was great but what about this, and that's allowed us to shift to change to evolve. So, I think oftentimes we think of impact evaluation as a, is a long term thing, and it is, but it's also short term, it should be iterative it should be ongoing. We should be thinking about what are the short term goals, how do those goals lead into a longer term goal. It's, they're really sort of snowball into each other. And so if you think about the short term things sometimes that can really help you figure out how to use that data you're collecting real time to course cracked. So that's, yeah, analyze, learn and repeat. And so that is really how resolution does it. We hope that this is a really helpful model. Again, this was a really lofty thing to do to be able to go through an entire metrics and evaluation system in an hour long presentation, or just under. I know that we've only sort of hit the top of the iceberg here. But we have some great questions. So we're going to, we're going to continue to answer those. Again, please do use that chat. If you have any questions. If you want to find my email and Carly's email there, please feel free to reach out. You know, I know we only have about nine minutes left. We might not be able to answer everything, but please do feel free to contact us and continue the conversation. Carly, did you want to answer the questions about how to convince donors to resource metrics evaluation and learning. Absolutely. I think that's a really phenomenal question. So it seems like it should be easy to convince everyone why is important, right, because at the end of the day, it's really one of the only ways that you can determine if they're if you're having an impact. But when it comes to your donors, your supporters, your sponsors, that is how they're going to see their money is being put to use. That is their case to bring back to their CSR to bring back to their companies on a large scale and say this is the impact that we've had this year. It's not enough for them to just funnel money into your organization without actually seeing how that money is going to use. So through actually investing in impact evaluation investing in monitor and evaluation taking that time you're able to make the case for why they should be giving you more money because they're able to see the impact that their money has had they're going to be able to have a trail to see where their funds have gone and continue wanting to work with you. So it's really just about being able to deliver what you know they want to see at the end of the day through impact evaluation and the data that you're collecting. Fantastic. I know we have another of other great questions here. One of them is is sort of related to what Kylie was just talking about which is how much of your budget should you be allocating towards metrics evaluation and learning. And if you go back to our smarty goals. One of those was about resources. So you really should think about the resources you have evaluations really important, but doing the work that you're doing is even more important. Making sure that you have the appropriate resources. So resolution where a staff of about 13. Carly and I are the ones who work on impact evaluation it's only a small portion of what we do. That doesn't mean it's not important to the organization. But it means that that's sort of what what we're able to allocate to impact evaluation. So you can do this with not a lot of resources. And then when you think about your budget it's great to think about putting some money aside there's some great free tools out there that you can use. Survey monkey. It can be a great tool. It's got a lot of data analytics already built into it Google has a great suite of resources as well. So there's a lot of free low cost things out there that you can think about using. So if your organization nonprofit, you might have Salesforce as your CRM. A lot of nonprofits do because they provide really great packages to nonprofits resolution uses that we do a lot of our data collection and analysis using tools that are either native or plug into Salesforce. So that has been sort of one of the places where we've allocated resources to technology, but there are some really great free resources out there. If you're just starting out. It doesn't have to be sophisticated. Think about some of the surveys you take. You know if you're on a website and sometimes they want your feedback. Those aren't necessarily always the most sophisticated, but they can collect some really important information. So resources oftentimes is probably even more time than it is money. Rachel, I think to that point it would be really important for us to mention that you can also get creative with the people that you have in your network for resolution. A lot of our impact evaluation reports were created from working groups were created from people in our network. A lot of those questions didn't come from us because we're so close to the work they came you know from these working groups from people who are familiar with resolution, but also really versed in impact evaluation so they're a subject matter, subject matter experts within their field. So figure out who's in your network and see how they can be also helping you create a working group around you know when you should have reports what should be on those reports as well. I think that you know doing a deep dive into the resources that you already have is a really great suggestion. Great. I see a number of other questions here. I'm just going to read them out mostly because Carly can't see them. So we've got a question here about insights and measurement of social or behavioral change. What can be more qualitative. A lot of us in the nonprofit sector and social impact, oftentimes are working in qualitative things. So what are those qualitative variables leadership, ownership, social cohesion, equity, and how do we measure those. So let Carly think about that for a second as I answer one of the other questions here. Which is about how do you get data that's real and unbiased if you're expecting people to report on it. And this is something we've tackled a lot of resolution. We have self reported data. We ask our fellows to report. They could be making things up. We don't think they are, but we are relying on them to provide us information. I think there's a couple of ways that we can do this and feel comfortable about that information. One, we also survey there are volunteers are guides. So we have a little bit of a check. We ask our fellows some questions, we ask those mentors that are working with them. And that helps us assess our is the individual implementing the project and someone who's working with them sort of seeing the things in the same way that can be a really helpful way. We also build a relationship with our constituents. We want them to be honest with us we cannot improve as an organization on what we deliver. We're not honest with them. We make it really clear to them that filling out impact evaluation reports every six months are not an evaluation of them. This is not a time for resolution to check up on them. Make sure they're doing what they were said they're doing. But that's the relationship we have with our constituents. You may have a very different relationship with the people that you're getting your, your data from. I think it's important to think about what that relationship is. So does that maybe impact the type of information people will provide you. Sometimes there's some really good public sourced databases that you can compare your results to that doesn't exist for everything. No, our work it's not as applicable because we're running a very specific unique program, but sometimes there are really great databases out there. In the universities, a lot of students in grad school are doing some really great work, and that can be a great source to data check against a public sourced set of resources. Carly, do you want to do I go ahead and answer the next question about qualitative variables. Yes, I will try to do it within a minute and a half that we have left. And whoever asked that please feel free to email and I would be happy to have a much broader conversation on this, but when it comes to qualitative data it also ties into what Rachel was just saying about the nature of the relationship that we have with our constituents in our community. Because when we're collecting qualitative data we have questions built into our semi annual reports where we can measure their progress and leadership where we can measure different different qualitative data but we also have different checkpoints with them, where we have individualized conversations. Again, in kind of an interview format but much more comfortable to be asking how their fellowship is going and that's where we'll find out a lot of that qualitative data is through those conversations. Again, making sure there's consistency with how we're measuring to make sure that we're asking the same questions for each of our fellows, but that's the type of relationship that we've built to be able to collect that data. So those checkpoints at very transformative parts of their fellowship so after six months of their onboarding process, and then after two years we found that those are really, you know, inflection points to figuring out how they're doing within their fellowship and within their leadership roles as well. I know that we're at time. Yes. So thank you all so much for being here. Really appreciate your attendance and your interest in impact evaluation, your dedication to making sure that you're tracking the impact that you're hoping to have. Again, please feel free to contact us. Our emails are Rachel at resolution project.org and Carly at resolution project.org. We'd love to continue the conversation I know we have a couple of questions left. Unfortunately, we're at time. We really hope that this was a great intro to impact evaluation and please do feel free to continue the conversation with us. Thank you so much everyone.