 This is Think Tech Hawaii, Community Matters here. Good afternoon, and welcome to another episode of Civic Partnerships in Education. I'm your host, Ethan Allen, here on Think Tech Hawaii. Thanks for joining us today. We're going to be talking, as we usually do, about education issues and sort of ones that impact the Pacific. Today's talk is not going to be quite so specifically Pacific, a little broader, it's going to be on the subject of evaluation. And to help me out here, I have Ms. Sonia Evanson, welcome Sonia. She's joining us via Zoom meeting here, welcome. Thank you, Ethan. Sonia is an evaluator at Pacific Resources for Education Learning, a colleague of mine, and has been there for quite a number of years, done a wide range of evaluation projects, both with Prell and as well as with other groups around and across Pacific. Pacific has bounced all over the Pacific doing evaluation work out in the islands, has done a lot here on Oahu, so a very broadly experienced evaluator. So let's just jump into this, talk a little bit, if you would, Sonia, about what you see sort of is the central issue in evaluation. What is evaluation all about? Thank you, Ethan. So to start with evaluation, there are many definitions of it, but oftentimes it's judged the value or work of something, but a more updated definition would be collecting data to important decisions about something. So depending on why you're evaluating something, you collect different types of information to decide whether to continue a program or to change direction of a program. You're going to find out what's working well, what's not working well. And in general, you have two different avenues of evaluation. You have formative, which tells you how you can improve your program as you go, and summative is pretty much the story when Allah said and done, and you need to talk about the impact of your program. So depending on the use of it, you get out these different things along different avenues. Sure. And some people add in an even earlier phase, right? The front-end evaluation to establish the need or sort of the context of a problem, right? Right. You start with what is the problem, and a needs assessment would tell you what the problem is, and then you address the problem. And evaluation is often then of the project or at the end of the project. And like you and I were discussing earlier, oftentimes in the past, the evaluator would come in as the cold-heart observer and tell you how you did without involving any of the clients or stakeholders. It would be kind of like doing the consumer report on a vehicle. Like, this is what works, this is what doesn't work, and not get involved with the many facets of the program. But now, there are many different types of evaluations, often called participatory evaluations, where you get involved with the staff, participate, or you see different aspects, and you get different opinions from different stakeholders as you go. Right. And it makes sense that the earlier form that you spoke of is more, there's not much to do at the end if your evaluator says, hey, you did this well, you didn't do that well. It's sort of, well, the project's over and done with, and we may have learned, and perhaps next time, if we do this again, we can make the use of that. But it doesn't really, in no sense does it inform the ongoing project, right? And further, it has sort of that bad connotation of evaluation as being this very negative thing, right? Evaluation is sort of grading a project or assigning blame for what's going wrong on a project, or putting responsibility on certain people for certain parts of a project. All these sort of loaded aspects, right, which make evaluations, sometimes people get very nervous when they hear, oh, we're going to be evaluated, right? We see- Hey, say that because my professor at Claremont, where Donaldson, wrote many articles about the fear of evaluation. And it's right, nobody likes to be judged for anything, or nobody likes to be told they've done something wrong. They like to be told what they're good at, but they don't necessarily want to be told, oh, this isn't wrong, or you didn't meet this, or... So it's taken a bad, it has a bad reputation sometimes. One of the things before we get into that story, I want to talk a little bit about one of the hardest parts about being an evaluator is people's misunderstanding of when it happens. Because oftentimes I've gotten hired at the very end of a project to come in and tell the story of the project, and the project's already happened, and I haven't had a chance to collect relevant data while it's happening or set up anything. So if nothing else happens out of this show today, I would hope that people would understand that you involve an evaluator very early on, even in the planning phase of the project, so that you can get involved at the right point and collect data that's meaningful and have a conversation with the clients about what is meaningful to you, but set it up ahead of time so that you collect that valuable data when you can. Exactly, this is what you mean by being an intelligent consumer of evaluation, is to bring that evaluator on well before your project starts, so they can help you frame that project and figure out, what do you really want to know? What's your real central question, right? What do you want to see different at the end of this work than is the case right now? And how do you know that it's different, right? Those are core questions that can't come in at the end and start addressing them very effectively at all. But if you start months before the project actually starts, you can really have deep rich conversations about what the real central issue is that they want to change and how they'll know that change has happened, right? Right, and one of the important things I like to start a conversation with is why do you want an evaluation? And oftentimes it's mandated by a funder and people are like, oh, it's something we have to have and it's kind of painful, like going to the dentist, not quite as painful as that, but not quite desirable, but I often have a conversation of, so can we use this for something? Can we use this for you? So not just a report that sits on a shelf because somebody required it and you just went through all this exercise of troubling the staff to collect data that doesn't feel like it's connected to anything. So really it's about having a conversation with people about what is most meaningful to you right at this time because if it's a brand new project that's getting off the ground, you're going to ask a different type of questions than you have a mature project that's been around 10, 20 years that's really kind of settled in their processes and how they do things and you can expect a different type of result. So you really have to custom make your evaluation approach depending on how new the project is or what phase the project is in, it depends on context which brings us to the Pacific. Yeah, before we jump into that let me just sort of, then you've made an interesting parallel, it almost sounds like you view evaluation as almost a sort of a coaching process, right? That is you go out, if you want to improve your physical fitness, you go out and you find a coach who can help you determine what muscle groups you want to build up, what you want to change, maybe your posture, your body language, they work with you, they help you identify the specific exercises you'll do, they encourage you and this is much more of the kind of process you're talking about and Evaluator works very closely with the project people so they can, as you put it earlier, really be the best they are, right? They can do things well as well as they know how to do and even perhaps better than they knew they could do them, right? Rather than being chastised after doing them not quite so well, they can actually, because of the formative nature and the front-end work, they can actually do them better, right? And the very act of asking it by doing this, this is going to happen, it really helps people hopefully streamline what they do because I have seen many projects do way too much stuff, your objectives and then we talk about how to measure those objectives and have conversations about this should lead to that. It helps people to kind of really think through what their program is and I really want to help people to make life easier, not harder and not do so many things all over the place. Really focus in on what it is you're actually doing and I'm going to take a little side track just to tell you the story of why I became an evaluator. Sure. I was a program director before and I had an evaluator that asked me these pointed questions, because of his questioning I was like oh my gosh what am I doing? What am I doing in this program and it made me rethink my program but it also is about the field of evaluation which is why I'm an evaluator today because it was a whole new way of thinking that I appreciated. Yeah, yeah that's really, and of course it should be, evaluation really should be a win-win kind of situation right when you're providing more or less objective sort of external advice to somebody you're watching the program get underway, you're watching the processes they're using, you're watching how they're implementing what they said they would do and you can step in and say I see you're talking to these people in 20 minute segments, I wonder if you do better talking to them in hour long segments or whatever you know I see you're asking them for written feedback maybe this group would do better if you oral feedback. You can help them really address their issues in ways they perhaps as program people aren't going to see, they're too close in right? Right, right. And you set up your kind of questions ahead of time you know you ask them what the program is and then you look at what data points are going to tell you and of course getting that data in early on time to do something about it is the key and the other part about collecting data and another one of the pitfalls I've fallen into is collecting too much data and what am I going to do with all of this? My bottom line is don't ask it if you're not going to use it. Surveys, people don't answer surveys. I mean if you get more than 10 questions if you get pages and pages people lose interest and then you don't even get honest responses sometimes. Yeah, yeah it is it's a very it's very very sort of context dependent as you as you noted there. Some if you're talking to groups for instance of college students who are in class when they take surveys you may you may get fairly good rate of response they're already in class they can respond in their mobile devices it may work quite well if you're trying to survey groups of people out in remote islands of Micronesia you know they're not going to reach online particularly very easily even if you give them hand written you know hard copy surveys it's gonna take a while for them to fill them out they they may not get returned right away you're yeah it's a very very different different game so you do want to be very careful certainly because the whole point as you say is to help make the project run well help get it keep it focused so the work of the project is directed towards and is accomplishing the objectives of the project and not just sort of keeping people busy doing lots of stuff that may not be helping you know. So what we were talking about just now is methodology like what method of getting information is going to give us the best information and that too is kind of an artistry I mean there's like different types of methods that tell you different things and sometimes you want to get from several different data sources like maybe you talk to parents and students and an outside observer about how the student is doing so you get three different viewpoints it's called triangulating where you try to get to the truth of the matter that's one of the methodologies but other methods other than surveys are focused groups and talking to people interviews observations and there's many different tools that we have in our toolkit to get that information all with pros and cons. Exactly and we're gonna in our second part of the show we're gonna explore some of those tools a little more closely and talk a little more in-depth about some evaluation work that you've done and actually you and I have done together but right now we're gonna take a brief break here we have about one minute off and then we'll be back so Evanson will help us explore the area of evaluation a little more. Hi I'm Lisa Kimura I'm the host of family affairs on Think Tech Hawaii join us every Tuesday at 11 a.m. to talk about the issues that really matter everything from policies that need to be changed in Hawaii to the fact that we need better gender equality so that we can all have a better shot again join us every Tuesday at 11 on Think Tech Hawaii for family affairs aloha and aloha my name is Calvin Griffin the host of Hawaii Uniform and every Friday at 11 o'clock here on Think Tech Hawaii we bring in the latest in what's happening within the military community and we also invite all your response to things that's happening here for those of you haven't seen the program before again we invite your participation we're here to give information not disinformation and we always enjoy response from the public but join us here Hawaii Uniform Fridays 11 a.m. here on Think Tech Hawaii aloha and you're back here on likeables on on Pacific partnerships and education here on Think Tech Hawaii with me Ethan Allen your host and joining us via Zoom meeting today is Sonya Evanson an evaluator from Pacific Resources for Education Learning we're talking about evaluation on the first part of the show we discussed some of the rationale for evaluation we made the I hope the strong point that evaluation should be done early that it's not a negative sort of thing it's a very positive thing if done well it can lead to better programs can lead to sort of wins for everyone your your funder your program people and we're just starting at that as we went to break to talk a little bit about sort of some of the approaches the methodologies the way of doing evaluation and we had talked a bit about surveys but Sonya you were saying there are other there are other approaches to use when surveys aren't necessarily the way to do it right right so not everybody and especially on the Pacific the paper pencil surveys you're dealing with not only whether or not they literacy there but it's a different language and then things get lost in translation so there are many reasons why surveys might not work other than the fact that people don't like to respond to surveys people do like to talk though and so answering questions either a one-on-one interview or in a group interview seems to be a way to get information out of people but it's a little more time consuming it takes you know more resources to do it but then people can talk freely and bounce ideas off each other and you learn much more rich information about a project than you would get out of a survey that's it gives you the time to ask other questions that decide what you thought you were going to ask in the survey you can ask probing questions to ask what they really meant yeah so no I mean I understand that that's valuable that you know as you say the downside is it takes a lot more time a lot more effort but yes as I've certainly found in my work out in the Pacific Islands getting people to respond to surveys was very hard but there are even subtleties of course to interviews right different cultures are have these different attitudes right and Pacific Island cultures talking about yourself is not particularly considered something that that is done particularly by certain subsets of Pacific Islanders right right and then so for example in the Marshall Islands a woman can't necessarily ask a man certain questions so it has to be a male interviewer asking a man then there's some age differentials I get away with a lot more because I'm not from there so I don't all under some of the rules but I mean there's a lot of different kind of cultural nuances that are good to know so that you don't send anyone sometimes too there's a reluctance to tell you anything negative so you have to couch a question in a way that gets at the truth of the matter so instead of saying what didn't you like about this workshop then you just say what would make it even better if you just have to kind of spin it to a little bit more positive because people like to please so that's a general rule right there there is there is very common phenomena sometimes called the Pacific Island yes right where where people do not want to say no to you they don't want to be in a confrontational situation and so that will say yes even when they don't exactly mean it and they just keep things very positive and so as you say if you ask them what was wrong they of course have nothing was wrong at all and yeah and then there is these the social interactions if you if you're talking to a group right that is particularly if you have an older relative you as a younger relative are not allowed to contradict that person right that's considered extremely rude and not done you have to think about the groupings if you have groups of people and then you have to think if that's the right method I want to tell you a little story about when I walked around doing interviews it was actually for a project for you Ethan it was in Yath and then Outer Island kind of find out about water use practices or did they clean their water filters and so I went around the village and I was asking do you clean your filter and the answer was always yes you treat your water and they're like looking at me for clues like you boil it yes and then they'll hold on do you boil it and they said an hour so they were looking to tell me the answer that I wanted to hear and I like oh this isn't working out so I realized I had to change tact and not use that approach and rather walk around the village and observe actual practices and go look at water filters myself see if they clean them or go look at if they did treat their water and even those who said they did they didn't they drank straight out of the catchment so observation was my backup plan because that interview wasn't working right you know that's a good point sometimes you you plan one thing like a survey that doesn't work you go to your backup interview and that still doesn't work so yeah uh that's always that bottom line you can you have to go and observe the behavior yourself but that again is it's even more time consuming right and and uh intensive and there's certain things you you're not going to get out of that but you bring up a very good point that people uh do like to make others happy in general and they want to please you so they're trying to guess what answer you want to hear and they want to give you that answer and that of course is not what you're after you're after some sort of at least quasi-objective truth right yeah getting it truth is kind of the bottom line of evaluation what is the truth or what is the actual real situation so whether or not people are learning what's intended sometimes you give them a real test or you ask them to rate themselves in what they've learned the self rating isn't quite as accurate as a true test of learning but then true testing comes with a lot of faults as well as we know people are up in arms sometimes about does the SAT test actually measure people's ability um so there's a lot of debate about appropriateness of tests and then there's cultural appropriateness if you have a test about tobogganing in the snow and you give it to somebody in Samoa they don't even know what snow and tobogganing is so is that appropriate to ask them those kinds of questions right it's much much more if you can look at the outcome that you wanted so and you mentioned earlier my water for life project and one of the things we really wanted out of that was to know whether people's practices of dealing with their water were changing and it was nice to be able to go back towards the end of that project and see many more places that had earlier had very non-functional rainwater catchment systems now had gone and they had clean roofs they had clean gutters they had first flush diverters in there they presumably were using they had a good new tank or a newly cleaned tank with perhaps a new hardware on it for the spigot and all and they were actually apparently treating their water a little differently than they had before which was again sort of the bottom line did they did they understand that it wasn't it was valuable to keep water clean and protected so yeah it's you know I wouldn't I would have felt very bad and I wouldn't it would not have been successful to go and give them some sort of a an actual test on do you know this do you know that do you know the other thing about water but the fact that I was seeing more people involved in monitoring the water and keeping the water systems clean and the in these communal systems and paying attention to the quality of the water and reporting data on the water even back to the EPA in many cases these were all indicators that this project was being successful in terms of getting people to change their behavior in sort of in a desired way towards towards their use of water right and so we got go ahead about buy-in to evaluations so oftentimes you have a project and you have to collect this data because it's mandated by somebody and the staff are like going through the motions and not bought in but the whole buy-in would more likely happen if the staff were involved in that conversation up front like well how do we know that these students are learning this thing and if the design of the evaluation included their process and they understood well this is what our project is trying to do we're trying to get them there that hopefully they would buy into the data collection itself because that's often one of the more painful pieces the data collection and the time it takes to do that and sometimes data accuracy is a problem too you collect data but it's not necessarily accurate I know it's hard enough to get just plain attendance counts for folks out in the Pacific region it's just a really it's not part of the the thought process that this is important but I think people need to know why are we using this it's important for what and I often I'm like that myself like I don't want to do something unless I know why and then I buy into it and then like then I'm more likely to willingly do it right and the participatory aspect of the evaluation is really important too because the people on the ground the sort of stakeholders if you will if you're evaluation often will have very good ideas about what is feasible what is sensible what's gonna resonate with the people who are actually doing the project who are you know really involved in the day-to-day execution of it and you may think it's very reasonable to ask them to observe their students and fill out a survey or a checklist on what students are doing they may not think that's a sensible way to do it they may want to as you say sit and talk story about that at the end of the week and say hey here's what went on this week here's I saw so-and-so doing this that showed me they have now learned how how to you know accomplish this task right and it's a very different idea and ignoring your participants is a you put a lot of project the project goals at risk I think right but it does take time and it takes a lot of conversation and it takes a lot of reflection but I love the idea of reflection because it just helps make your project stronger but you do have to build a time for it don't joke about that I mean it's just time consuming but it can be very worth it yes indeed hey and speaking of time we are I'm told now that we are out of time the project this project has come to an end I thank you for all your good insight Sonya that was this was really really informative for me and I'm sure for our audiences here that Sonya Evans in here has been been helping us explore evaluation thank you so much Sonya good luck and we'll see you in online and I hope you and I hope you will come back and join us in another couple weeks for another episode of Pacific Partnerships in Education until then I'm Ethan Allen signing off