 Hello, everyone. Welcome. Nice to see you on this lovely sunny where I am Thursday afternoon. Great. If you can turn your videos on, it'd be lovely to see you. Just admitting a few more people. Hello, Deborah. Great. A few familiar faces there, which is really lovely. Great. Well, welcome everyone. Thanks so much for joining this very illustrious session this afternoon. Hopefully, we'll have a lot of good conversations. It's really about a conversation starter and a place for conversation. Before I start, I'd like to acknowledge the country where I'm sitting today. So I'm sitting on the land of the Gadigal people of the Oro Nation and I would like to pay my respects to Elders past, present and emerging and thank them for the custodianship of this land. Before I get underway, just to note that there's three groups presenting today and we've got sort of a mix of different people talking and being part of this process. So this session was brought together by three groups. Firstly, the New South Wales Regional Network of the AES and if you don't know the network provides networking and seminar events such as this. Assist in the planning in the professional learning program and advises the AES board on its committee on the members issues in our region. So we've got a really active group in New South Wales. So if you do have any issues, the flow is over there. He's fantastic so you can reach out to him and also me. So please feel free to put anything in the chat box. Also to the ACFID community of practice Mel group so ACFID communities of practice are an organized and run by ACFID members themselves and they're really self organizing and really about creating space and dialogue for conversation to improve practice. And also the RDI network, which is a network of practitioners, researchers and evaluators working in international development supporting collaborative partnerships. So this is a collaborative partnership to improve the uptake and use of evidence in policy and practice. So that is who is leading the show today. And before I, I'll just get started by sharing this with you. So hopefully everyone can see that in a full or form flow. Is that looking good for you? No, we see them. All your miniature slides. Yeah, that's good. Perfect. Excellent. Great. All right. Wonderful. So I've just done the acknowledgement country. I'm not as proactive as much as we can be on these online forums, but you know, please use the chat box also raise your hand if you need to. And also in the case that you can rename yourself, particularly when we're going to breakout rooms and that's really nice for people to know who you are. So firstly before we get started we're going to do a quick poll and flow is going to flash the poll up to you. Do you need me to shop, shop, shop, stop sharing whilst you do that? Can you see it? Can people see it? Yeah, we can see it. If there's, is it possible to put closed captions on? I'm not sure I can do that from my end Karen, maybe flow you can do that from your side. I'll try. We have that feature on this one. Don't think we have that feature in this zoom. Perhaps it's an old zoom. Is everyone seeing the poll? Yes, I've been able to do the poll. Great. Okay. Everyone completed thumbs up. A few thumbs up. Great. Okay flow. What do you think? Is it, are we ready to launch the poll? It's a few people just joining. I'll just admit them. Just give you a couple more minutes for the poll. Okay. So sorry. Do some searching around captioning. Yeah. Usually it's automatic on our UTS system. So maybe this is an old version of zoom. Okay. Did you want to share? I think I've shared them, but I'm not sure if you can see it. Can you? Yeah, we can share the results. I think it's quite interesting. Okay. So at the beginning we can see an even a spread of people with the level of experience. Can you hear me? Yes. Yes. Yeah. And a few more people with experience and participatory evaluation. And what we were expecting this, the majority of people have low budget for, for evaluations, which is, is something we were expecting as part of this presentation as well. Any other comments from people? I'd like to just know what you mean by participatory evaluation. Great. Excellent. Well, let's talk about that as we go along. Wonderful. Okay. Good. All right. Well, a few people are still joining. So I'll let them, I'll let them join and we will keep going. So what we're talking about today is such big topics. And we, we recognize the diverse group of people that are here from across those three different communities, AES, the sort of Mel Cop group and also the research for development impact network. So we really wanted to share with you around, well, what is international development, thinking around what has happened in the past and what are the sort of now and future trends. Following this brief sort of presentation that I'll have, we'll go into a panel discussion and discuss these in more details. So really international development, it's, it's sort of a, it's a big term and many of us have been using it for a long time. It's, it's, it's really about this sort of program of work or an intent to address poverty, social injustice, discrimination. Most recently, this international development sort of program is framed under the sustainable development goals, which really speak to the, the connection between people, prosperity and planets. And it's really this agenda about leaving no one behind to recognizing in all countries of the world, there's marginalized and discriminated people, social inequality, you know, we know even in our own country inequality is growing so drastically. So it's really about ensuring that there's better well-being outcomes for all communities in all places. What is ID evaluation in the past and why? So international development evaluation, it's got a long history. It's been, evaluation has been a key part of international development for many, many decades and there's a big body of work, both in terms of theory and also practice in that space. One of the things in this slide, it talks about this OECD criteria. So OECD is, as many of you know, will be the sort of largest, most wealthiest countries in the world. And this DAC is Development Assistance Committee. And so in our geopolitical sort of landscape, many rich countries provide assistance, development assistance, aid, overseas development assistance, aid development, these kind of words, they describe this funding from like Australia or Canada or US to other countries, lower or middle income countries. And this OECD DAC criteria is really defining what is evaluation in that space. And it really does talk about this kind of systematic and objective process. So it's really framing this idea of the independent external evaluator coming in and assessing the quality of programs and looking at efficiency, effectiveness, sustainability, impact. And so you can Google, you know, OECD DAC evaluation criteria and you'll see these, all of these dimensions. And so evaluators in the international development field are kind of guided, are required to look at evaluation and look at programs through this lens. And so there's these key criteria that is, that is required for international development and it really creates this kind of, yeah, direction. It's a clear direction and clear guidance to follow that criteria. And we can talk a little bit more about that as we go along. So efficiency, effectiveness, sustainability, relevance, coherence, impact. You can, I can reel them off because I've had to use them in most of my evaluations that I've done. There's other sort of perspectives around accountability and learning agendas. And so this has been a bit of a debate for, you know, for a long time, but it's really about, well, who is the evaluation for? And is the evaluation for the donor? Is the evaluation for the funder of that program? And so that's really about thinking like they've defined those criteria and they're saying these are the criteria by which we want to assess this project. And then there's also this learning agenda. So thinking about, well, who is the evaluation for and who's learning and where are the evaluation findings going to inform the future? And so in some instances, is the accountability in learning for the donor or is the accountability in learning for the community? There's bottom up accountability. There's top down accountability, all of those things. Before I go on, I will stop sharing my screen because I did put it up the top. And did you think I would still remember to put it in the chat box? But one of the things we wanted to do is to make this process very interactive. And so you can do it. Great. Great. Thank you. So I did put it up the top and I still forgot. We wanted to make this... Did I copy an image? What's that? I think I copied it as an image. Oh, right. Here you go. Sorry, Karen. Before we forget, Jeff asked a question only on what is participatory evaluation. So if you can make a note for your presentation. Okay. Great. As people... Anyone put that in the chat box? Yes. Great. Okay. Great. Thank you. I'll go back to sharing my screen. Oh, we're really professionals at this. People, aren't we? Okay. Can everyone see that? Yes. Yes. Great. Thanks, Flo. Okay. So I'll... Yeah. Speak to those. If you have questions, comments, ideas flowing as you hear me speak or also hear the panellists speak, then please jump into that Jamboard and add any thoughts as well. So we've already got a question around what is participatory evaluation. Please, yeah, keep those questions flowing. And Edith will keep track of the Jamboard and the comments there. So, yeah. So there's been this kind of tension or this kind of positionality between accountability and learning and recognizing, well, who is the evaluation for? Are we being accountable to donor's funding or are we supporting learning for donors such that they... such that, you know, they take evidence from evaluation and use that in future of policy and learning for those communities, those communities or government or local stakeholders who have engaged in that development program and where does the evaluation support their learning and their own future addenders? So what are the now future trends, ID and evaluation and why? So the question around participatory evaluation is really saying that, you know, is it the way the evaluator comes in as an external evaluator, consults, extracts information from all those different stakeholders, goes away and writes a report and delivers it to the donor. A participatory evaluation is really more focused on that learning agenda and ensuring that the local stakeholders, the local actors who are engaged in that program are engaged in the evaluation, you know, they're active participants in the evaluation but more importantly, they're active designers of the evaluation so that participatory evaluation can really cut from across all aspects of the project cycle. So who's designing the evaluation questions, who's leading the evaluation interviews, focus group discussions et cetera, who's sense making evaluation findings and who's engaging in the communication and dissemination of the evaluation. So, you know, so we can think about the evaluation cycle and we can think about different points of participation in that and a participatory process is really recognising there's an ownership kind of quality for participatory evaluation that it's not done on people but it's done with people and that they own, they fundamentally own the process, own the purpose and own outcomes. So we're going to talk more about that as well but that's just a bit of a brief start. One of the things that's really interesting in terms of ID evaluation and it's very similar across other fields of evaluation is just this blurring the divide between monitoring and evaluation. So again, having that participation in those kind of evaluative questions along the way, not just at the midpoint and an end point of the process and so that's been a big trend that's happened in evaluation. Also too is questioning evaluation for whom and this idea of benefits. So it's really about sort of recognising the evaluation use and we'll talk a little bit more about evaluation use as we go along but really ensuring that there's that sort of evaluation use by local actors themselves and how they can engage in the learning and action and response to an evaluation, not just going up to a donor. You know, and so sort of the questions there around reporting is like our evaluations always reported in an English report format. I'm engaged in the evaluation at the moment and we're creating a short video in the local language which will communicate the evaluation findings to the local stakeholders. So it's really sort of thinking about how do we communicate these evaluations and two topics that we're going to talk more about today is this localization agenda. So this is really about, again it's questioning that idea of this independent external evaluator. Can't there be a local ownership and local leadership of evaluation practice and it's particularly an international development work. There's a lot of people that look like me, white, middle-class, educated people that go overseas and do evaluation in other country contexts. That's not appropriate and this localization agenda is really recognising that we as outsiders don't know that local context and we need to work with local actors who lead and own the process. And it's really about ensuring that we invest in that way of working rather than us coming from outside and working in. And so that's really about taking a strength-based approach as well and recognising that there are assets, resources, strengths in the community. There's local consultants, there's local researchers, there's local experts that can do this work. It's not just about external people coming in and doing evaluations in context which they're not familiar with. So that localization agenda came through the humanitarian response. So if you Google the Grand Bargain, you'll see that as an international community, there was a recognition that humanitarian responses to climate emergencies, extreme weather events, etc. Huge agencies coming in from outside doing the work of humanitarian response. This Grand Bargain and that localization perspective said, no, we need to invest in the local locals to do that work and we need to build capacity, we need to build their skills. We actually need to fund them to do that work, not bring others from outside. So the Grand Bargain has been around for decades. There's still not completely realised, but it's a very strong agenda in international development and it's gone from the humanitarian space right through to broader long-term development activities. And so that's really something that is top of mind. In fact, I didn't introduce myself at the start, but I work at the University of Technology, Sydney, and I do a lot of international development research and evaluation and we don't go anywhere without working with local consultants, local researchers, local consultants. We do everything together. So it's not just like, oh, they know the language, we can get them to do the interviews and we'll do everything else. No, we design the evaluation together, we sense making together, we write together. So it's really about us working together as one team and that localisation agenda is really important because the people that we work with, we work with them in a way that strengthens their skills and their evaluation practice and then they go off and do others themselves without us, which is great. Decolonising mindsets and methodologies, I don't know how I'm going for time, but probably a little bit doing pretty well. Decolonising minds and methodologies. So this one is again, it's top of mind for international development and it's really about us recognising the colonial nature of international development of what was described once as developed countries and developing countries and now that language is not quite right and the STGs recognise that we all have development issues. STGs is for everyone. And so the decolonising mindsets and methodologies is really recognising that there was a hegemony of knowledge of this sort of western knowledge from these developed countries and western knowledge, English, our ways of thinking and doing and sort of imposing that on others. And so we've got some resources to share with you later, but it's really about us recognising that there are other ways of thinking and knowing and it's not just sort of our way is the best. And this decolonising mindsets and methodologies is really recognising there are different ways of working. So for example in Fiji this practice of telenoa is really a valued practice of learning and of inquiry and so many practices in evaluation are using those local practices rather than thinking that we know the best way to do it and recognising the importance of relationship, recognising the importance of different time perspectives, trust, etc. So we have to really really rethink the way that we do our work in international development and international development evaluation and you can see the contrast between that idea of the independent external evaluator and really recognising local traditions, local knowledges, different ways of working and operating. And the idea is that we don't just change our practice but we have to change our mindset and we have to change our perspective and our identity as practitioners. So I'm going to pause there and I'm going to invite our panellists to share and then we're going to have a bit of a conversation with each other and again I encourage you to go into that Jamboard and write and reflect and we've got a bit of time together about 20 minutes or so to talk together on some of these issues. I can just jump in there real quickly with the Jamboard there's a page for each conversation that we're going to be talking about, the panel is going to be talking about. So if you look on the very top you can go to the next one. So the first one is all about localization, the next one is around decolonising methods and then the next Jamboard is around shoestring evaluation. So you want to kind of flip in between each of those as your question is relevant and comment is relevant. And since I'm on the screen I'll quickly introduce myself. I am Lindsay Riley and I work at World Vision Australia as the impact reporting and strategy manager. And I'll hand it over to David or Sophie or Karen who wants to go next. I'm happy to jump in there. So my name is Sophie Jenkins I'm the associate director of strategy and performance within the impact department at Parrot Test Australia and I will say for those of you who are coming expecting Delvin on the panel unfortunately he had to pull out so any Delvin fans out there I apologize for the disappointment but looking forward to sharing some insights nonetheless. Thanks. And hi all my name is David Keegan I'm the CEO of host international we work with primarily refugees in humanitarian displacement context in Southeast Asia but also work in Australia and New Zealand with more settlement type work and community integration. So bringing perspective of a CEO operational perspective to this conversation not an evaluator. Great and hi everyone I'm Karen I'm a research director at the Institute for Sustainable Futures at the University of Technology Sydney. So I just put in the chat box our first topic for conversation which is more about localization so how do we as panel members define localization? I think just unmute yourselves guys and just be ready to talk. Sophie over to you. Great. So yes I guess from my perspective it's a big question to answer in a very short amount of time but to me localization at its core is around shifting the power within our global humanitarian and development system to more genuinely enable the leadership, autonomy and decision making of local actors and local communities and I think Karen as you were saying this is not necessarily a new thing to the sector by any means but I think really positive to see the increased focus on localization over the last few years particularly the impact of COVID in really highlighting the capability of local actors and humanitarian systems to respond to local as needs. So I think an opportunity for us to continually interrogate the system itself and what our roles are as donors and intermediaries and for those of us who are evaluators how do we role as evaluators sit within that system and how can we shift that power in the work that we do. Great. What do you think Lindsay? I think like Sophie said I'm not going to say anything new here localization is inherently about power it's about the ability to say yes or no it's around determining the methodology the data how it feeds back into the community how much the community owns the overall process so I really I see it as it's a complete shift in power. I think the questions that we need to ask and we're going to start to go into in the other areas of decolonizing methods and mindsets is that sometimes our systems are not set up to do localization and so how is it as a support office which is what we're known as at World Vision in Australia how do we encourage these localization methods it's all levels of evaluation to and like Sophie pointed out and we'll discuss a little bit more the evaluator almost becomes a champion for doing localization having it starts at the terms of reference it starts with pushing back on whoever's funding the evaluation to say oh we need to do it this way and if we do it this way it's going to take more time and it's going to take more money and energy and so evaluators really need to be equipped to do that type of change management to be the champion for localization and to negotiate quite a bit but I'm eager to hear Karen what you think and David what you think right I'll let you go David because I've been talking a lot. Thank you I look I obviously agree with what my colleagues are saying I think what I tend to find is that localization is sometimes simplified as the devolving of resources to local actors whereas much more complicated than that and I certainly agree that it's about repositioning power but I think it's also about reconceptualizing expertise and so you know the whole concept of localization for me is very entwined with the whole concept of decolonization in that evaluation in itself is potentially colonial in its base and is built on this idea of us as all evaluators kind of having an expertise and coming in and evaluating something I think you know one of the challenges is what is it you know and we'll get on to this I guess is the implications of what does it mean to reposition power reposition expertise and where does that place the role of the evaluator I think the other thing that's complex that we've found is that from an organizational perspective is who owns that data and who drives the evaluation often impacts on that so I think we all agree that it's about that repositioning but in practice the system is still very much set up around evaluation being an externally driven or donor driven thing I think one of the ways we've tried to reconcile that is by really embracing concepts of co-design and trying to not necessarily ignore the role of the INGO in that process or completely devolve all responsibility and power but to kind of rebalance if you like the relationship and rebalance the power and to enter into a co-design process but again that is incredibly difficult to fund and resource and to do in the way that things work and co-design I think you know everyone talks about co-design but actually doing it properly is really difficult but I don't want to be a complete downer on that but I think the principle is actually you know I would summarize it in terms of re-reconceptualizing expertise and re-conceptualizing where and how power is engaged through the process I sort of had notes to prepare for this session and yeah I think you've all sort of said exactly what I was going to say and I think for us is that co-design process is really important and recognizing the unique value of all stakeholders in the process of an international development evaluation and so yeah from a university perspective we come with certain research frameworks or evaluation frameworks with sort of perspectives around this idea of rigor but even this rigor is a western concept so you know we have to challenge ourselves about that like whose knowledge counts and why do we even use that language but really working in that co-design process and I think the implications are for localization is that it does take more time it does take more money but the learning is so much richer because of that process and so yeah at the institute we make choices now about working in doing work where we can work with local researchers or local consultants or we have some familiarity with that context because otherwise it just doesn't make sense for us to do that work and so we're really sure that we work with local actors rather than you know working outside their process we might move to the next topic which is thinking through decolonizing sorry Karen I'm going to interrupt you there before we move on I just want to echo the comments from the whiteboard they basically echo what you mentioned about involving the local actors there's a few comments about that and the resource intensive nature of localization and how that could work for evaluators there was an interesting comment as well about the discussion of localization also relevant to role in remote communities in Australia and that yeah so that is interesting we have there's a few resources that I'm pretty sure you're going to share later that they'll refer to that specific component as well great thanks so much that's really good and good to see everyone's using the Jamboard okay we might just do the following the same order going around each of us and riffing off each other so the next sort of topic is more about decolonizing methodologies and mindsets so Sophie how do you sort of define this yeah look another curly one I think again inherently linked to localization and I think to me decolonization is really around understanding and acknowledging the colonial ideologies that shape our work and how they show up in our day to day practice but also recognizing in ourselves as practitioners how our own attitudes values and the norms within which we've been shaped actually then influence our decision making and the lens that we bring to these processes and I think you know in thinking about this session I was sort of thinking about some of the colonial practices and mindsets that are still really perpetuated in evaluation approaches today and again this is not across the board there's definitely examples of really strong evaluations that are sort of challenging those colonial mindsets but I think just this value that we place on the external objective lens and when we say external we read potentially someone from a western background educated expert in evaluation and that objectivity bringing a sense of validity and rigor to the findings and assuming that that lens is in itself neutral and again that's you know being critiqued as the white gaze on development of the neutrality of western world views and I think you know we do still see quite a lot in our work and that then comes out in the use of our language you know we still hear people talk about oh I'm going into the field or I'm you know on the ground and there's sort of these colonial carryovers of that type of language that's really othering and is really demonstrated of those mindsets that still exist and I think more broadly as well the capacity building agenda it's not a terminology that we typically use at Caritas we try to look more at neutral learning and capability sharing but I think often we think about okay we're going to have a localized approach to evaluation so we'll work alongside local evaluators and we'll train them up in our methodologies and our toolkit and the way we do evaluation and so I think decolonization really requires us to reprioritize and revalue different knowledge systems and different ways of understanding impact and change and I think at an individual level that requires giving ourselves space for critical self reflection which is often very uncomfortable and difficult I think as organizations we need to create space where staff can have those reflections and I think you know the ACFID discussion paper around decolonization and locally led development give some really great questions for that self reflection and learning but yeah I think there's a lot of opportunity to really interrogate those colonial mindsets that still exist but yeah as I said there's great examples of where those are being challenged and ensuring that as you said earlier Karen that communities are involved in actually determining the scope of the evaluation the questions of the evaluation and that locally led mindset is built in from the very start of a program because once you get to an evaluation it's very difficult to try and instill that locally led approach yeah Great, Sophie over to you I mean I can be Sophie number two that's sorry you all want to be Sophie I feel like I just want to say ditto to Sophie but I did want to just bring I guess the conversation a little more pointed which is who funds evaluations because I think that's one of the areas that we need to be quite critical around more often than not it's really pushed by donors and that doesn't necessarily create room for like Sophie was saying that the mutual learning and capability sharing or looking at impact and change especially I'm also thinking about board decisions and how to make decisions of when to pull in out of programs there's so much around where money is put into for evaluations and it just feels like a good starting point I hate to bring this up with accreditation where is the criteria on how are INGOs doing localization should that be a criteria that could be a can of worms that I just opened up so in that sense I will stop and hand it over to David David thank you look I agree I'm a little bit pragmatic when it comes to this topic I think I kind of think everything we do as INGO is colonial we can't actually get away from that I have that view because I think we can't undo our privilege but we can use it in different ways and so my approach to this that I've kind of adopted over time is to try and I guess to adopt a curious mindset that acknowledges that you that we have knowledge and expertise that can feed into a particular process whether it be programmatic or evaluation but that expertise also exists in the local context and so going back to my previous comment I think in my view on this is ought to do with reframing expertise and also one of the things we've had to work through as an organization is what you know what matters in terms of evaluation to different audiences and you know does the purity of results and reliability of data really matter to people on the ground or you know they're more interested in like is the learning more interesting and what is enough to actually facilitate learning and this may get on to the next topic but in terms of decolonizing I think we've got to we've got to undo our thinking around how things should work and we have to be curious about alternative interpretations and alternative ways of doing things because essentially if you think about colonialism as about the expert going in and trying to implement a solution based on that expertise and ultimately we need to reposition that expertise I think it brings into question what is the role of the evaluator for me because often the evaluator is tasked with putting together something that will produce a reliable result but actually if we're repositioning that expertise is the role of the evaluator perhaps more to be a broker or a negotiator or an educator or whatever might be needed in that particular context but I also think and this has been acknowledged I think by some of the comments that there are multiple stakeholders and you know we need to still remember that donors are often funding it with particular outcomes one of my biggest frustrations in this is that many donors are not really funding it well either not funding it evaluation adequately and secondly not funding about well not approaching evaluation in the way that we might like it to be so if evaluations meant to about learning and improving practice often for donors it's about taking a box or justifying efficiency in the way that it's going to be and I don't think that means we can't do it better but I think we have to acknowledge that organizations are actually in that tension between the two spots and I think for me as a manager there's often attention that evaluators need to help me work through which is this you know this idea of how do we do it better and involve local communities and do something that's meaningful for them how do we also learn and benefit from that but also how do we take a box for a donor third of what it actually costs to do it properly so I don't know if that fixes anything but I think ultimately to start doing this better I think we need to redistribute power and expertise and but I think the other thing I'll make a comment on is reframing part of this is about reframing how we view participants and so to be a little bit controversial I think sometimes we're guilty of seeing people as victims of their circumstance or as unskilled rather than seeing them as survivors or local experts or whatever however you want to frame that and and I think there's a lot of bias and assumption that we still need to work through but part of that's actually about creating a community I think amongst ourselves where we challenge and have those curious conversations I think where they tell us yeah great this is amazing I don't know if everyone else is enjoying this conversation but I am very much um yeah I think for me I just wrote the notes around that idea of the assumptions that we know and so I think that international development has been set up with this this idea that there's a problem to be fixed um you know there's a deficit in a situation and we need to come in have an international development program and you know through that program we fix it and I think the fundamentals of international development need to change and so exactly what you were saying David around reframing we actually need to position ourselves as facilitators of local change agendas and facilitators of local learning agendas and facilitators of you know aspirations in that local context and really about um reframing our role as not the experts and not the owners of knowledge but as facilitators and as kind of um champions and supporters of that process so I think it really becomes from an perspective like who is an evaluator then they're not necessarily like a content expert or a framework expert or a rubic cube expert you know evaluators love their rubics um what does it mean for us to facilitate a process of inquiry and learning in that local context and yeah for me it's really it's the fundamentals of the of international development and for me it's about switching it to a strengths based approach so I assume that there's knowledge expertise um curiosity interest there's that in that local context and our role is to kind of facilitate a process of learning and amplifying that agenda in that local context rather than us sort of you know seeing ourselves as experts in the ones with knowledge um coming in and yet transplanting that on people so I think it's international development work is really um this evaluation work is really connected to that sort of broader overall sort of perspective of what is international development and yeah reframing and taking that sort of perspective of recognising strengths assets knowledge in that local context and recognising um those mindsets and those methodologies in country is really important we've got some resources to share with you on that as well okay we've run out of time um do you want to just talk briefly about just like one minute what does shoestring evaluation mean to you trade off I'm going to give you all both 60 seconds both shoestring evaluation what does it mean and what are the trade off and benefits great so I think from my perspective when we talk shoestring evaluation we're talking those that are under very short time frames limited budget limited existing baseline data or ongoing monitoring data so really having to creativity into doing our evaluations so they may end up with a much smaller scope or stamp sample we might be relying on internal stuff we might be relying a lot on monitoring data or proxy data um but I think there can be real benefits to that I think often being able to shift away from a big formal ticking off all the OECD criteria type of evaluation can actually open space to a much more genuine outcome harvesting type process of understanding meaningful change from the perspective of participants um but I think like a key trade off from my mind is that participation burden um so you know if we're saying okay we're going to a lot of the the resources around shoestring evaluation talk about you know working with your in-country or program staff working with volunteers or community members what is the participation cost involved in that and is it that we're saving money by not you know having a very high high fee external evaluator go over but that's actually coming at the the cost of those communities or all local staff who are leading that process so I think there are some trade-offs but I think the critical questions is to think about a what's what is the actual purpose of the evaluation and is an external evaluator required or can this be completed within our existing resources what's the size scale and complexity of the program existing data exists and I think that critical analysis of what the benefits and trade-offs might be in that context and working with local partners and communities to understand what the needs are in that evaluation great um David I'm going to skip you Lindsay I'm going to skip you and me Lindsay and just go to David just for one minute because I really want people to get into the breakout rooms I think I'm not an expert on shoestring evaluations but I would argue that most of them are and I think the critical thing is to always focus on why the evaluation is happening and remember that you're always going to have people like me who will be trying to encourage you to save money and and and I think you know for me from my perspective it's not always critical that the data is iron cloud if sometimes it's more important of what the evaluation is trying to communicate or what we're trying to learn from something or whether you know there's lots of different factors going on but I think try to focus on the why mainly and don't forget to internally educate we also try to push back a lot to donors and try to renegotiate but sometimes you have to find compromise between their colonialistic requirements and and what you'd prefer and so sometimes you just got to find compromise and work out what works best great excellent alright well that's um our panelists well done everyone I want to I'm going to put you into breakout rooms now um and in on the Jamboard on page four I think it is I've put the guidance for the breakout room session as well so yeah basically just kind of introduce yourselves we'll have about 20 minutes introduce yourself you know share a little bit of background but then get into the into those sort of questions what experience have you had with localization what experience might you have with shoestring evaluation what's been your experience or lessons learned from decolonizing mindsets and methodologies you know where are you on that process is this language completely new to you today or are you in that process of curiosity and self-reflection as Sophie and David described um and how does evaluation get prioritized through localized um evaluation and then yeah sort of also sort of like a future focused what more could we be doing as a group as a sector to um to advance these areas yeah Karen there's also some questions in the in the whiteboard about the last topic perhaps the people that asked us questions can ask those questions within the the breakout room great perfect excellent all right we'll enjoy your breakout rooms everyone everyone some of you had very intimate breakout rooms with just two people there so that was very very nice for those people we had a deep conversations um right welcome back we've got a bit of time just for uh all good yeah all good okay great we've got a bit of time just for some um conversations or any highlights or questions remaining from your breakout rooms we won't do anything formal but because there's a smaller group of us now um but yeah just put your hand up or unmute yourself and just yeah share a highlight or um questions still remaining or a take away from that um that breakout room I don't oh good you go I feel like I talked too much no no no I was a panellist that means that you have to talk first yeah okay decolonising the webinar um my name's Jen hello and I was in a small group with Eva sorry we just cut off her writing and get to say goodbye and carrying so one of the things that we were talking about or that I was sharing was around um being in a situation of being very committed to that localisation process but then also finding that I was working with people at the provincial and local level who were forcifying data and so how to deal with that as the external evaluator and so really trying to have those curious conversations to understand why and what pressures what on people to falsify their data and for context I work in infectious diseases HIV and TB so that sort of epidata is really critical but not always well collected great thanks Jenny any other reflections on the breakout room conversation oh it's a reflection um yeah I'd say Jeff from New Zealand here um on the breakout but the whole conversation that kind of hidden thing it's all about power isn't it it's about broken power and I think I reckon we as evaluators very few who can move to the different spaces where there's more in this power and we move around where most other people are locked in one space where there's and that gives us I think a sacred task wonderful thanks so much Jeff that was really very powerful Barbara thank you I was just wanting to share that we started to talk about the use of arts based and culturally appropriate tools when working in communities with communities and trying to find out of course to bring in whatever you know might work you've used before arts based etc but also to try to find out what are some of the local ways that people perhaps um do you know have drawings do drawings or may create a song or things like that and it's as much the process of doing that together that you learn um as well as what you are what you collect and we felt that um I shouldn't say the royal we I'll go back to I I felt that in our conversation we were really finding ways to engage and learn from each other but but also take with us with permission artifacts and things that show that local story and we can still knock off a survey with all the tables and I wonder which bit people will really read I know when I flick through the tables and go yeah yeah but if I can read some of those cultural presentations of learning I want to I want to read more of that person's work that community's engagement yeah great thanks so much any other reflections on the breakout room or conversation more broadly hi everyone I'm Lindsay and Brenda and I think what was really great about our conversation was talking about the opportunities of a shoe string evaluation rather than the constraints um so yeah that was awesome great well what's some of those opportunities did you yeah we're talking about just when it is shoe string it can open up a very innovative mindset because you have all these constraints it should be more participatory where you look to the community to see the ways that they are already doing M&E because I don't believe that it's already kind of built into our different cultures frameworks so it I think brings out that curiosity that David was talking about and it um because you have the constraints of time and money you you're just going to look at it differently it's out of necessity really and I think you used the word hyper focus um Lindsay was really good you really focus on the added value um and the priorities of the project I guess hmm yeah I think it's also like recognizing that in every question we're asking we're engaging in a change process as well and so whilst an evaluation might be seen as this point in time where it's you know it's for the purpose of the funder or the donor or something it's actually part of what that community is doing or what that government organisation doing is whatever they're doing in that time and space the evaluation is should be contributing to whatever their preferred future is so I think that that's really important as well is recognizing the kind of yeah the generative value of the evaluation to you know those people in that place um rather than seeing it as something which is just kind of done at a point in time and then you shift on to the next thing because the questions you ask at that point are going to be like super um you know influential and informing about how they engage with themselves in each other and seek to generate change going forward. Barbara's nodding extensively thanks Barbara it's great any other thoughts on yeah from the breakout rooms any we've just got like I just wanted to add about that shoestring evaluation I really found it like very like an interesting conversation because my first um my first view was just like I don't like the you know like just the term shoestring evaluation but you know like but then when we were unpacking this um I always understanding that is because like just a short background like I just came to Australia a couple years ago and like most of my life I've been working in Peru I am Peruvian so like I felt like like most of my professional life I've been trying to you know do this just like work with like very very tiny budget and like um you know like be very creative at like how we can use our very limited resources right so like I was just like you know like just thinking oh no like if there are important um questions that need to be answered with a more resource evaluation then we should actually fund those uh but yeah then we went we can go back to the conversation of okay what is this evaluation trying to answer whom this evaluation is going to be um answering to right so um yeah and also too what does it mean to have participatory evaluation as well because we actually valuing people's time like as we evaluate as we go in you know in country as Sophie said to the field and we get paid for five days work or whatever and then we're having all these focus script discussions and interviews etc like are those individuals actually value as their time valued um you know are we paying like farmers or you know people have to like squeeze in everything into their own lives like yeah are we actually paying them so increasingly we're really thinking about that as well like what is the what is it like um uh appropriate compensation for people's time because of the cost benefit um you know and the and the cost that might be associated for them in terms of participating tough questions okay anyone the last yep last point great thank you Linda sorry I couldn't find the little yellow hand oh good one that was that we can still do that can't we no I just wanted to share um a sort of a technique because I you know I work on gender quality and you you want to find out the understanding of gender quality issues what are the issues in the community or the group or the organization and we found going in different cultures across different countries that you could always ask um people to identify and even sing a song that has a woman's name in it or it's about a woman or about love or about a man or alternative and so you get the whole group talking about and analyzing that song and it would be culturally specific to that particular group and it was really very lovely experience um the way they would immediately pick on you know popular songs a little bit similar with champions in their country so you could mount the whole questioning and discussion around something that was belong to them a family technique wonderful yeah thanks so much for sharing that's really really um sounds very positive I would like to have heard one of those songs I'm going to share my screen again and I'm just going to um um now which are you seeing the right side yep great um so these are some resources we just put together so yeah um feel free to contact us or take a screenshot or um take a photo or something but um are I have read three out of four of these um and they're really really um wonderful particularly um the decolonizing methodologies and research and indigenous peoples if you haven't read that that it's like a really fantastic um read and quite um uh confronting really you know and I think we also um talked about that today you know the need for our own reflection and to feel a bit uncomfortable in in our continued practice of decolonizing mindsets and methodology so I really do encourage that one um and there's some others there for you as well which really are those issues we've had our breakout room to have our report back so the last task is just to say a huge thank you to everyone um for for coming um and also for your participation we do have a mentee which um I think Flo will put in the chat box that we asked