 te uru, whakatakata te hau ki te tonga, ki maa kina kina ki uta, ki maa taratara ki tai. Ehi ake ana te atakura, he teo, he hooka, he houhu, ti hei muriora. And for those of you I love knowing the translation so I'd like to just read to you what it translates to because it's lovely. Cease the winds from the west, cease the winds from the south, let the breeze blow over the land, let the breeze blow over the ocean. Let the red-tipped dawn come with a sharpened air, a touch of frost, a promise of a glorious day and a glorious session in this instance. Duane, I would like to hand over to you and let you do an acknowledgement country and then we shall kick off. Thank you. Yeah, I'd just like to acknowledge all First Nations people and countries that we're all sitting on where I am in Perth here. I'm in what they call Wajak country which is part of the Nungar Nation. Nation there's 14 clan groups or part of the Nungar Nation. So I'd like to acknowledge all First Nations people right across where we're holding this meeting and to acknowledge elders past and present and also emerging. Thank you. Thank you. Well, it's my great privilege to join you from Wanaka in the South Island of New Zealand to Waipunamu, which for those of you who don't know what that translates to, it's the land of the Greenstone. Punamu is Greenstone as I think many of you know. And I've got glorious Southern Alps with its beautiful sunshine out my window today. Just a couple of brief housekeeping matters on that point around where you're joining us from. If you have a moment, we would encourage you to use the chat function and just say kiorohan hello. And if you so choose, you can pop where you're dialing in from. It's always nice to know the geographic regions. And also if you happen to know the country that you're in to include that as well in your own acknowledgement. We will crack on swiftly. The only other two key things I want to let you know is if you have any issues, try using the chat function to privately message Sineta or Marie. Sineta is AES New Zealand and Marie is Manus. So I can wave to you. And if you get really stuck, then you can email Michelle, but that would be last resort. We are going to, in terms of the session, we've got the panel and they're going to each speak for about 10 minutes each. What we're going to ask is that you hold your questions, either note any questions that you have down on a piece of paper, then ask you to start popping your questions into the chat once the three speakers have spoken. It just means that your questions don't get lost further back in the chat function. So, and the other final thing is we've got the slides being driven centrally, so there would just be the odd delay sometimes as they're advanced through on behalf of our speakers, given we've got three that we felt that was going to work best. So I'm going to shish up now and I'm going to hand over. The first speaker is Dwyanna Radcliffe from Community First Development and very much looking forward to your presentation, Dwyanna. Okay, thank you, Kara. Yeah, just to let everybody know who I am, Yamaji Nagaja Wajiri Man. My country is about 350 km north of Perth. The Yamaji country is a generic name for a collective group of people and there's about 16 language groups involved in the Yamaji Nation and Nagaja is who I am. And the other side is Wajiri, which is another language group. So I'm part of two groups up in the Yamaji country. Okay, I'm also a director with the AES, current director and formal vice president and currently on the regional manager with the Community First Development. We just undertaken a brand name, a change of name. We were formerly known as Indigenous Community Volunteers. So, and we work predominantly do a lot of work with First Nations community nationally. Basically, I look after all of WA working with communities, urban, regional and very remote. Okay, if you slip through that second slide. So, we're a national community development organisation, research organisation, so we promote skills and talents and culture travel of our people and facilitate community activities that lead to positive change for their communities nationally. Click on the next slide. How we work, we provide practical support and we have a network of school volunteers that we match up with communities. All the work that we do with communities is 100% community driven. So, we don't go on with our agenda. Everything is we do, we work towards a community's vision. And we only go near by invitation. We don't do things for the community, we actually work alongside communities. And every project that we do, we monitor and evaluate as well to prove that we have an impact. I'll get you to move to the next slide. This is our community development framework. And this is basically how we work and appeal with communities. So, basically our approach involves working by invitation, which I mentioned earlier. So, when we sit down with communities, we do some active listening and deep listening. And we share our learnings as we work with communities on activities. We spend a lot of time understanding the community's challenge and vision. So, when we work with communities, we actually don't just breeze in and breeze out. We take time to understand the circumstance of the community, all the backgrounds, all the issues. And we tend to work on average about five years with communities. So, we've worked on communities with communities for longer, up to seven years at some stage. So, we believe that's the best approach. We're built on local strength, knowledge and resources. And we also match our school volunteers to the activities, co-design and directed by the communities. I mean, knowledge is that we work at the community's pace. We don't go in and just force an agenda in a pace that the community can't keep up with. So, we work at the community's pace. Embedded in our project work and community development activity is our monitoring evaluation. So, everything is embedded into our everyday work. I'll get you to slip across the next slide. So, basically, when we work with communities, we, like I said, we sit down and listen and work with the communities to understand their vision and work towards our vision. And we plan things together and match the volunteers. Sometimes what seems impossible is always possible because we're always very solution-focused about what we do and our approach. As you can see, this is feedback from a community that I've worked in the past without Pijara. Some of the feedback that's come back. We slip along to the next slide. With the COVID-19 that come up, it kind of threw us in a bit of disarray. It kind of didn't. But the community resilience shone through. We anticipated that work, things would slow down, but it actually never, if you slip along to the next slide. That's with the COVID-19. You know, with First Nations people at particular risk for infection. So, it left many of our communities isolated and vulnerable. So, we're a quick, we're a pretty agile organisation, so we took swift actions when the restrictions were put in place to keep the community safe, yet we still remain connected to those crucial projects. And the communities responded just as quickly. Together we come up innovative and different ways to keep projects moving forward. Obviously, with the innovation, there's nothing more powerful than face-to-face meetings. What slide are we on? Sorry. On the innovation slide, Dylan. OK. So, normally we have face-to-face meetings. That's generally how we operate, but with the COVID-19, we're then an opportunity to have a yarn with the community through video-compressing interactive whiteboard sessions, screen-sharing. And we got on with our job to keep on progressing critical community projects as that in April alone. But the harshest shutdown here in Australia, there was still were able to keep 57 active community-first development projects going across the country with different communities in remote urban and regional settings. I'll get you to click on to the next one. Sorry. So, this is one of the examples where we were able to keep on working. This is Jimmy working with Angie on a project to support digital literacy work planning. And we were still, even though we were working online to face-to-face meet, we were still able to maintain that relationship even though we were working online. And the critical part of our work is keeping that relationship going within the communities we work to keep activities going. That's very important. And to progress things. And with a lot of some of our communities, they had less experience in using online platform whereas some of the communities were very experienced. I'll get you to click on to the next slide. This is a particular project I've been working on. This is with the Aborigils Males Healing Centre. As you can see, I've got myself there with the benio and working with engineers, Devon Kumar, the CEO of the Aborigils Males Healing Centre and architects and engineers, and putting together a plan about building a healing centre in Newman of WA in the Pilbara. So, we were still very familiar with using online platforms, but if I get you to click on the next... Some of the communities that we work with had less experience. So, it meant building capacity of the community to use video conferencing prior to doing evaluation reviews of CD activities. And once the communities were comfortable with technology, they were able to progress the CD activities and do evaluation. It took a bit of investment of our own time to do this work, but once the communities got on board and learned how to use online platform, it came a lot easier in doing our community development activities and our monitoring and evaluation, as you can see. But this particular project here in Yandamu in Northern Territory, as you can see. People come very confident. And, yeah, we've got things moving. I'll get you to click on to the next slide. Now, sorry about this. Another thing that we've done was, as part of... We've developed a yarning tool in response to our learning from the first two stages from our Action Learning... Action Research Project. And the round one interviews used more formal semi-structured interview approach. And it kind of found a bit foreign using that approach. We found that yarning was the best way of communicating with communities. And particularly online, where we used to meet face-to-face, that wasn't applicable. So, yarning become a way as the most effective tool that we use to talk deeply with First Nations people about this particular project, which is understanding government practices in First Nations community. And as you can see there, you'll see one of the tools that we use there. You'll see the seed-to-tree approach, that we use as part of... And we adapted it also into our project work to gauge where a community was up. It could be developing a strategic plan, a business plan. It gave us an indication where the community was in trying to catch us some data and information. I'll get you to flick across to the other part. And as part of that to the first round of any views that come out, we checked in with emerging things that come from the first report with communities. But it was more challenging to do the online instead of face-to-face, but it worked. I'll get you into the coding analysis. I suppose out of this action research stuff that we were doing, work that we're doing, over the last three months our community development team have been applying a knowledge game through our community of practice sessions. So we're always doing reflective learning processes and doing activities which is really embedded in our workflow. The M&E doesn't set outside. It's actually embedded right into our workflow. So we're learning. We've got a learning culture within community-first development. And we do a community of practice sessions to analyse our work. And part of that was to code the Action Research Project into things. And we have done it collaboratively as a team, but also within smaller regional teams, working groups, through specifically prepared data. And if you click onto the next screen with the little well-working group interviews, one of the processes we've done with this learning is used as a virtual whiteboard where we're able to organise the themes into collective groups and colour code them and put them into code all the themes as part of the research into groups. And then we're further going to analyse that information. We found the interactive whiteboard was very good for us, particularly given that we're a national organisation and we have different people from different areas. And we use MIRO in this case and we're able to shift these sticky notes around and put it and clump them into groups and then analyse the key themes that have come out of this research project. It's on the next. I think that's about it. Basically, for community-first development, it's a learning curve. I suppose we're able to act very quickly. We're quite agile. And we'll also use our existing relationships with the communities that we're working with to get in there and start building capacity of communities to start using online platforms to continue our community development work but also to monitor and evaluate the activities that we're doing on the ground. We're always learning. Just because we've just developed one way of doing something, we're always learning and we're prepared to try new things and in different circumstances to keep projects happening and going on the ground. I hope I've made myself clear but if there's any questions at the end, I'm prepared to answer them. Thank you very much, Joy, and that was really great. I hope you all saw our... There was just a little technical glitch with that slide set which is why we could only show it to you that way. John Farger is going to be up next. He's going to be sharing his experience at working remotely with some of his international development work in Indonesia. I'll give Sunita a couple of moments to find the slides and pull them up. Thanks to those of you who've sent messages through. Do get in touch if you've got any questions or issues and just double-checking. I haven't had a chance to ask you but the others are happy for us to share slides so we'll get those organised as well. Sunita, there we go. Perfect. I'll put myself on mute and invite John to step up to the plate. Thank you very much, everybody. Welcome. Thank you, Dawn, for an interesting kick-off, interesting tools and methods there. My name's John Farger. I'm an evaluator who's been working internationally for 40-odd years. I've worked in war zones, natural disaster areas during the immediate aftermath of natural disasters and in post-conflict zones doing evaluations on the ground. I've also recently started doing some remote evaluations and I think when things are not very good, remote is better than actually being there, particularly in war zones. It's been a learning curve, like Doen said, but I think we've learnt a lot. So if we go to the first slide, please, Sunita. Or the first one with anything on it. So this is in eastern Indonesia. It's a project. If you click one more time, please, Sunita. Thank you. It's a very large Australian World Bank financed agribusiness program in the six provinces to the east of Indonesia. So it's a very big area, including central and east Java, which are quite middle income, right through to Papua, which is very poor and in many ways has similarities to parts of remote Aboriginal Australia and Torres Strait Islander Australia. And in fact, down in the southeast of Papua, it is Australian. They have kangaroos and cockatoos and melaleucas and eucalyps and it smells and sounds like Australia, but people speak Papua and Bahasa Indonesia. We're working with a lot of private sector business partners in this program and it's into a second phase after a first phase of four years. We're 18 months into that second phase. And one of the ways they assess progress in this program is to have a six-monthly independent formative evaluation. So it's a learning evaluation and there's just two of us who implemented, Rob Hitchens and me. And so we've been doing this for nearly six years now and we know the people and we've got existing relationships. We know the country in the different locations. We know many of the interventions. So that helps with doing things remotely. But normally we would have 14 days, once every six months, to do fieldwork, do interviews, document review, the fairly standard sort of performance and quality, progress quality evaluation. But going forward, if we go to the next slide, please, Anita, there's a number of opportunities in working remotely, some challenges and I'll tell you our approach. So just to start with the opportunities, because we have existing relationships and time series data, it was much easier for us with document review and some targeted interviews and questions to update information and get a sense of progress and quality. It would have been very difficult if we'd never worked with these people before and it was a sort of greenfields evaluation. Because we did less fieldwork, there was a lot less distraction for the implementation team. And we used that saving to create an opportunity to deep dive into selected themes. Because you have more analytical time when you're not planning field trips and bouncing around in land cruises in the middle of nowhere. And we were much more focused in our discussions. Because we know each other when we get together, there's always banter and how the family's going and what's happened with this and that. I mean, that's part of being human and so it's important, but we were very much more efficient remotely than when we were together. So some of the challenges, if you go to the next one please, Anita. Just click it one more time. That's it, thank you. The decision to conduct this most recent evaluation in March remotely was made three days before our plan start. So I was actually in an airport on the way to Jakarta and Surabaya when the decision was made. So this was remote and a shock to the logistics required a lot of change. The purpose of the evaluation was exactly the same. It did not change, but clearly the schedule and the approach did change. There was a challenge with less informal reflection and behind-the-scenes talk. As you all know, when you're conducting an evaluation, there's formal processes, including the sorts of processes and tools that Dwayne talked about, yarning tools and whiteboards and so on, which are very useful. And then formal discussions and so on, which are very important. And that includes body language and informal reactions that you can't see. Things that you would pick up in a room or under a tree with somebody that you don't do virtually. We had a big challenge with variable bandwidth. We had some participants in Geneva who were fine and we had some participants in areas of Papua, like Wominah, which have trouble with electricity, let alone bandwidth. That'll be familiar to our Aboriginal colleagues in Australia. So it was a challenge. Not everyone was familiar with the various platforms, voice-over internet platforms, like the one we're using now. And we were working in 13 time zones, so that restricted when we could realistically operate to be fair to everybody. And so the approach we adopted, if you go one more time please, Aneta. Thank you. We used document review and semi-structured interview methods. We decided to use just two languages, English and Bahasa Indonesia. Normally when we're working in Papua, we would also use Papuan languages. But because of the delays in that, we decided and our Papuan colleagues agreed that we'd use Bahasa Indonesia with them. But normally when we conduct these evaluations, we'd be working in four or five languages. Again, something that's quite familiar to many colleagues in Australia at least. But we agreed on two. We did conduct a deep dive and the theme that our partners wanted to look at was sustainability, particularly around the private sector. And we deliberately included time for feedback and reaction from participants because we weren't able to pick up those informal things. So we wanted to give people time to reflect and come back not immediately after a discussion, but later that day or the next day. And so a number of lessons, if we go to the next slide, please, Anita. Thank you. Just click at once for the first one. These are things that we found from this experience. Having a very clear purpose and understanding of the needs, we want everybody involved in the remote evaluation to understand why we're doing the work and what we're going to do in the particular meeting or interview. So it's efficient. Thanks, Anita. We had a very practical schedule of interviews, six one-hour interviews per day with about half an hour between interviews to allow the evaluators to confer and also time for food and other breaks. People need to go to the bathroom, they need to eat, they need to have a break. Next one, Anita, please. We learn that you need human resources to schedule and manage the interview logistics. Setting up meetings in Zoom or team or whatever platform you're using. Making sure documents are available at least a day in advance. Making sure the right people are coming. And then in our case, we learnt this on the fly, but language and protocols for each interview have to be agreed up front. And we're about to repeat this in August and we're going to send round protocols and language and so on for the interviews well in advance so that people are prepared that will make things more efficient. We use video at the beginning so that people could see each other, but then it was off because many people, myself included in Yundi on Ramanjeri Country, the NBN, we have National Board Band Network is very weak here, it's a satellite link, again similar to many remote communities in Australia and we just don't have the bandwidth. So video off, mute microphones as we're doing now, use the hand up function and the chat functions. And we manage the time very tightly so as Kara is about to do and wave her finger at me, we kept things very tight so that it worked efficiently. And then at the end of all of that, as we do anyway, but it's normally face to face, we had a very carefully facilitated feedback session for implementers and funders so that after Rob and I had done the data collation and analysis and reporting, there was a time for clarification and reaction before the final report went in. So a fairly typical evaluative process but just giving people a bit more time so that they could do that. So there's a few bits and pieces in there that hopefully help people as they prepare for their own remote evaluation. So hand back to you, Kara, thank you. Thank you very much, John. And I'm going to send an advanced apology from John who has to leave right on the dot of 330. So if you have any questions for him, you might like to send them through now so he might have a chance to respond via the chat function briefly, but I don't want to cause a distraction from our final presenter. Donna, if you wouldn't mind getting yourself ready, here's your slides. And Donna's got a beautifully colourful slide deck and we have many of us have inadvertently addressed to match. I want more colour in these remote connections. So Donna, I'll hand over to you. Thank you, Kara. And hello, everybody, from Garnaland in South Australia, the beautiful Port Nalanga South in the Onkapringa River Mouth. It's really great to have an opportunity to, I guess, share some of the emerging thinking and lessons that my team and colleagues have had as we've been struggling to undertake an interim evaluation of what is actually an extraordinarily complex global multi-stakeholder partnership called the Pride program. I'm actually working on this program with... I myself am an international development evaluator and I'm working with an organisation called Edge Effect with LGBTI communities. It's based out of Melbourne and particularly kind of looking at how do they include the community within development efforts and the program itself is actually delivered by an organisation called COC which is based in the Netherlands. So this is a partnership of an evaluation partnership of a very complex program. So, Sunita, if I could have the next slide, please. It's just a really brief overview of the Pride program. Don't look at the theory of change too much. It's just showing kind of the complexity of the many, many pathways. This is an 18.5 million euro program that works over four years. It's part of the Dutch Ministry of Foreign Affairs Voices and Descent program which is a human rights program that works particularly with... particularly vulnerable and marginalised groups worldwide. Pride works in 16 countries, in five regions. It has 125 direct partners delivering over 300 different activities and it also has approximately 30 advocacy and technical partners that support the delivery of its global advocacy efforts. I think one of the really important things when we're thinking about this program is to recognise about this program. It is a human rights program. It is about dialogue and descent and it is working in these extremely narrow spaces. So, over 80% of the countries in which it's working include countries such as Iraq, Ghana, Pakistan, Tanzania. So, you can imagine that actually working with LGBTI communities when those sorts of spaces present significant risks to protection and security. There are five pillars for the program. I'm not going to go through them all. But there's also, as well as those kind of five pillars, there's also six cross-cutting issues that obviously work across all of those pillars and they're listed there in my presentation. I won't go... really don't want to go into exactly what it's doing because that will cut into our time which is really around thinking about how have we addressed the shift in this evaluation given the COVID context. I guess the first thing to think about is that when we are actually applied for this evaluation and won the evaluation, COVID wasn't a thing yet. We were actually working on this tender back in December and January. We were notified in late February about the work and at that point in time, COVID was sort of coming into play. We took quite a lot of time, I think, to work with the partner on recognising that our intent for field work would not happen. They were really holding on right up until, say, the end of March and even into April in the hope that field work would happen at some point in time. We had big issues, was actually really trying to rethink and help the partner to rethink what would be possible. Obviously, the first thing we think about when we've got to looking at a shift in context is changing the timeframe. That wasn't possible because this evaluation actually needed to be delivered within a set period of time. The money was going to run out. The partner decided that they still wanted the valuations to go ahead and wouldn't make a budget available beyond the funding period. We sort of decided to look at the scope of the valuation and we really worked with the partner to sort of focus on the purpose and the utility of what this evaluation was about. It was initially designed as a summative evaluation, but at the time we were designing the evaluation itself, the organisation COC was actually notified that they had an additional four years of funding for a new stage of a program. In fact, this evaluation became almost formative in some ways. We agreed with them to focus on learning, focus on the effectiveness of the approaches that they've used over time, exploration of risks, and also as a result of that, we decided to maintain a focus on using really highly participatory outcome mapping and outcome harvesting processes. The graphic on the bottom is really just to show how we kind of looked at our stakeholder analysis with also defining the utility and the scope of the evaluation. We really focused on that red and yellow area which was where did this program have the greatest influence and control over their activities. So, we really are focusing on the key actors within the global LGTBI community and the methods that the program used to engage with them and then the methods that they used to engage with their communities downstream. We're not so much looking at impacts, et cetera, which I think we would have if we were doing the fieldwork in COVID. Next slide, please, Sinita. Okay. So, the plan is on the left. It was pretty straightforward. And I'm just going to talk through some of the changes that we made. We decided to go to a user process where we continued to use outcome mapping and outcome harvesting as our key process. But we actually decided instead of doing what was five deep dives within the field, we moved to doing nine remotely delivered case studies. And I'll talk a little bit about those case studies later. What that means was our team of two evaluators back off the support, a bit of extra support for quality assurance ended up with two lead evaluators, an evaluation director and a lead researcher, eight local and international researchers, and also a data analyst. We brought in, we added a fairly significant component to the program around data mapping to really try and help us to understand what was happening in that evaluation. We decided to move towards doing summary country reports. We ended up with a survey and we brought a partnership health process into the evaluation process as well. And that was new because we received that, or the partner knew that they'd actually received that additional funding, so they really felt the need to explore how the partnership worked. Something that was really significant for us, it was a significant shift in the way that these resources were used. So where we had originally 30% of our funds to fill costs, we still ended up using the entire budget, but only 5% of funds went to fill costs. And I'll talk about that in a moment. So that's sort of what the evaluation looked like. Next slide please, Sinita. Okay, so these are just a couple of the sort of real challenges that we found as we've sort of planned out this evaluation and are implementing this evaluation. The first one is just dealing with that complexity that's shown within that theory of change and within the different cross-cutting issues within the program. We're really showing a very diverse movement, a diverse set of activities of pillars. So it's been a challenge for us as sort of, for myself as the evaluation director and for my colleague Clare, who's the lead researcher in terms of thinking, how do we bring these eight different researchers together into one mechanism so that actually we can get that met analysis and actually help with our high-level assessments of effectiveness and efficiency and so forth. We ended up running our concurrent... We ended up running the case studies concurrently over a set period of time. At the same time we ran a survey and at the same time we managed a whole series of high-level key informant interviews so that as we were meeting really regularly with the research teams, we've actually been able to bring data in from those different pieces of work. So that's been a real challenge for us. We also engaged a data analyst and they spent time accessing into the COC database and was able to very heavily mine data, come up with clear descriptions of activities and maps of activities and reasal spends and analytics that actually helped to define those case studies. We met regularly with the researchers and really needed to take time to create a safe environment for them. The second key issue for us that comes into this evaluation is the security and protection risks associated with working with LGTBI communities in very narrow spaces. Some of the ways that we kind of managed that was we looked to the actors for solutions. We recognised that the brave people who were doing this work in very narrow spaces actually are used to working in those spaces and have workarounds and solutions. So that was something that was really useful for us. We used consent obviously as a first principle but also opting out of all lines of inquiry if different actors didn't feel that they were able, confident to be able to respond. I found that identifying researchers from the community itself from the LGTBI community was really, really important and in some cases actually using local researchers who actually were aware of the work that was being done understood the communities themselves, the challenges that those communities were experiencing was actually really, really fundamental. That was a really key piece of work and it helped to build trust and also to build relationships. We have very, very clear security protocols. Headphone use, video opt-in opt-out, informant driven times and locations, informants determining what sort of media they want to use. We managed to negotiate access to actually use the partner server so our information is actually provided on their server because they were able to provide a higher level of internet security than we were and our data analyst is actually also a data security expert so we're actually really able to draw heavily on her. Next slide, please. Other challenges that we have is verification does still remain a challenge as we're moving through things, in particular getting access to verification of secondary and tertiary sort of evidence, particularly around allies so not necessarily actors in the movement who are part of the activities but the government service providers that they're connecting with the police, the authorities and so forth so that's something that actually we're just kind of trying to work around at the moment. We have very clearly accepted agreed limitations with our partner and we maintain really clear lines of communication around that. The other area obviously is just the location and the relationship so we're working in, we've got researchers in about eight different countries. I'm the Evaluation Director on Based in South Australia. The lead researcher is based in Canada and the partner is based in the Netherlands so just that in itself to get all of us actually into a room means that somebody's up at one o'clock in the morning and somebody else is up at six o'clock in the morning so those sorts of things have been real challenges so we rotate our meetings and I've found that's been really great because it's actually keeping everybody quite well informed. We also felt that we needed a lot of time to focus on process, not just with the client but also within the team and actually build a high degree of trust so that we could actually get the contestability that we wanted for the robustness of the evaluation process. I'm conscious of time so I'll just run through a couple of last points. So for me, these are the sorts of things that I think were just the most valuable in terms of the approach that we've been taking. We've used a partnering approach. We've built engagement in internal contestability with the partner themselves so we've actually really asked them to develop their internal narrative. They co-created the analytical framework which was a definition that says this is what we think success for this program would look like. We've consistently engaged them in outcome harvesting type sessions so that we're working through with them why is this change created? How do you think of the pathways, et cetera? That's taken a lot of time and a lot of really important partnership and trust building. Mining the data has been absolutely... The role of the data analyst has been very, very fundamental to actually getting the case studies off the ground to be able to understand where the stories are within a program of this size. So bringing that person in right up to give us that really strong picture of the program was absolutely critical and we've now included the data analyst into the lead evaluator's team. So she sits with myself and with the lead researcher to make sure that we're actually able to cross-verify information and so forth. Case studies have been fantastic. They've enabled us to really dig quite deep down into the theory of change, into understanding how change happens and to articulate that really well within the report itself. So I'm actually really happy and I would say, arguably, I think we've got a better product than we would have by actually running traditional in-field sessions in country. Again, I have said before, working with local and international researchers from the community itself was absolutely fundamental in terms of having people informants and actors being comfortable to speak but also them being able to kind of listen and really understanding the really complex dynamics of what's happened in each of those areas. From my perspective and in full disclosure, I'm also a partnership broker and we brought a partnership health check process into this and that has been really interesting because through that process, what we've actually been able to identify was that there's a whole sphere of work within this evaluation, which is around the way the donor and the funded organisation, the COC, have actually worked on political levels with embassies in countries, with the UN, et cetera, to actually raise critical human rights issues around the LGTBI community. So that has been something that's been very valuable. And finally, Sinita, one more place. Funding flexibility was absolutely fundamental. We have used all of the budget, but we've really had to invest in researchers. We've had to invest more time in processes and so forth. In the end, excuse me, my dog's just popped up. Remote evaluation in the end is actually not cheap. So that was a key issue. And we also had to take care around not setting up perversions in teams around people's participation in the process. And then I'm just going to talk quickly to the last one, the focus on internal team relationships and agility. That was for us building that really strong rapport, not just with the client, but also within the research team, has been so important in terms of addressing contestability and so forth, making sure that different researchers can connect with each other and talk about where the intersectionality between their different pieces of research are falling. But that was also tested just recently where the lead researchers, grandparent died and they needed to leave the process for a few weeks and we were actually able to have other researchers come in and actually lead research meetings and so forth. So I felt that in a crisis what that taught us was maintaining those strong relationships were really, really important to deal with the stresses that are going to continue to come through working in a context like this. So there we go. Thank you, Donna. You had me lulled in there. I just seen a chat out to everyone just to say, we know, so the session is for an hour and a half but we know that a lot of you can only stay for an hour. So I'm going to give those of you a few moments to do so. But also wanted to now open the floor to questions and also extend the apologies from John again that he hasn't been able to stay for the question session. While people are thinking about questions and feel free to add anything into the chat, it's interesting listening to all three of you actually. And I think, Donna, you summed it up nicely at the end, I was thinking what's maybe the most significant change to my own practice that I've had to make in this in the last couple of months and the points that everyone's made around the importance of communication and relationships and how we've needed to change the ways that we're either establishing or maintaining those connections and those relationships and, you know, and the mental burden that that puts on us sometimes to know that we need to do that but also getting up to speed with new tools to try and do that. But it still boils down to much this similar themes that we already know which is around relationships are really key. It's just how we do that is different. Okay, we've got questions flowing through now. Um, let's have a quick look. Okay, Mary's got a question for you, Donna. You mentioned that remote workers are expensive, can you elaborate? Me too. Sure, look, I guess that when we started this evaluation, when the evaluation was designed, 30% of the budget was for field work. And that included sort of, you know, certain members of the teams to engage local researchers to work in the field, the travel costs, all of that type of thing. And while we're now not doing any travel, what we have had to do is actually really invest in the quality of the researchers that we're getting and also the supervision of those researchers. So that's something that has actually costed just simply in terms of time, but we've now gone from a team of two and a bit with a couple of days of translators and local context advisors in those initially planned five field visits to actually now having, you know, nine researchers in place for a period of about four weeks to do what a really, really detailed analytics. So that, for us, has been a really big investment. And then the second thing was, I just can't stress enough how valuable bringing the data analyst has been into the process. And that has taken in a program of this size, she's had 20 days of work looking at the data analytics because, you know, we are talking about 125 partners plus, you know, additional partners. And she's now putting together sort of country synthesis reports of everything that has happened because those reports have been what's actually helped us to determine where the case studies are and where to use the examples out of the different countries to put into those case studies. The case studies being thematic country focused, etc. So I think that's where a lot of the cost has actually gone. Yeah. Do we, and I was wondering if you might like to reflect from your point of view, you've actually got a really interesting perspective and as you said in one of our early open spaces, way back in the thick of lockdown that you've you guys have been having to work remotely as part of your business as usual anyway. So I'm curious just to know how you, you know, do you feel that there is an added cost or that that cost is spread in different ways when you're doing this is your normal way of working. Oh, you need to unmute yourself. I suppose costs wise is the time that you spend in building capacity of communities to participate and to communicate is that time spent you got to prepare a lot more which I think Donna spoke a fair bit about and was it John? Yeah. You just there's a lot more time consuming but also interestingly what I've seen coming out of the two both present, the two presentation is that we adapt and created innovative ways to still do our work and that's not even myself just looking at the two Donna's and John's presentation I've learnt a lot, picked up a lot about that. I think the key to to success is not only building capacity but I think it's maintaining those relationships and trust with the stakeholders that you're working with. I think that's an important element of all of our work. I think that's the key you're not going to succeed. The other thing is don't be frightened to try new things and come up with different ways of doing stuff like just with my current work at the moment I'm not a evaluator in any sense I'm a community development practitioner but evaluation is embedded in our work but you've got to come up with different ways to how you engage and gather the information. It's still our own to work with communities or stakeholders to get the outcomes that you want to achieve. Jade has asked a classic lessons learned question around now based on your experience, what would you do differently and reflecting on particularly the using of online modes and things. Donna asked you what would you do differently if you had a time again? In designing I think for me a lot of my reflection is now as I'm looking at designing new pieces of work moving forwards is really questioning my role and do I need to reflect on some of the things you might have experimented on recently and do I need to be that person that's in the field all the time and really coming to terms with my utility I think that for me is something that's just really, really important The talent that the edge effect team has been able to get in terms of the researchers that we've used, both international and local researchers has just been absolutely extraordinary and we've got amazing human rights lawyers from Egypt and extraordinary actors people who have been exiled from their countries and worked in this movement for 20 years I have just working with them has been incredibly rich for me and I think for the whole team so I think really kind of thinking through what are the skills we need who can do these pieces of research and how can I support them use my skills and my 30 years of practice to actually support them around what is good evaluative research and how to work with people on telling their stories so that for me is a really, really key takeaway and I think it's something which I usually do but is really working on the outcome really working with program logic really the change that's happening from them to them so those would be the things for me that I want to take forward So would it be fair to say I'm just looking at Ken's question around what changes are there at Zod of COVID-19 do you think it's a pain? It sounds like Donnie you're suggesting that maybe really trying to encourage and force in a strengths based way doing it differently in future so that you're leveraging that strength Yes And I think it's that real focus on process it's really forced I think process when we go and do consultations and stuff like that but as John indicated when you're spending three weeks in the back of your land crews are doing 12 hours a day driving between village to village to village when you walk into those meetings and you sit out there under the trees sometimes you're just so tired that the process goes out the window and you're just trying to get through your data collection process we all know that that can happen to us so it has been beautiful and I can really think about what John said about what is it that we really want to know and I think what I heard from Doi Ann about the yarning tool and stuff like that is really thinking about we're here to listen and how are the different ways that we can listen so yeah So Doi Ann on that and that particularly because that process piece is something that you have really stressed and getting that process but right have you been experimenting with any new tools recently and what would you keep and what would you chuck I suppose using a tool that you can use both online as well as base to base I think that's the key thing if you're able to have a tool in your toolbox like the cedar tree is something that we can I can actually pin it up behind me and do an evaluation process using the cedar tree I've done it before but also you'll be able to take it into the field and lay it down in front of people and do it that way it's something that you can use both I think the key thing is finding the tool that you can use both online or you can use in the field face to face and I think that's the key learning That's a really interesting point OK Nolan's got a question about reporting findings I'm interested in how we report on evaluations and implement findings does this change during COVID it also seems to be difficult in remote evaluations do you care to start or comment Would you like to repeat that question again? So Nolan is wondering when it comes to reporting and implementing findings from an evaluation does that change in a COVID environment does the reporting and the implementation of findings change and he was noting that it seems to be more difficult in remote evaluations how do you actually take the learnings from this remote piece of work and implement it I think possibly it's sometimes because people might be thinking this is some exceptional event and then we're going to go back to BAU so maybe there's the transferability of findings may not be always immediately apparent and Nolan if you want to clarify further a male of that please do so that's what I was thinking No I think you've got it pretty much it's just we do struggle with reporting back I mean the way we've done it is we've had workshops and things like that but in this environment it's a bit difficult to do that and to get feedback on the learnings back to stakeholders and then really implementing those findings and seeing how to make we improve what we're doing you know we're pretty good at doing the evaluation bit but I felt in this environment for my job for example makes it like 100 times harder to get around to doing this to get out there and come up with strategies to change how we're working to improve how we're doing things Donna do you want to add anything Yeah I do Okay I'm just checking that I was on mute I guess that I guess that in the field in which I work we have to recognise that we're actually you know even implementing partners are often remote partners and I think you know it's acknowledging that and really making time to obviously to be able to think about is something that's really really important you know I think just because we're working remotely doesn't mean we can't run workshops we might not be able to run one day workshops but we might be able to run a series of a session a week for four weeks to talk about different parts of the evaluation so in that sort of what we're trying to do with COC at the moment really engage them as we're coming up with our meta-analysis in actually talking through results and talking around findings so I think that taking some more of those sort of action you know action evaluation sorts of processes or developmental processes into your evaluation if you can is good if that's not going to create an issue around independence or objectivity etc which you know I think that those things can be managed and I think this the other thing for really trying to think about what's your communication strategy that's you know that's one of those downward accountability things that we as evaluators always need to think about is you know is this just going to go and sit on somebody's desk in Department of Foreign Affairs and Trade or is it actually going to go back to the people implementing the program and downstream so I think you know it's kind of upon us to think about how do you communicate beyond the delivery of the report and I think just going back to the question about just reporting is also just being we had to be very very clear or we did become very clear with the COC evaluation for the Pride evaluation what we couldn't do and what we could do and we made that explicit within the evaluation plan and as we reporting out we're reporting exceptions consistently so I do know that as we reporting out you know there are a lot more sort of footnotes at the bottom to say well we've actually been able to verify that with X but we can't actually get a secondary verification right so we're having to do a lot more of that but I think that's just I think that's going to be our reality for a long period of time certainly in the international development space because you know we're in this first wave globally and this is going to be the reality for a very long time for many of those countries in which we're working so I think this is any normal There's a couple of other questions that have come through but Dwyanna I wanted to ask one question that I'm sure others will have been thinking about say your comments again about relationships and time but and then getting people up to speed with using some of the zoom and those sorts of things how did you find the person in that community in the first instance to provide that support to get them online or were you doing it over the phone I'll just initially start off slowly and I suppose just starting off slowly with the community and find someone who's able to access the IT some of the relationships that we had pre-existing relationships in the community so it made it quite easy we have come across one project community that we've worked with where we just started off brand new and we didn't have any existing relationship but it was just a matter of talking and communicating building that trust over the phone and then slowly working towards that online platform such as Zoom or Teams or Skype what everyone uses yeah so you keep asking and asking and asking yeah you just keep on working because if the community people are keen and prepared because we don't if people are keen to move forward then they'll then they'll communicate and participate and it makes it a lot easier as opposed to trying to force someone to do something Donna Mary's asking you a curly question around RCTS have you seen us in the chat in a post COVID-19 environment how would you measure impact in a non-RCT do you think that in general remote M&E influences the results of what works impact how do we address the attribution gap and I'll just get you off me out there we go yeah Marie thanks for that question that's awesome look I think I can answer that question in relation to this piece of work very specifically and now we're not working with with randomized control methods and and so forth what we are working with in the pride program in particular is about the particular lived experience of a very defined group of individuals and so we have been able to do the impact analysis and the attribution analysis through actually very very detailed storytelling amongst those individuals but also overlapping the outcome harvesting process where we're actually really able to look at what are the outcomes that the program has achieved which we're looking at through our analytics and through our case studies and then actually really trying to map back how has that changed happened what was the role of specific regions of the program or different actors or what was the implication of a particular context shift to those individuals live so we're really looking at this work for this program very much at that sort of largely at that individual level and then there's another there's actually another piece of research at the moment which is looking at how does that then go up to the community level meaning that LGDPI or diverse sexual orientation gender identity community within a particular context so we're sort of looking at this movement building concept as well so I might need to come back to you in a couple of weeks once we've looked at that one in a bit more detail Thank you I just want to acknowledge Ken, I think that we've addressed your question but if you've got any other follow-ups feel free to pop them in the chat in the last five minutes I just a quick follow-up question Donna you mentioned the use of outcomes harvesting as one of your tools and I forget actually from your slide if that had been part of your original plan it's something that I've seen a few people seeming to use more in this kind of context and I'm interested in you commenting on how useful you might have found it in a remote setting Yeah so I it's a really great question Kara and this was the big challenge for us right at the very beginning was the whole evaluation was actually to be planned around an outcome harvesting process so it was very much around the storytelling of the partners and so forth and when we started to plan this out we decided we couldn't do the field trip COC was most concerned that we wouldn't actually be able to do outcome harvesting at all and I basically I think that what we've sort of done with the researchers to help them to use the sorts of questioning and the sorts of processes that you go through in a harvesting process with individuals and with small groups of individuals as well so while we're really used to outcome harvesting as being that that cacophony of noise and people mapping things out and contesting things you can still use those forms to get there so for me I think it's always about the quality of the inquiry and the journey that you take people on through an evaluation that's the most important thing and so that comes back to that building a relationship having trust you know something I just want to add to that if you think about trying to have for example legislation with an LGTBI activist in somewhere like Egypt that is an extraordinary narrow space the ability to actually use a local researcher who they may have a relationship with already and help that researcher to be able to walk them through that process is actually really quite powerful and it's something that I couldn't have done even though I'm used to using those methods I would never have got that the trust from those people so I think you can it's about your quality of inquiry we've got only a couple of minutes to go so I'm going to wrap up the questions and answers but I just want to reflect back some of the things that I've taken away from this and I think what this recent experience has taught me and listening to your presentations as well I know that relationships are important but this has really highlighted it for us in different ways and we know about trust and the comments about being really careful and purposeful in the process and taking time to plan these aren't new things but I feel like the COVID-19 environment has really sharpened our thinking around this because we've not been able to take certain things for granted and despite my frustration of having to plan my sort of sense making sessions more carefully I know that it's actually it's useful and it adds value and so for me that's been it's reassuring to listen to your to your feedback I want to thank our three presenters again Doi and Donna and John I want to thank New Zealand AES committee our partners in South Australia and in Western Australia it's a complete delight that this joined up approach probably wouldn't have happened for another few years if COVID-19 hadn't happened as AES many silver linings so I'm going to close with a karakia and farewell all on a very good rest of your day kiohora timarino kiowhakapapa pounamu te moana hei hurahi ma tātou i te ranginae aroha atu, arohamwai tātou e tātou katoa hui e tai kie kia ora everybody have a fantastic rest of your afternoon and hope to see you all again online soon