 Okay, good afternoon everyone and welcome. My name is Kat Street and I'm one of the co-conveners of the NT branch of the Australian Evaluation Society along with Christabel Darcy. She's sitting here with me today so just a big welcome to you all. Before we get going I would firstly like to acknowledge the traditional custodians of the land on which we are meeting here today, which are the Larakia people and also to pay my respects to those Larakia people past, present and emerging as well as those who the traditional custodians of the lands on which people from all around Australia are joining us today. So again a big welcome to everyone but also a big hello to people from the Northern Territory that are joining us today just because ourselves have been a bit quiet in terms of organising events because due to the opportunities that COVID has opened up and as you've probably all seen there's been a huge number of events and seminars workshops that have been offered online which has been awesome to see and I hope that you've all had an opportunity to access some of those. Just to let you know that we are getting some plans together for what we're going to hold next year from an NT branch perspective and that will include both face-to-face and online events so we will be resuming face-to-face events which is really exciting. So today I'm really pleased to introduce Sam Abardo and Margie McGregor who are presenting today on Digital Evaluation Story. So Sam is a consultant with visual insights which is a consultancy that specialises in the use of digital technologies in engaging people throughout the evaluation process and she's been working with Margie at Catholic Care for a number of years now to strengthen the way that they use evaluation within their organisation. So Margie is also going to be talking about some of her experiences in being part of that process so I think it's going to be a really good presentation. Just briefly can I ask everyone just to remain on mute and if you do have questions that come to you throughout the presentation feel free to type them into the chat box and then we should have about 20 minutes towards the end of the presentation to go through some of those and also just to note that this presentation is being recorded. Okay I will hand over to Sam. Thank you very much Kat. Welcome everyone. I'd first like to acknowledge the traditional owners on the lands in which we're all meeting and particularly the Larakia people of Darwin and the Dugara and Turigul people of Brisbane where I'm speaking from today. I'd like to pay my respects to Elders past, present and emerging and to welcome all First Nation people that might be joining us today. I'm happy to see some familiar faces and more. I saw some familiar faces until everyone turned their videos off which is good fine. I listen yeah I see some familiar faces and great to see quite a few participants from the Northern Territory and I'd like to also introduce you or welcome Rosie Pretorius. So Rosie is the visual digital lead for visual insights and she's joining us today as well. So let's get started. So the plan is 30 minutes presentation and about 20 minutes for questions at the end. It's in two parts. The first part is from my perspective as an external evaluator. So I'm going to provide a little bit of an introduction about digital evaluation story as a methodology and how we're using it and then provide the context for this case study looking at Catholic care NT and the broader evaluation context in which we're using the method. And then I'm going to hand over to Margie who'll provide more of an internal evaluator perspective. So she will talk about the I've got final in brackets, the almost final digital evaluation story for monitoring and evaluation for Catholic care NT. And as a participatory approach it's never going to be final but pretty close. She'll also talk about some of the benefits of the approach and the many challenges that have been overcome by the organization and our collaboration over the several years and give you some pearls at the end for those looking at incorporating story and particularly digital evaluation story into your monitoring and evaluation. So to save you a little bit of time and a little bit of learning that we've gone through over the years. So I'll start with some definitions. So if we look at evaluation story, this is a definition from 2010 from Richard Prueger and it's in his chapter of the handbook of practical evaluation methods. So he defines evaluation story as a brief narrative of someone's experience with a program or something else, a project and initiative that is collected using sound research methods. So his definition comes about two decades after Davies and Jess Dart, Rick Davies and Jess Dart developed or published on the most significant change method which was another story method for looking at impacts of programs through the eyes of stakeholders and in doing so increasing stakeholder engagement in evaluation. So story is definitely not new to evaluation but sort of sitting around the fringes. Digital story is another method that's a short usually personal narrative presented as a short movie for television, computer monitor or screen. And this has also developed over the last 20 or so years and particularly from the education and community sectors and has been largely used to demonstrate participant experience of programs and also as program promotion. And this has run alongside the rapid increase and rapid availability of digital methods that fingertips on our phones on iPads and also the increase of simple online editing tools and animation tools. So finally with these two ingredients and definitions we get to digital evaluation story which is a short movie of someone's experience with a program that's collected using sound research methods. So the key part here is the sound research methods from digital story and from evaluation story the key part is that it's a visual story. So why use a digital evaluation story for monitoring evaluation and learning? I'm going to just note that I'm going to shorten monitoring evaluation and learning because we're aiming for 30 minutes and any any abbreviation that's going to help us get there. So it'll be Mel or Emily I'll be talking about. So the first set of reasons I've got four set of reasons here and the first set of reasons is that pictures and stories are impactful and memorable. Our brains are wired for story and even if you think about the way that you work on a personal level that we think in story we remember in story and we also change our experiences into stories to to recall them and portray them to others. When we look at the visual side you know our brain actually processes visuals around 60,000 times faster than it processes words or verbal communication and about 90% of what we store in our brains is visual as well. So the strength in terms of engaging people with pictures and stories is really something to think about as a real ingredient for engagement and you'll see the little guy waving in the middle. It's also a great way of supporting a participant-centred approach because it really it features the voice of participants so it really platforms their voice in contrast to other methods such as survey for example. Okay second set of reasons is that stories are really one of the main ways of developing evaluation models such as theory of change. So if you look at Finella and Roger's work and their text on evaluation theory you'll see that using story or the inductive way of developing theory of change is one of the three key ways of developing a theory of change and so on the flip side of that in terms of engaging staff and other stakeholders in theory of change it's or in evaluation models theory of change with story woven through it is a really good way of engaging people. Third reason is that there's a real recognition now that that most evaluation benefits from a mixed method approach and I think a lot of evaluators out there are using mixed methods. I know that in Michael Patton's book on principles focused evaluation a couple of years back he actually talks about it yeah it's great that we're actually using both methods but they're generally or one of the issues is that their silos we use we're not combining them well. So one of the ways or one of the things although we're acknowledging the value of qualitative methods is how do we actually integrate that with quantitative methods and there's a very kind of there's a gap in terms of male or M&E in terms of incorporating qualitative into M&E approaches there's still in in terms of textbooks and in terms of the evaluation literature they're still very quantitative focused in in terms of monitoring evaluation so it's time we started to incorporate more narrative and more qualitative methods into evaluation methods and monitoring evaluation and the final good reason for using digital evaluation story for male or M&E is all of the tools that are increasing in their accessibility and to us and all of the tools not only their accessibility but the quality is ever increasing and it's all at our fingertips from really good quality cameras on phones to really easy accessible recording equipment to virtual reality equipment that you can attach your phone to and then also in terms of editing programs from from Apple programs to PC programs the availability is ever increasing and it's also increasing as media for communication as well or media for communication. So now moving to the context of the case study in which we're presenting today. So Catholic Care Northern Territory is a medium-sized non-for-profit in the Northern Territory. It is a complex organization that covers a wide geographic and very diverse geographic region and a wide diversity of participants of cultural and linguistic diversity. Around 30% of staff of the 250 staff across Catholic Care NT are Indigenous and there are 30 programs across 19 sites of the Northern Territory and as I said very diverse programs so very diverse organization covering a wider region. So why digital evaluation story for Catholic Care for their their male initiative? Well about well five years ago before we started Catholic Care were routinely collecting for several programs good news stories as part of reporting to government funders. So though we're familiar with the story approach even though these stories might be more from the perspective of staff member rather than the participant. They were also reporting as many organizations are still reporting on largely on outputs and at the same time there were early adopters of visual platforms that engaged participants and like social media platforms using videos and images to engage participants in the organization. But there wasn't much there was kind of there was a gap that the leadership saw for several years actually before 2015 when we partnered with Catholic Care NT for being able to demonstrate the difference they made and around 2015 they actually saw that that this visual approach to pictures and stories approach is really compatible with what they were doing is and is also very compatible with a participant centered approach and engaging participant narratives. So really looking for they're really looking for a method alongside standard quantitative methods of outcome surveys and output monitoring to demonstrate the difference they were making. So really seeing that the need for a participant-centered approach. Okay so I'm going to cover five years working just a slide here just to give you a taste of where the digital evaluation story method fits within the whole initiative of monitoring evaluation for Catholic Care NT. Okay so it's a start where it was a five-year approach and it was from the beginning it was a participatory action approach and it was about responding to where the organization was at at that time rather than presenting a whole lot of new methods. The focus was looking at demonstrating the difference that the organization is making and looking at the outcomes that linked to their strategic plan and their vision which participant well-being or family well-being and social participation. Also looking at satisfaction and program and service integration. So that were the really the four key areas for providing evidence on that were linked directly with the vision of the organization at the at the time. And the other thing to point out here is that because that the work was very participatory action focused that the each year's work was negotiated the previous year. So we really reflected as a group both visual insights with the Catholic Care NT leadership team of around 15 leaders at the end of each year we really looked at what we've achieved what we needed to achieve over the next year and what was going to work what didn't work what what was going to work. So this wasn't mapped out at the beginning it was a very responsive approach all the way along. The next little thing to know this is that we are here so final final stretch here so sort of partway through year five. So the the first thing to note is that it really the whole initiative started with organization capacity building. So when we're looking at the digital evaluation story method it really started with building the the skills of the leadership in evaluation broadly and also in the methods but bringing the staff along from the very beginning and doing evaluation training that included not only evaluation models theory of change and quantitative methods but in the context of this presentation a lot of work around story around qualitative interviewing sound interview skills sampling analysis. So lots of capacity building and the development of the methods came at the beginning of the second year and in some ways although I've got that ending here it's it's still developing but it's more like tweaks. So this is really the solidification of what those key methods were in terms of outcome surveys and output monitoring and and evaluation story interview for story and the important thing to note underneath each underneath the evaluation methods is four and five is what needed to follow was the establishment of infrastructure and the formal documentation of the monitoring the mail method or the M&E method and I just want to emphasize that there was a lot of work in that approach and a lot of energy and that was really important because that really needed to support the methods and particularly when you're looking at a new method of interview for story there was a lot of development of both infrastructure and documentation of which Margie will touch on that was really important. So then we have digital training which not only included training in the digital kits which Margie will to show in in her slides documenting the method itself but also in staff being able to storyboard for an effective story and being able to learn some of those editing skills in-house and so there were a certain number of champions within the organisation that really took this on and mentored others. And finally the stage that we're in now is completing a visual evaluation toolkit and alongside that doing internal mentor training and support to ensure that there are visual tools that will that will live on to ensure the rigor of methods and also that staff across the organisation in key positions have that training and knowledge and are supported by the continued evaluation capacity building of the leadership which you'll see goes across the five years. So that's five years or four and a half years in a nutshell. I'm going to pass on to Margie now who's going to talk about the digital evaluation story method which is a large component of this but she's going to focus on the method itself and particularly on the evolution of the analysis of the digital evaluation story and we'll talk about some of the benefits and also how the organisation met some of the challenges along the way. So over to you Margie. Thank you Sam. So I'm going to describe CAFIC here in T's integrated method of digital evaluation story within the monitoring evaluation and learning initiative. So this method was developed and refined over a four-year partnership with the leadership and staff across the organisation. So all 30 programs are guided by a sampling frame and quota. This ensures that each program has sufficient interview for story data for each six month period and that there's proportionate representation of participants from the service delivery demographics. The interview process for each program is supported by a tailored interview guide which follows four main topic areas including personal context and engagement in activities, early changes as a result of program participation, longer term changes and program improvements and there's also a stakeholder interview guide. The recommended interview duration is 10 minutes and each interview is recorded using a digital kit then fully transcribed in preparation for analysis at the relevant regional evaluation analysis forum. So regional evaluation analysis forums or REIFs are convened quarterly in each of the five regions. All regional staff are invited to participate in the REIF with the goal of analysing each interview with reference to the relevant program logic and theory of change. This process is guided by an interview analysis form which assists the content and thematic analysis of the interviews. Quotes from the interview provide the evidence of outcomes. The REIF first analyses the transcript for evidence of program outcomes then for evidence of the four organisational outcomes of wellbeing, social capital, participant satisfaction and program and service integration. After the REIF provides its analysis and feedback for each interview the information is provided to the organisational evaluation analysis panel. I refer to it as the EAP. It's easier and that comprises the evaluation systems manager myself and members of the executive. The panel considers the information provided by the interviewee in order to ascertain opportunities for organisational and program improvement. And then interview rigour and accuracy are assessed and if the interview provides sufficient evidence of outcomes the clinical practice coordinator conducts a case review prior to the interview being edited into a digital evaluation story. The content of the interview analysis forms the basis for storyboarding and editing. Each completed story is displayed on an online platform ordered by program, region and outcome. Key to process completion is the provision of systematic feedback to inform the participant about how their interview has provided evidence for organisational and program improvement. A letter detailing this information is sent from the director to each participant. Feedback is also provided to individual staff members which includes both the results of the analysis and feedback on technique for reflective practice. Each of these stages of data collection, data management and analysis is contained within an established system, processes and infrastructure that have the ethical collection and use of data at their heart. I will now talk about some of the benefits of the approach, how the organisation overcame the challenges and what we have learnt about the attributes that can support an effective mail process that incorporates story. So while the reefs are a relatively recent innovation for Catholic care NT being launched just over a year ago, they've proven essential for staff to gain an understanding of how their interviews give their participants and stakeholders a voice and provide insights for program review and development. Surprisingly, some of the areas where staff thought participants would be reluctant to be interviewed such as the men's behaviour change program, the intensive family parenting support program and participants from non-English speaking backgrounds have actually had the most enthusiastic participation. Now initially the reefs were conceptualised as comprising the regional manager, lead practitioners and team leaders. However over time it became clear that opening the reefs to all staff was key to linking the theory of change and program logic to their daily practice. When staff attend the reefs and use their program theories of change and program logics to analyse the transcripts, there's a clear connection between program theory and their work on the ground. An unexpected benefit of this engagement was that staff are now proposing changes to their program logics and theories of change, where they can see that particular outcomes occur earlier or later in the process or may need to be redefined. In highlighting that each staff member brings their own unique and valid perspective to the process, we've recently been moving away from analysing the interviews as a group and requesting that staff conduct their own analysis prior to the meeting. This shift in process has enabled us to avoid group think and harness the extensive and diverse experience, knowledge and skills of our staff base. Similarly, the Catholic Care and Tea Evaluation Analysis panel has also been evolving over this time. The commitment from the leadership group is such that the panel has recently made the decision to move from one half-day meeting per quarter to a full-day meeting every two months. Hearing from our program participants in this way means that the panel has been able to identify themes occurring across the organisation and develop appropriate strategies in a more time-effective manner than previously. Many of the challenges we experienced will be familiar to organisations which work across a variety of urban and remote locations. First and foremost, ensuring staff that had access to the necessary technology and connectivity required a substantial shift for our IT department who were used to planning for desktop applications. There was a significant investment in iPads and digital hubs and a range of necessary accessories to facilitate the interview process. Uploading large media files was problematic as they competed for bandwidth with the case management activities of the organisation. So strategies were implemented to upload the files outside of business hours. Another challenge experienced was that initially staff were unsure why they were receiving this digital training and it took some time for their interviews to reach the standard where they could be used for evaluation purposes. Digital skills were developed through staff stories and program team support of participant and stakeholder interviews and now all staff are required to record their own staff story as part of their induction process. So these challenges were mitigated in a variety of ways. Key to the initiative was the director of Catholic Care NT's Clear Vision for the organisation's evaluation capacity building and this was shared with the leadership team over the first couple of years of the initiative. The composition of the leadership team has remained relatively stable over the past five years with new members appointed who are familiar with the purpose and benefits of reflective practice and program evaluation. Program participants have welcomed the opportunity to be heard and this means that staff have been able to connect with their participants in a different way building a relationship between equals. The reefs and the panel are constantly evolving as the process responds to staff feedback and insights. Consequently, a process of co-creation of quality evaluation systems has been an unexpected outcome of the initiative. Having a strengths-based philosophy means that Catholic Care NT staff seek the learning in all of our interactions with participants and use their feedback to improve program and service delivery. So it's been quite the journey. Integrating digital evaluation story within a male approach has taken both a shared vision and considerable courage. We're reaping more rewards than the process as much as the products as the time goes on. So some advice for people thinking about embarking upon this kind of journey. First, it's really important to prepare for incorporation of digital evaluation story into a male approach by seeing it as a major organisational change that needs to be managed appropriately. It's really important to ensure that you have a solid committed leadership team and that they're equipped with the vision knowledge and skills to facilitate the process. And it's also really important to make sure that right from the start, you're diagnosing and keeping track of readiness along the way. Whether that be staff readiness, the readiness is different systems or the interest infrastructure. So that needs to be continually revised and acknowledged as well. So I hope that's been helpful. Thank you. And Sam and I would welcome any questions you may have. Thank you, Maggie and Sam. So please feel free to begin typing some questions into the chat box. I've got a couple to get us started. So you talked a little bit about infrastructure in terms of internet iPads. Just from an NTG perspective, I think I've seen that that can be a challenge that comes up a lot. So I just wondered if you could go into a little bit more detail around listing the exact things that you need to have in place before you get started. Sure. Okay. So as I said, we needed to have a look at how we were going to be uploading the information, the digital files, because they're large digital files. And so we had to have some quite detailed conversations with our IT department about how that was going to be managed. So we brought in digital hubs. So each of our major sites now has a digital hub where staff can access their recordings so they can upload them to the digital hubs. And then that information goes through a process called Centre Stack where they're uploaded through to another system, I guess it's held by a systems compliance administrator. And he's the one who makes sure that all the information is there and it's complete and accurate. So we have the hubs in place. And one of the things I'm looking at at the moment is upgrading the amount of memory that's available on those hubs as well, because as more of our stories get recorded, we're needing more memory. Having the iPads available to all staff has been really important. So we have digital kits at each of our sites. And the kits actually contain the iPad, a microphone with an extension lead, a stand, because you don't want that wobbly effect going on by people just holding the iPad. So we need a stand and a bracket. And staff are taught how to make sure that the iPad is anchored securely into the bracket and how to set up the iPad as well. So there's particular guides around using a grid on the iPad to make sure that you've got your participant in the right part of the frame and so forth. So it's quite extensive. And staff respond to it really well. I enjoy doing it. Thank you. And the other question that I had was, and this might be more for you, Sam, but as a novice for someone wanting to give this method of evaluation ago, what advice would you give them in terms of how to, I guess, get the most out of these videos that they're capturing? Are there any resources or anything that they might be able to refer to that you can recommend? Got to ask Unmute. Is that going to work? Yeah, she goes, yep. Yeah, it's probably the, I take it sort of back a step based on our experience and really look to start with at the organizational readiness, as Margie talked about at the end. So looking at the leadership readiness for this, because what was really important sort of surrounding all of this was a leadership with a vision and a good partnership and a sturdy leadership and also staff readiness and also the capacity of the organization. Because as Margie mentioned in that response as well, each area of the data collection for interview for story or digital evaluation requires processes and systems that most organizations don't have to start with. So that they need to run along parallel from alongside the quantitative, your quantitative database. So it's a different system. So I think what I would recommend for organizations thinking about this approach is looking at what they're wanting to achieve with monitoring and evaluation and what their readiness is for this approach and how digital evaluation story might fit within this approach. It was very clear from the beginning that Catholic Payout saw digital evaluation story as a great fit for what they wanted or where they were going or what they wanted to do. So yeah. Thank you. Another question around, so you said you've got a sampling frame and quota. So can you talk us through a bit around who you choose to interview as part of this? Choose? Do you want to start on that one? Margie, would you want me to? No, I can talk to this. So we started off with a sampling frame for each program and then quickly realized that that was going to get quite unwieldy. And so we looked at areas of commonality across programs and ended up with three sampling frames that were going to suit those broad areas. So our sampling frames, for example, for our remote staff, we would be looking for broad representation. So we divide the sampling frame up into age groups. So we're looking at, let's say, 18 to 35, 36 to 54, and then 55 plus. And then we're looking for males and females from within each of those age groups as well. Where it's remote staff, or where we're talking about remote programs, we don't specify that people need to be Aboriginal and non-Aboriginal because all of our participants in our remote areas are Aboriginal. So there's no need for that. So when staff are looking to ensure that the interviews they conduct are representative, we need to ensure that, okay, if we look at the age end to 35 bracket, do we have an interview with a male and do we have an interview with a female? And in the next bracket, do we have an interview with a male, an interview with a female? So that's how we're making sure that it's representative. Where we have programs such as our Elder Smithening Program or Age Care Advocacy. Clearly the age ranges are going to be a bit different. And we also have sampling frames for children and young people as well. They were broken that up into age groups under 18 and also broken that further into Indigenous and non-Indigenous. Yep, yeah, is there anything else that was part of that question? I'm going to add to it because I think it's a really, if that's okay, sure. Yeah, I think it's because I think it's a really good example of the responsive and participatory approach that we went through. And what was important about the sampling was from the beginning, it was part of the evaluation capacity building of the leadership and regional leadership involved in workshops looking at because you've got such Catholic communities, got the 30 very diverse programs, people saw the sampling differently for different programs and different regions. So what was really important is that that was a fairly long process of consultation to come up with a joint sampling frame, which possibly in retrospect we could have come up with beforehand. It's like it's not, I have to say, it's not brain science to have a sampling frame that represents age, cultural diversity and gender. But what was important was that process of developing that sampling frame and involving staff in learning about the process to begin with so that they really owned the sampling frame and really got it. So that sampling frame took about 18 months to develop and now it's gone that stage further where the different regions have actually got a sampling frame and they've got that within their regional evaluation plans. So it's further ownership. So a lot of this is, I think that's, it's a big takeaway is to actually really embed this process. It's really important for to build that capacity and to have, and the process is key. And the process is what takes time because we could just think up a sampling frame like that today and then present it tomorrow but it just wouldn't work because there wouldn't be the ownership. Thank you, Sam. The next question is around, so you've said a number of times they've got staff based in all different parts of the territory and so if you could make a comment on additional resourcing needed to achieve capacity building of staff who might have needed more support in various locations. This is where having hands-on approach is really, really important because a number of our staff have limited English language literacy and we really needed to make sure that what we were doing was practical and that they could see value in it for them. So we had quite a number of workshops at a wide range of locations. Sam, do you remember what the total number of workshops was in the end? The staff workshops? Yes, I do. I do. 24. We had six workshops over about a three-year period. Six workshops in four different regions. Six staff full-day workshops working with staff from where they were at. So looking at, you know, their good news stories and shifting them from where they were at, building in knowledge of the different aspects of qualitative method from sampling through to data collection through analysis. And of course that's an ongoing process now. So recent, well just this week I was in Catherine doing, basically hosting the Reef for Catherine and one of the staff members came up to me afterwards. She said, you know, I feel all at sea here. I've been part of this process for a number of years but I don't quite get it. I think that's fine. So we're going to sit down together. We've got some new staff starting at our Catherine office. So I'm going to run a bit of an evaluation overview session for everyone who's there, including this person, to bring them all up to a level of competence where they feel comfortable engaging with the process. So it is an ongoing process for us. Someone has asked, are you able to make a comment on the approximate cost of undertaking a method such as this as part of your overall evaluation? That's a bit tricky. I don't actually have those financial records. That kind of query would actually need to be referred to our director. I don't have that information I'm afraid. Someone has asked, you mentioned that a letter is sent back from the director to participants informing them how their interviews have contributed to the program or organisational learnings. Do you have any, sorry, do you use any digital or visual methods in this feedback loop or is the letter a standard and formal written letter and do you have any advice about closing the feedback loop in sharing evaluation and learnings back with participants? So the letter is the first stage of the feedback that we provide. So it's really important to us that we give feedback to the participant as soon as possible after they've had their interview and that again is a process we're getting better at over time. So it's really important that we detail exactly what we're going to do with the information they've given us. If their interview has been identified as one which needs to proceed to case review and possible converting into a digital story, then when that letter is set out, I send a consent form. So they'll have already signed a consent form prior to their interview being conducted with the letter. We send an additional consent form saying, if any of your consents have changed, please let us know. This is our process at this point. And then we know whether we should be proceeding to editing into a digital story. And then once we have received the consent for case review and editing, we can do the digital story and then we invite the participant to view the digital story and make sure that they're happy with that before we would proceed with making it published. So there's a number of different consent stages and a number of different kinds of feedback which we provide. The next question I think is a really good one and it's about the strength-based philosophy. So that came out really strongly in this presentation, I think, in that the whole approach is really built on that philosophy. And so was that specifically integrated when you delivered training as part of the process? That's such an interesting question. I think it's just part of what we do. So I've been working with Catholic Care now for 14 years and so for me, a strength-based approach is just part of the ethos of the organisation. We have a really strong values framework and it's very well articulated. One of the things we've been doing recently is mapping the values to our theories of change as well and seeing how they correspond. And that's been a great process. It's been really enjoyable and really affirming to see how the values are very much embedded in our theories of change. So yeah, I guess we didn't actually have formal strength-based training as such. It's just part of how we approach things as an organisation. Sam as an external person to the organisation, would you like to comment further on that? Yeah, absolutely. I totally agree, Maggie. And I think because it was a responsive approach and it was a very much a partnership approach between us as external evaluators and yourselves that we were really guided by you in the way that you wanted it to proceed. So I'm thinking even from the staff workshops to the leadership workshops, it was a very much a listening approach and very much looking at where people were at and bringing them forward and not being negative about anyone that's further behind. I think because particularly even if you look at the regions, one of the challenges is that they were developing at different stages, although going forward in this process at different rates for different regions. But that wasn't something that was kind of it wasn't a negative in many ways. It was a challenge, but it wasn't seen as you've done bad because you're not going at the rate that someone else is going. It was a really that looking at where people were at and looking at the strengths that they had. So that meant that we went beyond looking at digital evaluation story as well and incorporated Aboriginal art, for example. I know some of you went to the presentation in Sydney at the AES conference last year and Leone Young presented her artwork that was a story but also a theory of change. So I think that strength-based really allowed people to start that flexibility to bring their own strengths to it and not to be a rigid approach. So I think our approach and what made it a fit was it's a very listening approach and looking at where different people were at and where the organisation was at and just like, okay, we'll go with that. Let's go with art. So I think that was really important for us to have that flexibility as well made it work. Thanks, Sam. Another question is around probably for you, Sam. Again, have you been able to use digital methods on other evaluations that you've done or is this just a one-off? As in, you know, please comment on your experiences of applying the method if you have in other contexts. Yeah. Yeah, this is definitely the biggest, you know, the biggest initiative in terms of I haven't, we haven't used it yet for monitoring and evaluation and really changing, you know, or really developing or assisting the organisation and developing a whole system to support it. But we're trying to incorporate it in most of our evaluations now and at least some visual components because I think you can bring, whether it's bringing a few stories into it that participants develop. We had one, you know, there's a few COVID stories in here that of things that could have happened should have happened this year around that method. And one of them was a participatory workshop approach with students and their parents to document their story of their experience with the program using storyboarding, very participatory storyboarding and making some virtual reality films out of that. That was all set to go, but that COVID stopped or, you know, COVID happened. But we have used it particularly over the last few years at varying, you know, in varying degrees. So I think that's where it comes back to making sure organisations are ready and what label you can bring it, bring evaluation story method into your evaluation. So I think it's having those degrees that you can bring it in. So yes, we have, but not to the extent we have with Catholic Care NT, but definitely we intend to do that in the future, but there are so many benefits from the process, that visual process for, particularly for staff engagement and staff learning and process improvement, as well as the evidence of organisation or outcomes from programs and the organisation. So I hope I answered that in a kind of a roundabout way, as we all have to nowadays. Thank you. Someone has requested that you bring up the final slide about the lessons from the journey. And perhaps while people are reading that, there's also a question, are there examples on the website we can view of any of these stories? Not yet. Not yet. We're just finalising the editing for two stories at the moment. And then we're looking forward to having a launch. And then once we've had the launch, they'll be available. But yeah, they're just in those final editing stages. They need titles, slides and some music put to them and so on. So they're nearly there, which is exciting. I look forward to seeing them. All right, just a final question. If some people did not want to participate or didn't articulate their thoughts because it isn't possible to do it anonymously, or did people, did anyone not want to participate? Participants for the programs in the videos, I think. So we do find that there have been Aboriginal participants who have been quite reluctant to be filmed. However, they're quite happy for an audio recording to be taken. So what we do in that case is we just do an audio recording because this is about evaluation. So we actually don't need the film. We just need a story so we can analyse that and see if there's some, the outcomes that we would expect for the program. So that's been a really valuable method. The other thing is that with our Men's Behaviour Change participants, all of those interviews are audio rather than video and that's understandable. So they're quite happy for their stories to be told, but they'd rather not be filmed. And that's fine. And that's given us great evidence of outcomes for the Men's Behaviour Change program. We did some animated virtual reality stories with some of the participant stories who wanted to be anonymous as well. And there was a potential to use their voice or an actor's voice or a staff member's voice. So we did a couple of stories like that, so visual stories that went for about three minutes in first person. So the participant telling their story, one was a drug and alcohol program and one was a counselling program. And both participants really responded well to the product as well and were able to have a copy of those to show their family and friends. So to actually be able to step back from it and be anonymous, but still for it to be their story. So that's definitely a way of the future with the capacity in or the capacity of virtual reality to be used for digital evaluation stories as well. I think there's a lot of potential there which we have played with. Awesome. Well, thank you, Sam and Maggie. That's been really great. And especially for me, I've known you for a while now, Sam, and just to hear that deep and really rich account of the work that you guys have been doing together and to hear about the process that you've been through and the strengths and the challenges that have come up through that journey for you. Yeah, I thought that was really awesome. And I just want to make a comment. I think it's so great to see that you've really made sure that there's been a focus on that timely feedback as part of that reflective process, which I think is really important that we all do. And the other thing that came to me was that this, and you've said it, but this method is really relevant to us all, not just people who are culturally linguistically diverse. I thought that was an interesting point that you made around the brain processes, visual information a lot quicker than it does words, which makes a whole lot of sense. So thank you so much. And the recording will be sent out to people who have registered. So thanks to all who have dialed in and have a lovely afternoon.