 Kia ora everyone. Welcome to this AES lunchtime seminar of conversations over the ditch with Dr. Jess Dat. I'm Marina Sanker, the co-convener of AES Aotearoa, New Zealand. Before I begin, I'd like to acknowledge the traditional custodians of the lands which we come from. I'm speaking from Wellington, Aotearoa, New Zealand, and I acknowledge our leaders, past, present and emerging. This seminar is recorded. Please feel free to introduce yourself in the chat section and type in any questions you may have throughout the presentation. Our presenter is Dr. Jess Dat, an AES fellow and founding CEO of Clear Horizon Consulting. She is a recipient of the 2018 Award for Outstanding Contribution to Evaluation. Her specialities include evaluating large complex and emerging programs, human-centered design, developmental evaluation, facilitating organizational effectiveness and theory of change. Her doctoral research involved adapting and testing a story-based participatory monitoring tool, the most significant change techniques which I have used in so many evaluations. So, welcome Jess and thank you and over to you. Hello and thank you Maria and welcome everybody. I'm coming to you today from the lands of the Gani Perni people. Actually, I just moved so my tag's wrong. This is a gift land in Victoria and I'd like to also acknowledge the traditional owners of the land on which I'm standing today and pay my respects to the elders, past, present and emerging. And also acknowledge any First Nations people who might be on the call today. So, it's really lovely to be here with you all and I'm going to be talking to you mostly about a case study today. And there are conversations over the ditch and the ditch, if you're not from New Zealand and Australia, we're talking about the body of water between Australia and New Zealand. And it was mostly to have a conversation about how evaluators are increasingly working in the front end of initiative design or human-centered design or co-design and how evaluators are playing a role there and have a chat with you about that. But before I dive into the presentation, I'd love to find just a little bit out about who we have here today. So I'm wondering if we can have a little bit of a go at the chat function and do something called a chat slam. I don't know if you've ever done a chat slam before, but the idea is you type, but you don't press send until I tell you and then we can all see everybody's comments. Well, I guess the first thing I'd just love to know is about what sector you work in. So for me, I am a bit of a cross-sector person, I guess. So I'm a consultant and so I work across sectors, social justice, a little bit of international development. So that's my sectors, but I'd love to know what sector or how you describe the space that you work in, whatever feels natural to you. So if you don't mind starting to type and I'll press you, I'll ask you to press send at the same time. And then we can see who we've got here. So the spaces that you work, however you'd like to define that. Let's give you a second. All right, press send. Let's see who we've got. Okay, let's see. So we have health, education, education, education, government, evaluation, legal space, gender auditing, social housing and urban, local government, university mental health, health, health, health, lots of health, lots of health, social housing. Morning everybody to you too, Aboriginal health. Fantastic. So we've got, I would say the social sector is dominating today. Maybe that'd be about right. Second question. So in my career, I started off, although I've been doing evaluation since, gosh, it's been about 30 years. I'm showing my age. But before that, my really early career was actually in participatory planning. So I had a sort of bit of a background of does that, what would you call co-design today? I guess only it was framed differently then, particularly. And I'd love to know whether you see yourself more of as an evaluator or you also did design work. So are you, are you, yeah, and are you working in the front end now? Maybe that's the question I really want to know. Is do you work right in the front end of policy or program design now? So the answer to this is yes, no, or don't know. Maybe you don't know, you're not sure or unsure of the yes, no. So the question is, do you currently work in the front end of design, programming, human-centered design, co-design? So go for it. You can press yes, no, maybe let's see. Let's see what we've got. So just a sense of who we are. Yes, no, yes, no. Oh, yeah, I think the yeses. No, maybe I would say, I don't know. I'm doing a very quick quantitative analysis here. What do you reckon? I think the yeses might just have it, but it's fairly even. 50-50, I would say. Roughly, roughly. Yeah, if anybody can add up, they can. Maybe a few maybes. Okay. Excellent. It just helps me know who we've got here today. So it's lovely to know where you're all from and there's quite a few people who are working in the early spaces. I think it's been on the increase. So evaluators are being asked to come in. I don't know about you, but I first started to work at the front end, probably just because I was doing a lot of theory of change and program logic work. And I'd often get asked in as part of an evaluation, maybe after something had been running for quite a while and they ask you to come and do an evaluation. And I used to say, well, I just need to clarify what you're trying to do first, like clarify the theory of change, clarify the program logic. And then they'd go, oh, God, why did you do this at the beginning? Like, this is terrible that we're doing it now because we realize that we, you know, things didn't make sense at the beginning. And so gradually I got invited in more and more to the beginning rather than at the end. And I guess that's fairly normal. But over the 30 years, that, you know, wasn't normal. I remember back 30 years ago, people really thought that evaluation was only, you bought the evaluators in when the program was over, you know, over. So now things are changing in that regard now. And certainly we do evaluation planning much earlier. So I get invited in before there's even a program before what really at the very earliest concept and that's what I wanted to chat with you today with this case study. So on that note, I might start to share my slides. Just one second and bear with me. I'm going to try and be a bit fancy today. I've got a video to share with you. Now, hopefully you can see that now. Yes. Yes. Great. I can't see you, unfortunately. Let me just see if I can. I'll trust that you're with me. Okay. Okay. Can you see it? Can you see it correctly now? Yes. Yeah. Not in slideshow. Okay. So what I'm going to do, we've just done the acknowledgments going to share this case study of it. And this is the thing that's really surprising and interesting about this is it's a 10-year journey. It's the first time ever, maybe it'll be the only time that I get a 10-year evaluation contract. And the initiative, all of the people knew from the beginning that it was going to be a 10-year thing. It was funded. It was funded by philanthropy, which is often where you get these longer horizons. I'm going to tell you firstly a little bit about the initiative itself, which is very interesting. It's in South Australia. And it's the community-led social innovation context. So I'll tell you a little bit about how it began and then we can get your questions and so forth. But I guess I'm really interested. This is my question for you. Maybe you can think about this as I'm presenting. How does this differ or is it similar to what you do in your evaluation practice? And I guess what does it mean for evaluation practice, for evaluators to be working in the source space? That's the question I'm going to come back to at the end of this presentation. Okay. So the name of the initiative is Our Town. And I often think it's best described by the people themselves. So I'm going to try and do something quite fancy now if I can do it, which is to play a video. So let's see. I just need to make sure I have the right settings to be able to share. One second. Advanced sharing. Now, can you hear that? Not yet, Jess. I've just got to get the advanced. This one second. Advanced sharing. I'm really trying to be fancy today. There's it popped up last time. Optimize or bound. Let me just try once more and I'll give it up if not. Can you hear that? Not yet. I think it's more gross. There's a square in the middle of the screen. Okay, let me just stop sharing and start again. I'm really going to give it one crack and then we can give up. It doesn't work. It doesn't work. Right. This is what happens when you're trying to be fancy, right? So share from there. So you should see the screen. Yeah. It works. It works fine before. Yeah. It could be a band with the show as well. Okay. I'm going to give up. It's all right. There's a comment in the chat. You're going to be looking at the video in your browser and share that. Yeah. That's not a bad idea. I. Think I'm going to give up. It's okay. I'm all ready. It was just a really nice little video. I'm going to tell you about the initiative. So the initiative is in. It's focused in South Australia. It's in. It's a community led initiative focused on mental health. And it is a partnership. So this is important to get a foundation wanted to do things differently. And they provided the funding. And it's a small foundation, actually. And they put all their core, all our funding into this one initiative. And so when, when I first, I got involved right at the beginning when that, when the foundation first decided that they wanted to do this work before, before they put tenders out before they've got anybody on board before there are any towns. So that's how early I was invited in as an evaluator. So let me just share. I'm not trying to share the screen. Just my slides now. Can you see? Can you see the screen? Okay. Perfect. Yeah. Yeah. So it started off with, as often does some research around health needs. The health people will be glad to see that this happened about health needs and priority in South Australia. And they looked at what was going on. And they looked at what was going on. And there was some serious, serious issues, particularly around mental health and well-being. And so this initiative was, was, was developed with this idea that it would be community led and owned. Of course it didn't start like that because there were no communities on board. And I'll come to that later. But it was this long-term 10 years of funding and it was going to be social innovation and evaluation together all the way through. With a focus on prevention and promotion. So a while it was about mental health and promotion was very much going to be led by the towns. So, so it, and it became an expose more about well-being as time went on. So in terms of who was involved, I'll explain these acronyms. So the community teams, there are eventually when expressions of interest went an hour or so, and there was a process to apply. Six towns ended up getting involved. And the initiatives three years in now. And the towns of Seduna, Cummins, Berry, Kimber, Mid-Murray and Kangaroo Island. And it's also an advisory group of the FFF. It's a fuller foundation. That's the foundation. The South Australian government are involved. This community, this community members, people who lived experience on the advisory committee as well as community members. They have experience on the advisory committee as well as people from the organization that's supporting the implementation, which is taxi. Taxi stands for the centre for social innovation. So the support team, we call them support team, is a combination of a full foundation, Clear Horizon, the evaluators and taxi, who are the social innovators. And there is also a network of all the towns, which increasingly plays a role. The model of our town, I guess, is you've got distributed governance. It's got a lot of capability building, participatory decision making, very strong focus on networked communities. And it does aim to have policy and systems influence beyond these towns. It is based on four key principles. The principles were really important, actually. We started off with a few more, but we ended up co-develop these principles with the communities. And there was only four of them, community led and owned, leading our way through change, seeing and acting on the bigger picture and modeling mentally healthy practice. So where we are now, we're about into year three. The towns are on board. They're beginning, they're employed. They have employees in the towns. They are running the initiative themselves. They are networking together. And they are, we have been building their capability to do their own evaluations. That's where we have now. And it's very much a networked approach. And the network is sort of something that's being really worked on now. So there's a network, for example, of champions, evaluation champions from each town who come together and share their practice. And there's other types of networks as well across the whole initiative. But I guess I'm here today really to talk about the evaluation, not the initiative. So let me focus on that. So we often say that the, I like to say it was slow cooked. One thing that I've really learned about working in social innovation is that you can't go too fast. You can't, there's no point in developing a big evaluation framework. You might as well throw it away because there is nothing there yet to develop a framework around. So it started with developmental evaluation in the very pure sense of the term because I started right off with the foundation when they were first thinking of the concept. And we started to think about what were the outcomes, what were the principles that might hold this initiative. And then we moved into contracting phase where the evaluator and the social innovators were bought on board. But there were still no towns at this point. So we worked in the first year or so, it was really trying to work out what the initiative would be, how we would bring towns on board, how we would ensure that it would be community led. What sort of selection process could we use? How could we involve people with lived experience? And eventually what were the outcomes? So the role right back in those early phases of the evaluator was sort of very much helping to develop the early stages of the initiative. But also I played a critical friend role. And once we'd set up the, once we'd agreed on those principles, we very much used the principles to check in on ourselves, check that we were following them because it was all pretty wild. And so the principles sort of anchored us in action and to capture learning. And we did process evaluations about every six months to capture how things were going and to get feedback and to feed it back to the initiative. So this was the early dead stages. And sometimes I like to think of the roles at this stage of the developmental evaluator. And so I played this critical friend role. So this is very much like challenging people, like, oh, I'm watching, seeing a pattern that we, we said we wouldn't do. We seem to be falling back into that trap again and helping people stay anchored, particularly in the principles. Also facilitating pieces, such as the theory of change development and the principles. I call it the scientists. There was definitely an evaluation where I roll where I began to think about, we all began to design the longer term impact evaluation framework and start to think about how we would measure the impact and what that would look like in the long term, as well as capturing notes and process of what was happening and what are the key learnings. Because it's over a 10 year horizon. I've never done a 10 year horizon before. It's really interesting. I start a 10 year horizon. It was quite, quite re-illuminating. So in the first phase, and I'm going to talk about two phases really. The first phase was sort of from nothing, if you like, through to selecting towns and getting them on board, getting them employed. And then it sort of shifts more into a community led phase, which is phase two. So phase one is sort of before the towns or as the towns were coming on board. So this was supporting and defining and helping to, if you like, build what our town was, build the model or codify it. So that was developing the principles and the theory of change and developing what we called umbrella evaluation strategy. It had to be fairly loose because we didn't yet know what the thing was going to be. So you couldn't develop a typical evaluation framework we had to hold off, but we could have a bit of a strategy and thinking about what we'd need later on. The second component was to helping the actual short listing in the selection process. So we worked right alongside the foundation and the design team in, you know, observing and facilitating feedback and thinking about checking in that we were following the principles and how selection was done, that we were acting with integrity in the selection process. And also they really wanted to select towns not based on the written word, not based on the typical reasons you might use applications. They wanted to let different suspects, you know, be successful. And so there were different ways that the towns were evaluated for growth and capability and potential impact. And there was very different ways of selection than I've ever seen before. So we were like feeding in and supporting that and doing process evaluations of how it went and getting feedback and like so forth. And yeah, in the end, we also did, we assessed the capability growth of the towns and that fed into the selection process because the idea was, were towns, were town teams moving towards a more sort of mature and were they learning and were they building their capability and that was one of the things that we looked at in selection. There were 20 towns that applied originally and there were six towns that eventually went through with four of them getting the 10-year funding. That was phase one. And I guess phase two was, this is when phase two is now, the phase that we're in now, it's sort of started about two and a half years through. And at this phase, we switched really focus because the towns were now in place and the whole idea was that they would do their own evaluation work. So it became very much, we stopped doing developmental evaluation and we started building the capability of towns to do their own evaluation. So Clear Horizon, we have an academy and so we took people through the online learning academy and coached in between things and they built their own theories of change and building their own measurement framework and started to collect data themselves. So that's taken, it took longer than I initially anticipated. So I've been taking, I guess, about a year of a really strong capability building phase with emergence of some champions. And now we're also beginning to think about how we develop the impact measurement framework for the whole initiative. And that's right from policy, influence work to actually changes in wellbeing and mental health, but it's very much grounded in the work of the towns and it's sort of drawing from their own, their theories of change. So there you go. That's phase two. In terms of the, I thought you might be interested in that overarching evaluation framework. So there's sort of four components to it. So there's measuring against the theory of change and there's three levels that we measure at. There's the shared goals that the towns are aiming to achieve, the systemic changes and there's the systems change focus and there's also the activities and engagement. So three levels of measurement, periodic evaluations started off with sort of process evaluations and moving into more picking up emergent impacts. There's the developmental evaluation, which was very strong at the beginning and capturing learnings. And I guess it's still strong. It's just more across a bigger group of people now, a very strong learning focus. And then there is this new part which is capacity building in now because we are not the ones who will do that for the towns anymore. There's the theory of change that was developed which is fascinating to develop the theory of change over 10 years. But you'll see here, I won't go into details, but you'll see here that the years eight to 10 are goals set by the community, for community our real life. So they're not even named because it's community led but there's also intentions to influence policy and create this regenerative network of towns who are working on their own mental health and wellbeing. And so this is this sort of flow helped us to focus on where evaluation would look at different phases. We have a concept cube that evaluation happens at the town level, but it also happens at the network level against three levels, three theories of change, but we also have this learning, set of learning questions, triple loop learning that goes alongside it. So that's the sort of summary of the conceptual framework. And these are the levels again, measuring impact, measuring systems change and measuring engagement and reach. So this sort of visualization of it, it's quite hard to conceptualize a 10 year evaluation with all these different parts to it. So we tried different ways. Some people really like these concept cubes to describe how different sorts of results would arrive at different times. That was to manage our expectations of when we would see change, but also where the evaluation should focus at different times. But always we were looking at what we were learning. We were checking in on our principles, whether we're holding truce. That's the way we were working, if you like appropriateness and beginning to gather impact and measurement and outcomes measurement as things moved forwards and then progress markers where we were at the stage we expected to be at. So these are the key evaluation questions. There's three main ones. What's the overall impact? What's to what extent is our town shifting systems and conditions for transformation? And what did we do and how long did we do it? That's a simple framing question. So pushing this across time in the first phase, which I talked about was really basically developmental lasted for about two years and then the next phase which started about two years and it's going through to about three and a half years is basically setting up the baseline for the long-term impact measurement but beginning to see emergent changes. But there's also a shift at this point because we're not sort of holding all the evaluation work that's being done by town. And the next phase which will probably start in three or four years is looking at really impact measurement and learning properly. Okay, so I might stop sharing at this time. So I would love to talk to you guys now about what you've heard and first of all maybe take some questions. So I suggest that you might want to because there's quite a few of us on the call you might want to put your questions any questions you've got into the chat function and then we can, I can go through them. So first of all, have you got any questions before we go into that? What does it mean for evaluation? Let's just, I'll just start by answering any questions about this case study that you might have. So feel free to ask questions and I've got the chat function up. So anybody, I don't see any questions. Anybody want to verbally ask a question? Go for it. I have a question, Jess. I was wondering how big is the sample size and can you, can you give a rough estimation of the ages and gender distribution in the sample? Sample size. I mean, what do you mean by sample size? I mean, it's not like, do you mean how many people are involved in the initiative? Yeah, that's a good question. So there are six towns and there are in each town team there are about up to six, seven people who are employed or working in a voluntary capacity to do that and then they engage with all the people in their town. There will be thousands of people who are engaged but indirectly through the town teams who then engage with their communities. Yeah, but we haven't done broad scale surveys or anything like that. It's just not where we're at yet. We're right at the front end. We're sort of forming this thing. Well, it's like we're building the house, but we're nearly ready to let the guests in but we haven't quite finished building the house yet. It's just at that stage of moving towards what maybe would look like more programming but it's way before that. So it's evaluating, it's working at building, helping build and shape and set things up. It's quite different, yeah, than a typical evaluation. Yeah. So I've got some questions coming through the chat now. So how do you achieve this if the organisation does not involve you from the beginning? I mean, that's typical, isn't it? How often do you get involved at the beginning? This is unusual for me. I guess if you're not involved from the beginning, you have to work with what you've got, don't you? And that's the normal situation but I'm really curious with this example when evaluators get brought in right from the beginning. I have to say though, this is my successful case. I've done a fair bit of developmental evaluation and had some really tricky experiences as well and I've had to close a couple of contracts because it just wasn't possible to do the developmental evaluation and work because the conditions weren't set out to do it. So yeah, it doesn't always hurt smoothly. How do you get involved as an evaluator in social initiatives so early? Yeah, I mean, I had a partnership with this innovation group and we've got a partnership with them for many years. I guess it started about six or seven years ago, the partnership, and they wanted to work and find evaluators they could work with because they were doing a number of social innovation initiatives and they wanted to evaluate them and their first experience in working with evaluators wasn't very good. The evaluators were insisting on things being stabilised and having a rigorous evaluation plan which prevented the innovation occurring so it failed. So they were trying to find evaluators who could be more flexible and more adaptive and be less rigid and so they came to, I guess, came to us because we've got a history of working participatory sort of setting so I guess that was a good, although we had to learn, we had to really change the way we worked to work in with them and then we built that partnership so the partnership was already there but it's happened over years. We had to learn, as evaluators, we had to learn how to show up differently so there's some implications for our practice here. A lot of the things that I was taught in my early days as an evaluator, like having a strong evaluation plan, I had to throw that out the window. It was quite terrifying to do evaluation work without an evaluation framework there. I don't know if any of you have done that but it was almost like you had to use your skills, your innate evaluative thinking skills on your feet as opposed to the planning part that we normally do first. Yeah, because you can't really do that when the thing's not built yet. It gets in the way, in fact, the plan does. I learned that the hard way, like rewriting a plan, like I did one developmental evaluation where I went in thinking I would do an evaluation plan but it was too early and then I had to redo the plan as you might expect and then redo the plan and then I think I redid the plan like eight times and I used all the budget up in just doing the planning. Total disaster. I had to learn the hard way how to go, okay, I'm just going to hold off. That's what I mean by slow cooking. Just hold off. Don't go into fast. Find out what's going on. Slow the whole process down. Be there to be useful. Serve the social innovators. The orientation and the positioning of the evaluator is really different. I've got some more questions there but have a look at them. Okay. Where the gender diversity consideration analysis, the support team seems to be mostly women. Oh, no, there are men in the support team. In fact, there's a guy, there's one of the men, the evaluator is three of us, one male and there's men on the support team as well. I think that was just the photos he showed. Remember that the towns are all diverse as well and so each town team has thought about diversity and who needs to be on the team and they're all very different. But I guess this is more design and it will be the social innovators and the town teams who have those discussions. How do you specifically measure the impact of the collaboration on lifting systems learning and policy advocacy? Yeah, that's a great question. We have methodology in place to do that. I guess we're using a combination of tracking changes in the policy and then finding out what's happened and then using contribution analysis, which is the way we often work, use contribution analysis to find out what happened and what role did different people play and can you plausibly bring that back? But we haven't done too much of that yet. Although there have been policy ripples and we need to get onto it, but we do have an impact log to track changes that we see. We sort of use eyes and ears methods. Everybody's got an impact log to see what's happening and then we follow it up with like finding out is there something in this. So it's sort of inductive, but if you like you have to find out where the ripples are and track them down and see if they relate back to you or not as the case may be. That's how we usually go about tracking policy change. Let's see. What would be a key learning or a key success for you to embed capability building within the ML framework? Key learnings, yeah. I mean, I guess I have done, like I do lots of training in evaluation. I've done that all my life really. I'm a trainer. So I figured I'd just keep doing it like we'd always done it, but I was I wrong. So I guess the people that we've been training in this initiative, they, like, you know, they're not, they haven't ever done evaluation before. They may not have done programming before. So they've come from all walks of life, like, you know, got a mechanic and a hairdresser and a counselor and they're people from the local communities. They're not service providers. This is something they might be accidentally, you know, it might be what their job is in the daytime, but they are there not for that reason. They are there. Most people involved have lived experience of mental health from a family member or themselves in some form or other. And so it was, so some of the people were like, haven't been to university, I guess. And so I, it really, it got us to slow down, but it worked. It still worked. The mail capacity building and the approach still worked, but we ended up just slowing it down and coaching into it. So we slowed down. We did one piece at a time and then everybody did it. And then we brought everybody back together and we shared it and we discussed it and we moved on. So we sort of slowed what would normally be, we would normally do through the Academy at 12 week course, three hours a week, Clear Horizon Academy and learning how to do mail. We just slowed it right down and did it across the year, but built it as we went. And that worked quite well. But I guess the key learning was that you have to go slowly and you've got to use it. It's got to be practical. You've got to actually apply it straight away and have a chance to share. And the sharing between the towns worked really, really well. They loved seeing each other's work. And that's, that's what we want. We want the relationships between, between them so that they'll go on after we did. How have you navigated and negotiate with the client? Is your role as an evaluator rather than if you've been contracted in a different role? It's interesting because it doesn't feel like the normal client evaluator relationship that we don't call them the client. Like we, they see themselves, that the foundation provide the funding. And they see themselves as a partner and we see ourselves the evaluators as a partner and the social innovators are a partner and the towns are a partner. And everybody's actively involved. And I guess everybody's playing roles slightly differently than what they would be used to. So it's a whole different ball game. This is why I'm so fascinated by this case study. It just feels like it's starting really differently. Yeah. So navigating in terms of evaluation, but it is shifting though. And there's been transitions. Like at the beginning, we were sort of like all chatting together the support team about the findings and interpreting the recommendations of what we do differently from each evaluation round. But now that the towns are all there, we've had to rethink our role as evaluators and we're actually stepping into a slightly more independent role. Like we talk to everybody and do some surveys of everybody and everybody comments on everybody. And we pull it back and produce a sort of report for everybody. And everybody then develops their own, what we call intentions. So each group develops their own intentions for how to respond to the findings. So we sort of pulled back a tiny bit. We can do that now because there's so many different actors and we sort of need to represent all of them. Yeah. It's sort of interesting the movements of the evaluator through this time span. How do you ensure the mindset of the community is willing to engage with research and evaluation? It's funny because these guys are really committed to addressing mental health in their community. So there's no question of commitment and engagement around research and data. I've never even really thought about that. It's not about me trying to ensure that they're engaged with it. The challenge is the more about how to build capacity in a way that's not overwhelming and step things out and do things at the right pace. We have to keep having to slow down. When you've got 10 years, you can go slowly, but somehow there's something wrong with us that we keep pushing too fast. And then we have to take a few steps back and slow it down. Given the flexibility in the evaluation approach, how do you balance and manage the risk of not capturing the baseline that you'll need for the evaluation? Yeah. I'm fairly relaxed about the baseline, to be honest, because at the end of the day, we'll probably be using some form of public data and that data exists already. So I'm not in a hurry to decide what measures to use because they have to be decided by the towns. And there's a whole issue in Australia about data sovereignty, and we take that really seriously. We've got Aboriginal towns, Spursnations towns, and they want to say in how their towns are being described. They want to say in what sort of data will be used, and they want to hold that data. So we cannot rush in to deciding what the baseline will be because we have to wait for the towns to be ready to determine what it is they're doing and then develop the baseline. So the baseline will be sort of decided about three years into the 10. But we can go back, we can go retrospectively and look at data from earlier, public data at least. But if it's collected data, then yeah. If only enough, I just don't think that's the most important thing. I used to think it was, but I think making sure that data sovereignty and the right data is collected and that evaluation doesn't prevent this great work occurring. Something that I've learned over the last five to 10 years is that evaluation itself can be the problem, can be one of the big problems. It can be colonial. It can get in the way, it can prevent this good work and that we have to really think about how it's show up in these spaces. Systems change, work and transformation is all about thinking about how you show up and being immune to it. You have to think about how we show up. Love contribution and analysis and impact love, somebody would like to hear more about that. Actually, if anybody's interested in any of the methods that I talk about, I don't think it would be time to go through things like contribution analysis. But Clear Horizon, you know, I am the founder of a company called Clear Horizon and we're evaluation specialists. We also teach and run a whole lot of things through our academy and there's also what's called a community which is a free space for evaluators to come together and ask questions and there's a whole lot of resources on there and you're welcome. It's no cost. If anybody wants to go to the community page, the Clear Horizon community page, there is a whole lot of resources on contribution analysis and impact logs and things like that and you can't find it with ask because that's what people do, they just go there. How do I do this? How do I do that? So that's an emerging. We've got about 350 people. It's fairly active. It's fairly new. It's about a year old. Was the 10-year theory of change created in partnership with the town? Could you talk through how this theory of change came about with such a long-term project and so many stakeholders? Yeah, we didn't start with the theory of change. We started with the set of principles and they were co-developed. Well, we started before the towns were there. We had six. And then when the towns came on board, we ridded them with the towns. So the principles were developed with towns. So they began with the set of principles. The theory of change, there were some rumblings in the background and we did a bit before the towns came on board. And then we kept moving that forward and we shared it with the towns. But what happened next is that the towns have all got their own theories of change because they've got their own plans and their own goals. So that box in the theory of change just sort of says towns achieve their goals as set in. So I guess the theories of change at different levels, there's an overarching theory of change that describes the whole initiative and how the towns are situated in that. But it's also got our policy change supposed to happen and how long-term systems should occur. So there was some feedback. There was a lot of feedback from the towns. But I guess it was, that started to be assembled before the towns came on board. But they've fed back on it. But meanwhile they've been developing their own and then we're shifting the theory of change to come into alignment with what they've done. So essentially, yes, the towns have influenced it but not straight away because they weren't on board straight away. Yeah. Although there were people, there were community members on the advisory committee. Not to the same extent that the towns are out of power now. Okay. Does involving community work smoothly as it is explained or are they late in conflicts and disagreements? How do you work with that? I mean, this is life. There are always disagreements and conflicts. So wiggly, wobbly, massive all sorts of things happening at once. But I do believe like when you set things up well from the beginning, things go much more smoothly. And this is the smoothest I've ever seen it. And I think one of the secrets and I don't know how politically comfortable this is going to be for some of you is that government weren't involved at the beginning. There was no government involvement in selection of the town. So there was no, we're in an election tomorrow in Australia. There was no ministerial announcements about which towns would get funding or not. There were no towns involved who didn't want to be involved. They had to compete to get the funding. And that was all done. It was all done without government involvement and government do sit on the advisory committee now, but they weren't there at the beginning and that has made so much difference. So I mean, I think government's got a really important seat at the table of anything, but it sort of helps if they're not there at the beginning. And same with academia actually. I mean, like the really important role for research and academics, but it was sort of great just to work with really solid social innovation and community, people with community development experience at the beginning just to set things up, to set the culture, set the ways of working in a way that, because we want it needed to be mentally safe. You know, this is people with lived experience. It needed to be a safe space. So it was sacred. It was really important that we create this atmosphere where people could speak and so forth. Yeah. So, yeah. Yes. The slides I have, and the videos are linked on the slides and you can follow it on all the links are available for you. I've met a PDF and given it to Maria and I'm sure Bill will make them available after this. So you can have them. No pulp, barreling, no. It's so weird talking about this on the eve of election. I'm feeling a bit nervous over here and I don't know how many Aussies we've got in the forum today, but it's been quite tense this election. So let's move to the next part of this. I'd love to get you to think about what this means for evaluation. What if we did more work like this? What would it mean for your practice and so forth? So I actually had two questions, didn't I? I'd better go back to them. So my first questions were, are you seeing work like this? Are you doing work like this? So is this similar to what you're doing or is it really different to what you're doing? So let's do a slump chat. So if you don't mind writing in to the chat function and we'll press button together like we did before. So are you doing what, is this similar to what you're doing? Is it really different? Are you in finding the same sort of being invited into the front end like this? So let's find out if this is, we're having similar experiences. Okay, I'm hoping that you're all typing now. Typing, typing, typing. Okay, let's press go. Yeah, very different, really different on a much more scale, yeah. It's exciting, isn't it? And I know it's a real privilege. I know this isn't normal. I feel really, really privileged to have this opportunity but I feel like I hope that we can all learn from it and evaluate it. Yeah, it's sort of like, it's like a vision of the future of what could be. Never worked without some kind of plan. Yeah, I really, that's, thank you for being so honest about that. Yeah, thought of. So the general consensus is, that no, people aren't doing a lot of this. I mean, like, my team, I have a team, Clear Isons are medium sized companies. So we have about 50 employees and we're a valuation company. So we're actually, I think one of the biggest evaluation, like a dedicated evaluation group in this region. And I work, my team's about 12, no, 16 people and we are working the social policy space, like you guys mostly. And we're mostly doing developmental evaluation now. So I think it is on the rise, but nothing quite so pure, if you like, as this, but it does get me excited about the roles of evaluators into the future and how forming partnerships with social innovators. And they love having us on board. That's the nice thing about it is that it's, they love having us and we love working with them. I'll tell you why I love social innovators. They love feedback. They don't get defensive. They're like, oh, fantastic. You've got some negative findings. How good is that? Let me have a look. They smile when I give them negative findings, if only. So there's a lovely, there's a lovely possibility of a relationship between social innovators and evaluators because we sort of need each other and you can go places. I don't think you can go alone. Evaluation can de-risk this sort of social innovation work. And this type of really cutting edge social innovation work, I believe has potential to really address wicked societal problems. And we've got plenty of them, haven't we? So if we can learn to work in with the stuff, we can actually work really constructively to make a difference. And that gets me excited. It gets me so excited that we can actually make a difference. So I think the world needs it right now, hey? Yeah. Okay. So any final comments? And, you know, we are at 10-2. So we won't be going much longer. So I invite you to turn your cameras on and like have a bit of a chat about any implications for evaluation. If we were to do more of this sort of work, what would be the implications for us? Has anybody got anything they'd like to say? What do you reckon? What if you guys all got to do this sort of work? What would it mean? What would it mean if this became more normal or more widespread? What would we have to shift? Yeah. I think the biggest thing from my experience is just the cost implications. That the twists and turns, the exploration, the time that you need to spend actually collaborating with whoever the partners are has massive implications. Sort of. Because at the same time, I think it puts you in a phenomenal position to do a much better job because of that semi-embeddedness. And I just wanted to, I guess, echo your point that people who are doing innovation are so receptive to this work. And it's actually, it's the kind, like working with social enterprises and things like that, it's just, as you said, such a privilege because you get to work with people who are so engaged in what they're doing, so driven, often really charismatic, interesting people. And it's just such an energising space to be in compared to a couple of other examples. Yeah. And I love it. Catherine, it is you're right. And I think about this a lot, like it's, you know, and it's also different sort of contracting. It's taking me a while. I'm a bit of a businesswoman, you know. And because of being the founder of Clear Horizon, I think about contracts and business and how it works and the contracts have to be different. You know, it's not really a milestone based, if they can't go, oh, we'll pay this much for the plan and then this much for the interviews and this much for the report, because you're actually being paid basically for a service. Yeah. So it's like a service, like it's more like an hourly, daily rate, you know, how many hours, how many days is it? And then how do you manage that? And you can actually bring your rates down. The rates can come down a bit because you can get paid hour for hour, which doesn't happen. Does it normally when you do an evaluation, you say, oh, that'll take me 10 hours. So it's sort of like, you can be more, you can be more real about how long it takes. And more flexible in a way. Yeah. And it's more livable. Yeah. And that's exactly, I mean, I only have myself to be dealing with, like I am an independent consultant. And so that flexibility is a big part of what I'm able to offer, I guess. And very much things are done on, well, this is how many hours we're calculating these different activities will take. And any, and just, yeah, exactly what you said, basically. Yeah. But I mean, how do we show that this work is really worth it? Because what I see is when you work like this, you get amazing outcomes. And to be honest, as an evaluator, like you would have seen how many evaluations have you done where there weren't good outcomes at the end. There just wasn't. I mean, like I've evaluated, you know, overseeing many evaluations and mostly they're disappointing. And when you work like this, you can start to see the power of what change can achieve if you do it properly. And you're like, oh my God, all that money wasted and things that didn't work. Why not fund this instead? Because it actually seems, you know, to be promising. It's early days. But I'd love to see what this can, this does in 10 years. Watch this space. Yeah. It seems to be. And yet it all adds up to about $10 million, not for the evaluator. It's the whole initiative. And that sounds like a lot of money, but seeing a lot more money wasted than that. Things that weren't successful. Yeah. Well, we nearly, we're running out of time. Is there anybody else who'd like to comment about the implications for what this might mean or your practice? Yes, Lillian. Well, I think that it actually gives the results of the evaluation a lot more credibility because people are actually invested in it. And so they, I think that it's, when you have that early buying that, you know, people, they, because I think that there's an underlying fear of what the evaluation results will be. When it's unknown, when you're sort of like working with groups and that have not been involved with evaluator and then they just sort of think, oh my gosh, I wonder how all this is going to go. Whereas I think you get that confidence built in at the very, very beginning. And you sort of, because you're traveling along with the evaluators. And I think that that's really important because it gives the whole project so much, you know, people have real faith because they've had sort of, they've been able to put in their, you know, put in their perspectives and be heard at the same time. So I think that's really important. Yeah, there's very strong trusting relationships between the evaluators. In fact, the community really, they really respond well to the evaluators. And sometimes we've got to watch that because sometimes they like to have a little, you know, word in our ear. Like, oh, I can give feedback to you about anything we want. So you've got to be careful sometimes. But there is this, there is certainly not this feeling that the evaluators are the, you know, outsiders of just coming to like, extract and test us as none of that. It's, I think we're definitely on the inner and known well and have relationships and get invited to people's houses and they'll know us. I mean, so it's really, it's really a different world. It feels so mentally healthy as well. I don't, I mean, that's one of the intentions of the initiative is to always model mentally healthy practice. But I mean, like, what does modeling mentally healthy practice and evaluation look like? You know, it's not an interesting concept. Yeah. Thank you, Doki. Any more questions? And we might wrap it up soon. Anything else? Yeah, all right, call it a day. Hey, I hope you all have a beautiful day and that this was interesting for you. Yeah.