 Welcome everyone. I'd like to thank Tamash for making himself available to speak to us today and also to Colin for introducing us to Tamash and Benchura and facilitating today's talk. I'd like to acknowledge the lands that we're meeting on today. We're meeting personally on Ghana land in South Australia. I'd like to acknowledge Elders Past, Present and Emerging and pay respects to the use of the land and their relationship with the land and water. And I know others here are not necessarily on Ghana land but are also on First Nations land in Australia. I'll hand over to Colin and he will take over as MC for today. So thank you to everyone who's attended and for those watching the recording. Please feel free to reach out to us. We're always here and happy to talk about evaluation. Over to you, Colin. Hello. Thanks very much, Mark. And welcome, Tamash. And I can see Ventura's come on board now. Welcome Ventura. I'm glad you can make it. Mark has mentioned that in Australia we honour our Aboriginal heritage and I'm coming from the Neuringerie land down south of the peninsula at the moment. But I want to talk a few words of introduction of the context. Although I've been involved in evaluation since 1983, in fact, my psychology department colleague and PhD colleague, Annette Armstrong, met me after she'd already started the first National Evaluation Conference in Australia in 1982 in a coffee lounge one day and said, you want to help me out running this thing? And so in 1983 when I became the first Research and Evaluation Manager for the Commonwealth of Australia, setting up a pilot program budgeting system, I got involved and she got involved with the Victorian Government's program indicators. So back in 83, we started developing the second Evaluation Conference nationally from 1984 and then two years later, the 1986 conference. And from that conference, we passed a resolution at the end of the conference to form the Australasian Evaluation Society and Annette was elected as the first Foundation President and I was a Foundation Committee member and eventually went on to co-edit the journal and set up the Committee on Ethics and Standards. And so there's a bit of history that I've been able to impart and hopefully it's relevant for our discussions today about the formations of international and local Evaluation Associations. I'm just going to share a screen to give us a bit more context. So if we can share a screen, I've lost it now, where has it gone? There's the share screen. Can you see the shared screen of the slides? How's that? Yeah, no problems. Yeah, okay. So I've called this a journey. I think it's a real journey hasn't it been, Thomas? Ups and downs and bumpy roads. It reminds me of the Osebisa song. Where are we going? Where will we get there? So I guess one of the things that happened is that Thomas contacted me through LinkedIn. And I think it's important to realize that the international linked networks have been facilitated by various means. I think the very first thing you could say that started our internationalization of evaluation was in 1975. You can see the program evaluation standards. I got this second edition in 1994. And it was set up by, can you see that the Committee on Standards for Education Evaluation on my screen with my mouse is moving around? Yes. And it's really important to realize that the American Psychological Association, the American Evaluation Association and various other educational associations and the Canadian Society of Study of Education realize that there was benefit of sharing evaluation standards and professionalization of evaluation across borders and across professions. So the psychologists, educators, education managers and so on and psychometricians and statisticians set up this joint committee on evaluation and based in the domain of education and evaluation. And I was privileged to be part of some of that with my formation of the first Committee on Ethical Standards in Australia. And also I was invited when I came to the American Evaluation Association's first collaboration with the Canadian Evaluation Association Society. They had a meeting of the American Evaluation Association in Canada at Vancouver in 1995. So I met then the AES President Eleanor Chulinsky. She was the head of evaluation for the US government's Office of General Accounting Office in the executive of the president in the US. And she was the most genial host and benefactor. I was struggling to find where to go in this conference venue. She saw some poor lost little Aussie came over and said, Hello, are you interested in the evaluation conference? I said yes. I'm the president of the Evaluation Society from Australia. Come to participate on behalf of my colleagues in Australia. And she said, Ah, come along. So she's followed me up and put me on stage with all the other presidents of other venerable people in evaluation. And I actually introduced her to Elliott Stern who just got the Evaluation Society going in the UK. So that was amazing. And I was able to form good relationships. I asked Eleanor to come out in 97 to be our keynote speaker for the evaluation conference in Adelaide, which I convened. And unfortunately broke her knee and wasn't able to come. But there's been a long association between the American Evaluation Association and this Australian or Australian Evaluation Society. And the Canadians, of course, because we've been Commonwealth countries, there are Commonwealth countries in Africa that obviously have kindred relations with Canadians. But the point I'm saying here now is that we now have this, I guess, Evaluation Association register. Unfortunately, AMA is not on the list yet. And Tom, I should go to get back to them to get that done. Where are you coming in from? Well, Tom Losh is in Matuto. And that's when the conference was held. You've had a bit of bad weather associated with the Indian Ocean, I suppose. But we're going back to now 2020. I got a LinkedIn message from you, Tom Losh, and you asked me, here's the message. I'm from Mozambique and now part of the team working on the process of creating an M&E association. Could you have any availability to discuss a bit and see how we can help each other? And I thought that was a really important outreach, which the Australian Evaluation Society should reciprocate because we were given that similar outreach by the Americans and the Canadians. And so I accepted the invitation and I sent a whole pile of stuff, my papers on the history of evaluation, the guidelines on ethical conduct of evaluation, which is a number of additions now. Canadians have actually collaborated with us because we've had very similar approaches to ethics and standards. So I also referred Tom Losh and Ventura to our CEO, Bill Wallace, and the two presidents, John Stoney and Kiri Paratha, who are very hospitable and communicated with you, Tom Losh and Ventura. And sent along a whole lot of stuff about our strategic priorities and reconciliation plan for Australian Indigenous recognition. So we have a guide to ethics and standards, a professional learning competency framework, and also a statement on evaluation in COVID. So these are two themes we'll be following up now. First of all, I think it is important to come to the idea of what the AAAA's strategic objectives were, Tom Losh, and this is something which I felt was relevant for us to. You actually learn quite a bit in terms of strategic planning and working out where to go. So your objectives were as far as promote the interaction and learning and professionalization of a public and private sectors of monitoring evaluation. Well, as you learned and have done, we formalized that through conferences as we did in the Austronesian Evaluation Society. And then you facilitate a second objective, the development of capabilities and professional networks, the evaluators professional learning competency framework, which we developed a similar thing. Objectives three development and documentation of quality monitoring evaluation theory and practice. Well, in fact, that was always one of the things that the evaluation society was about the theory and practice, not just the professional discipline, but the evaluation journal is now quite well respected. And then the ethics guidelines I mentioned, these are the two objectives that really got me interested in your conference and that was advocate for the development of evidence based policies and programs to enhance transparency and social accountability. A very laudable objective which we could still learn from in this country. But also the other one which is interesting promote new evaluators and attracting students and recent graduates and professionalization, something which the AS is always tiptoed around. We've always tried to be a professional association of the discipline of evaluation rather than a guild or a union or collective of evaluators, which the American evaluation associations tended to be. So last but not least, we decided to share that through the LinkedIn network. And so you registered your organization in November 2020. Is that right, Thomas? Yes. And so from then on, we've had a violent, vibrant network going on and I'd like you to now carry on the story that I've started, if you'd like to say a few more words about how that came to be and what happened with the conference and the formation that you had. Thank you so much for this opportunity. First, I'd like to thank the Australian Evolution Society for bringing us in this conversation and giving us this opportunity of sharing our experience. Well, it's a long, long story, which started, it didn't even start in 2020, it started before. But we've been struggling, we've been trying to do with Ventura, leading some more, most of the actions. I also joined, actually, we met me and Ventura after we have worked in the same company. And then that's when we met and we said, okay, let's try to go with this, we want to make sure that this happens in Mozambique. And I remember when I started, I was in a group of discussion with Jinra Sekan. And we had some discussion, that's when she said, okay, you can talk with daughter Colleen and maybe he can help you in the direction you want to take. So, in 2019, there was a lot of discussions in 2020, we just started creating this movement. How did we create the movement? We created the movement through WhatsApp, through communication, through emails with our colleagues. Mostly when we started, it was mostly people from NGOs and we started communicating and we just brought this idea of creating the association. It was a challenge because when we started trying to bring all the set of things we needed to register the association, then it comes COVID. And we're sitting in different places in Mozambique, where people from Northern side, where people from Central side, and everyone is far away. How could we make sure that even this process is a democratic, we can include everyone in all this process. So, it was a challenge. It was a big challenge because things were shut down here because of COVID and even communication centers. So we decided, okay, just to make sure that this idea doesn't go away. Let's just create a WhatsApp group where we will organize like weekly meetings, or that's when we started, weekly meetings where we will bring a speaker to share experience, to share vision in monitoring and evaluation and things like that. And we have gone for that for almost six months doing that weekly. And even if we didn't have a speaker, we could have at least one of the key members of the association sharing the status update of the registration, legal registration and so on. And in November, we managed to collect some money from our members just to formalize the registration. And that's when we said, okay, this is done. We started contacting people, organizations like UNICEF, African Development Bank, World Bank, and USAID and so many other organizations. Bringing this idea. And we didn't bring just idea. We brought like a concept note. What is our idea? What are we thinking? What we want to reach? We look to our market and I have been working in monitoring and evaluation for more than 15 years. And Ventura as well have been working in monitoring and evaluation for a long time. So we said, okay, what is the biggest problem we have in our area in monitoring and evaluation? So we identified some of the points and then we said, okay, if we had to create an organization, what are we going to be doing? And what are we envisioning? And what support is going to bring to people in Mozambique? So we said, okay, let's focus not in a lot of objective, strategic objective, but maybe four to five strategic objective to make sure that we focus. We want to focus and we want to help people. We want to make sure that the people in this area feel that they are people supporting them and they can get support internally. And the other thing which brought us, which just seconded this idea was like, okay, most of the time we will get like consultant services, we will get like M&E activities, but we will not be the principal people. We will be the second people and always NGOs will bring international consultants to run this. So we will be like, okay, data collectors. And then we said, okay, it's not bad, but we need to look why NGOs looks to us as just data collectors. We need to understand that. It's not just saying complaining, but we need to understand. And we said, okay, I think we lack of professional skills, right? We lack of high level professional skills. We lack of these experience and so on. Then that's why that's when all of these ideas, all of these strategic objectives we brought in. And the idea was, okay, if we want to make sure that we are recognized, then let's be professional. Let's be good professionals. Let's not just asking someone to hire you and then when you're there, you're not able even to perform your job. So let's create these to make sure that we can provide capacity building to people. We can provide, we can help them improving their skills. We can bring this idea and we can also create a mean, this mean which can like guide the evaluations in Mozambique. And that was the idea. And yes, we brought this in our idea in our, in our, in our strategic planning. And also we decided to do the first conference. It was a challenge to do the first conference. We secured funding from UNICEF to 2021. Unfortunately, COVID-19 streak again and we didn't foresee this doing like online event. We wanted to have at least people sitting together discussing ideas, bringing the ideas and things like that. So we said, okay, let's postpone it and let's wait for the right opportunity to do that. And UNICEF said, okay, I'm able to fund you guys 100%. We went to other organizations, African Development Bank, they said, okay, we will fund you half of that if there's no, if there's another one to fund, we can do that. And we went to other organizations like Clusa, they said they will help us and things like that. So at the end, when we started, people were like, oh, okay, maybe it's something which is coming and then it's, it will disappear. But then when they saw that the idea is there and this is staying, organizations starting, started wanting to be part of, of, of MNE. And I remember, for example, in 2022, there was this new phase of, of a monitoring service from USAID. They launched, they launched the opportunity and all organizations who were applying to this opportunity, they all come to us and ask for a letter of support so that they can apply for that. And that was not a require from USAID, but then that's when we saw that, okay, we are bringing these importers in, in country and we want to, to move on on that. Yes, currently, we are almost 500 members. We still have some, our struggles, but we are moving there. And then when we started organizing this conference, we reached a lot of people outside international, international guests, and we managed to, to have guests. For example, our guest speaker, one of our guest speakers, Dr. Collin and we have Dr. David. And we have also people from Canadian Evaluation Society, from American Evaluation Society, we had also the Vice President of European Evaluation Society. So we had all these kind of people and all of them, they were interested in joining and sharing their experience and giving this support so that we could, we could boost our presence in, in, in the country. And there was a lot of good discussion and, and, and Dr. Collin can say that and we appreciate because it was late tonight to Dr. Collin. Initially, he said he won't participate in the hall conference, but since the discussion was very active, was very good. He managed to stay the whole sessions, the both days. So that was a very good opportunity for us. And that discussion brought us ideas, what are the next steps, what do we want to do, what do we want to create, what kind of movement we want to bring in Mozambique. So, in short, this is the experience I could bring from, from Mozambique for now, in terms of monitoring and evaluation. One of the challenge you will oversee is that we are like starting because this thing of monitoring and evaluation was brought mostly from NGOs. And most of material, they're in English and we're Portuguese speaking countries. So imagine what kind of challenge we go through just to make sure that we read material, we get material, we get the right material, we read that and we learn something from that and we can perform our activities, we can do what we do. So it's a big challenge, but we're working and, for example, this year we had also one opportunity, we had a request, Ventura, maybe if he stills here, he will share clearly this idea, but we had a request from the University of Antwerp, where they said, okay, we had, we wanted to support maybe two to three. Professionals from Mozambique to come here and have a two weeks training in monitoring and evaluation. And these people who went there, they just went because we had to give them this letter of support and the University news that these people are coming from Amma. So, which is very, very good for us and these people will bring this experience, will share this experience with the rest of the team. That is our expectation. That's why we're thinking we need to build this capacity. We're working also with Clear Lab, in terms of making sure that we have sessions where we advise, we provide councilmen to new monitoring and evaluation professionals. And we oversee, we see the opportunity also to offer linking with capacity building institutions. And recently, after the, and the fun thing is, that even though we are not registered to AFREA, when we organized our conference, but we managed to bring the president of AFREA to participate in person in our conference. Yeah, so interesting. Sorry, Thomas, can you explain for us who don't know these acronyms, AFREA is the African Evaluation Association, yes? Exactly, yes, yes. And also, we neglected to actually pronounce what AMMA stands for, can you add for us in Portuguese? Okay, thank you. So AMMA is Associação Mozambican Admonitaria Avaliação. So Mozambican Monitoring and Evaluation Society. So this is something I wanted to bring up, because one of your keynote speakers, Dr David Amayow, I may have not pronounced that correctly, but he was that great AMA, EW, YAW. Dr Amayow was a regular speaker in your conference. And he came, he said he visited Australia in 2018 and was impressed with the government use of monitoring and evaluation. So one of the things I guess that we don't usually mention monitoring and evaluation as the key foundations of the Australasian Evaluation Society. But in fact, monitoring and evaluation is obviously an important factor for the NGOs to have the reporting framework for accountability and regulatory involvement for independent funders and donors for the NGOs. So can you say a little bit about the importance of that in terms of how those donors make those regulatory requirements of monitoring as well as evaluation? Okay, thank you. Thank you. So in Mozambique, I think all of the NGOs, they have this requirement of monitoring and evaluation in their systems, but which is different with the government. The government is starting now to require monitoring and evaluation in all ministries. And we have someone sitting in the presidential house for monitoring and evaluation. But they are starting. The idea here is to bring this monitoring piece just to make sure that they know where they are going. They make sure that they understand what they are doing before they just end up doing things. By the end, they see that they were going to the wrong direction. So that is what is driving. It came to a point where it's not just a requirement from the donors, but it's something they see that is useful for their program management. So obviously, it's tied to the funding arrangements, the monitoring and the evaluation is funded for is a percentage of that budget you've been given like that's how the Americans created an industry of evaluators back in the 1960s is the federal government required at least 1% of the budget provided to a program such as an educational head start program for deprived poor socio economically disadvantaged children. So the millions and millions of dollars provided from federal government to states with the proviso string attached at least 1% of each grant for each state had to be spent on independent evaluation. Along with the monitoring of the regular reporting and budgetary and other accountable arrangements. Then there's the additional aspect of independent evaluation. Is that a similar arrangement you have now in Mozambique through the various development banks and donor arrangements. Yes, it's, it's similar. It's similar. That's one thing. That's something which happens, or not only in Mozambique in even the region. I had this opportunity of also working with countries in the region here and I have seen that that is how things happen. And there's this requirement. And but now it not just come like a requirement. Someone is asking you to do, but it comes as something which is important to make sure that we know where we are going. So one other thing that came out of the conference very strongly and as I said it was the sixth of your strategic objectives is the idea of professionalization of evaluators, which I was pleased that Dr David Emilio talked about Australia but I wasn't too pleased when he said that he said that everyone here, everyone of us here in the conference should be certified as a qualified in evaluation. And I think that that kind of great a little bit with our tradition here. So for example, I started out as a psychologist involved in evaluating as a program manager. And I've always seen evaluation as an important management tool. I've always seen it as an independent accountability requirement. But ironically, I then ended up being the first person to set up program budgeting for that evaluative arrangement for the federal government. So, but the AS has always promoted the idea of professionalization and professionalism, but not requiring people to be certified as practitioners or certified as members. I think that's a trend that's been discussed quite some time. The Americans and Canadians have toyed with it, but it hasn't really taken off as in some other professional groups, like the ergonomic society and the Australian psychological society, of which I was a member, you have to be certified to be a practitioner. How is that working now in the African countries. And what was the upshot of that discussion in the Mozambique conference. Thank you. Even in Mozambique, even in our prior countries, it's not like a monitor for you to have a certification in monitoring and evaluation to work on that. Even myself when I started working in monitoring and evaluation, I didn't have a certification of that. Even I was an, I was an informatic person, technology person, but I went to work as a monitoring and evaluation officer. So, even now I would say in Mozambique at least, maybe I would say more than 95% of people, they don't have certification of that. The idea behind of the certification is that these areas attract a lot of funds and people just come like, it's a simple thing to do. What we have seen in most of the organization, we have even people who don't understand very well about numbers about indicators and things like that, but since they think it's a very simple thing to do, let me just go there. So, that's when this idea came from. It's not just like a university certification, what we are looking for is make sure that at least someone has a minimal capacity building on monitoring and evaluation, at least to understand indicators, to understand frameworks and understand all of that things when they go there, because most of the time we hire someone because it just went to YouTube, it went to internet and read the Lord and things like that and he is able to respond to the question but let's do the work. He's not able to do. So that is the idea behind. Well, I think we should be opening up for any questions fairly soon. Is there anything else you'd like to say just to wrap up before we open for questions. Yes, I'll maybe what I will do is that our vision has a mama is to make sure that our professional, they are of high quality that organization NGOs are able to trust people or professionals from Mozambique. The other points to make sure that the government recognize this movement and make sure that they understand the use of evaluation. And when we're doing evaluation, it's not just for a sake of doing something which is planned, but it's doing to use it. The Michael Pattons theme for many years, and I was fortunate to work a little bit with Michael and be attending of these various workshops is utilization focused evaluation. Yeah. Okay, so can we open up for questions Mark, do you want to organize that or should we go through the chats and see what we've got with the chats. So chat we've got john's ask a question. And partly I think some of it might have been covered but I'll repeat it just so you can clarify anything. Now what are the main drivers motivation for evaluation is evaluation a condition of funding agreements and or using evaluation to improve policy decisions. So I guess we come back to the use and maybe I know you've touched on bits of this you can clarify but also just to add to john's questions. Have you seen any of the evaluation findings change decisions or improve programs so far. Yes, but it's very difficult to see space, especially when we talk about the government, they're still like not willing to take they will even approve you to conduct the evaluation. But after you bring the report, they'll just put that report under the desk and it's done it just to make sure that they follow the procedures as they agreed with the funding but not to use it. But recently, we have seen some of the government department, not using 100%, but at least relying a little bit on what was a main findings from from from the evaluations, and that is an open door for us to make sure that we push more from them and make sure that they understand the importance of not just doing the evaluation, but also using this evaluation to inform police, police making, and I will just give a quick example on that. And remember, I participated in a movement where we work in some police development through USAID, and we brought some good findings from from evaluation we did on ground, and the team sit down and this decided to think again what they were bringing that police making. And that was a little bit of change. And even the government officials they agreed to change based on that evaluation findings. So it's not a lot, but it's something which we see that maybe we can work on that direction and it can improve. Thank you for that so much john you've got a raised hand so if you'd like to ask a question. Yes, thanks Mark and thanks Thomas for your presentation. Here in Australia we've have, I suppose seen several examples of government departments and even NGOs, where where they tried to build evaluation in as a management tool as what you know Colin was alluding to. Well there's an accountability mechanism, but as a learning process. But many of those examples, I'd say have failed and then we find that evaluation is quickly just become an accountability exercise. I suppose my question is, are you seeing examples in Mozambique, where whether it's a government department or an NGO or a private organization, where they're trying to build evaluation in their organizational capability. Not as an accountability mechanism, but they're doing evaluation as a program development purpose which is what Colin said how basically he started out with evaluation or is evaluation still largely just an accountability mechanism. Yeah, it's a very good question. And from what from my experience from what I've seen. I know, maybe only one company, which works like okay, which works in a different direction but most of NGOs, they do evaluation they do monitoring as part of accountability, not as like, as you mentioned, like a direction they decided to follow. Right, but as to make sure that it's, they are accountable on what they are doing. But when I say they have seen one organization doing that it's only one I've seen. Maybe there's another, but I know only one where they built this evaluation has a has a has a culture of that organization, not just has a not just has a has a has a accountability process. But far behind. Sorry, I have I think I have to reply to john's comment that hasn't really worked with internal evaluation involvement, having been involved in helping to set up internal evaluation processes. I think there is an element that john's correct that a lot of these processes in Australia haven't been as effective as you would have hoped, because the senior executives haven't actually developed that culture of actually being evidence based in their decision making, or haven't focused on the strategic value, as it were, in terms of decision making that evaluation could be an important part of strategy. And so, yes, there's quite often a lip service paid internally, but where there have been executives involved that have hired me to help develop people internally. And at the same time, there has been some serious uptake of evaluation and improvement. But as usual, when the CEO is change, it can all dissolve. And when governments change, it can all dissolve. That's, I didn't want to hold up text him or you want to use the next person with a hand up. Yes. Yes. Good morning. Good day everyone. Thank you for for these discussions. I wanted to get back to when the based recently on the certification. Thank you for sharing experience that you have in Australia where it's not a professional need to certification. But on the other hand, we are seeing this is an experience more than we are seeing a movement of training institutions being created all of them with the issuing certificates to professionals. And we raise this question who certified this institution who created this institution how strong is this institution to really provide and give this certification as a professional this individuals and then because we know that at the end of the day if I'm recruiting a professional I would rather take one with a certification with a certificate compared to the other person. So how do we ensure that there is real finding in the certification process that is being followed in the country. I also wanted to take it on to the role of the traditional society and see from your experience what role has the Australian traditional society played in ensuring that there is real credibility in this certification process. Thank you. I believe there are two things I could say about that is for some time there were at least three universities that had a strong academic staff members and a curriculum in running degrees that had evaluation. For example I ran a master's program in the Finners Institute of Public Policy and Management for ten years which had an internal audit and internal evaluation as an elective in the management courses in the master's program. In Melbourne University had the centre for evaluation which had master's degrees in evaluation. Curtin University and other Sydney a long time ago had training and education qualifications that were registered with Australian qualifications framework through universities. Unfortunately that's dissolved quite a bit when I left Finners in 2004 that stopped in 2006 was carried on for a short while. The last university standing with a strong staff contingent in evaluation is left as Melbourne University and they're focusing more on educational evaluation. So the OES has now for a long time run workshops associated with each of its national conferences has had regional groups running seminars that have set up the evaluators professional learning competency framework and required people such as myself and John and others who run workshops as a part of a broader OES developmental program that have been required to comply with the evaluators professional learning competency framework in the training courses and other things that are provided. These are done as short courses for basically people's professional development rather than certification. So we really haven't developed the capacity for that certification in a regulatory manner. The Canadians have but they've still made it voluntary. So their competency and certification process is a voluntary one and they like the Australasian Evaluation Society provide regular workshops associated with professional development workshops associated with their conferences and their regional groups such as this regional group in South Australia. We've hosted a number of conferences and regular seminars, but they're not strictly certification processes. I hope that answers your question. Do you want to comment on it? Well, that's what we said and I remember during the conference that I'm king said it's not just about creating institutions to certify people but who certified this institution we need to set up a framework just to make sure that we certificate this and not just bringing this course there and without certification. So yes, you're right. So I think John's probably got something to add. Yeah, interestingly there's one pocket of certification in Australia so the Victorian Government is part of its individual departments want to have a business case for large investment. They have to include in that what we as evaluators would call programme logics and theory of change. They don't use that language but in the State of Victoria to make a government department an application to treasure for substantial investment for a new programme or infrastructure part of it is a requirement to include what we would call a logic model and theory of change. That part of it has to be undertaken by a person who's certified by the Victorian Government so they don't call it evaluation they don't use the term logic model they don't use the term theory of change but their requirement, their business requirement is that when you're putting up a business case you have to include that element which we would call logic model and theory of change and that element has to be done by a person who's certified to do that type of technical work. So it's interesting how the Victorian Government has formalised that requirement and they've set up a training programme for people to become certified. We've paid the money to come certified in that particular methodology which allows them to be the certified person who contributes to the business case. So I'm just sorry to finish off that John the Department of Agriculture and the Department of Commerce in Victoria hired me years ago to help run workshops on capacity building and to set up a maturity model of evaluation as a part of their career development and their developmental programme for internal capacity in evaluation. So as I mentioned, Nona Armstrong was hired by the Victorian Government back in the 1980s to set up performance indicators for accountability requirements and improvement of evaluation as a basis and the existing master's degree in evaluation is in Victoria. Victoria's had a strong influence on the culture of evaluation of Australia and have continued to require that competency and capacity but it doesn't always work and it's sometimes actually the programme I was involved in immediately dissolved after my contract finished and the senior people involved left and people executive changed so it lasted maybe nine months after my training programme that was involved. So with the best intentions and the best culture and the strongest commitment is in Victorian departments these things don't always have complete traction and longevity. Sorry David I keep interrupting so sorry over to you David. Yeah well you actually raised two more points to talk about. My first point is that one of the area in Australia at least where there is a strong culture of continuous improvement and reflecting on practice and looking at evaluations is in the emergency management space. I wonder if maybe Mozambique might have that you could try and engage with some of those people and see what's happening and there may be some synergies with the work that you're doing in Mozambique so I'm a little bit of work with the National Emergency Management Authority here in Australia and they're very interested in making change learning from the mistakes they may well provide a model that you might be able to use we'll come back to that. The other thing I should say is that the investment logic accreditation that's just been talked about is a commercial operation largely so it's available it's an international thing and it's available and the training courses are basically handled by a commercial company or they may license some training but they do that so and the third thing which has gone out of my head sorry David that's my fault that's probably enough we haven't got much time left I wondered if you'd thought about the emergency management space how much so it has to be an interest and commitment and capacity building an emergency management for M&E in African countries or Mozambique there is but as I mentioned not from the government mostly from NGOs from companies, evaluation companies so there is that that's why also we think that this is a way to go to make sure that first we have these people understanding what they are doing professionalising their work before we claim anything so there is this movement here so I should say that there are two other things that I do credit for Australia in 1994 the Australian Youth Foundation hired me to set up an evaluation framework and provide training courses and I was involved with them developing their field youth workers as evaluating their own programs using an evaluation framework under the start do-it-yourself evaluation framework manual they developed and they used for 10 years then the government decided to merge their association and my contract ended and that was the end of that program after 10 years and not much to show for it the other one was at an evaluation conference back in the 1996 1998 the Defence Department talked to me about helping set up a vet qualification in Australia we have higher education with the universities but we also have a very strong vocational education and training capacity through our TAFE and there is a separate regulatory body that regulates vocational education training and the Department of Defence which itself is largely involved in a lot of training decided to set up an evaluation diploma a vet diploma and I was instrumental to get the competency statements and the training program curriculum sorted out to the point where we had an outline and then the various officers left and that was the end of that program after 9 months and we never actually got to run the training but there is an intention there to run this I've always said as both an internal evaluator both sides are important as in auditing auditing is a good model internal audit is important to help establish the frameworks of controls within an organisation absolutely essential backbone of a lot of operations but external evaluation is there for the regulatory requirement for accountability and transparency for the financial viability of the organisation and the two both work together and I've always seen evaluation as a similar process where I worked as an external consultant I was able to work better with the organisation when there was someone internally who took the ball and was able to set up and work with internal audit or work with monitoring and evaluation when I was an internal evaluator working I found it difficult because external evaluators were imposed with separate framework, separate agenda that would help it on being totally independent without consultation with the internal staff internal auditors and evaluators didn't always work when there was that blindfolded separate commitment to be totally independent John probably could comment on that having done a lot more external evaluation work than I have I'm just conscious of time I think that's a big discussion for another day but yeah I'm just conscious of time because I have something I'm working on at the moment resonates with what you're saying but in the interest of time I won't make any more comments on that that's probably a good spot to stop given we've reached the hour mark I'd like to thank everyone who's attended but especially I'd like to thank Tomash for making himself available to talk to us about his experience in Mozambique and also Colin for again arranging it but also sharing his wise words as well on that note I'd like to wish everyone who's in Mozambique or other parts of Africa a good morning and those who are in Australia a good evening