 by the Queensland Regional Committee of the AES. I'd like to start by acknowledging the terrible and the Yagura people as the first nation's owners of the lands where we meet for those of us who are zooming in from Manjin, Brisbane. We recognise that these lands have always been places of teaching, of research and of learning and we pay our respects to their elders past and present. And I'd also like to extend that respect to any other First Nations people joining us today, including anyone here who might be zooming in for my home of Aotearoa in New Zealand. So welcome everyone. So my name is Rebecca. I'm on the AES Regional Committee. I work for the YMCA of Brisbane as an impact and innovation lead. So I really enjoy being on the committee. We've got a fantastic committee of people here in Queensland and I know that around Australia we have some great committees of the AES. So I very much encourage people to join the AES if they're not a member already. And I also very much encourage people to think about getting involved with their regional committee. And if you are interested you can go to the AES website and find the contact details for your particular state or territory. So I've just got a little bit of AES news that I will just have a little chat about before we get on to the presentation today. So many of the people here today may have participated in the festival which was held last month and we had a really great turnout to that. So thanks very much for you did come along. We did circulate a Survey Monkey link via email. So if you have received that I really appreciate if you could just give us some feedback on your experience of festival via that survey. I'd also like to remind people that we have invitations open to contribute to the AES blog and also EJA which is the Evaluation Journal of Australia. So yeah they're looking for people who would like to write something about their festival experience or anything that the festival got you thinking about. Yeah the working group would love to hear from you if you'd like to contribute to the blog and you can email the blog at blog at as.asn.au for your festival story. Yeah and you can also turn your ideas for a blog post into an article and the Evaluation Journal of Australia would also love to hear from you or anyone else who's interested in making a contribution to the journal. So do please get in touch with them as well. Now also we've got our AES annual report that's available on the AES website and anyone who has gone onto the website lately will have seen that the website has undergone a facelift and it looks really really really great. So if you haven't been to our new website please get along to that as well. And then lucky last I think today is the last day for people to register for a workshop which is Introduction to Program Logic presented by Rick Cummings. So no doubt you would have received some information about that by the email so just a reminder today is the last day to register for that. Okay so on to housekeeping for today's session. So yeah well everyone has popped on mute so make sure that you stay on mute unless you've actually got something that you want to say or want to ask a question and you can unmute yourself at any time if you wish. Now the process for asking questions today will have about 10 minutes of question time at the end of the presentation so you can either put your questions into the chat box as they come up in your mind and at the end of the presentation we'll select a few for the presenters to answer. Alternatively what you can do is you can ask questions directly off the presenters at the end of the session invite questions. If you'd like to do that you can actually raise your little virtual hand and if you've got a question and we will ask you to direct that to the presenters so that icon can be found under the participants tab on the right hand side of your screen if you're not familiar with that. Okay so the webinar like Nicole said is going to be reported today so please let her know if you've got any concerns about that. Alrighty so on to our presentation and our lovely speakers. So we have today Charity Davies who's standing in at the very last minute to arm for Georgina Roberts who's unwell. So Charity is an Associate Director with Brusvinna and has extensive experience in a wide range of organisational review and improvement disciplines including leading large-scale change and transformation projects so welcome to Charity. And then we've also got Evie Cuthitson so Evie has nearly 20 years of experience consulting to the public sector. Evie is a highly experienced and practical evaluator which is always good to hear. She has demonstrated ability to build strong rapport with clients and stakeholders and works to uplift capability and leading practice for monitoring and evaluation. So welcome Evie. Okay so as you're all made aware and very keen to listen today's topic is are you evaluation ready? Improving the evaluation culture and capability of your team. So as we all know evaluation is a unique skill set and way of thinking. To have an effective evaluative culture it's important not only to focus on the skills of the individual evaluator but also the capacity and readiness of the area or organisation as a whole. So in this session we're going to listen to Evie and Charity. They'll be introducing us to key features of evaluation culture and by applying their evaluation maturity model they will walk us through how to assess and enhance evaluation capability at both an individual and organisational level. So I'm very much looking forward to hearing more and I will now hand over to Evie and Charity. Thanks very much. Thanks Rebecca. Thanks Nicole and thank you everyone for coming today and thank you AES for putting on these events and giving us the opportunity to present today. My name's Evie Cuthbertson and I'd like to introduce you to my colleague and co-presenter Charity Davies but I can't seem to see her on the screen at the moment. I don't know if that's a problem at my end it often is. Eve I think I have to be speaking so I think because you're sharing at the moment people will only see the person that's actually speaking. So hello everybody I'm Charity Davies and you'll be hearing from me later in this presentation. Thanks Charity. Yeah so we'll look sorry about that we should there's a I'm as I was saying before I'm an MS team's aficionado but Zoom often gets the better of me everybody. But look thank you again and Charity yes Charity and I work with Grovener we're a professional services firm and among amongst other things we undertake a lot of review and evaluation related work requirements with our clients including those related to development of evaluation capability both at the individual and at the and also at the organisational level and I guess to that extent what are all the implications for that from a workforce capability perspective and that's why we have Charity along today because she has a particular sweet spot in that area. So we thought today in terms of unpicking what we mean by are you evaluation ready it would be good to look at it in the context of organisational policy program or project and at the individual level because the requirement or the framework that you utilise to understand the degree of evaluation or the maturity of evaluation capability is same same but different according to which lens you are applying and then rolling all of that together what overall are the implications for the workforce capability at the workforce capability level. So that's the order of events for today's discussion I'm going to be talking primarily around the organisational level frameworks and the individual level maturity frameworks and Charity will be talking more in more detail relating to you know project is my project program policy evaluation ready and then tying that all nicely at the end with respect to workforce capability. So first of all let's talk about evaluation capability maturity through the organisational lens but before we do that we probably need to define what we mean by evaluation capability and I guess in all things related to evaluation it's got about 50 different ways of describing that some people call this you know performance culture evidence-based practice a results-oriented culture but by way of our definition that we'll use today we've gone back to John Main and I'll just read it out so that we've got it at frontest piece of our mind he's defined an evaluative culture as an organisational culture that deliberately seeks out information on its performance in order to use that information to learn how to better manage and deliver its programs and services and thereby improve its improved and thereby improve its performance. Now that's a bit of a mouthful to drop into conversation at any at any one point but I guess if you really boil it down to tin tax I see an evaluative culture as an organisation that's clear on what performance it seeks out to achieve and what the requisite outcomes what requisite outcomes are needed and then has a corresponding strong set of evidence and understanding of whether they are tracking towards achievement of those outcomes and there's lots I know that's a really kind of no frills addition of what an evaluation culture relates to and we'll unpick it more over the next couple of slides but I think it's it if you really get to the tin tax of what John Main is saying there it deliberately seeks out information we know what we want to understand by way of input performance and we're going to use that information to improve how we do things around here and make it better so that's really what we're talking about today and what charity and I want to share with you are the models maturity frameworks that we've developed along the way as part of our work with our clients and helping them to achieve their different project requirements that have supported them to better understand where their organisation sits according to the different domains different capability domains because if I move now to the next slide you can see that there's lots of different when you actually start to unpick what an evaluative culture looks like or evaluation readiness looks like there's lots of different related domains for example I'm sure all of you out there in zoomland have been exposed to organisations that do evaluation really well or aspects of really evaluation really well and when they're doing that you sort of see these kinds of features or these kinds of signs I won't go through all of them now for the benefit of time but let's just unpick a few in a in a high high performing organisation you'll see continuous learning and improvement is normal practice and I guess from my experience those are the organisations where learning and sharing of information is valued they're a regular learning events people are people are not afraid to share mistakes in that you know well as in like they're not happy about finding out about the mistakes but people are pleased that when these mistakes are brought out into the light that they can then learn from them and then respond to them and make things better those are the organisations that have got really good systems and supporting processes that enable you to collect good data that then supports that empirical evidence that can then help you understand what's happening by way of your program or your policies performance etc and then I guess the other thing I would observe by way of a high performing organisation is those are the organisations where evaluations just you know the way we do things around here there's a constant I suppose culture or environment where you're questioning things you're using evidence to push things forward you're constantly reviewing where you want to get to and where you sit along that race mark again I won't go through these all verbatim because you can all access the slides later but I guess sometimes you can see these different features in evidence but then I guess when you look across the map if we're up in our helicopter looking down on these key features of success sometimes you'll observe in an organisation that there are gaps or deficits and the models and frameworks we've developed are a way or a structured approach that you can use to identify where you're going well with respect to your evaluation capability and then where there is room for improvement and that's what I'll just and these are the models that I'll take you through now how how they were how they were derived and how they were applied and just by way of like context before I move into the the maturity evaluation capability maturity models we've developed I guess by way of the definition of what a maturity model is I'm sorry I'm not trying to teach you all the suck eggs but we all call things by different names a model is often referred to as a framework and it is a structured approach that enables you to assess either at the organisation or the individual level where you are at a certain point in time by way of a capability against a particular domain and that then serves as the jumping off point for where you want to address or action improvements or where you want to head next the model I'll take you through on the next slide it's probably important that I just back a step to explain how we designed it just so that you know it wasn't something we just pulled out of our head it was evidence-based and I'm sure a lot of you have already come across a range of capability frameworks either in your workplace or your your department probably has one for example or alternatively it's driven by outcomes based reporting and legislative requirements for example but in developing this model we reviewed a range of existing frameworks and from that generated 10 capability evaluation capability domains and as part of that we then developed the descriptors which would enable you to peg where you sat on each particular domain according to maturity levels of beginning developing embedded and leading so if you can imagine it's like you know your x-axis is your domains and your y-axis is the maturity levels and then to undertake the assessment we developed corresponding interview guides documentation guides and then tested and refined the model so that's essentially how we derived it how we ended up with this model and then how we apply it in in the practical coface of working with our clients so I know this is a very busy slide and I won't break down each of the domains in sequence suffice we ended up with 10 domains and you can see they thought if you move around the wheel in a clockwise direction they relate to sort of four core groupings if you like the culture the leadership the governance the collaboration engagement and the people they're all sort of you know the the both the combination of soft and hard enablers but people and decision making related then you have the system domain and that's around how do our systems tie together to support the delivery of program evaluation are we collecting the right data can the different data sets talk to each other are they consistent can we aggregate them can we derive insight bigger and better insights by using different data sets etc then you have the group the orange pink and purple group sorry the the movi group relating to the nuts and bolts of planning conducting and using the actual evaluation data information and recommendations derived and then the last domain is actually focused on evaluation capability itself how you know how are we performing the program evaluation in and of itself so that's essentially the breakdown of the maturity model and we apply this you know in in situations with our clients where they where they want to look at things from an organizational perspective it might be in relation to see how well we we've worked with clients for example that developed developed an evaluation strategy a three-year strategy at the 18 month mark they wanted to understand how they were traveling with respect to delivery of that strategy and whether they were actually gaining any traction according to these different domains so in that instance we were able to have a baseline you know with the with at the starting point and then we were able to see how far they've gone within an 18 month timeframe and able to recalibrate where they needed to address their strategies and and planning for that for the future 18 months of the strategy again I sort of briefly explained how we apply the framework in you know the project the organization context but basically from those domains we derived a range of you know data collection tools undertook a range of interviews and documentation review pulled the evidence evidence together determined the ratings using those broad brush you know beginning developing going well and then identified where the areas for improvement lay moving forward and then help the clients to prioritize which of those opportunities or recommendations were a priority and help them to plan that out there's a link here I've enough and I've put by the way sorry everybody I've put the links in the chat if you do want to access any of these other details this is an article we developed relating to driving driving evaluation capability a driving evaluation readiness in your organization and that explains a little bit more of the application of this particular model if you do want to explore that further just a few tips and traps before I wrap up one of the things we've learned along the way is that it's really important when you're undertaking this kind of maturity assessment that you clear about what level you actually want to aspire to for example it's just not practical for each and every organization to want to be or to be able to be leading across all 10 capability domains well I mean I'm sure some organizations can and do achieve that but often an organization may not necessarily have the capacity or the resourcing to enable that so it's really important to pick your battles and as part of that prioritization step that I talked about on the last slide that's when you want to come up with what capabilities are really going to get me might get enable me to attain the most impact moving forward I guess the other tip that we've I would I would put out there is that when you start to talk about evaluation capability people can often get caught up in quite a you know on that on that train track and it relates to key concepts terminology and what opportunities you need to take up and I would recommend thinking flexibly in this in this when you're out there in this when you're out there trying to move things forward there's lots of really creative ways you can piggyback on existing projects and other kind of opportunities that can carry your capability forward but it's all but you don't necessarily have to call it evaluation capability per se it doesn't need to be wrapped up in that nice evaluation capability wrapping paper the other thing I would recommend is when you've undertaken an assessment like this if you come up with a laundry list of recommendations it's really important to prioritize because if you've got 10 key things that you're supposed to achieve before Christmas often people get they're like rabbits in the headlights and everybody and the whole thing falls over and your project loses credibility and nothing gets done so my recommendation would be choose one or two things where you think you're going to be able to gain the best traction and just do those things well and you know if so facto once you start to move things along all of a sudden you start you started to build up momentum and you can manage the rest of the recommendations in you to etc I think that's probably my two or three key tips there and I guess really that would be that wraps that sort of wraps up our summary of what we've done by way of understanding evaluation readiness and using these frameworks to support that kind of assessment but if we think back to when we started we were looking at things through three lenses organization the project and the program or the policy level and then the individual level so I'm just going to hand over to charity now to talk through the different kinds of frameworks or checklist that you can use when you want to understand capability with respect to the pro at the program level thanks charity thanks a that's a great start I think that's a really nice way of prefacing this having that the layers organization program project layer individual layer and what that means from a workforce workforce point of view is really important so moving from that org layer to you know project program policy layer often when you're looking at your policy program or project you're going to be looking at it in a much narrower lens so you're going to go narrower and deeper and that requires a differently oriented tool to what we're looking at at the org layer so let's have a look at the slide that you can see in front of you now is we have a downloadable checklist which is designed to help you navigate your way through planning your evaluation for your policy or your program or your project and the checklist summarizes and organizes the steps in various aspects that should be considered when you're establishing a sound and structured evaluation plan so the actions and considerations suggested in the checklist are structured around the four basic questions you can see on the screen in front of you so first of all what is it that we're doing in this program project or policy how will we use the information that comes out of the evaluation what's the intended purposes of doing the evaluation so the which leads to the why why are we conducting it and more importantly why are we conducting it now how are we planning on doing that so is this something we're going to do internally or are we going to seek some assistance from external from other agencies or from private providers to help us with this and then a question around what resourcing is required to get this done in a way where we're able to pursue the right type of evaluation methodology get the right information the right data do the right consultation so that we're collecting and analyzing information that's going to achieve the purposes that where we intended with the evaluation in the beginning but also identifying at this point what the time and resourcing constraints might be with regard to being able to get that information in a usable way and then finally when so when are we going to undertake the evaluation activities when are we looking for data packs when are we doing consultations what's the timing of key milestones and deliverables so following that general sequence of tasks outlined in the checklist will ensure that amongst other things you're identifying what key decisions will need to be made throughout the evaluation and when they'll need to be made that appropriate resources are allocated that the evaluation is proportional to the program that you're evaluating so I think we've all seen great examples of where it has been fantastic overkill in program or policy evaluations where the cost of the evaluations probably outweighed any risk associated with the program going wrong but also the opposite happens right where evaluation might be planned in too light a fashion or the evaluation methodology is not necessarily fit for purpose so following this approach means and following the checklist will mean that there is proportional and effort associated with the evaluation it'll ensure that sufficient support is provided most importantly from leadership so from management from key stakeholders that the right touch points are in place to ensure the right level of information is provided at the right times and that it's generated in an effective way so that it can be used effectively for the purposes intended and also so that through every cycle so through every evaluation on every policy every program every project we're embedding and reinforcing that evaluation culture it's fostered and encouraged at every turn and I just something you said before Eve which sort of tweaked something in my mind is about you know that domain around culture what we see and why this part is so important is if you set out in your plan to use your evaluation as a stick for policing people and for identifying major weaknesses and and gaps then that's the culture that you'll embed along the way if however you use your evaluation as an opportunity for capability building for instituting a growth mindset within the organization and linking that to the strategic objectives of the organization at every turn then each time you do an evaluation at that policy program or project level you are reinforcing that positive evaluative culture along the way thanks Eve if you could just go to the next slide thank you so what what you'll see here is um have I lost you no I can see you sorry I just lost things on my screen my there we go um so what you'll see on on this slide is just a an extract or a snapshot of our checklist the checklist is broken down into the two key columns provides tasks on one side as well as a brief supporting explanation on the right hand side so I'll post this link into the chat now so that yes and actually charity that's what that's if people do want to go to that article it's I've posted up the very top of the of the chat in the chat cycle hopefully fantastic thanks Eve so that's all for policy program and project layer Eve I'll hand it back to you for individual layer thanks charity this is the final piece I said the final lens that we wanted to walk you through today in terms of unpicking evaluation capability at the individual level so I guess it's sort of from the lens of well hey how am I going as an evaluator where you know what what areas um do I am I um showing signs of success in or where where are my deficiencies and um from I don't know if anyone out there um on the on the participant list has uh was was put in the um recent trans eval uh series there was actually a session a session on the AES evaluators um capability tool um and it was derived from the the learning competency framework which I'm sure all of you AES members would be very familiar with but the domains relating to attitude professional practice uh skill into interpersonal skills etc but um I guess um the evaluation specific tool that I've just referred to um I've been I've been back in touch with the AES to see where uh that's up to in its evolution and lifecycle it's currently being up I'm sorry I can't provide you the link at the moment but it's currently being updated by the AES and um an actual self-assessment tool is currently being developed so um I don't really have um anything more to share from that point of view suffice though uh that will my understanding is that that will be accessible on the website soon once it's um been completed and um you know ahead of um you know guessing ahead of its finalization I imagine it's going to be something like um the other frame you know the different um ratings if you will across these different domains so um as evaluators I think this is also a really interesting assessment exercise for us to undertake to keep us honest and up to speed in terms of our own professional development and um uh you know in terms of planning where we need to um pick up our skills and improve um according to the competency domains um look so that's really um and I'm sorry I can't share the tool with you today but um suffice where does that leave us um with respect to workforce capability and we did want to close um on that looking at evaluation capability within the context of the overall workforce framework because if you haven't got that closing loop then you're really um not looking at evaluation capability or evaluation readiness um in context and so I wanted to hand back to charity now to close us out to talk us to step us through just that thanks Eve so so look rather than being evaluation specific workforce capability frameworks are designed to be applied to all of the functions that are undertaken within your organization so the slide you can see now um really is just a very short and snappy construct for looking at how we typically build up a workforce capability matrix or model or framework noting that those um terms are used interchangeably depending on the vernacular of the organization so typically a workforce capability model is made up of at the very top enterprise capability sets and what I mean by that is these are those capability sets that are required for your organization to achieve its strategic objective so it might be things like financial people management policy design service delivery high level enterprise level capabilities that are required by your organization for it to achieve its strategic objectives that's typically the pointy end or the top end of any workforce capability model now it would be remiss in my view without taking too much of a purist stance on this for organizations particularly in thinking about how we become future ready how we prepare for the future of work and and start building a workforce for the future to not be thinking in that enterprise capability set space about where evaluation fits in so it might be there's lots of lots of ways to do this right there's no no one right way but it might be that an organization in their enterprise capability sets at that very top level have something like a research and evidence enterprise capability or it might be an audit and evaluation capability there's lots of different ways to cut it but through the design of workforce capability models or frameworks we should be thinking about where evaluation fits in the next layer down then is each enterprise level set includes the capabilities at a functional level that are required to achieve that enterprise purpose so for example if an enterprise capability set was say people management the individual functional capabilities that might be included in that set might include things like recruitment development plans performance management workforce planning etc or if an enterprise set was something like communications then the capabilities the functional capabilities that sit within that set might include digital comms social media management internal communications editing and publishing etc you get the drift so so we go from that enterprise high level what's needed for the organization to achieve its remit down to what does that look like when we connect it to the functions of what we actually undertake within the organization usually each capability within a set is described based on the high level elements that are involved in undertaking that capability so let's look at evaluation imagine there's an enterprise capability set that is so research and evidence and then evaluation is a capability a functional capability that sits within that enterprise level set that evaluation capability would typically be described using half a dozen or eight high level elements so if you think back to what Evie had in those 10 domains and what we know is in the the AES capability framework it might include things like understanding evaluation theory and its application within the organization's context identifying appropriate evaluation methods planning and undertaking research or inquiry developing key result areas or measures planning and undertaking detailed consultations or inquiries analyzing large data sets it might include something in there about detailed consultation or evaluative attitude and professional practice so that's how a functional level capability is described using a series of elements that talk about how that capability is applied in practice but then what does that mean in terms of our workforce and how do we measure our maturity in undertaking that capability well typically the way we do that is to build proficiency descriptors which are used to describe the increasing skill level associated with each capability and you can actually marry that up with the increasing maturity level at the at the organization level around capability so usually what we choose to do in that proficiency space or what we encourage organizations to do is to build proficiency layers that might start at sort of the emerging level right through to the expert level where you have an individual able to to assess or even a team able to assess on that in that capability what proficiency level they're sitting at. Clever organizations go ahead and map capabilities and proficiency levels to roles so they say if Evie you're in role X and here are the capabilities that are mapped to your role in order for you to be able to undertake that role at the level required here are the capabilities that are mapped to that role and here are the proficiency levels from emerging through to expert that are related to each of those capabilities mapped to a role and of the time particularly in the public sector construct we see that that lines up nicely with the classification structure so there's a way of blending that in with the classification structure. What does that mean at the individual level then so at the individual level the benefit of doing that is you're in a situation where the workforce can be baselined against those capabilities and proficiency levels that are mapped to their roles to the roles that they currently occupy but another another nice thing about it is individuals can baseline themselves against capabilities that are not mapped to their roles that are that they feel they have some proficiency level in which might come from a previous role they've held or from some particular subject matter expertise they've picked up through training or development or qualifications etc the way that that there's lots of ways to baseline the workforce against that that kind of construct including coming up with assessment questions for each proficiency level so that you can validate and verify where an individual sits on that increasing skill scale if you like but you know I I guess as I said before this is a broader construct for capabilities across an organization across all functions that exist in an organization typically we see that there are a core set of capabilities that are mapped to all roles in the organization and then there are specific capabilities that are mapped to only some roles evaluation may be one of those specialist capabilities that only match to some roles in the organization and you know this is in by no means are we saying you need to go ahead and develop a workforce capability matrix that covers everything you do in the organization like Evie was saying before you can do this type of thing in bite-sized chunks so there is absolutely nothing stopping you starting with a with the evaluation capability so thinking about what fits with evaluation at the enterprise level what would the enterprise level set look like and then how would we describe the evaluation function vis-a-vis the domains and what's in the AES capability framework how do we describe those key elements and then what would an increasing scale of skill or proficiency look like from emerging through to expert within that capability so that's really just a high level construct for how we build capability frameworks and that's how we would advocate that evaluation capability from the individual levels also built next slide for me please Eve thank you so this slide really is just to say the best examples of capability models that we've built and been involved in sit at the center of all workforce and capability management practices and they actually form the fulcrum around which the rest of the people management ecosystem operates so you can see up the top there are a range of factors their future of work structure and design you know all design job and skills design career pathways your capability model should be informed by those but those items should also be informed by your capability model once you have it in place so your capability model should be helping drive your structure because you know done properly you will have a very very clear view of where your enterprise level capability gaps are at team level at individual development plan level so it can inform career pathways it can inform job design and org design it's also then becomes a very useful data set to be used for succession and talent management and for the ability to stand up teams quickly and for mobility for standing up tiger teams or cross-functional teams for multidisciplinary teams etc for individuals to take an active development mindset in their own career development and progression etc so to drive that culture and link active development mindsets into everyday thinking within the organization thanks eve thanks charlie that's excellent i really appreciate you walking us through that and i've got some other links i'll post those to the to the chat in a second which takes you through to further articles if you're interested in expanding on some of these slides a little bit more that charities just walked us through is one more slide there which i'm sorry sorry no no that's okay i think it just ties it together nicely not just in terms of the workforce capability model but everything we've talked about today around you know that evaluative culture and practice the communications that go along with that how you affect good change management in a shift to a good evaluative culture and that is to start with the value propositions that this brings for your organization so you know in building capability models we always ask ourselves the question of well what's in it for the individual what's in it at the team level and what's in it for the organization and i guess the big takeaway here for us is at the individual level it really is that impound or ownership to have that visibility about how you know i can see what the increasing proficiency or skill levels are now in this evaluation capability or in the suite of capabilities associated with evaluation and i feel empowered to be able to improve my skill set and proficiency level in this space so it is driving that active development culture as opposed to performance management hard wide or hard you know default to weaknesses and gaps kind of culture it's a different it's a shift in mindset at the team level it really is about that ability to be much more flexible with your workforce with your workload allocation with your shifting work to where the capability is and where the strengths are and being able to be very agile in how you construct teams and bring people together to get the work done quickly and then at the org level the big takeaway that we get over and over when we look at the outcomes from building capability framework is around that ability to do true strategic workforce planning that's over the horizon that's future ready it's looking at what skills do we need to enable our workforce to have and then at the enterprise level what capabilities do we need to really harden up to be a going in five and ten years time so i just think that's a nice slide that pulls all of that together but that's it now i promise i'll stop talking i'm sorry that was that was the kicker and i'm sorry i i cut you off with more talk about some other great articles that you've written in relation to this that i'll post into the chat in a sec thanks um but look really um thank you all i guess that concludes um charity in my session today um so we've really enjoyed it thank you again for having us and um rebecca i i think we've got 10 minutes now for any questions or further discussion so i'll i'll hand over to you now to to facilitate thanks evie and thank you charity um that was really great and i actually had a question percolating around in my brain um that charity ended up answering which was fantastic which is about and how do you um work to have people understand how that evaluation can be part of their role even if it's not a core part of their role and i thought the talk about the workforce capability was really helpful um so thank you very much for that um so we do have a question i invite people to put questions in the chat and if they have questions um please also even put your hand up if the little raise your hand function so we'll start with the question that um p main has put here um so asking how would you suggest going about trying to change the culture of an organization where senior management do not want to be evaluated unless the result makes them look good yeah that i would you like me to answer that one charity or yeah absolutely go for it oh that is the hard i think that probably is one of the hardest almost difficult challenges to overcome uh and um it knocks off a lot of good um evaluators and a lot of uh and i think that probably some of my biggest disappointments have been great evaluation reports great recommendations that have ended up in the bottom drawer because um the lack of wanting only wanting to um see what's good and to not learn from mistakes um i think probably um where i have seen that turnaround have been um well then in a range of instances one um you know charity um was talking before about um you know it's that shift that's schism between identifying your gaps or identifying your deficits to opportunities to harnessing your strengths and to um it's it's the it's the flip of the coins it's the second side of the coin if you will and um an example i i experienced once was we had a really tough evaluation results were really damning and um not only put the whole program into a into a into a into a corner but the the organization and um what was really interesting was it was almost um the change was driven by not necessarily the people that were the decision makers it was by the people that were actually at the coalface of the program delivery that were passionate about making it work and making it work better and their reframe was thank god you bought this into the light thank god um we've learned this now before there's a before you know before the train actually tilts right off the tracks and takes more lives with it thank god we've found this now and um and hey good on you leadership thank god you are creating this environment where you are now in control of bringing this train back on the tracks and steering it back safely into the station so that makes me feel that we were all very manipulative and complicit in um that reframe but i found that um even though the the the people the the people that actually took charge of that situation weren't necessarily the decision makers and they weren't necessarily empowered from a um delegation point of view from a um um i suppose a a grassroots um level they really made a huge change in in turning um that mindset around the other thing that i would say that i've i've seen in terms of turnarounds um people uh is um finding champions and um if you can find one or two champions in in the field um if you have identified through your maturity assessment model that this is a problem in leadership if you can bring those champions into the fold to extol the virtues of evaluation and to um i suppose um uh what's the word be flagship in terms of how using um difficult results to improve things has actually been a win win for everybody i think um i've also observed that on a number of occasions the other thing is the assessment of the maturity um assessment in and of itself can actually bring that to light for example if you've um assessed you've had an you've had an independent assessment which says hey um you know you've got all the skills great you're a leader hey um dad is fantastic leadership bow bow you're a bunch of duds having that having that documented and um um it can be quite confronting but you've got that um it sort of highlights it and it can be um and because it's um based on a robust methodology uh it can't be contested and it's not it's not subjective through the use of this model so those are three examples where i've seen it turn around but i'm not saying for a minute that it um in any of those situations was it easy um or um was it um it was stressful does that answer just would be interested rebecca if you could ask the um the the participant whether that answers her question sufficiently does that answer these are actually questions specifically um yes thank you that was very helpful so that's awesome um okay so for another question here it's great that i can um choose the questions because this is something that's been in my mind a lot lately and do you see a link between building evaluation capability and building monitoring and data capability in an organization yes uh do you want me to take that one evie yes yes that'd be great charity absolutely um there's there's no doubt about that and when you think about um any capability model um whether it's a whole of workforce or whether it's just in that um enterprise set related to evaluation you'll see that there's quite a bit of crossover so the trick in um building up your capability model is to make sure that there are linkages and there's sequence interaction between the capabilities but not overlap um if you want people um when you want your workforce um building capability without being measured twice for the one capability or the one one um task or skill level if that makes sense yeah and often i would say too with that charity um depending on how the organization is structured that can also cause the um uh the d linking if that makes sense for example if you've got uh like um you might it could be one or two things like normally what you'd want is this view you know both your data management data generation analysis is bubbling along streaming along nicely over here and then your evaluation capability is also you know syncopated and for whatever reason because of structural um issues and and because so and so this this part of the schism isn't talking to that part of the organization that can also derail things but yes i agree with you charity they're totally interlinked and um you can't optimize things if both aren't working um in unison and and and to um to a high standard yeah and look i think one thing that i'd point out here is we have a vast library of capability and proficiency descriptors so that and you know teasing out how they sequence together but don't overlap there's a bit of a some of that but we do have a vast um library of those descriptors so if you're looking for a bit of a uh read on how to do that just sing out and we can give you some um guidance over email for that kind of thing rebecca i noticed there's a question in there around behavioral frameworks to drive change and i was just wondering if i could quickly say one of the things that's really important in building up capability is looking at the aptitudes that go along with that and so related to behaviors but probably at a more and easier assessment um level is looking at the aptitudes that go along with good evaluative practice and so oftentimes we will work with organizations to build up an aptitude framework that sits alongside the capability framework so there are ways of doing that there are particular behavioral attitudes that relate to change thanks so much for that charity um so look we've got a couple of few questions in here but i think we're going to have to leave it there um just give me in mind some people's lunch times um so thank you to everyone for submitting a question um and sorry if we haven't been able to get to it um so look i'd like to two things i very much like to thank charity and ed for their time um i'd also um welcome people to stick around for just one or two minutes because we do have um a quick poll which is what we use to evaluate these sessions but yeah so i really thank you so much um evie and charity i know i certainly took a lot out of that and i'll be pouring back over your slides again and sending you an email to say can i get some more information on that as well so um thanks evie and charity and i'm sure thank you um on behalf of everyone here um as well and participants please stay online for another