 working in the monitoring and evaluation space and using customer relationship management systems to help you in that, if that's being experienced. So thank you for spending Wednesday evening with me to hear me share my reflections working in Bushfire Recovery Victoria earlier this year. So just a little bit about me. I work in the outcomes and evidence branch like Ian had mentioned. The branch is responsible for leading work on whole of government outcomes and evidence reforms to embed a stronger focus on how we use both outcomes and evidence in decision making and embedding continuous program improvements. So more recently our branch has been doing program level evaluations and capacity building work around the Victorian public service to support how we generate, synthesize and use evidence in our day to day operations. So it was really timely for me to get the opportunity to work in Bushfire Recovery Victoria earlier this year to help implement these reforms in a very contemporary setting. So I guess in this seminar there's a few things that I would like you to take from this presentation. The first thing is really about how we can generate evidence to understand client experience, particularly in an emergency management context. The second one is about how we can use evidence to tell a story about client impact. And then the third is how we can embed a demand for the evidence that we are generating to promote continuous learning and improvements. And we'll finish off the evening like Ian flagged with a small group activity to really give you a chance to network with fellow AES members. And to share if you've seen any examples of this being done well in practice. So I'll start a bit with an introduction to Bushfire Recovery Victoria to kind of set the context before I go into the body of the presentation. So Bushfire Recovery Victoria or BRV was established as a permanent agency earlier in January in response to the devastating bushfires in North East Victoria and East Keepsland. And BRV has a very clear remit around leading the government's Bushfire Recovery efforts in those particular regions by partnering with communities to lead community action in these areas. And Bushfire Recovery Victoria has a really broad range of services that it provides. Given the scale of the disaster, the response has been equally massive and something that will continue for years and years to come. A lot of BRV's efforts have been focused in the immediate space around getting communities back on their feet, so around cleanup efforts. Also BRV has delivered a number of services in terms of linking people up with case management support. It has established one-stop hubs in the form of community recovery hubs in these two communities to provide people a safe space where they can come to get the support that they need. There's also a longer-term immediate and long-term focus on environmental recovery, economic recovery, et cetera. So yeah, it's just a broad breadth of services that BRV has been providing or coordinating. What is ACRM, a customer relationship management system? So at the most basic level, it is a technology platform for managing a company's relationships and interactions with customers in a central place. And what this image illustrates is that CRMs are ubiquitous. So if you've interacted with a bank, if you have a bank account, if you've been to a GP, if you have a loyalty program with a retail store, if you've bought travel insurance, if you've bought concert tickets, you've probably interacted with a CRM. So all it does is it allows the company or the agency that you're interacting with to collect details about you, details about their interactions with you, including services that they've provided you and anything else that may be needed for that particular organizational context. A CRM from what I've seen is can be used to varying degrees of sophistication. So at the most basic level, it can be used just to collect information. At a more sophisticated level, organizations can use it to do detailed data analytics to start to understand and unpack who their clients are, where there's the demand for services, what their interactions with your company looks like and how they can improve profitability or impact. We see a lot of companies in the more commercial retail sector using CRMs really well to achieve some of these things. However, I think sometimes in government, we are sometimes lagging behind in that. So what I hope to do is share some reflections on how we can better use the data that we collect to tell a story about the impact that we're having. For any of you who have worked in the monitoring and evaluation space, particularly in emergency management, you will understand that there's a lot of challenges that limit our ability to plan ahead, especially for monitoring and evaluation. And one of them is really that we are working in a highly reactive environment. I often felt like this running person icon while I was in BRV because you're constantly reacting to things and as an organization, you're working under a high level, high degree of scrutiny, both by the media, the community and ministers as well. You're responding to issues, you're making really quick decisions, you are often under a lot of uncertainty or without the information that you need. This can often leave very little room for planning, particularly when it comes to setting up those data collection systems or planning for how you will monitor and evaluate how your organization is performing or how you will start to incorporate the learnings from your journey or from previous experiences into what you are currently doing and the decisions that you're making. Another challenge or limitation is that we often find that there's limitations in data collection systems and in data sharing, which means that organizations in the emergency management space do not always have access to data that tells them about what the experience of the client has been, what is working well and what improvements need to be made. Moreover, successive inquiries into bushfires, like particularly the Royal Commission has found that there's a real important space for monitoring and evaluation in helping us to understand our progress, adjust implementation and to learn from the past. One of the things that upon reflection of my time in bushfire recovery, Victoria is the importance of starting early in terms of setting up the foundations for monitoring and evaluation. And that's something bushfire recovery did well. We, VRV set up a dedicated space and a dedicated time and dedicated effort to set up monitoring and evaluation framework early, which clearly articulated the things that we will measure when we will measure those things and why we need to measure those things. Another thing that we did in VRV, which links to the overarching outcomes reform work that we're driving is specifying the outcomes that the organization wants to achieve, including the outcomes that we wanna see for our clients. I think especially around the outcomes work, one of the things to note is that while the intent was set in terms of the outcomes, it's still in its early stage and it's still a while away before it can be finalized, but the intention was set at the very early stage of the work that we did. And the benefit of doing something like that was I guess the opportunity that provided was that it provided a real strong starting point that any subsequent work that we did around designing data collection systems was anchored in the M&E framework or the outcomes that we were trying to achieve for our clients. So the CRM, the Customer Relationship Management System for VRV was launched in community recovery hubs. So when I was in VRV, there were about 10 to 15 hubs that were launched and the CRM was launched in each of those hubs back in May this year. So the CRM has been in operation for about three months and I've returned to my substantive role in DPC about a month ago. And it's given me some time to really think about my experience in VRV including the process for designing the CRM and what were the things that worked really well? And as I unpacked some of those things, I was able to group them into these three themes. So I guess if you are in a position where you had to design a CRM, these are some of the things that I would say work really helpful for me. So the most fundamental thing for me was really starting with the vision for what the organization is trying to achieve through a CRM. So really why does the organization need a CRM? And this is a fundamental question to answer because it helps you to design the scope and the scale of the system. And it helps you answer important questions like who is gonna use the system? Why are they gonna use it? And how are they gonna use it? In VRV's case, it was really about having a CRM to provide a single official record of all client interactions with VRV at its several intake points. So VRV has intake through phone, through its website and email and through a physical location such as hubs. So this was really about having, being able to provide clients a seamless service experience when they interact with us by having that central place where all our interactions are logged. The second thing that I found really useful was understanding the service system and the client experience. This is really about how can we ensure that the technology platform that we design is user centered. And this is really important because when you're working in the emergency management context especially, people are highly traumatized given what they've experienced and you want the technology to be an enabler for good practice and not something that impedes practice of the traumatizes people. So a tool that I found to be quite powerful with this was really client journey mapping and what client journey mapping allows you to do is to unpack a client's journey through the service system to really understand what the lay of the land looks like. It also helps you to understand what a client's touch point is at various points in the service system and understand what data is required to be collected of a client at various points and why. The third thing that I found to be quite important in this was ensuring that the design of the CRM in terms of the data that we would be collecting was aligned to organizational needs. So this is really about what does VRB need to know and why? And this is, you can think about this in two levels. So the first is at the programmatic level. Thinking about where is the CRM going to be implemented? What does this program area need to know about a client in order to provide them the services that they need? The second level is really to think about the organization itself. So in VRB's case, it was about what are the organization's existing reporting obligations? And for us, it was really looking at the weekly state recovery report that provided key metrics to stakeholders about how we were faring in terms of our recovery efforts was looking at reporting obligations to the commonwealth and thinking about where the CRM fits into that piece and how the CRM can add value to VRB's existing reporting obligations. Another thing that was particularly helpful was aligning our data collection to best practice in other jurisdictions. So New South Wales was one of the jurisdictions that we looked to because their systems were being designed at the same time as ours. Or in fact, the system was designed just before ours. So we looked at what New South Wales was doing to learn from best practice elsewhere. So I guess these were the three key takeaways from me in terms of key considerations for designing a system that collects client-experienced data. So going back to VRB CRM, this infographic provides you a bit of an overview of the data that we're currently collecting. So it's broadly grouped into four categories. So we're collecting client demographic information, which is just the basic information about a client along with details of any additional family members. So we get an understanding of who our clients are. Another category of information that we collect is about permanent and temporary accommodation details, which helps to give us a sense of the level of damage sustained to people's houses, including what level of support they need in order for rebuilding. A third category of information is to understand a client service history. So once a disaster strikes, the immediate phase is the response phase, which is then followed by the recovery phase, which is where bushfire recovery victory is more active. The service history is really for us to be able to understand who the client has interacted with immediately following a bushfire and organizations like Red Cross and the Department of Health and Human Services are more active in the space. And the fourth category of information that we collect is about information and services that clients require in future. So that could range anywhere from mental health support, housing, financial counseling, et cetera. And the purpose for this fourth category of information is really for us to be able to plan and coordinate services for our clients. The key takeaway for me on this and the point that I really wanna highlight for you is that the current state of data collection is not the future state, sorry, not the permanent state, nor should it be the permanent state. The current state of data collection is suitable and appropriate for where BRB is at right now. However, as BRB as an organization evolves, it is really important that the data collection also continues to evolve with it so that we're constantly collecting the information we need and re-evaluating to stop collecting the data that we no longer need that may be imposing an unfair burden on clients. As you'll see in the bottom, just put down some ideas for where future data's collection could go to give us a better sense of what crime impact might look like. So some of the things you could look at embedding into a CRM is a client satisfaction survey. So at its most basic level, a client satisfaction survey could just be a very short questionnaire that a client can respond to after any interaction with BRB to tell us about what their service experience was like. We could also look at having a more comprehensive needs assessment which goes into the detail of what a client situation currently is to help us unpack some of the current state of where the client currently is at. At the more sophisticated end of the scale, we could embed a free and post-client outcomes questionnaire into the CRM. So that's quite different to the survey in that the outcomes questionnaire is linked to the organizational outcomes framework and it's very much outcomes focused and thinking about impacts over time of our interventions. And the three key messages that I wanna leave with you in the slide is that the first one is data collection must be pragmatic and fit for purpose. So you may start off with a really grand plan for all the things that you would like to collect. But if it doesn't align with what is currently happening at the program level, it just won't get any buy-in. So it's really important that you prioritize what you need in the current state or in the current times with an eye to what you will need in the future and to keep building on those things as well and currently. The second thing, as I said before, is data collection is evolving. So it is important that data collection and the design of data collection systems, particularly in a CRM, is not seen as a once-off exercise. It's thinking about how can we keep building on it? So not just building on it to meet program needs, but also building on it to meet the needs of an evaluator. So one of the things that I found quite useful was really to think about, to go back to the monitoring and evaluation framework and to start thinking about what the key evaluation questions may be for this particular program and to break it down to the data level in terms of what data do I need to tell a story? What data can I embed into this particular CRM that will be valuable for me later on for a subsequent program evaluation? The third most important thing was the importance of having a system, a CRM that is flexible and can be scaled up over time. So VRV is using a Salesforce platform which is a cloud-based software. And one of the things that a cloud-based software allows you to do is it allows you to keep building up and up and up and adding more things as you go along. And it allows you to even scale it back if you need at a later point in time. It's important that you're not stuck with something that cannot change because the CRM will inevitably grow all the time. The data collection will inevitably grow all the time. So that's an important consideration when assigning the CRM. In terms of the insights that the CRM can provide, so in its very early stages, so again, it's only been three months since the CRM's been operational, but it can start to tell you a story about firstly at the programmatic level, it can tell you how's the CRM being used. The second thing is it can start to tell you a story about who your clients are and why they're coming into contact with you in the service system. And over time, you would expect that the CRM can actually start to provide you insights that will help to inform or improve your future emergency response and client outcomes. So very much at the start, it starts to tell you a small story. Sometimes it can just tell you, give you process level insights, but over time, you can start building the CRM with more outcomes focused data to start telling your story about impact, both at the individual and potentially even at the community level. So I wanna finish on a slide about using the CRM data for monitoring and valuation. So I've spoken to you about how we can use the CRM to start to generate the data. I wanna talk to you a bit about how we can use the CRM to tell a story about the data and also how we can use it to create a demand for the data that it's generating. So I guess the key message for me here before I go into these three points is that the CRM design is not and cannot be done in isolation. So I worked in a team that was responsible for both doing, leading CRM design development and implementation, leading the organizations, monitoring and evaluation plan, doing the outcomes work and also leaving a huge piece of work around data and analytics. And there were constant conversations with my colleagues to really start to think about how will the data that the CRM collects benefit the wider organizations? So it's really important to have those conversations to make sure that the data you're collecting is contemporary and important. So outcomes I find is a powerful tool that can help you to use the data you're generating to tell a story about the impact you're having over time. So it's really important to have your outcomes in mind when you are not only designing the CRM and for data collection, but also when you are monitoring and evaluating a particular program area, ensuring that the outcomes the program is trying to achieve is front and center. So that whatever data you're collecting from the system can tell you a story that resonates with the organization and with the community as well. The second thing is the adoption of new technology takes time. So if you think about BRV and this is common across all organizations really when you're introducing a new system, but in particular with BRV, the CRM was implemented across 10 to 15 sites and geographically they were completely spread out plus with the added complexity of COVID. It meant that the adoption of this technology was going to take time. In particular what I mean by that is that having consistent practice in the way data is collected is not going to happen overnight. You can have a perfect system, but the practice element is what's going to take time. So as someone who's doing monitoring and evaluation, one of the things to be mindful of is that in the early days, there may be practice issues, there may be data issues, but one of the things you'd be looking for is how these issues are being addressed so it can be improved over time. One of the things that I did in terms of addressing some of the issues that came up was using the data that the CRM was generating to create a dashboard for senior executives just as a communication tool to show them how it's being used, where there are emerging risks or issues that needed to be addressed. The third thing is about how we can use the CRM data to create routine feedback groups to synthesize findings and act on lessons learned. Again, very early days, but one of the things you could think about as an organization is creating or developing an evidence strategy. And I think at the most fundamental level, it would be about articulating what evidence are we collecting as an organization and how can we start using that evidence to embed it into our decision-making processes or into our governance. One of the things we started doing in BRV, for example, is creating fact sheets or podcasts that were very relevant and tailored to the work that we were doing. So I think when it comes to the CRM or just evaluation in general, and particularly when it comes to disaster, working in an emergency management space, where it's really important to keep past learnings in mind so that you don't repeat the same mistakes thinking about potentially developing an evidence strategy to create those continuous feedback groups of information and get the broader organization buy-in for the work that you're doing. So on that note, I finished my presentation, but we go into the activity now. It's a 20-minute activity and it'll also give me an opportunity to read the questions that I've come through and I will respond to the questions once following the end of the activity. And like I said, the group, everyone's come from such a diverse range of industries and I'm really keen for the activity to be an opportunity for people to network, but I'm also keen for you to share with the group if you've come across any examples of where you've seen, for example, CRMs being used to monitor, the client outcomes and impact in a really effective way and where you've seen CRMs being used quite in a sophisticated way, particularly in the emergency management context. So Ian is gonna help me with this, but shortly you'll break out into groups of five and I think I'll leave these questions up here. Hopefully you'll still be able to see it in your breakout rooms, but if not, think about a program that, so consider yourself to be part of a member of a team responsible for evaluating a bushfire recovery program. So for example, it could be cleanup, which is when you go and clean up the site where a bushfire has gone through, you clean up the building so that, and make it clean and safe so that household is able to go and rebuild a house again. You could think about evaluating a grants and payment program or a recovery hubs, which is a one-stop shop for people to get the help that they need, or it could be anything else. Think about, identify one key evaluation question you would use to evaluate this program against and think about what data you could draw on from this CRM to effectively respond to this key evaluation question. So you'll have 20 minutes to discuss this in this whole group. When we come back, it would be great if you can share one or two reflections from doing this activity, including any examples of, like I said, of organizations doing this well. Welcome back everybody to the main session, and I hope you've had good discussions in your breakout groups. And I did broadcast a message asking people to nominate one person to report on behalf of their group. So if you could just, we'll just hope we don't get too many people at once, but if you could just unmute and please mention your name and just give us the report from your group, that would be good. Who'd like to go first? I'm happy to report on group two, if you wanna start with two instead of starting with one. Okay, group two, yep. Mike McSteven here. So we talked about some of the, well, we took two examples. So we took the example of the cleanup, so the GROCON cleanup of houses in the last bushfire and some of the simple process things would be, things like the number of houses cleared. If you wanna get a bit more sophisticated, you could look at the number of alterations to plans based on customer requests. So when GROCON came in, they would have a basic plan and then they'd talk to the customers about things that they wanted left, things I wanted protected. You could even then talk about the number of specific requested items actually protected. So you get an idea of how well their GROCON were engaging with the customers rather than just simply how quickly they were clearing the blocks. Another example was a previous program on whole farm planning after disasters, looking at whether the whole farm planning program, actually, whether people used it, whether they found it useful. So just a satisfaction survey, whether they actually changed paddock layouts on the basis of it. So did it actually influence their behavior and whether there were some soil conservation outcomes and things as well. So that sort of data, the data you would draw on from a CRM, well, it depends a little bit on what goes into it, of course, you would certainly want to address those services that those people have accessed. So things like the date they applied for service and the date it was delivered would give you an idea of how efficient things are being, how long people are waiting. Other sources of data, possibly things like rates databases from council to check ownership, land use, overlays, various other bits and pieces that might exist. Most of the councils, I think in Victoria, use a system called Crisis Works, which is used for capturing information. Also for case capturing, it's like a CRM. It's not a fully functional CRM, but it's used like that. So there may be some cross-referencing required for something like Crisis Works back to the CRM to make sure that people are maybe receiving services in one context, being recorded in one context and then being picked up separately through the BRB hubs and things like that. Well, thank you very much, Mike. You've given us a good view of what you discussed in your group and I'll ask another person to report. What about group number one? Do we have a representative from group number one? That's Janet. I think that was us. Janet, I'll just ask you just to confine it to the main one or two main points because we haven't got too much time. Sure. I suppose we chose as an overarching question how successful was the BRB in setting up or supporting people and getting temporary accommodation if they needed it, emergency accommodation? And I suppose some of the ideas for data from the CRM would be things like when they made the request for emergency accommodation, then you would know how quickly they were able to get into that through the entry and exit data on that emergency accommodation. But we're also realising it can't be timing. It has to be actually the suitability because it may not necessarily suit the needs. We had a suggestion that you might also have a register of complaints, which seemed to be negative, but that might also be useful data where there were complaints about, yeah, if there were any issues with some of the accommodation. They're some of the main points, really. And we talked about our experiences. Thank you very much, Janet. You're welcome. What about group number three? This is Wayne. Can you hear me? Yeah, we can. Oh, great, yeah. We didn't get too far with it. Like we also feel a green look around like the bushfire stuff. So I think what we looked at was sort of grants programs and we've got it as far as sort of talking about like the sort of different touch points and probably decided on like the initial touch point which was like how did people first hear about grants because, you know, we're not always exposed to natural disasters like sort of very frequently. And so we didn't get too much further than that actually apart from sort of identifying like where we would want to go. Like what the valuation kind of question would be. Like in saying that, listening to other people, I guess like there'd be sort of things like, how did you sort of find out about the grants program? So, you know, I was like through social media, like, you know, like a big texts out to numbers that are in like bushfire affected areas, you know, post-desert recovery hubs, et cetera, et cetera. But yeah, like that was the extent of it. Okay. Thanks, Wayne. That's quite informative. And what about group number four? Hi, Verena here. I'd like to report on behalf of Nicole, Addy and Alison. We talked about the example of the hubs and our key valuation question started off by being formulated around efficacy. And so we discussed some of the data that we would be collecting, obviously our demographics. The queries that would come in where there was perhaps multiple queries coming through the channel, whether we had, we developed a little flag to follow up with our clients to ensure that if they were referred down to a federal service that they actually were able to be, were able to hook into that service. But Alison and I came up with a really great idea that particularly I like, which was, let's look at the barriers why people didn't use the service. And obviously we could perhaps even use RCIM to map against other records of people that were affected to find out why people didn't use the service. Did they not know about it? Did they not want to reach out? Were they too afraid? So that was really something where we thought perhaps we can use the CRMS to identify people that weren't in the CRMS and identify the barriers they were facing. Excellent. Okay, that's a good great idea and good compliment there. That was group, your group number? Four. Four. Let's try group number five. Who's reporting on behalf of group number five? There were five groups. No body to report on behalf of group number five. Okay. In that case, we'll leave it. Thanks very much for those reports. I added a lot of details. I'm glad everybody had some good discussions. What we're gonna do now is return to your questions. If anybody has got any further questions, we'll see what we can get through in the time that's remaining. This session can go till seven, my time, Victorian time. We've already got some of the questions already listed. So, Christine, you've been looking at them and I think you probably might like to say who's given us the question and what the question is and then your answer would be great. Yeah, thanks Ian. While I do that, I might actually share details for how you can get into contact with me. So, I don't know if you can see this. You can see it? Yeah. All right, the reflections that all of you shared was really interesting and I really like how engaged and you were in the activities and Okay, so I will go to the questions now. All right, the first one was from Mike. The question is, has the data being used? Did it identify any shortcomings of Biavi's activities? So, so I was in Biavi for two months after it was launched and by the time we had launched in May volume of clients wasn't as much as it would have been had we launched a CRM, for example, in February. So, the data that because it was so early on and the volume of data wasn't that high, the way that we used the data was very much around operational improvements. So, early on the data was telling us that perhaps not all the users were using the system or that they weren't all using the system in the way that we felt it should be used or the way it was designed to be used. So, what we did to address the operational side of things was to create dashboards to the executives in relevant areas, but we also used examples of data to bring together coordinators and hubs together every week for the first month to start unpacking some of the shortcomings in data collection to really get a sense of what the practice issues were and how we could support coordinators in transitioning to using the new system. So, yes, the data was used, but not so much in terms of identifying shortcomings of BRV activities. I guess another thing that the data or the CRM design highlighted was that we needed more. So, I think this is why I said earlier on that you can't, you know, it's great to have a great vision for data collection, but if you go dump it on people when they're not ready to hear it, then it's not gonna work. I think what the benefit of starting small and then building up was that people could see something tangible and also started to see that, hey, we actually need a client's satisfaction survey to tell us how clients are pairing and whether they're actually happy with the services that they were receiving. We also need a comprehensive detailed needs assessment. So, I guess in some ways that's how the data was being used. Christine, one thing you might like to finish sharing your screen now just so we can see more of you in... Cool. Yeah, that's great. There was another question from Nicole about how you understood client journey mapping in a crisis context. That's a really good question. So, there were a couple of ways around that. The first way around that was we engaged heavily with the local councils, with Red Cross and with other agencies that were working in the space to really get a sense of the environment, the physical environment that they were operating in and also the programmatic environment they were operating in. We spoke to them to unpack who the clients were, who was coming to them in the immediate response phase, what were their needs, what is the level of damage, et cetera. We also got a sense of the technology that they were using, any shortcomings in the technology they were using. And we had a lot of consultants in the early days who offered their services to us. And I think that happens a lot in the disaster space from what I heard. So, there were opportunities which freed up our capacity in some ways because there were opportunities for people to go out even though it was a crisis context to shadow some people to go out into the regions to observe what the temporary hubs were looking like and what the day-to-day working environment was looking like. But that was one way that the client journey mapping was able to be done. So through phone calls, using existing networks and also by going out to the sites when people were able to. Also being very mindful that people were traumatized, they didn't wanna be engaging with us. So it was about finding that balance. We have a question here from Helen. What's about, was it used for quick turnaround around adaptive management real-time data? Helen, I think I may have answered this question through the Mike's question earlier about how we use the data real-time. But again, it was more on the operational day-to-day management of how the CRM was used. And if we think about adaptive management, I think the way we use the data actually got executives on board and engaged. It also, we started to see a real change also in how coordinators were using the CRM and engaging with it. Went from being scared of using the CRM to being active co-creators of what the CRM should look like going forward with us, which is a really great change to see in a very short time, actually. So I hope I've answered your question there, Helen. Melissa, you have a question around, we have considered CRMs in the past and the concern with Salesforce's data security and privacy policy as clouds are not within Australian jurisdiction at my point. So there were a couple of things about that. Salesforce has some sort of an enterprise deal with the VPS and as part of that, there are certain security, information security obligations that must be met and have been met. We did a comprehensive privacy impact assessment as part of this. I think our cloud is actually hosted in Sydney and we also led to some extra security features to address some of this, to strengthen, I think, the security of the system. So yeah, we addressed data security and privacy concerns to legal advice, privacy impact assessment and also, oh, there, I saw a thumbs up from you. Cool, we have a question from Pamela. Do all state government agencies have access to the data? The short answer is no, we can't share the data because it's identifiable personal information. But one of the things that BRV as an organization is looking at is putting in place information sharing arrangements with relevant agencies to get more aggregate level data, I think, from what I remember to help with some of our business but not so much individual data. We're definitely doing a lot of extensive work around privacy and putting in place information sharing arrangements. I'm not fully across the detail of that, unfortunately, but happy to put you in touch with someone who can help you with that, Pamela, if you're interested. Catherine has a question, how expensive is Salesforce? Is it expensive to use an access? Salesforce has an enterprise agreement with the VPS. I think it's called an enterprise agreement. There's some sort of commercial arrangement and it means we do get a slightly discounted product. But yeah, I don't know how it compares to other products out there in the market, unfortunately. But Salesforce is a product that gave us what we needed for BRV. Question from Claire. As my organization, CRM admin providing an evidence summary and returning results is a helpful insight. Any more details on what worked in the space for users would be great. So Claire, at the time I was leaving someone else came, I don't know if you're in the meeting. So someone, I wouldn't mind falling up with Claire after the meeting, Ian, if it's possible. But basically, if anyone's interested, after I left the person, at the time I was leaving a person who came to take over from me had the idea to create some sort of a dashboard for workers to look at as soon as they entered the system, which would give them a daily summary of outstanding tasks, that sort of thing. So it personalized the system to their circumstances and would help them from a task management perspective. The question from Nicole. The Victorian government is a leader in the use of tech and social media. Whether any existing government initiatives infrastructure you could capitalize on. Well, one of the things that, so it's not something we have done yet, but an idea for what we could do is in engaging with the communities, we have an engagement big platform, it's a really great, great platform for engaging the community on any issue and helping them to co-design things with us. So if there's anything that I think would be absolutely, would be a great platform for BRB to use, I think that would be it. I think that's about it, that's the last written question. We just have a couple of minutes left and I'll just give anybody else a chance to unmute and ask a question if they'd like to, if they haven't asked one yet. You know, I wouldn't mind getting in touch with Claire later on, if that's possible. There was a question, sorry, that just came from Melissa. Does the CRM have the ability to measure outcomes with the social determinants lens? If outcomes are deterred? Yes, if we do, if in a future state we did have the outcomes questionnaire, there would definitely be the ability to monitor any changes or impacts over time and overlay that with demographic data to see if outcomes vary depending on people's demographics. There's a question from Eddie. Having a data platform is good, but who will be responsible to follow up with the data and constantly drawing insight from it to fully utilize the data platform? So the great thing about the way we were set up is the CRM team was also, the team that I was in was responsible for data and analytics, CRM, monitoring and evaluation and outcomes. And we all work together constantly on this. So the way we would envisage the CRM working is, so in the first instance, there's regular reporting of CRM data that happens to program areas because program areas need to also report to the minister about progress being made. So program areas, by extension, are looking at how VRV is making progress. But the second way we will envisage someone monitoring data, how the data insights is aggregate data from the CRM could be drawn out by our data and analytics team as part of their daily day functions, provide those insights. So there's a clear role for everyone to play in this space is what I'm trying to say. Nicole, you have a question. Do you have a sense of any particular cohorts who have not engaged with VRV, struggled to access the services that they need? Nicole, unfortunately, I don't have an answer to that question. But in terms of people struggling to access the services they need, the hubs were set up in physical location, but there's also mobile hubs. And mobile hubs are really people getting into a convoy of cars and driving out to very remote locations where people may not necessarily be able to access help and support, or it's too far for them to drive to the nearest hub. So that's something the organization kept in mind in terms of addressing that gap. But I'm not sure about the specifics of anyone who's not engaged with VRV. I think that at that point, we've reached the end of our written questions. And as far as I can see, as soon as we've missed. And we've got a couple of minutes to go. If anybody else, I just give you the chance now to ask a question if you're going to unmute yourself. You can ask a question directly. We've had a lot of really great questions so far. So I think we've done well. So what I think I'll do then is to thank Christine very much for a really good presentation. It was very clear and concise. You draw out the connections with evaluation and monitoring and evaluation very well. It was great to do the group exercise so people could meet each other. We've had a couple of comments back on the chat here saying people did enjoy meeting people from other areas and from other states. And it's always a good additional feature of our seminars. And also for the entering of many questions that have come up. So thanks very much, Christine, for being a very good seminar. Thank you very much for attending and for your participation. I really enjoyed hearing the discussion and responding to your great questions. Thank you. And it's very generous of you to share your email address so that people can get in touch with you if they have anything. I'd be happy to hear from you.