 Good morning and welcome to the 11th meeting of the committee in 2015. If you wish to use tablets or mobile phones during the meeting, please switch them to the flight mode as they may otherwise affect the broadcasting system. Some committee members may refer to tablets during the meeting because we provide meeting papers in digital format. Agenda item 1 is to agree to take agenda item 4 in private. Our next agenda item today is consideration of a negative SSI, the local government pension scheme Scotland amendment regulations 2015. That is SSI slash 2015 slash 87. Members have a cover note from the clerk explaining that instrument, as she will note, the Delegated Powers and Law Reform Committee has pointed out that this instrument contains nine drafting errors, so the Scottish Government has agreed to lay an amending instrument after the start of the new financial year to correct these defects. Do members have any comments? For the official report, I would like to record our disappointment at the continuing high level of inaccuracies contained within the statutory instruments that we consider, given that statutory instruments are law. I would like to note our thanks to the members of the Delegated Powers and Law Reform Committee, staff and parliamentary lawyers for their good work in addressing those on-going issues. That being said, we agreed not to make any recommendation to the Parliament on this instrument. Agenda item 3 is our local benchmarking system annual review. We will have an oral evidence session with witnesses from Solace Scotland and the Improvement Service on the on-going progress of the local Government benchmarking framework. As members will know, we have taken a keen interest in the progress of the framework and have held annual evidence sessions on its development for the last few years. I would like to welcome Angela Leitch, who is the current chair of Solace, and is also Chief Executive of Eastlothian Council, Colin Mer, the Chief Executive of the Improvement Service and Emily Lynch, Senior Project Manager at the Improvement Service. Before we move to questions, do you have any opening remarks that you would like to make? Thank you very much. We will be very brief because I think that the paper I hope is reasonably self-explanatory and raises the questions. First of all, I want to thank the committee for its continuing interest and support in developing the benchmarking framework and to record certainly my own interest in the comments that you got back from your online survey of people's responses to the framework. What the paper does is just to refresh the point that the purpose of the framework is to create a range of high-level comparable measures across the 32 councils. We have explicitly across time used the language of can-openers, what those indicators do is post questions for councils rather than answer them, and the language of drill-down has been used recently by us in respect of that. If a council looks off-trend on a particular indicator, it poses the question that you drill down within your services and engage with your communities around why that would be the case and how improvement can be made in that. We hope that it is one contribution to a whole range of improvement tools that councils are using. Just to reassure some of the people who fed back on your online survey, it fits in to, for example, self-assessment using EFQM or variants of EFQM across the 32 councils and community planning partnerships. Spell out what the acronym actually means, Mr Merr, because it is the European Foundation for Quality Management. It is a self-assessment model used both in the private sector and there is an adapted version called the Public Service Improvement Framework, which is used in Scotland in the public sector. The benchmarking framework continues to be in development, and there are areas that still need substantially strengthened, including around our understanding of children's learning, growth and development across the period preschool and then throughout their primary and secondary schooling, and work is on-going to try and get to comparable measures in that. A final point from myself would be that we are still operating within the framework of an accounts commission directive. As I think the committee has discussed in the past, this framework replaced the statutory performance indicators that were previously laid down by the accounts commission. However, the accounts commission does place a directive in councils annually that they must report on the local government benchmarking framework data and that they must put it into their own local public performance reporting. The reporting part of that is happening at council level and down to communities. The framework is there to support councils in having the data that they need to report to communities. If I might also introduce Angela, just to say a bit about how the framework is actually being used at council level and with communities. Good morning, everyone. At a local level, I have taken the opportunity to speak to some of my colleagues in advance of coming here. I think that it is fair to say that the benchmarking framework is now firmly embedded in the public performance reporting that local authorities do. All of the indicators appear to some degree in each of the 32 reports that are produced on an annual basis. Beneath that, there are a further range of measures that support the benchmarking indicators. That allows us to drill down either into more thematic groups or to geographical areas. I have some examples that I can explain further into the meeting. As Colin said, I think that it is, on a practical basis, benchmarking is used very much as part of the improvement toolkit that we have. Again, we have a variety of examples where using the benchmarking data is used to develop improvement plans and future service plans, which in turn are monitored through the benchmarking indicators. Equally now that we have three years analysis and three years data are seeing trends. That trend data is becoming very useful in further engagement with our communities, in discussion with communities as to priorities, how we might change some policy decisions or whether there are efficiencies that could be made. That is where the family groupings are increasingly important. We are looking at comparisons elsewhere in Scotland and the learning that we can individually take back to our own local authority area. As part of the exercise, we went out to the public and asked their views on the framework and have had some various responses. I will start with one from a fife environmentalist, which will come as no surprise, because we want to hear your views on the benchmarking system. His immediate response was that my view is that it is mince. On a good day, I might be generous and give it 2 out of 10. How could you persuade the fife environmentalist that the framework is working and that it is making a difference compared to the previous performance indicators that were used? Firstly, I think that I would take one of the points raised very forcibly and succinctly by the fife environmentalist, that if this was being presented as a measure of environmental outcomes, including carbon emission and so on, in fife it is not that. It is a benchmarking framework of what councils call their environmental services. That means that we are looking at waste collections, street management and so on. I think that there is a bit of what the fife environmentalist wants as a statement of outcome. There is a parallel bit of work going on with community planning partnerships that represent all the public agencies and areas to get to a set of fairly standard outcome statements that will be published annually that will allow the public to look at how things are changing in their area. That is a benchmarking framework to help people who run services to compare themselves with other services elsewhere. The fife environmentalist is entirely right. It is therefore not measuring the type of environmental outcome for fife, he would wish to see, but I would reassure him that there is a parallel bit of work going on there. What I can also say is that, in terms of how does this work—and Angela will talk more to this as well—where people seem to be significantly out of kilter in terms of how other councils like them are performing, it is now being used to drill down quite hard to say, well, why is that? Is it a fault in our systems? Is it something to do with our mechanisms for delivery and so on? That then leads into improvement planning and improvement delivery. I think that we could give examples, and Angela could talk a bit more to this, of where this is being used in quite practical ways to drive forward change in services, but the core point about outcomes is a valid one that the fife environmentalist raised, and it is being addressed in a parallel stream of work. Just really that point that it makes about what it is. I think that what he is really saying is that you are not necessarily measuring the right things. I suppose that is the point. I remember when the benchmark came out a couple of years ago, and if you looked at it, the comparison was the cost of children in care. If you looked at fife, and in a comparison, we are similar, size authority south Lanark. Fife's costs were way way much higher, but it was then what sat behind that. For the member correctly, the fife homes were much smaller homes. The south Lanark homes were much larger homes, but what then sat behind that, because did it mean that in the fife homes children in care had more chances of succeeding? Was it better quality care? How much does the benchmark exercise what I am trying to get to? What is it that you are measuring, and is it meaningful? Are you measuring light for light between authorities? It is very much a can opener. The example that you give about children's services, we use that extensively. The key thing for me is that services are self-aware. They know why the differences exist. I think that we should expect differences, because each of our local authority areas is very different, and we have different practice. Elected members are appointed on a manifesto, and they determine local policies that are different, and it is on the basis of those policies that some of the practices within local authorities are undertaken. I think that that example, though, I would expect that there would be deeper analysis to look at. In having smaller homes, in having that higher staff ratio, what are the outcomes for those young people any better? If not, what can we do? Is there a policy change? Is there a practice change that needs to be undertaken? That is where it comes into the improvement, the wider improvement agenda, so that you are looking at it, not just in terms of the raw data, but probably engaging with other professionals. What could we do differently, particularly on the attainment side, is crucial. We have a lot of examples where, on the face of it, the raw data would look as if each individual authority is not perhaps performing as well as either the national average or as other comparable local authorities. However, when you look beneath that, you can then understand why. That is up to individual councils to determine whether that practice wants to continue with it. If I take my own area, our road services, we currently spend more on roads than the national average. However, one of the reasons for that is that, at this point in time, our investment is not just in resurfacing, but when we are doing the resurfacing, we are looking at drainage, we are looking at kerbing, particularly on our rural roads, because, in the longer term, we know that that will help to have the resurfacing last for a longer period of time and make that investment more best value. It is the self-awareness that is crucial. Previous sessions that all those metrics were to be caveated by authorities, like you have just done in the case of East Lothians roads. Is that happening? Have the general public got an understanding that there are caveats in place because local decision making has made some authorities do things in certain ways compared to others? Can I say something about that? We worked with local councils recently to look at how they would provide that information to the local public and how they would provide those caveats or, if you like, that local context. What was identified as being critical was not just to put the data out there without some supporting narrative to help people to understand what were the local priorities, what was the starting position, if you like, for that particular council, what were the policy objectives that that council was pursuing. Councils are currently working to improve the way that they include that information in their reporting in order to provide those caveats and to provide that narrative. We have an event with councils next week to look at the good practice that is emerging across councils in terms of how that has been reported to the public, and what feedback we are getting from the public in relation to that reporting and how we continue to improve that. We are working with Audit Scotland as well, because they are currently undertaking their reviews of public performance reporting to build on the findings from that review, so that that can shape how councils address that. Do you want to come back? I think that it might be useful following that event if the committee was supplied with information on how councils are using that information and how they are reporting it to the public. I think that that would be useful for a further discussion. On the general numbers being published and discussed as they always are, it would be ideal in an ideal world in the case that that Fife story, the reason why those numbers being different in Fife, for example, were spread around authorities, because they may choose to make the same decisions. We will probably come back to outcomes in a second. In terms of the metrics that we discussed at length in the run-up to the formation of the new framework, some of the old indicators that were used, how many library books were borrowed per thousand population is one that really didn't show what services libraries provided in today's world. We have from Elma Murray, the chief executive of North Ayrshire, a wee statement about the correct metrics or indicators. She says that there are some indicators that would be more appropriately measured by alternative means. For example, one of the indicators is the cost of parks and open spaces per thousand population. Would that metric be better suited to acres hectares of parks and open spaces? Are you continuing to look at each of the indicators and trying to modernise, if you like, the actual measure itself because something has become a little bit irrelevant over sometimes short spaces of time? Ms Lynch? Yes, absolutely we are. It has been a key priority for the programme to this point and it continues to be a priority. We have identified that there are limitations in some of the measures but there are still gaps in some of the measures that we have within the framework. Again, we have been working with all 32 authorities to identify where the limitations exist so where are they still causing some concern in terms of their robustness and where are the key gaps. Some of the priorities for the period ahead to address are improving some of the guidance around the financial measures to ensure consistency. We want to strengthen the indicator that we have about gender equality because at the moment we just focus on women in the top 5 per cent positions but we are interested in gender equality across the workforce so that is an area that we are looking at. We are also looking to strengthen the measures that we have around outcomes for children. Colin mentioned that earlier in terms of preschool and primary school education because at the moment we have cost measures for this but we do not actually have an outcome measure so that is something that we are working with ADES and other professional and educational authorities to look at how we address that. We are also strengthening the measures that we have for older people and adult social care because again we recognise that this is an area within the framework that requires to be strengthened. Just a specific example that we are also working with councils to do in terms of, you mentioned, sports, culture and leisure is to look at whether or not a net measure of cost in this aspect would be more relevant for authorities than a gross measure, which is currently there. That has been identified by the directors of finance as a piece of work that they would like to take forward. There are certainly some areas that we are looking at improving. The number of issues that I would like to pick up on just in terms of the statement submission and the issues raised by the call for opinion from the public. What is the engagement of COSLA in this process? What I have heard so far is that it all seems to be officer led and Ms Lynch made a reference to an event taking place next week. Who has been invited to that event? Is that purely officers or are there elected members invited to those events as well? We run a number of events across the year for the range of audiences. The event next week is specifically for officers. However, we also run events for elected members. We ran four different events last year for elected members and we are also scheduled an offer of four regional events for elected members this year. We recognise the importance of elected members. COSLA is on our project board. It is the vice chair of our project board, so it absolutely provides an on-going steer and involvement in the development of the programme. We presented at the recent leaders meeting in January for all the elected member leaders in order to ensure that they were up-to-date with the programme developments and understood the key themes that were emerging from the project at this stage. We certainly do prioritise that area in the work programme. I think that most local authorities will embed that within their development programme for elected members. Only yesterday, our benchmarking information went up to policy and performance committee. At that level, the elected members must have spent about an hour scrutinising the data that came from it. That is where the narrative is. At a local level, having it embedded in our public performance reporting in a variety of different ways, it allows members to give the scrutiny to make the challenge to officers that they would expect. The council committee spent an hour scrutinising that data. How long does it take the officials to scrutinise the same information that has been presented? I am sure that it is longer than an hour. It is part and partial of the way that we look at improving services. We use that with a range of other measures. Part of the benchmarking is satisfaction responses. Most local authorities take that to a much more refined level. Whether it is in five terms, there is the people panel, and there are a lot of other local authorities that have a citizens panel. The engagement through that is crucial. The time spent is becoming much more part of the whole improvement agenda. Increasingly, we are looking at aligning the feedback from complaints or compliments back into some of those areas where we think that either we want to improve or that there are policy issues that we need to be thinking about as a council. It is not just about engagement with people who are interested in that type of engagement. It is about using the passive responses from individuals on our services to tie it back into performance information in a way that helps us to think about how we can improve. That is what it is all about. It is about how we can make better use of our resources and how we can provide a better service to the people of our communities. John Finch-East, can I ask the panel, are you confident that all local government elected members are aware of the benchmarking process and the criteria used within their own local authority for measuring service delivery? Ms Leitch, you mentioned earlier that councillors are elected on manifestos. How does that then tie in with the benchmarking criteria and performance of a local authority? How does that stand against what might be the manifesto that the leading party that takes administration is elected on? If I can give you a practical example in my own area, our elected members have prioritised the environment of East Lothian, parks and open spaces that were mentioned earlier, and it has been one of the responses. We spend the most in all the local authority areas on parks and open spaces. That is a policy decision. Having said that, we are now looking at whether or not that level of expenditure is something that we want to continue with. That is a process of engagement with our elected members, first and foremost, but increasingly it is an engagement process with the electorate, the people of East Lothian. One of the things that we are now doing as a result of the benchmarking information is that we have taken a selection of people from our citizens panel. We are now looking at a citizen-led review of parks and open spaces to help inform that policy decision. That is an example of where policy, benchmarking and engagement are starting to come together. I would be very confident that councillors across the council are familiar with their own performance framework and how the benchmarking framework is embedded in it. If you are asking me to, I think, call 1200, councillors in Scotland are entirely aware of the benchmarking framework as it would appear in our overview report or on the website, very possibly not, but that would not bother me very much because the point of the benchmarking framework is to support their own local performance scrutiny, not to be treated separate from what they are doing as a council by way of both scrutiny and improvement. I would be confident about your latter point. I could not honestly give you an accurate figure for the first point. Ms Lynch, do you want to add anything? I will add two more technical points. The project board overseeing the programme, one of the things that they look at on a regular basis, is the extent to which individual councils are including that information within reports to elected members. It is an on-going focus. We were asked by councils last year to deliver a programme of training for officers to support them in developing awareness sessions and approaches with their local elected members to support elected members to engage with that information and to interpret that. That is something that we did earlier this year, which is to deliver a programme on that. One of the issues that I have is that, yes, you can deliver training and you can to officers to then take that to the elected members. It is whether or not the elected members feel that there is any value to participating in those training events. I know from the authority that I sat on some of the training events and I still get information regarding some of the training events that are held in the local authority, where there may be only half a dozen elected members turn up to particular training events. It is about trying to understand, because benchmarking is not just about within the local authority. My understanding is that the benchmarking process and framework were established so that local authorities could compare against each other and within the families that had been identified. How do we make sure that that type of measurement against the delivery of the families or other local authorities of a similar nature or size is taking place so that local authorities understand how they could, because part of it, my understanding was to try to deliver things better and use the examples from other best practice authorities. I can say a little bit about the programme of family group work that is on-going, because, quite rightly so, this is of particular interest to elected members also in terms of when we present the information or share the information at a high level, when you are able to then share further richer detail about what is emerging from the family groups. I think that makes the data far more relevant. Our family groups currently being established in areas such as looking at services for looked-after children, which we talked about earlier, also in terms of waste management, council tax and in sport services. All 32 councils are currently working within those families in order to come together to use the high-level data as can-openers and then to drill down into that data to try and understand better what is behind the differences between those councils, what opportunities are there where we can learn from each other, because some of the differences, as we have talked about, are absolutely about policy priorities and local decisions that are taken in local context factors. However, some of the differences in performance are absolutely about new or innovative practice or about different ways of working, so that work is on-going. We continue to work with councils to roll that out across other areas across the framework. Already, there are examples generated in terms of good practice that are being highlighted in those groups. I would like to ask a thousand more questions, but I think that I have taken up enough time at the present moment. You may get the opportunity later. Clare Adamson, please. Thank you very much, convener. I was interested, having also served as a local government councillor as well, about the differences in the areas that are benchmarked. I can quite easily see how you can do a financial comparison, deliver a comparison with uplifts, lights being on and off, and things like that. When we get to the issues that my colleague Alex Rowley raised regarding areas where it is more intrinsic and more about the outcomes, I draw your attention to the evidence from Museum Scotland about the focus being on the financial indicators in terms of visitor numbers, rather than anything to do with the importance and contribution of the wellbeing. If I put it in the context of the Government strategic objectives from 2007, a wealthier, fairer, savering, stronger, healthier, greener and smarter, if we are having to look to other reports and other pieces of work to get the full picture, is there not something fundamentally wrong with how we are approaching the benchmarking process? Should we not be able to do that all within that process? I think that the point about outcomes that our colleagues from museums and galleries raised benchmarks are measures that pose questions for us for people, pure and simple. The financial measures that we are satisfied are standard, diet and accurate. The footfall measures are measures that are taken on exactly the same basis as the national galleries take their measures and so on. If they were wrong for local government, they are certainly wrong for everybody else as well. If they were wrong for a lot of the figures, we would talk about attendance at galleries and museums would simply emerge to be wrong in that model. We are satisfied that the measurement is right. We see the point that the measurement is being. If some people have, for example, very significantly managed to increase their footfall to the museums, because most of the change here is increased use rather than decreased expenditure, the unit cost per person has come down because of that. That is a success story and it is important. What have they done to get there? How have they managed that? In some cases, that will emerge to be that we have begun to run free bus services for certain communities so that they can access the art resources that we have, but it would cost them quite a lot on low incomes if they had to get buses there themselves. We started to look at the whole transport connection. In a way, we see the benchmark as posing a question for people. There would be a small number of museums where that question is posed quite starkly. There is a genuine issue of footfall declining quite sharply. Why is that? How can that be reversed? On the outcomes of museum and galleries, I have to say that I am working on a completely separate thing with the directors of culture and leisure in Scotland just now, which is entirely about how do we begin to better demonstrate the value that we believe comes from participation in music, sports, arts and so on, which I do believe. I would have to say even sitting down and looking at value to studies and so on, we are still quite clonky around defining outcomes here and we are still quite clonky about measuring. There is a lot of assertion, but if you poke a stick at the assertion it often dissolves fairly quickly. There is quite a lot of work to be done in some of those areas to get to a much clearer understanding of outcomes, but I think that this committee has raised it with us before. Outcomes for who? If we are running absolutely and rightly uncharged art galleries and museums in Scotland, presumably we want to make sure that the whole community benefits from that and not just some sections. We would also start to need to know more about the segmentation of our audience. Are we disproportionately getting used from them? Communities are disproportionately not getting used from others. If that is the case, what are we going to do? That is the drill down bit that Angela Raffaerd referred to earlier on. We know that some councils are looking at this matter in some detail. Who is using? Who isn't? What could we do to get the people who are not using to use? I would axiomatically assume the value of arts and museums, but we then have to demonstrate our value by saying, are we getting the foothold that you would get if you are publicly funding absolutely and universally free access to that? That is the question that the benchmark poses. We then need to go on and answer that question. I absolutely accept the point from the galleries museums. Ms Leitch, I think that this is where the can opener phrase really comes into its own, because from that, and I was interested in the observations made in the responses, in my own authority, we would then take that back into our improvement framework. The improvement framework is the assessment of how well the service is performing, and it is done by staff themselves. It is not managers. They gather the evidence, benchmarking is part of that, and they then do the comparisons with others to say, how do we compare? That whole process then is scrutinised to mixed extents with different publics, because people engage because they are interested in a particular subject area, so taking that to the individuals who are particularly interested in that topic and that service is the next stage to that. In addition to that, in terms of the whole value of improvement, the improvement plan, we do further scrutiny on that with our local area network, which is all the scrutiny partners who then help us to look at the improvement journey, and benchmarking is very much a part of that. I could also cite an example of where housing across the country is really embracing this. This is our own performance report that we put out to tenants and residents associations. The benchmarking data is there, but it drills down to far, far more detail. The detail that is in those reports has been developed by tenants and residents, so they have told local authorities the types of information that they feel is of value in demonstrating the worth or the performance of those particular services. Housing has probably done an awful lot more than some services, but it is actually a good model that we can adopt. Again, it links into the question of the committee on that engagement process and the journey that we are on in that. My concern is that all of this is that you can go through all that process in the area of housing. You may well have done all that process and be compatible with the other authorities that are in the group that your authority is in. However, if you went to the families involved and said that your housing has been improved, where is the evidence that the outcome has been changed? Even if the financial targets are similar and the process and delivery are similar, you are not materially changing the outcome for the people that it affects. Where is that information captured? In some of those, if I can use the housing one, it is not the only example, but the benchmarking families are, actually. I can use our own illustration of that, where we have compared our own performance with others. Where we do not perform, that is the question that we then go out and say, why? Why is it that there are others who, on the face of those indicators, look as if they are working more effectively? They have a better outcome for people. That is where the analysis is. That is where the narrative is important and the whole improvement journey. As Emily was saying, it is about taking ideas from others and adapting them to your own circumstances so that that improvement becomes much more embedded. Over the peace and local authorities, the staff—this is where the workforce is so crucial, because it is not just about a certain few who understand those figures, it is about the workforce being committed to that journey of improvement and linking everything back to better outcomes for local people. I am a former member of the Public Audit Committee and also a quality assurance manager at some point in my past career. For many years at the Audit Committee, convener, we were always interested in the question about follow-up from the many good recommendations that often come out of reports and benchmarks and documents. It is part of the circle that is not completed very often. Who does the follow-up? I would imagine that the benchmark reports tell you as much about what you have done and also look forward to what you might want to do. Who does that? Who does that but reporting whether your good recommendations, advice and so on are taken up by the authorities across the board and how does the public see that? Who is going to take a crack at that first? Ms Leitch? On an individual basis, it goes back to the embedding improvement as part of the culture. When the data comes out on an annual basis from the benchmarking, it is certainly something that local authorities will sit down with their administration, with various scrutiny committees and start to look at where we need to look at. We need to tie it back into each of the priorities for the local authority. At different stages, there will be particularly different priorities that they have committed to through their SOA or through their council plan. That is really where that level of, well, let's look at how others are doing. Attainment is a big one for my own local authority. There is a big piece of work at looking at how others are improving attainment, particularly for people in disadvantaged areas, and Western Bartonshire has done particularly well on that just now. We are looking to learn from them, not exclusively, but from the practices that they have currently introduced to try to equalise the attainment levels across their local authority. It happens very much at a local authority level. I suppose that people picked up last year's report or two years ago or something and said, oh, those are great recommendations. Were they done? How would they find out that information? Did they do it? One answer is that, in a way, if you look at our overview report, it is a report on data and what questions and issues the data poses. That is it. It does not actually make recommendations at all, but it would be improvement plans at council level that would follow from that. Improvement plans are tracked through the improvement process itself. They are scrutinised by audit and scrutiny committees within councils. There is a process whereby, if an improvement was agreed, are we delivering the improvement that would then be tracked over time? I am working with the council, which is now very similar to Angela, who is looking to see how they can improve and prioritising the improvement of education attainment for kids from the most disadvantaged backgrounds in their area. That started out from the benchmarking framework. They were improving with the work as fast as others and led them to engage with a range of other councils. What they have now got in place is a set of improvement plans with schools, with their community learning and development people, with their homeschool link people and with their employability people to say, we are now going to shift this onward sharply. That will be built into the performance appraisal of headteachers. They now have targets about what is expected of them with respect to the composition of their school. It will be used to judge the education department and that will be routine reported on. There are mechanisms to reassure you that, as you move from the high-level benchmark comparison to the question that poses to the improvement action, that then gets built into what are really quite formalised processes for then taking those improvements forward and reporting them. We have heard from Ms Leitch about embedding improvement. We have all heard time and time about continuous improvement. You have talked about putting things into appraisal systems for headteachers. Often what we have found is that front-line staff who are delivering the services, who often know what the improvements should be, are often the ones that are least involved in trying to get the outcomes that we all require. What are front-line staff's involvement in those benchmarks? Are they communicated to front-line staff? Are they asked for their opinions to try and improve? All we are hearing here is top-level stuff. I think that it does vary again, as you would expect, but, if I can point to the western aisles, for example, they have a fairly innovative practice that they produce an annual report of their performance again based on their... Ms Leitch, the western aisles are kind of different in some regards because it is a very small council. Quite frankly, the chief executive of the western aisles is likely to know the vast bulk of the staff in the western aisles and is particularly approachable. We have seen that for ourselves when we have been into smaller local authorities where the chief executive knows everyone and everyone knows the chief executive. What is the situation in the north Lanarkshire, Glasgow or Aberdeens in that regard? One of the key features that Colin touched on earlier is embedding the self-assessment process, the self-evaluation process, and where it works particularly well is where staff at all levels in the organisation are involved. That is where an understanding of performance is at the heart of what it does. We would certainly encourage that type of approach. We obviously do a lot of feedback to staff. Various local authorities adopt different approaches, such as Lean Six Sigma and the Vanguard approach. There are a variety of different mechanisms on an improvement level that are done. There is equally a number of measures that are put into place to engage people. Glasgow, for example, has just gone through—I do not know if they have completed it yet—a programme of engagement that has included all the front-line staff. It has been a two-year programme. That is about trying to explain what the corporate objectives are but also to give people at the front line an opportunity to feed back on improvements on how they think that things could be done differently. There are a variety of techniques being used. Willie Sorry, I interrupted your question. No, thank you very much. I was very interested in that response as well. Can I just talk about the framework in general, Emily? In one of your answers, you talked about modernising elements of it in relation to the question from the convener. How do you see that developing with the community empowerment bill coming soon? Do you see the framework itself evolving significantly as a result of that? Do you think that the public at large will be able to influence and determine, in fact, what is in the framework? Also, what are the meaningful measures to them? Will they be able to shape the frameworks rather than have the frameworks done to them by the local authorities? You are going to start with Ms Lynch. Yes, absolutely. I guess that there are two elements to that. One is that you talked about closing the loop on us and we are very much focusing on how we use that information to understand what is happening in local communities and use that information to engage more effectively with local communities. As we are more successful in doing that, that will help us to shape and refine the measures and the way that we present them. We very much see that as a loop in terms of, once we have managed to affect more effective engagement, that will help us to understand what measures are important and how that should be shared. The other thing is to reiterate the point that Colin McLeod made earlier about the development of the community planning approach, because that very much has at its heart ensuring that that information helps us to understand what is happening with local communities, understand, for example, where significant inequalities exist across or within local communities and engage more effectively with those local communities to understand what is behind that and also to help to ensure that they are shaping the solutions to that. Again, across the year ahead, as we develop that work, we plan to work with community planning partners and local communities to shape and develop that approach. The intention would be that local communities shape what are the measures that are important to them and how should that information be shared and reported to them in a way that they can engage with it and make sense of it, so that is certainly an intention for the year ahead. Very briefly, I worry that we may have overhyped this framework to you, because some of the questions are implying that it does things that it does not do. It is no substitute for all the other good things that Angela talked about. We need robust self-assessment built into the whole way that we run our councils. We need robust and properly resourced community engagement and development as part of what we do as councils. I think that what this framework does is what we have said in the tin. It does not do anything more than what we have said in the tin. It is one tool, but no substitute at all and needs to be linked to all the other improvement tools and mechanisms that councils use. I think that your points are utterly germane. Is the underlying improvement planning sufficiently robust? Is it sufficiently engaged with the communities on whose behalf we are trying to improve things? Do they understand what we are trying to do and so on? Just to reassure you, again, every council has other mechanisms for collecting data from communities as part of service reviews, etc. Final point, in line with our colleagues in Audit Scotland, clearly councils publish this data and have to, as under a director from the Accounts Commission for Scotland, their local auditors will look at whether they respond to the data that they have published. You have said that you are off the mark with your family in this respect. The auditors are perfectly entitled to find what you are going to do about it. There is also the statutory audit function, which is not just a free-floating, voluntary thing, but it is within a framework where councils are statutorily audited for best value and improvement as routinely part of how they are dealt with. I think that we understand the linkages between this framework and all the other improvement bits and pieces that need to go on. The vast bulk of the committee bar, Mr Buchanan, has been councillors at some and very recent times, some still, if you remember rightly. I am corrected. We do understand that. There was, I have to say, a lot of hype about this framework. That is a fact. One of the key things for us is to make sure that hype becomes improvement. That is one of the reasons why we are going to come back year on year around about this framework. It is about the issue of families. When I was a local councillor, I can well remember the family grouping that Mr Searshire was part of. From time to time, I did wonder why we could not get a comparator between an activity in Searshire and an activity in Glasgow. We are just never part of the Glasgow family because of its size. Is this system developed enough now to allow either elected members or officials or the public to choose what comparator families they would wish to group? I know that you would need some software to do that rather than a paper report. Can we do that kind of thing? Can we look at different deprivation indexes around Scotland and group them together, for example, and explore comparators for ourselves? Ms Lynch? Absolutely. That is one of the priorities that we have identified in terms of how to improve the framework, if you like, going forward. It is about refining the family groups. We had to make a start, and I think that the original groups were agreed as a starting point in terms of providing a practical structure, but also in terms of providing some similarity in terms of the challenges that those groups of councils faced. However, as families have started to work together, we identified that the family groups are not always right for all the subjects that we are looking at. Colin might want to say a bit more about education, for example, because that is one that we have been looking at. We would certainly be keen to work again with the councils to identify what better groupings or ways of arranging the groupings would work. Ultimately, there is simply a structure to support councils to come together, to share and to learn. Therefore, I think that there is that opportunity to make sure that they are refined so that they are more appropriate. Very briefly, there is a visualisation tool—I think that I am using the right language here, I might not be, but—that would allow you to go on and explore and make comparisons between any councils that you would like to explore and make comparisons between across the whole range of indicators. If one wanted to make up one's own sense of what a family should be for certain purposes, Emily's point of education is an interesting one. It is almost the point that the chair made at the beginning. Are we benchmarking our past or are we trying to move to our future? If you put every council with quite a high level of deprivation together and say, well, we know that affects education, are you building an element of a self-fulfilling prophecy into the future around that, we are saying that we do not really expect that people from deprived backgrounds will perform as well as others. Now the challenge is how do we help them to, but we should not simply create families and stick with them that imply if you have deprivation, your education results should necessarily be worse than others. I think that one of the interesting things that has happened—Angela Llywodrae alluded to Western Bartonshire—is that there are other councils that have made spectacular improvements over the trend time that we have in the four years. In the performance of kids from disadvantaged backgrounds, not all of which we would be regarded as very disadvantaged councils, but they are doing very well with disadvantaged communities. There are things there for other bigger councils with a lot of disadvantage to learn from them, so you are absolutely right that we should not be too rigid about the family boundary. I should say that I am a big fan of this approach. When the first information on Benchmark came out, I was a councillor and I remember being so excited trying to ploim away through what was there then. I think that, to John Wilson's point, Benchmark information can empower councillors to an extent that I had never felt empowered before by being able to look right across the country. One question is how it was in the early days, and a lot of the data was raw. I was interpreting it one way and I was being told that it was different, but that information, I assume, has improved, so it would be good to get an update on that. My second point, if you look at the view that was received from Mary Hall Summerston Community Council, where they say that they offer a wide range of performance indicators in comparison, successful appeals planning and so on, they go on to say that they highlight the indicator of how clean is my street. They really say that they would need to get more of that detail and how they could get that detail down at the community council level. Back to what you said, Colin I accept that it is exactly what it says on the tin. It is not about the improvement service and pulling this together for every community, but have you done analysis of how councils are using the Benchmark in both in terms of how they are using it to improve services but also how they are using it to get to Willie Coffey's point with the community empowerment bill, how they are using it to get that information down to a meaningful level within communities so that communities can compare between each other but also more widely about how it is performing in Scotland. Ms Leitch. Thank you for that. I think that that is absolutely key. There is a real push towards what we are calling placed-based approaches to service delivery rather than just assuming that one size fits all regardless of the size of the local authority. Myself and others across the country have really been using the benchmarking data, expanding it and really looking at what does that mean in local areas. Most local authorities I think have some type of fora, whether it is a partnership, whether it is an area committee or the likes. On the basis of that, the detail is now broken down much more to perhaps not community council level, I have to say, but certainly ward or area level. This is an example of one that we do and our area partnership has been furnished with this. It is on the basis of this data that they are now setting the priorities for their local area. It is taking the benchmarking, using it as the can opener and then with communities, really distilling it down so that they can make sound judgments on where they would like to see our services being focused. It would be good to get a copy of that, convener. Sorry. It would be good to get a copy of that. If we could, that would be brilliant. I think that you are absolutely right. We are presenting data at a very high level for the whole of account. Clearly, for example, if you look at the performance of children in education, whether it is S4 or S5, there is an average there for councils a whole. The variation around the average will be staggering. You need to take that right down to community level. We have been doing some work with community councils across the last year. I do not think that they are well supported presently, but I think that there is a lot that we can do with routine public domain statistics to get them down to that level. We have created a thing that we shall happily send you the link to, called view stat, which would allow you to take any public statistics down to the level of communities of 600 to 1000 people. The difficulty is that that will not always tidily correspond to a community council's sense of identity as to what a community is. Those are geographies that are used for public statistics, but at least it would allow them to take all the educational results on an authority and say, okay, what happened in the two streets next door to me then? You can actually pull that up and look at that pattern over time. We put 10 years' worth of data into this thing and said, look, if you want to look at trends, you can look at trends. It is by no means perfect. Its design standard was needed to work for someone who was at least able to book a Ryanair ticket. If you could book a Ryanair ticket, you can use the damn thing. It is relatively simple, but it does at least allow much more ready access to public data. That sits below the benchmarking data. I think that your point is a valid one. That drilled down bit really does matter because the real action is happening in quite small communities. Lives are varying, they are getting better, they are getting worse, whatever. We need to be able to get our analysis down to that level. Angela is right as well. Most councils now have mechanisms for council planning, but community planning increasingly, down to neighbourhood level, Edinburgh would be a good example, and then sub- neighbourhood level, even in some cases. We need to be linking the benchmarking at one end to that pattern of engagement and working with communities at the other end to get value out of CRB. It will be helpful, though, I think, because it will give communities a right to challenge us on whether we furnish them with enough information. In a way, one of the rights that CRB confers in the community is that we are not achieving the outcomes that we want in terms of how informed we are, sort of. Public authorities, whether they are health boards, councils or whoever, will have to look to make sure that they are satisfying communities in terms of what they feel they need to know and what they feel they want to know. There is an important role for CRB in driving forward an agenda about informed public and alongside-empowered communities. Alex, do you want to come back? No, that's fine. Thank you. Cara, health and please. Thank you, convener. It ties into this work about how we better meet the needs of every community and go back to the evidence that we received. I am looking at the evidence from Clear Fife. They have said that there is a democratic deficit in a lot of communities. There just isn't really the opportunity for residents to be consulted. They are wondering whether it would be a good idea to develop an indicator on the levels of consultation and the quality of local authority consultation and, indeed, the activity of local councillors. I wonder if the panel would like to comment on that. They are right that we use the household data to get a measure of residence experience of local services. The Scottish household survey is an all-Scotland survey, so it doesn't give us that, but it's thin. In any given year, the chances of you being involved in that survey are utterly negligible. If that's what we mean by a democratic deficit, there's no question that exists. Councils themselves then have things like citizens panels, resident surveys, etc. However, I would accept that we could and should do better here. We've tried to outline costs just now because we need to get board approval to do this. If we wanted to power up the household survey, let's allow a lot more people to be involved, because it's a very high-quality survey, but it's quite a narrow base by 12,000 people across Scotland. To get it up to a decent level from a community point of view so that we can disaggregate the data a bit, we're talking probably a million, 1.5 million. I'm just never having that. It's the problem. Many councils, many local authorities, community planning partnerships, other bodies are carrying out very similar surveys, and they are pretty comprehensive ones, on a regular basis. Why can there not be co-operation in bringing those things together rather than reinventing the wheel and creating something much bigger? There are councils with citizens juries who are regularly in contact with 1,000-plus people in their particular area. Why can there not be co-operation in terms of that? I was picking up on the point that was raised about getting down to community level. If you have a citizens jury of 1,000 people for Glasgow, there's nothing I can do. That's a statement about Glasgow as a whole. It's not a statement about any community in Glasgow, and indeed the jury will be waited to represent Glasgow's population as a whole, not the population of any community in Glasgow. I took the question that I was being asked to be, how do we get this much closer to actual communities and allow many more of them to express their views about public services? 1,000 of a sample annually, which is perfectly decent from a representative statistical point of view, is a pretty thin engagement with the almost 500,000 people who live in Glasgow. If that was all you were doing—I take your point—there's a range of things happening. Our problem is they're not happening on any standard basis, so if you said to me, could I benchmark that across Scotland? No, I couldn't because people are using completely different instruments, they've evolved locally, they suit their local purposes, their member's priorities, etc. The merit of the household survey is that it's done on a standard basis across the whole of Scotland and is already part of government's commissioning. The idea was, could we piggyback on doing that? We have looked at linking up the work that people already do at local level. One of the kickbacks that we get there is that if people have got an instrument that works particularly well for their communities, why sacrifice that so that you can get a standard instrument that then allows you to take a measurement across Scotland? They are saying, don't interfere with local practice. If the local practice is working to engage communities, we're not sacrificing that so that you can get a better measurement. In a way, there's always a tension between the ways in which you could get to that, but my view is sympathetic. I do think that we need to get much closer to communities, both in terms of engagement, information and making sure that we're aware of the different views of different communities. Cara Rennie, that's extremely helpful. I'm going to take some quickfire questions because there's some grain that we haven't covered. Quickfire questions and quickfire answers, hopefully. John Wilson, please. Welcome to the issue about the household surveys. The difficulty is, as I remember, over 10 years ago, having the same debate with the Scottish Government officials regarding extending the household survey, so best to look at that one. The issue, the visualisation, Mr Mayor, that you referred to and the view stats criteria, now I know when I tried to get on to some local government websites, I find it very difficult to navigate. I think I can book an air ticket, or an easy jet ticket, or a jet ticket, but when I go on to some of the local government websites, they are torturous in terms of trying to get some information. How can we get improvements in that area so that people can view the stats that you're referring to at a local level? Mr Wilson, there's a very quickfire there, Mr Mayor. Very quickfire. If you go on to the benchmarking website, we've done dashboards for each council, which are dead simple, but allow you to make comparison over time and comparison between councils. It is literally a dashboard system, and we hope that that makes it more accessible. What we will welcome, though, is your feedback, and we get feedback from the public who use it to tell us what they like and, quite forcibly, what they don't like about the way we've designed things, but we'll welcome your contribution as well. All councils also have similar dashboards now. We're asking them to put the dashboards out on their websites, so it's easier for the public to get the information quickly. You know that the committee spent some time in community empowerment and we've submitted a report about trying to engage with local communities. The responses that we got in terms of our consultation with the public clearly showed that community councils felt that they were not engaged with and that they were not aware of the benchmarking process. How can we do this better? I think that there's perhaps a differentiate between the terminology and the information. I think that the notion of benchmarking is a bit like community planning. I'm not sure that all community groups really associate themselves with that, but when you talk about trying to get engagement or giving them information, then that's quite a different matter. I think that there is a groundswell of trying to get relevant information out to appropriate individuals without swamping people, particularly community councils. The feedback that I have is that it's the easiest thing in the world for some of our services to throw things out to community councils. They can be swamped with information and it's really difficult for them to differentiate between what's really important and of value to them and what they can ignore. There is a bit of work that we need to do to make sure that we push relevant information to them at appropriate points. We've heard previously about improvements to Scottish Water after they brought in a new benchmarking regime, and they were comparing with other bodies elsewhere. To what extent are you now using external comparators in the rest of these islands or in the rest of the world? Ms Lynch? Again, it's something that I think that you'll probably be able to give more detail on, Colin, but in terms of the international comparisons that we look at— He looked at you. I know—I'm just realising that I do know—I remember speaking to a couple of colleagues before—about, particularly for Glasgow, for example, what they would like to do within this approach is to include more detail on cities across the UK. That's something very much—we have a steering group and that's something that we're working to do is to look at what information is available across the cities in the UK that we can actually include. I think that it will be at the drill down stage and the family group stage rather than actually within the framework, but it would actually stimulate those discussions and support those discussions, so that would be broader in the UK. That will be looked at because, certainly, the evidence that we've heard at the committee previously showed that the vast bulk of improvement that took place in Scottish Water was when they started comparing what they were doing with other bodies outwith the UK. I hope that that will be done. It's certainly very difficult, as we've heard before, to put certain councils into family groupings because of what they are, so I hope that you will look at that. To what extent do you feel that the general public and stakeholders are using the framework to challenge local authorities? Ms Leitch, do you think that it's being used to challenge local authorities? I think that it's variable. We now have three years' data, as you can see. There's four for some, where we're starting to see trends. Those trends are now starting to become much more evident in terms of, is it practice, is it a one-off and what could we do differently. I do think that putting the information out through the public performance reports is one thing, and there will be certain groups that will be particularly interested in that, but if we really want true engagement, we need to distill it a bit more so that it's relevant for the particular groups who want to engage with us on different subject matters. Obviously, the committee has been looking a great deal at community planning over the past few years, and we've heard previously that certain targets that are put in place at councils may, in terms of the single outcome agreement, come at a completely different angle from targets that are put in at the health service. We've had a number of folk in response say that we should be looking at the outcomes via the single outcome agreement frameworks to include council and health authority outcomes, rather than just taking measures on local authority business alone. Is there a view on that? There is. There is a view that we need to do both. So, we need to have a framework within which performance against outcomes in SOAs is consistently measured and publicly available. I think that that doesn't mean that we still wouldn't want to do service-level benchmarking on the cost efficiency and effectiveness of the way different councils deliver services as well. I think that the two things are related, and they will allow us to explore that relationship. So, there is a significant bit of work going on, which Emily is leading, which will develop and is developing on that outcome approach, because I think that the point is well raised by many of your correspondence. But that work is on-going, and there are major areas where we are, if you like, shamefully short of clarity about outcomes at all. So, when we talk about outcomes for older people, we are still struggling to put any coherent sense around what we imagine these outcomes are. We use words like dignity, choice and so on, but what would that mean and how would you show that that was happening, if you like? So, there is a lot of work going on in that arena, and we will report back to you if you would welcome that progress on that dimension of the work as well as the dimension that we have reported on today. I think that it would be interesting for us, and, obviously, next year we will be back anyway, but it would be really interesting for us to continue to be appraised of any changes that are made. Obviously, things like health and social care integration may mean that you are going to put different measures in place now that that jigsaw is going to be complete. As the community empowerment bill kicks in after its passed, I think that the level of engagement around certain of those areas with the public will grow. Again, you may look at the different measures that you are using at that point. So, if you could keep us appraised of developments rather than us waiting to next year, that would be useful. Obviously, there were a number of requests from information that we would be grateful if we could receive. I thank you very much for your evidence today, and I suspend, and we now move into private session.