 The 22nd meeting of the committee in 2014, everyone present is asked to switch off mobile phones and other electronic devices as they affect the broadcasting system. Some committee members will refer to tablets during the course of the meeting as meeting papers are provided in digital format. Agenda item 1 is to consider whether to take item 4 in private. Are we all agreed? Thank you. Agenda item 2 is an oral evidence session on the local government benchmarking framework. We have three panels giving evidence at this morning's session. I'd like to welcome our first panel, David Martin, who is giving evidence on behalf of Solace Scotland and Mark McIntyre, director of governance and performance management at the improvement service. Welcome and good morning, gentlemen. Do you have any opening remarks that you would like to make? Just very briefly, convener, thank you again for the opportunity to come and talk to the committee about their interest in the local government benchmarking and improvement project. I think that when we spoke last with Ronnie Hines about 18 months or so ago, the committee were particularly interested in the progress that had been made. I think that a critical test for you was whether benchmarking information was being used to promote improvement across Scottish councils and whether or not senior people like myself and politicians at local government level were actively interested in it. I hope that what you see today, through your three sessions, convinces you of the fact that we are embedding improvement through the local government benchmarking framework. It is beginning to get significant traction on improving change for the better in local government. Is the framework fully embedded in all 32 local authorities now? It is clearly fully embedded in all local authorities. Not only is every council participating actively in family groups in various ways to look at improvement. All 32 chief executives receive regular reports at Solace on progress on the framework. The council commission, as you know, is very actively interested in how we are using it and continually challenging the scrutinised progress. To my knowledge, most, if not all, councils are regularly reporting the key data to scrutiny committees or full councils on a regular basis about performance in their authority. Importantly, getting beyond narrow league tabling and trying to understand what is behind the information so that the service improvements can be driven forward. I would like to echo what David Serra said. Over the past year or so, since we were last in speaking to you, we have continued to progress some of the technical issues about improving data. All 32 councils have taken part in that. The family groups that we may go into a bit more detail around. Again, all 32 councils have been fully participant in that. Over 100 officers, over a sequence of meetings have been part of that exercise. There is ongoing work within the programme around, for example, how we strengthen the public accountability. Again, all 32 councils have been taken part in how we develop a common reporting tool around that that will feature in their public performance reporting towards the end of the year. The level of engagement and sustaining of that engagement has been, I think, quite significant from local Government and not just from the corporate performance colleagues that we have began the project with but including now service colleagues who have been having the family group meetings and so forth. Traction has really gotten old. All 32 councils are using it, reporting and proving. How are we ensuring that best practice is being exported, obviously, one of the key reasons for all of that is to ensure that best practice could be exported from one authority to the other? Is that working? There are a couple of things that are happening around about it. There are the formal elements, if you like, within the benchmarking work itself. As you know, we have set up family groups. We had agreed those family groups last year with the councils. We decided to pilot two areas with the councils. There is a project board that we now have overseen. The project has agreed the theme. We looked at positive destinations for children and road maintenance. We pulled all 32 councils in their family groups together. They interrogated the data for their family group. Is that data accurate? The second stage was, then, what supplementary information do we need to make sense of that data itself? Then they went into the improvement exchange. Who is doing what? A range of things come through from that. We have captured all of that, and we will be publishing a report on both of those groups in the next couple of weeks. We have now also got a forward programme for activities over the next two years for those groups. That is the formal side. Again, as you will hear over the course of this morning, individual councils are also doing work over and above that. For example, I have been doing some work with Dumfries and Galloway Council supporting a strategic service review of roads and infrastructural services. They use the benchmarking information around that to start to guide for them some best practice visits that they took forward themselves as part of their own internal service improvement. Those informal activities are also happening quite widely across the council, and the benchmarking information feeds into that. You talked about an improvement exchange there. Do you want to expand on that? What is this improvement exchange? Again, the kinds of things that we will pick up through the family groups, and you will see in the final report. When they were looking at, for example, positive destinations, some councils take their positive destination data and are now starting to pursue it beyond the first year after school. When a child leaves school, a young person leaves school, we follow them for the year. They are going to higher education, university employment and so forth. Some councils have started to try and track beyond that to try and tie that back into their strategy to deal with youth unemployment. Other councils were not doing that, so there is a lot of discussion around how they can use that information for broader purposes to keep those kids active going forward. It was that kind of day-to-day practice exchange that was taking place. There was also some good discussion by some of the councils who gave specific support to children in schools themselves for positive destinations, dedicated staff to that and parents who were brought into the school as part of the discussion about where the child wished to be going forward. Again, that kind of exchange was there because the family groups and a number of councils have said that they are going to pick that up and look to replicate that type of practice within their own authority going forward. It is that type of thing that has been happening as a consequence. The benchmarking data is a bit that is driving them towards that conversation in that exchange. It is supplementary from Stuart McMillan, first of all, please. Thank you. Actually, it was on the previous issue regarding the councils, if that is okay. Thank you. I was just to seek some clarification when you mentioned about the engagement with councils. Is that all councils, including the Opposition councils, within local authorities? Absolutely. You will see that the overview report that was issued a few months ago was signed by, obviously, the president of COSLA and Sola. In practice, there is interest at both the strategic level through COSLA leaders in how the process is going. The richness of it, as the committee has said before, is how do you drill into performance at a local level in an authority? Can you then compare, learn and improve as a result of that? Certainly, in my own authority—and it is typical of all councils—the information would, of course, go to either a leadership board or one or other form of scrutiny committees. It would be taken to a cabinet if that was the form of governance in a particular local authority. In some authorities, there is an annual review or annual development seminar drilling into some of the information. I know that one or two other colleagues have done a similar thing, tried to use some of the information in their learning for members' development days. I think that there is a range of formal and informal opportunities, but I think that the question is, is council performance scrutinised cross-party, cross-council on a regular basis? Absolutely. That is good governance and I think that the data is used in that way. Mark McDonald, do you have a supplementary? I was just looking through the National Benchmarking Overview report and I seem to recall when we last discussed this, I think that I was a substitute on the committee at the time. I raised the point around what was being measured and what the balance was between input measurement and outcome measurement because, obviously, inputs tell us in terms of how much is being spent, but that does not always tell the story in terms of what quality of service is being delivered off the back of that. While it may look good on paper to be spending a large sum of money on a service, it may be that the quality of that service is not delivering the bang for the buck. Have we got the balance right between looking at the inputs and also the outcomes, but also not just looking at them but drawing the correlation between the two to inform what councils are doing with their money? I think that that was one of the main drivers behind the whole benchmarking agenda and journey. I think that the Accounts Commission, moving from the SPIs, which is predominantly about throughput or input measures, to embracing the idea that this is a mix of input activity and outcome, has been a real step forward. I think that the Accounts Commission has helped us to drive forward that agenda. You will see from the range of indicators that we look at that there is a mix, because both are relevant. It is still perfectly important to know cost per particular activity and to compare that. I think that you need to know what you are getting for your money. There are a number of examples in there on social care and education, where we can compare cost information on primary and secondary education but also look at the attainment and achievement of young people coming through. When you put the two together you get a much richer and deeper understanding of what is actually going on in a particular community and in a particular range of services. That is an evolving journey, because one of the things that we are clearly learning as we go forward is that the outcome world and prevention is what we want to achieve. The data and indicators that we collect are reasonable but not full fit for that, so we are continuing to evolve the information that allows us to go right across the range of local government functions. A good example of that would be in the current overview report. We have now got information on economic development and the reach of local government economic development services and the impact that they have on employment and unemployment. We are trying to get both of those things. I think that part of the reason for politicians being very interested in this is that the local government level is concerned about outcomes. They want to know what difference services make to communities, so we will continue in that journey. To answer your question directly, there is a balance between input, throughput and outcome measures. As you know, we said last year, certainly the first time we met with you, that there would have been an ongoing process of continual improvement of the indicators themselves, and that is something that has taken place. Last year, we used, for the first time, a net costing for waste and recycling services, because it was felt that that was a more accurate way of capturing the interrelationship between waste management and collection and the recycling agenda that all councils are seeking to progress on. We adopted that, and that gives us new insights to some of the better practices in recycling terms across councils. For all of those indicators, we have a knowledge hub, which is a private website for once a better term, where we have something in the region of 350 members from across all 32 councils. We undertake the analytical work that underpins this report and that is shared with all 32 councils, so we look at the relationships across the various indicators. For example, if we are looking at children's services, not only does high costs and high spend indicate good performance, we will look for the relationships between that within the data itself and supplement that with additional analysis around that to try to shine a light on that. That is shared with all 32 councils, and that is the future as part of those discussions at the family group. It is really starting to get a hold at that level, and it has a role for us in the councils just to continue to evolve that. I certainly remember, from my time on Aberdeen City Council, that there were here your statutory performance indicators, your key performance indicators, and then there would be ad hoc measurements that would be requested by councillors. To what extent do you analyse or get feedback from councils around the things that their members are looking for to measure? Also, what assessment do you make of the value attached to some of the things that are being measured and what recommendations do you offer on that? Finally, how do we ensure that what is being put in front of councillors—we talk a lot about the cohorts in terms of the benchmarking, and often the reports that come before councillors are purely looking at their own council—for valid reason, given that they are accountable for what their council is doing. However, how often are councillors themselves being given reports that will show that this is how we are performing, and these are the performance measures for our council, and here are the performance measures for our cohort and how we compare on the benchmarking? How often is that information being put in front of councillors, not just solely the information that is relevant to their council? I will try to answer the last question first, if I may convener. The annual public performance reporting ensures that benchmarking has been done as part of that public performance report, so it is not just the elected members that are getting that, it is the whole of the community who can access information in comparison to the purposes. Indeed, the council commission in Audit Scotland recently produced some information to all local authorities on assessing our ability and competence in public performance reporting, if you like, and that was a major indicator that was being measured to what extent benchmarking is out there in the public. I think that elected members clearly want to know what is going on in their own communities, but my one councillor in Renfrewshire is part of a city region, with eight other authorities. My leader and my members are just as interested in knowing what is going on in the city of Glasgow as they are in Renfrewshire, because clearly members are driven to try and ensure that your local services are performing well. My own view is that the benchmarking project and the approach that we have taken has raised the bar for us all. We are now much more aware of what is going on elsewhere and, indeed, much more interested in trying to, frankly, pinch good ideas if they can be transferred into a particular local authority context. You are seeing that evolving as Marr mentioned the best practice approach on school leaver destinations and on roads. There is another trunch of activity now under way on, for example, sports. How do we improve the participation of young girls in sports? On waste management, how do we use education and awareness to improve recycling rates, and look after children? Stability of placements has been a concern for some time in local government. How do we spread best practice in there? That has come from the confidence that has been built on the first round of benchmarking activity. I think that hopefully you can see that we are beginning to get a lot more traction on this and use it for improvement. Again, I cannot speak in detail about what happened in all 32 councils. You will hear some of that practice later on today as well from some of the authorities you have gotten in. What we have been more involved in is general support. For example, just before summer we ran two master classes for elected members across council. I think that we had over 50 members attended it. Different councils, again opposition and administration members within that. To talk through with them, what does this information provide you as elected members? How might you engage and use that internally within councils? Following on from that, I was also then invited to both Murray and Perth and Kinross Council to go and talk to their members exclusively about how they could equally use that information within the council. We are frequently getting those types of requests coming in. We will get a couple of schedules for the autumn. There are a couple of other authorities to go in and have that internal conversation with the full council to say that this information is now there. Here is what we think adds value for you as an elected member and then have that discussion with them. We are more at that end of the general support, but I think that you will hear over the course of this morning how some councils are using it in a bit more detail from them directly. John Wilson, please. One of the issues about benchmarking is the data that is collected and who collects the data and whether or not the data that is being inputted into the system is accurate, up-to-date and relevant. How assured are you that the data that is being collected is comparable across the local authorities, particularly the families and the structures that have been established, and is the same data being collected so that we can get an accurate comparison across the families about what is being delivered and how it is being delivered? As we can be, the data is accurate and comparable across all 32 councils. We have spoken to you in the past about some of the mechanisms in play to do that. We have had on-going work with directors of finance and Scottish Government College around the finance data, for example, to ensure its comparability and better standardisation within the local financial return. We are pretty confident that that information is accurate. What we then do at family group is to draw from that general source, so we do not get additional information from the core information that we have. Across the family groups, what we did see over the course of the last few months was when they wanted some additional information to interrogate that data. Again, we ensured, because my team played a role within the family groups, we ensured that that was consistent across all four groups that we were looking at, looked after children and roads. That was agreed with the participants at each of the four family groups that were looking at each of those two themes. We were there to make sure that, if one group thought that it would be good to look at X, the other groups were also looking at X simultaneously, so we were there to support that process. Again, as we have said in the past, it is an on-going process in terms of the data itself, keeping it relevant, keeping it up to date. That does remain a challenge. For example, this year we have agreed that, in order to sync better with the local public performance reporting cycle in councils, we are going to publish a report early so that the last two reports have been published in February, March of each year. This year we are going to publish a report in November. We have had work on going with directors of finance to ensure that we get the financial data to help to populate the indicators, but there are going to be other areas where we are going to be behind on that. That is because we are not the data owners. With the councils of the data owners, we are getting good access. That is just an on-going challenge. It is part of having to draw from such a wide range of data sources to promote the project that you end up in those scenarios. It is never perfect, but I think that it is as comparable and as strong as we can get it at this point in time. I think that the question is very pertinent because you have to have assurance that the data is accurate. Certainly, the Accounts Commission has worked hard on that, and we have external auditing of all the data information. I think that there is also a maturity point. We did spend the first year, particularly of the local government benchmarking approach, as members will recall, really cleaning up the data and making sure that, if there were difficulties with it, it was at the margins. It did not stop you drawing conclusions and drilling into improvement. Of course, it is about improvement that the local government benchmarking framework is essentially about. I made the point about maturity. The conversations that take place in family groups, that take place in my council and I know others, are about what we can do to improve services. We are not hung up on the comparability of the data. It is reasonable for members, absolutely clear, to assume that the information is accurate and correct. I have seen the debate move on much more to how we are doing it in local authority. Why then? When we have opened this particular issue, our council's performance is not where we would want it to be. Who can we talk to to try to address some of those challenges, as opposed to what is different here, because we cannot compare apples and impairs? The committee can take comfort that the data, as Mark says, is good enough, and we will continue to improve the quality of it as we move forward. Interesting response in relation to the apples and impairs comment. In relation to the benchmarking, we have arranged, you have mentioned, looked after children and roads in the two areas that have been examined in detail. However, what we are being asked to do is look at a benchmarking process across a wide range of services. Is it not that apples and pairs comparison can be made between local authorities and the same family, or are we going to, with the financial accounting process, get council officials saying to elected members or saying to the public, that it is different in Glasgow than it is from Edinburgh, because while we spend an excellent amount of money on that service, we do it in a different way and we do it in a way that is an accounting process that is slightly different, but we do it better. It is how do elected members then hone into that and understand what is being delivered, how it is being delivered and why it is being delivered by a particular local authority if they are being asked to compare it with other family members or family group members? I think that, through you, it is the other way round. I think that the data is comparable, and we have taken steps to try to ensure that it is comparable. Members rightly might ask why performance differs, and the answer to that should not be that the information is not comparable and that we are not comparing like with like. The answer might be that we have different social economic circumstances, that the council is pursuing a different priority or a different level of policy in our area, and that is why the information draws those conclusions. I think that it actually improves transparency, if that is the case, because I think that elected members are then able to decide whether or not their relative priorities ought to change or whether or not there is genuinely a lack of resource in a particular area, or that there are some inefficiencies or a variation of any of those things. I think that what we cannot do with the local government benchmarking framework information is to suggest that somehow it is different here, because if our approach to collecting the information justifies that, it is not about standardisation, it is about being clear and transparent about why levels of service might be different and priorities might be different in local authority areas. I think that allowing elected members and the public for that matter to take a view on that. I think that we have passed the stage in the benchmarking project where the data is the problem. I think that it is now a question of what are the relative policy priorities and how can we learn from, as the convener said in his introductory remarks, good practice and best practice across Scotland. It makes it much harder to hide. No, I would use Deco, I think, what David was saying there. In the first year or so of the project, we did spend a lot of time, I think, as David said, trying to make sure that the data was clean, tidy, comparable. Over the last year, there has been virtually no discussion about that. It has been around other supplementary pieces of information that we can use now that we have opened the can with standardised data to start to understand performance. Again, that, as David said, gets towards issues of policy choice and priority for councils, differences in terms of their social economic makeup and then into the performance agenda itself. It has not featured much as a discussion over the last period of time. I think that we are relatively confident, and councils are relatively confident that the data is as good as we need it to be for the purposes that we identified, which are always about improvement. Next issue that I want to raise is the issue about elected members and elected members' understanding of this whole process and why that has been done. 17 months ago to the day, I asked Ronnie Hines about the situation with elected members and I mean all elected members understanding that. What we have had this morning has been discussions with either executive committees, the cabinet or senior councillors within local authorities. Mr McIntyre referred to a master class with 50 members. We have 1,223 elected members in Scotland. Not all of those members are members of a group or a party. Some of those members, in fact, have significant numbers of members, and some of the more rural authorities are made up of independent councillors. How do we ensure that all elected members understand this process? If we cannot get all 123 elected members to understand this process, how do we expect the public to understand it? When we publish, for example, the overview report, a link is sent to all 1300 councils in Scotland to alert them and to try to draw them into the website itself. We also have a range of other communications with elected members as an organisation. Again, elements coming out of the benchmarking work, when it is relevant, will be broadcast to councillors electronically. In addition to that, we have a continuous professional development programme that we support councils. I think that there are some 2023 councils using the CPD framework. Can I stop you there? We talked about CPD last week with the Accounts Commission and they clearly indicated that, in many cases, there was no continuous professional development going on. I can get the details of where that takes place through the programme that we support councils in. Again, within that information is made available for benchmarking purposes. The point is that it was made quite clearly by the Accounts Commission last week that a number of elected members, by the sounds of it, a fairly substantial number of elected members, are not taking part in any CPD programmes. That may well be true, but, again, I think that what we have tried to do as an organisation working with the councils is that you make it available. You cannot force politicians to take training, but we certainly make it available. We certainly encourage members to do so. We work with the member services, colleagues and councils to support elected members again to try to build that culture of engagement and development and training with them. Opportunities are made through that, including elements from the benchmarking work. As I say specifically on this, I say that we alert all councilors electronically that those things exist. We make a request, and if anyone needs some support or information for ourselves, we will happily furnish that to the members. We keep trying—I cannot guarantee that you get success in all 1,300 cases. I think again that Mr Wilson's question is about reach and trying to ensure that across all one, two, three councillors there is an active interest in this. In addition to the points that Mark has made, my experience is that members are never reluctant to scrutinise if information is provided to them in a format that they can get a hold of and drill into. That is certainly my experience as a result of the local government benchmarking framework. About 70 or 80 per cent of local government spend is broadly covered by the range of indicators that we are covering. We have broken them down into children's services, social work, environmental, culture and so on, as you know. Whether there is a committee structure or cabinet structure in the local authority concerned, portfolio holders, opposition members who are involved in scrutiny, committee chairs and their shadows are opposite numbers and are getting information on their service area. Increasingly, the data is being reported in that way in local authorities, not just in an annual or in a periodic council-wide basis. The other aspect of it, of course, is that the press in the media is very interested in that information, understandably, and therefore that gets members interested in the information. I think that there is a range of things happening that mean that all councillors are actively interested in what the information is telling them about their particular community and the services that they are either running through an administration or working on an opposition basis to scrutinise. I am pretty confident that that is going to continue. That is in addition to, of course, the council-wide scrutiny committee and audit committee activity that goes on. Although we can always improve that, my sense is that all elected members are both aware of the benchmarking data and are pretty actively interested in what it tells them about their constituency or their particular ward, and I expect to see that to continue to grow. John. I thank the members for their responses and could I, like the convener, draw your attention to the official report of last week's meeting in which we did raise concerns about the level of continuous professional development and the level of information being provided to individual councillors either through training or through the processes in terms of dissemination of information, just so that we are clear as a committee that elected members can and do understand what is being presented. Most importantly, they can then take that and convince the public that they know what is happening within their local authorities. I was not part of this committee in March 2013, and I read here in this report in 2013 that I wanted to ask you about this benchmarking families. You mentioned Mr Martin about East Renn and Glasgow. What exactly do you mean by benchmarking families? I understand the term, but I am not sure how you have developed it. The idea of very disparate local authorities was a challenge at the start of the project, that the Clomannanshire Highland, the city of Glasgow and Renfordshire, are all different. What we have tried to do is to encourage each local authority to compare itself, first of all, with whoever it wishes, because good practice is not necessarily a function of scale or of rurality or of urban councils. It is not just the benchmarking families that the benchmarking work goes on. As I said to the committee before, if I see good practice and looked after on accommodated children, for example, in Murray, I am going to talk to Murray Council about that, and I think that you need to be reassured that that happens as a matter of routine. However, the idea of the benchmarking families was that you can have common cause or a similar set of circumstances, and that, therefore, comparison within those benchmarking families was a good thing to do in addition to the generality of engagement across local government. We had some significant debate about that in COSLA and in SOLAS at the time, and the benchmarking families that we now have reflect that degree of common interest. Markets are a little more, as we are getting more sophisticated about that. It is an option to work within a group of local authorities where you are going to learn more by comparing the performance, for example, of roads or educational attainment, because the councils that are in that family are experiencing similar challenges. A couple of things to that. We did a piece of analysis just over a year, 18 months ago, with the councils, to look at if we were going to group councils on what basis would you group them. What the analysis showed was that when we looked at people-focused services such as education services, the key factors that were of importance and understanding performance related to social economics and deprivation. We grouped councils that were close to social economics and deprivation together to have discussions about people-based services. For more physical services such as roads, populations dispersal came through as one of the key factors in what seemed to be explaining the differences, so we used that as a basis for grouping those families. However, there are also other family group arrangements, if you like, across councils as well. Again, education had good practice already established in terms of sharing information across councils, and we again support some of that work itself. As David said, they are there if you like the minimum, but there are additional factors around that, and there is nothing to preclude councils going out with the family groups to exchange information and practice with other councils in the fact that we furnish that information across all 32 as well. You choose the families, or each council chooses its own family? We agreed collectively with the 32 authorities what the family groups would be, and they work at that basis, but they also exchange over and above that as well. So the families are not the same necessary for education and street cleaning as they are for everything else. They can choose what they want. We have used the analysis that we did to guide us on that, and it was David's point to try and get councils that were broadly facing similar challenges together, because we thought there would be some relevance in the exchange around that, but they do exchange out with those groups simultaneously. Has that worked? Has it been successful, in your opinion? Two pilots that we have ran earlier this year, one that looked at positive destinations for children and the one that looked at roads. We did an internal evaluation with the councils on that at the end of that exercise. We have tweaked it, and we have now launched a programme for the next two years using those same family groups. They have worked, but this time around the councils themselves are going to direct the family groups rather than my team, so we will stand back from the role that we played and leave it to the councils to run. We will help them to gather the information, etc., as part of all that, but they are now taking the leading process, not ourselves. Alex Rowley, please. Can I maybe focus on how we use this information? I find the information here fascinating, and I think that, in terms of council leaderships looking at budgets, etc., it will be helpful. If we are saying that we really want to get into this information, being more useful to councillors, then how it is presented and how we use it, and maybe just pull out a few to look at that. I looked after children was talked about, and if you do see a comparison, we like to see Fife and South Lanarkshire, Fife spending a fair bit more. What does that actually tell us? Fife could be looked after children and Fife could be succeeding much more in terms of attainment and education. What does that tell us? How much detail do you go into? How are we saying that all those 1,500 councillors, or whatever, are having to try and drill down? You mentioned the website, but I have been told that it is a bit complex to actually get in there and understand it. I sometimes think that, if we are going to have those councillors trained in all those different things, then the law need PhDs will be able to suck on the highly paid officials, and the councillors can start running the councils because they will be so qualified to do so. What is it that we actually do to look after children? Also, can you pick up on home care services? If we look again, Fife looks certainly more expensive than your own authority, David Rennfrewshire. What does that tell us? I know that Fife has a much higher proportion of services delivered in-house than they have through the use of the private sector, and some would argue that, as a result of that, they actually have a better quality service being delivered. I do not know. What does that tell us, and how are you presenting it, and what information lies below that, and how are we getting into the detail of that? I think that Mr Rowley has just demonstrated the value of the data, because what you are basically saying is that you have gone beyond the headline, and you are then trying to understand the reasons for that. That is exactly what happens. The data allows you to start a conversation. It is not of itself a solution, or you can draw simplistic conclusions. If you take both the examples that you have given—that is South Lanarkshire example and my own authority, for example—we have had conversations in the Greater Glasgow area about the relative differences on looked-after children, and it boils down to all the factors that you mentioned and more, the extent to which looked-after children are accommodated at home, or looked-after at home, the extent of use of residential care, and all the issues associated with how education and attainment plugs into that. What it allows you to do is to get behind the headlines. You might well conclude, as you just have on home care, that there is a policy choice that you want to make on the basis of differential costs, because members believe that the quality or the approach is, in fact, better in Fife having sutured local circumstances than in my own council in Renfrewshire. What the data does is it starts allowing members to scrutinise policy, policy options and policy choices. The issue about care at home, for example, may depend on procurement practices. I will just give an example, if I may, but, when we recently re-tendered for care at home services in Renfrewshire, we did look at the benchmarking data, we did have a sense about where we were in the Clyde valley, if you like, in terms of procurement and comparative costs. My politicians were very keen to ensure that when we went out and tendered for services that we built in the living wage, we did something with zero hours contracts, we were keen to make sure that there was quality training and learning for employees. That conversation started because of discussions about what was going on in other authorities, which were flagged up by the benchmarking information. In a very real way, you can see how the benchmarking approach allows conversations to take place about what is going on elsewhere, which then leads to political dialogue and member-led approaches to how you might take forward different services. I think that it is good to give the example, Mr Martin. The fact of it is that the benchmarking data just makes it more transparent and allows us to then have those kind of comparative approaches and conversations with elected members about where they want to go with their policy priorities. The improvement service pick-up on that, but let's see a member of the public getting this report, and it is fascinating. I welcome this, it's a step in the right direction, absolutely, but a member of the public getting this and going to a councillor in Fife and saying, well, you know, it costs quite a bit more for a look after children in Fife and at this South Lanark, and the councillor in Fife says to them, well, that's because we're actually delivering a better service at the end of the day. How does a member of the public check it whether that's right or wrong? Can they go on to the improvement service website? Will that tell them anything? How do they actually draw down, or is it not for them to draw down? Anyone can access the website, so there's not a problem with that. There is a tool on it that you say you don't quite need a phd on it, but it's pretty sophisticated and it allows you to bring different data together. Again, I think that's something important to stress. It involves a lot of hovering, I believe. There's a bit of hovering you can do if you wish. There's also another tool in development that we'll maybe get you a link to in a couple of weeks we've been developing with the Welsh local government data unit that will publish or launch in tandem with the PPRs later in the year. The point I was going to make is that within this, what we've constantly stressed is don't simply look at one indicator in isolation of something else. Higher, low cost, and in and of itself isn't the explanation, so it's, again, I think, as David said, the point you've made, it allows you to start to raise questions. When you take the performance data and the cost data together, you ask questions and you ask, well, why is it different? I think the whole point of the process of benchmarking is to get answers to that why. Sometime it's because we choose to be different as a council and if it's about a service weakness, then what are other councils doing that my council can learn from so that I can plug that weakness? That's the way the conversations I think are going within the councils, but this was deliberately constructed to do exactly what you just done with it, ask questions, and we'll continue to simplify and make it easier for people to do that. I think, again, the other part to remember is that this is a national report, all 32 councils then report locally on their performance, and later this year when we launch a revised version of the local end of this, the local reports will also include the improvements that are happening in each of the authorities off the back of the work that have been done through the benchmarking. We are just there as a national, but it's a local bit that gives you the real detail about what's happening in Fife, and, again, I think we'll continue to see that improve for the next set of reports. I'll give you two quick points on the best practice. If you take, for example, the library service and you look at some place like Argyll and Bute and it's cost jump out, you've been high, and you think, well, okay, that's rural. Until you go along and you look at Highland, and their costs are significantly lower. I suppose it's how do you then link it to best practice, because in some of these authorities with some of these costs they would be cheaper just telling the resident to go on to Amazon and get the book delivered than they could keep it and the council would pay for it than they would be to hire the book out. Is there good practice? How has Highland got that down? I wonder if you follow up on the good practice. My only other question, because it jumps out at you, really jumps out at you, is the direct payment spend, and the fact that Glasgow has completely shot up in terms of the use of direct payment. Again, the interesting thing for me there would be how is that operating in Glasgow? They've clearly went to promote it. Is it working well? And is it a better service, and therefore why is other authorities so far behind? Glasgow have been part of a national pilot to look at direct payments. That's why you've seen the spike in their performance. So there's been particular work done in Glasgow to, as part of that national project, to encourage direct payments. So there is a process I think through that group that are now looking at how do we take the learning from what Glasgow and a couple of the other councils have done in the last year in order to improve that particular service. Your point about libraries is a useful one. Again, I think that at this stage all we've done is use this data to raise those questions. Family groups haven't looked in detail yet at libraries. That's scheduled in for, I think, next summer. The family groups are going to start to look at the library services. But I suspect, and you may hear more of that later today, that some of the individual councils, when they see the data, they're already asking those types of questions and then making contact with authorities elsewhere to try and answer some of the questions for themselves. So I don't think they'll sit back and wait for us through the family group process, getting round in a year's time to look at libraries. They're already doing that themselves as my understanding at the moment. Where it is, I don't know because we're not involved in those conversations. That's between the councils themselves. Mr Margillan. It's just really very refreshing to reassure Mr Rowley that those are exactly the same questions that I'm getting asked by culture, sports and arts conveners and opposition members to take Mr Wilson's point about what can we learn, what's this telling, and that's what I meant by transparency and I feel like an inquiring mind. The kinds of questions and analysis and scrutiny that I think members are well capable, when indeed they daily do, give of officers, will lead us to better services and to take Mark's point earlier. Sometimes the answer is actually there's an inefficiency here that we need to iron out, which in the current public sector finance climate we need to be doing very proactively. Sometimes it's a question of policy choice and in the case of the Highland and Argyll issue, I don't know terribly well but I do know that Highland have a real information access and technology approach to the use of their libraries you see in those community information hubs. It may or may not be the case that that's the position in Argyll and Bute. That might be an example where two rural authorities can have a conversation and indeed are about what can we learn from each other in this particular area, driven by the availability of the data. On that point that Alex Rowley has raised on the baseline that local authorities are starting from, I think that Argyll and Bute and Highland library services is a good example to use. Not all local authorities are starting from the same baseline and what calculations or what work has been done to try to get that baseline or understand the baseline that local authorities are working from because there clearly have been decisions made by local authorities prior to the benchmarking exercise. I can think of Glasgow in terms of care services who basically put their care service out to an arms length organisation and other local authorities provide those in house. Your own example, Mr Martin, where the elected members have insisted that in terms of looked after children, you have a minimum wage, you have a guaranteed working hours a week and things like that. Was there any work done in terms of where local authorities were starting from in terms of this process because they would have been starting from different decisions being made, historic decisions being made, as to where they find themselves, particularly when you are calculating the financial aspects of what they are delivering? It has certainly been the case that the data is comparable and then does immediately lead to those kinds of questions and those kinds of issues. It is not about suggesting that one set of policy choices or political choices in one part of Scotland were better than another. It merely makes it clear that that is what has happened. I think that the dialogue then leads to how much of that is transferrable between one authority and another. You then get into the richness of the debate, if you like, about how to try and improve public services. We are comparing apples with apples to use the earlier metaphor and I think that the issues that Mr Rowley raised make it very clear that members then get into the well. Are we happy with what we are doing in our particular community and compared with another one? I would not want to give you the impression that there was a kind of initial problem. It was merely making sure that the data allowed those kinds of conversations to take place. I think that they are leading to significant willingness to look at quite difficult and intractable problems in local government. We did start, as Mark said, with school leaver destinations and roads, but very quickly and with a lot of support from councils, we were touching on the areas that I mentioned earlier—museums, equalities issues, HR practices, libraries that were mentioned and a variety of other areas in local government—all driven by an interest in trying to make sure that we are doing as well as we can and that we use the information that the benchmarking project has thrown up. I want to go back to some of the initial discussions that we first looked at. One of the things that was said was that local authorities would, of course, caveat the reasons for why they were at a certain place, because some of them have made policy choices to spend some more money in certain areas. Rightly so, that is what local democracy is all about. Have local authorities been caveatting the reasons why they are at a certain place by highlighting the policy decisions that they have made? Are others looking at what they have done and looking at the outcomes rather than necessarily the indicator itself? Absolutely. I think that not caveatting them and explaining them is the key message for the committee. In doing that explanation, perhaps sometimes revisiting the original rationale for the particular service and either reaffirming it or thinking that it is time that we changed our approach. I think that you are beginning to see evidence of that and you will hear some of that later on this morning. I think that that would be useful for us as well as having the national report to see some of the local reports that have been out there. Ann McTaggart, I am sorry to keep you waiting, Ann. No, no, that is fine and good morning panel. The comparators, we have talked about them looking at the 32 local authorities, but have we thought about outside the box and looked at the UK or even further afield from there? Yes. The local government benchmarking project is important. It is not the only thing that the local authorities are doing. For example, in employability and labour market programmes locally in Renfrewshire, we spent a lot of time over the past 18 months comparing our performance with that of Manchester, Leeds and a variety of other major city regions. There is a city deal being launched today, which is a major labour market element on that as a partnership initiative. A lot of work has been done on looking at how city regions in England have dealt with the labour market agenda. There are lots of examples of that across local government services. I would stress that. Mark wants to say more about this, convener, if there is time, about rolling out the approach across community planning partners, because the approach to benchmarking is rich in local government but even richer when you start looking about a plan for place and how the health service and the other key partners in the community planning partnership operate collectively. That gives us an opportunity to push further towards the point that Mr Macdonald made about outcomes, because community planning partnerships are all about outcomes and single outcome agreements. We are on that journey just now and I have just started it in earnest. Before I pick up that point about CPP, it is just a couple of other things on your point about working elsewhere in the UK. We have had some discussions with our colleagues in Wales and I say that they have been working with us on a piece of software that we will launch later this year. In return for that support, we have agreed that we will support them with some of the benchmarking work that we have done. They are going through their own reform process in Wales at the moment of restructuring of local government, so they have asked that if later in the year we can open that dialogue in detail with them and we will do that. Again, I have been over in Northern Ireland this year to talk to colleagues through their local government association over there, who are again looking through their reform process and they have asked for further work to be done with us next year once the new councils are up and running in Northern Ireland as well. We will be there to hear our practice, hear some guidance from the kind of experience we have had and we will continue to offer that to other colleagues elsewhere across the UK. The point about CPPs that David made, I think it was something that we discussed with you when we were here the last time. We have now agreed a programme with the Scottish Government and we will launch it in the autumn to take some of the insights about how you do with benchmarking with local governments into community planning partnerships. A project board has been put together at the moment, I think that it is scheduled to meet early October for the first time to oversee the programme. What we intend to do is to publish a draft indicator framework in the autumn following that first board meeting, consult with the community planning partners themselves and come to agreement about what would be a core dataset to begin that dialogue and process of benchmarking across the CPPs. We will launch that in the autumn and we are looking to then have in effect the equivalent of that overview report available probably sometime early next spring, April, March, something like that. That is the plan at the moment but we will see how that goes. Hopefully coming in the spring time we will have something to say about the community planning process in a lot more detail. It will not be exactly the same obviously as what we are dealing with with councils, but it will be much more of the outcome end of the spectrum and involve all community planning partners. A way back at the very onset of the committee looking at benchmarking, we also had a concern about local authorities having to give data to all different sources in a sense. Has that changed any or has it alleviated any of the stress from there? It remains a big issue about data access and data management for councils. Again, this is not the only work that we are looking at. There are other groups at Scottish level looking at some of those issues. Two groups that we are involved in certainly is the improvement service, one called the improving evidence and data group, which again brings colleagues from across Scottish Government and the whole public sector together to look at exactly those issues about making data easier to access for all public services, not just councils. In addition to that, something called the public service reform boards are looking at the performance management frameworks across the public sector. Again, one of the rationales is to make it easier to access data and to harmonise the type of data that we are all providing across the public sector to various different performance frameworks. It is a perennial problem. It is not perfect, but there are attempts to clean up a lot of that, not simply through the work that we have been doing but certainly elsewhere as well. However, it just remains an issue. It is not perfect, I am afraid. Thank you very much for all that information and your evidence today. Can we ask about your immediate or medium-term development of the framework from here on in? In the overview report, we set out some of the developments. The family groups is the biggest development over the next year, so we will have eight themes explored over the next year and then a year after a further eight themes. Again, as we said earlier, they are over and above councils will do their own thing, so that is going to be the big investment over the next period of time. Getting the local public performance reports strengthened towards the end of this year is a second major area for ourselves. Again, that is well under ways a piece of working will be working with the councils through to about November to finalise that piece of work. The last big area, I think, for us is going to be around customer satisfaction. We had highlighted previously that we use the Scottish household survey as the basis for customer satisfaction within the framework, but it was never ideal. It is a good sample if you want to understand issues at Scotland level. Once you get down to the individual council level, the data samples become small and tend to be somewhat unreliable. Again, over the next 12 months, a big area is to strengthen that up from the local authority perspective, so we have stronger customer service data in future feeding into the benchmark. They are the three big areas, but there will be other things as well. Again, we can send you through, if you wish, a copy of the full development plan. You can see some of the other areas that we will be working on as well, but they are the major ones. That would be very useful for us. If you are very, very brief, Mr McMillan. Yes, thank you, convener. Last week, the COSLA commission produced its report on local government. Have you had any discussions with them bearing in mind the recommendations that I have had in the report? With COSLA or with the commission. I was one of the advisers to the commission and, certainly, we made them aware of the work that we have been doing through benchmarking as part of their discussions. Given what they did, they conclude around the need for stronger local accountability in terms of local democracy going forward. That chimes with the work that we have been doing. It gives us fertile ground, depending on what happens with that. The recommendations of that report continue to promote that kind of work. It chimes well with what they are saying, but it was certainly just there's background information to them. There was no real detailed discussion for them at their various events over the last six months or so. Finally, it was seen by many that this entire project and the data would be used as a stick to beat councils with. There seemed to be quite a fear about that at the very beginning of the process, and maybe that's why the process took so long. I don't know. That doesn't seem to be the case. Would you like to comment on that, Mr Martin? I think that it's a little government being confident about performance and improvement, and certainly that's been very much the message that we've had from the COSLA leadership, individual councils, leaderships and oppositions, that we need to know how we're performing in order to improve public services. We have a burning platform in terms of public finances. It would be fair to say that there was some nervousness, I guess, about to what extent we would have unhelpful and uninformed league tables in the press. We tried, through the launch of the project, to get a more informed debate when it was launched last year, and that worked. The kind of feedback that we're now getting from the media is about interest in how public services are performing, as opposed to, if you like, naming and shaming and talk of postcode lotteries. That was part of the concern, and I think that it has built confidence, convener, in using the data. I think that what we've demonstrated about this is that local government is good and, I think, can be relied upon to self-evaluate and to use the information for improvement. Thank you very much for your evidence today, gentlemen. I'll suspend for a couple of minutes for a change of witnesses, please. I now welcome our second panel of the morning, Steve Grimmond, chief executive of Fife Council, and Elma Murray, chief executive of North Ayrshire Council. Welcome to you both. Would you like to make any opening remarks? I would, convener, please. Thank you very much to the committee for inviting me here today to give evidence. As I say, a few introductory remarks. North Ayrshire has worked with the improvement service and council colleagues across the council community over the past three years on this particular area of work, which is why I'm very happy to come here today and try to answer all of your questions. Benchmarking as a process as opposed to the actual benchmarks themselves, that benchmarking as a process is a fundamental and important part of the council's overall approach to performance management and performance improvement overall. The local government benchmarking framework is not the only framework that we use, so an example of another area that we do a lot of work with is the Association of Public Secretary of Excellence, which covers England, Wales, Scotland and Northern Ireland. It is very much about driving improvement, and it is about, for staff particularly, learning more about critical thinking. The process, as you have gathered from your previous questions this morning and answers, is very much still evolving. I believe that that will continue. It involves elected members. I can talk more about that, if you like. Chief officers and staff. Importantly, I raise staff because a lot of that is about the overall culture and the ethos of improvement within the totality of the organisation. You mentioned local reports earlier on, and if you would like some copies of the local reports that North Ayrshire produces for its cabinet and scrutiny committees, I'm very happy to provide those afterwards. That concludes my opening remarks. It would be useful to have those reports, Ms Murray. I have no opening remarks to make. In terms of that level of scrutiny, that level of overview by councillors, you mentioned Cabinet and scrutiny committees. How much access do all of the elected members in North Ayrshire have to the data? How have you helped them to understand what it means not only in general terms but in comparison with its family members? The way that it works in North Ayrshire is that this year we have provided reports to both our cabinet and scrutiny committees, and they are pretty much identical reports that they get. All of our elected members get copies of all of the cabinet reports as a matter of course, so they get a weekly delivery of reports, and for each time that we have a cabinet, they will get their cabinet reports, including that. All elected members get the information provided to them. Members who sit on cabinet are the members of the administration, and scrutiny is cross-party membership, including independence of which North Ayrshire has a fair number. In total, that would be 14 out of our 30 elected members who would be sitting in a meeting and delivering the reports with officers. Last year, when we started to produce the report following the very first annual report from the improvement service, we took the reports to council, which meant that all council members got access to it at the very first time that we considered it. This year, as I say, we have honoured it down to cabinet and scrutiny to allow us more time to debate and go into the detail in a lot more information than we are able to at a full meeting of the council. We have been, as I said, evolving and developing our approach to performance management overall and performance improvement. We have been looking at the overall approach to benchmarking, including all our other different benchmarking forums that we get involved in. It is our intention, so we have not done this yet, but it would be our intention that, once we have pulled more of that work together and can present it to members in a cogent way that allows them to see the overall work that we are doing, we would intend to take that to all members again as a performance management and improvement seminar and expect that to take place later on this year. I will come to you in a second, Mr Grimond, with the same questions. In terms of front-line staff, one of the things that I have said previously is that, in order to drive that improvement, front-line staff have to be aware of where the council is at. They are often the ones who come up with the best ideas for improving services. How do you relate that information to front-line staff and what input do they have in terms of driving that improvement? First of all, some of our front-line staff will take part in some of the benchmarking activities. So, when we look at particular aspects of our performance and determine how we will engage with the improvement service and, indeed, through families or peer groups, which could be different from our family groups, front-line staff would be the staff who would get involved in doing that work. What would happen with that work is that, rather than cascading down, that would bubble up to senior management and to chief officers, which would be appropriate to allow us to have a look at the recommendations that are coming from that work. I could give you an example of how we have done that this year. A piece of work that we did over the course of this year was to look at educational attainment for looked-after children. That is a family group piece of work that will take place through the improvement service this year across all authorities. However, we did an early piece of work on this this year because it was of particular importance to us. Obviously, we had the information from the first couple of years' worth of benchmarking to point us in that direction. We got together with another four local authorities. It was our staff, our service delivery staff, who got involved in that work across all the authorities. They looked at what issues were causing poorer performance for looked-after children and where it was better, because there are some particular areas where it is better. For example, for looked-after children, where we take them into council or private accommodation—certainly in North Ayrshire, it is council accommodation—we take them into some of our children's units, their performance is generally much better than it would be if they were still staying at home. We also find that we get better performance if they are living with foster carers as well. What we did with that was to look at what were the conditions for better performance, so that the environment that those children were living in and what support they were getting in, but also what were the similarities between what we were doing and where there were differences. What we found from that piece of work was that there were lots of similarities for what we were doing, but there were some ideas that came out from the staff themselves about what they thought they could progress further in. It was because they were spending a concentrated piece of time looking at the conditions for those young people and working with other colleagues and other councils helped to develop other ideas. We had about four or five recommendations that came out of that piece of work that then came up to chief officers to allow us to look at that and say, yes, we are happy that you go on and you progress with this work now to try and affect even more improvement in the work and what we have been doing in the outcomes for those children. I will take you back a wee bit. Before you gave the example, you said that some of your front-line staff had an in. How can we ensure that all front-line staff play a part in that improvement and know what that is about? One of the reasons why I asked that is that, quite often in my days in the council, you would get a member of staff coming to you and saying, Kevin, I have never been listened to on this particular matter, but we could improve this by doing x, y and z. It was often the case that x, y and z made a huge difference in service delivery into folks' lives. How do we ensure that all staff, not just some, are involved in this process? First of all, I do not think that the particular question that you are asking may necessarily relate directly to the local government benchmarking framework. I think that that is about having an improvement ethos and approach across the whole of the organisation. There are a number of other activities that North Ayrshire Council will adopt. I guess that there is a facility with our staff to allow all of our staff to get involved in improvement activity. I could not hand on heart to say to you that 100 per cent of the staff in North Ayrshire right now are involved in improvement activity, but there is opportunity for them all to be involved in improvement activity. The way in which we do that is through issues such as suggestion schemes to allow staff to bring forward their own suggestions for how they can make improvement, by very regular communication between staff, senior managers, team leaders and staff members about what is happening in their service and how they can improve it, by having a range of different projects and initiatives that we clearly communicate across all staff members, which allow them to see where improvements are being made and the fact that their peers and colleagues are getting involved in those improvements at different levels across the organisation. It is much, much broader and much more about the organisational development activities of the council that allow some of that to take place as well. I may come back to that. Mr Grimond, do you want to tell us how you are ensuring that councillors are able to scrutinise those benchmarks and also able to improve? I am also interested to hear the staff involvement and at what level staff are involved in those issues. In terms of elective member engagement, we have embedded the local government benchmark framework in our council plan. At a very high level, we have recognised that suite of indicators as a good proxy for performance across the council and identified that as an improvement target for the council over the lifespan of the council plan. That was considered by the full council and agreed by the full council. Monitoring and progress in relation to the council plan returns to the full council on a regular basis. At a very high level, we have recognised that. In terms of more forensic opportunity to interrogate the performance information, we have also embedded the LGBT framework in our service plans. Through that process, consideration is given by annually through our scrutiny committee. There is an opportunity to challenge progress in relation to services and to look at that through the lines of the performance data that we have got through LGBF, including a comparative analysis on cost data. That is in front of the scrutiny committee. The third element of that is through our executive committee and the administration's involvement in both setting policy and undertaking decision making in relation to budget strategy. We have used the LGBF data, particularly the cost comparative data, as one fairly rich scene of information that helps at the very least as a can opener to allow members to begin to scrutinise spend in particular areas and ask questions about the opportunities for efficiency in some areas where the performance data would suggest that Fife is an outlier in relation to both our family groups and more broadly across the board. In relation to staff involvement, equally in setting this as a high-level priority for improvement in the council plan, I have ensured that we have engaged with staff across the council first of all to elevate the focus and importance of performance, and the LGBF data being a significant lens through which you can look at performance. It is not the only lens, there are a number of other ways in which we do that, but I highlight that that is important. We have shared information with all staff about our relative performance in relation to that data set and encouraged staff to engage with that dialogue around how we can improve over time. The ways in which that can then happen are various and probably mirror some of the points that the ELMA made in terms of individual staff being encouraged to bring forward suggestions for improvement, more planned approaches to looking at particular areas and involving staff across the hierarchy in improvement programmes and also we have improvement boards which are populated by senior staff of the council but also involve staff right down the front line in looking at how we can make improvement. There are a range of ways in which we do that. The other thing that I said, just in terms of going back to elected members, in addition to the consideration that we provide through committees, we also provided full information on the benchmark information for all members of the council at the point of publication. Okay, and McTaggart, please. Thanks, convener, and good morning panel. Ms Murray, you had mentioned earlier at the very beginning of your speech about looking at, it was UK and further afield, was it the public sector of excellence? Could I ask you to tell us a wee bit more about that and to enable you to do that? Does that create loads of extra work and loads of different data? We've been working with the Association of Public Service Excellence for longer than we've been doing the local government benchmarking framework. Some of the indicators and data that we send to APSI, as it's called, is the same, and some of it is different. APSI has its roots in, I would say, a lot of what would mean the traditional blue collar type operations of councils, but it has expanded from that now as well. Some of the data is the same, some of the data is in addition to that, but it's all data that we feel is relevant to the performance in our council. There are benchmarking groups that we are particularly interested in participating in. Some of the examples around that would be around refuse collection, building, cleaning, highways and winter maintenance and so on. Some of them are the same and some of them are a wee bit different. Have you made changes in taking an example from the UK that perhaps do it differently? Sometimes we make small changes and, to be honest, I wouldn't have the detail here today. I could get that for you at another time if you wanted that. Sometimes it's bigger changes, but what it does is, I think, importantly, is it makes sure—I'm just picking up on one of the points that Mr Grimmond was making about the different lenses that we use to look at performance. It gives us another lens and it gives us another suite of performance information that we can compare ourselves against, so it's looking out with the council, with other areas, to get that broader perspective on what we're doing. How often do you, as a council, review the indicators that are put before members? Obviously, there are some that are statutory, but there are others that councils themselves choose to place before members. How often do you review that, Mr Grimmond? In terms of the council management team, we would review the corporate indicators on a quarterly cycle, so the defined set of basket indicators, which would include reference to the LGBF suite, but would have some additional indicators. That's mirrored through the council scrutiny committees, where, again, there's a six-month cycle of review of service plans and performance in relation to that. At individual service level, there will be a more regular quarterly review of specific indicators in relation to an area of service. As a corporate management team, we would do that on a six-monthly basis, with some exception reporting and, between times, depending on what particular indicators or issues are telling us in areas where we're seeking to make a very specific piece of improvement. That, again, is mirrored through our cabinet and scrutiny committees. On an annual basis, in addition to that, on a six-monthly basis, we also have our public performance report, which we take to councils. In essence, we are getting three opportunities at the elected member level and at least three opportunities at the corporate management team level over the course of the year. In terms of when reviews take place, there are obviously the statutory performance indicators as well. I am aware that there are some measurements that have been undertaken by councils for a very long period of time. Some of those will be relevant, some of those may become less. Do you think that there are some things that we continue to measure that we shouldn't be measuring any more? Are there some things that we ought to be measuring that we're not measuring at present? How do you have feed into the process in terms of what SPIs are being measured? You might want to add to that, because there was discussion at the very beginning of the process of creating a uniformity in terms of all the measures that were going to be done by various bodies. Has that uniformity started? Is it complete? Will it ever come to fruition? You might want to add that in to your answer to Mr Grimmond. I think that the LGBF framework has been a significant tool in driving towards uniformity and comparability at its heart, and that has been a positive development. The other thing that it has done in answer to Mr MacDonald's question is that, certainly from a five-perspective, it has allowed us to review the wider range of performance data that we currently collate. What that has led to is a degree of culling of some of that performance data, a sense that we had a myriad of very forensic data, but not necessarily well-aligned to what were the key priorities that we were trying to deliver as a council. We have been going through a process of, almost at service level, reinforcing the need for comparable uniformity, so that we have key indicators that we can compare ourselves in relation to. As a consequence of that, challenging services and members have been challenging as well around whether we can reduce or remove some of those indicators that are less relevant. Are there other areas that would additionally be useful? I suppose that one of the issues is locally set priorities and how you measure progress in relation to those, and they may be consistent. The LGBF data may well assist in shining a light on that, but there may be other things that are local priorities that we would seek to capture data on. The final point in relation to that is that the LGBF data, while providing a significant suite of data across local government, is not completely comprehensive. There are other areas where we might want to have more forensic attention, so on 5, we would wish to develop further intelligent measures in relation to our economic development activity as a priority. Also, across early years activity, how do we measure success in performance in that area? That would not be necessarily covered comprehensively in the LGBF data, so there are probably two examples of that. I might not see much more about uniformity than Mr Griman did, but I will reflect on the Accounts Commission now in relation to its SPIs. Its annual direction for two years has been that they will use the performance information that they get through the local government benchmarking framework, so that has been extremely positive in terms of how councils and the Accounts Commission in Audit Scotland work in this area. From a council point of view, Mr Griman is right, I think that most councils will probably review, I would suspect, on an annual basis whether or not the range of indicators that they are using and measures that they are using to assess their performance are still relevant and still appropriate for the way in which services are delivered and the range of services that they deliver in the area. From my council's point of view, what we do is we have a much broader suite of measures than the ones that are contained within the local government benchmarking framework. We have measures that we have identified but also measures or information that we provide to other regulatory bodies across Scotland that we look at where we feel that those are very important for our public to know about and where we feel that they do accurately deal with particular aspects of service delivery in our area, so we include those as well. We clearly mark out in our public performance report which are local government benchmarking ones, which are statutory performance indicators, which are additional measures that the council uses, which are ones that we send to other bodies so that there is hopefully complete clarity in that. As well as that, one of the important aspects to show that we are reviewing and reassessing is that this year in the local government benchmarking framework, just a year in, we added another measure this year around economic development to look at our employment activity. That will be an area that I see as being very important to all councils in Scotland and an area that we would want to do further work on as well. I asked earlier, at the very outset of the process, I focused on the issue around measurement of inputs versus measurement of outcomes. I get the feeling that we are still a little too keen on measuring inputs when the policy agenda has been more towards delivery of outcomes. When you are looking at the things that you are measuring, where you are measuring an input, are you taking the steps to identify measurable outcomes that you can then draw the narrative between what is going in in terms of funding and what is coming out in terms of performance and quality of service. I am aware that data will not always tell you how well the service is performing, but it would give an indication beyond simply saying that we are putting x amount of money into this service or that it is costing x amount per head to educate children what is the outcome from that. I know that in education we are very good in terms of the attainment data, but I do not think that other services have the same focus on outcomes rather than just the input data that gets put before councillors. I think that the input and output measures that we have are an important aspect of guiding us into where and how we should be asking questions. I might just touch on a piece of national work that I am doing just now that Mr Macatheer referred to earlier on, which is work that the Public Service Reform Board is doing around the performance management framework. I am leading that particular piece of work for the Public Service Reform Board just now. What we are looking at is the national performance framework, which, as you know, has now been in place for some seven years, since the present Scottish Government came into place in 2007. It is very focused on outcomes. What we are looking at nationally is how we better demonstrate that those outcomes are being achieved on an on-going and progressive basis in a much more effective way than we perhaps have managed to so far. I agree in part with you that we need to do more work to focus more on outcomes. However, going back to the point that I was making, when you look at the input and output measures that we use at the moment, what those do is allow us to start having those very important discussions with colleagues, either through the family groups or through peer groups, about what outcomes they are achieving with the measures that they have in front of them. That allows us then to look at how we do what we are doing, as opposed to just what the outputs are from what we are doing, and that is the link between the outcomes and the measures. I am going to use an example. I think that the Benchmark framework indicators are largely inputs and output indicators with a focus on cost. I think that you have got or certainly how we would approach that in Fife is to see this alongside our council plan priorities, which are largely around outcomes and connect the input and output data to those outcomes. If you took an area, for example, looking at social care, we have clear outcomes identified in relation to providing quality social care to residents in Fife, against the fairly challenging backdrop, not just in Fife, but nationally. The input and output data that is within the local government benchmark framework provides a useful set of data that provides an opportunity to challenge whether we are doing the right things and whether we are doing those things effectively towards delivering on those outcomes. That cost data is asking some fairly hard questions about whether we are organising the way that we are delivering social care as effectively as we could to deliver on the outcome. That would be the way that I would pull together the connection there, but clearly those largely input and output indicators cannot be used in isolation, nor do we, I think, within Fife. Mr MacDonald, is that you? Yes. I have a number of people who want to come in, so brief questions and brief answers, please. Stuart McMillan, please. Just regarding the composition of the family groups, how do you think that they have been working? Are you quite content with them? I think that the family groups, we had quite a lot of deliberation before the family groups were finalised, so nothing that we are doing is done in any kind of unconsidered or fully considered. I am pretty comfortable with the family groups, but the other reason that I am pretty comfortable with the family groups that my own councils are part of is that, in addition to that, we have a whole range of peer groups for different measures. I do not know whether that was in the full report that the improvement service gave you, but I am trying to find it here in my notes. In some of the other bits of work that we do, we have peer groups where they are themed, so we have about half a dozen peer groups that allow us to work with different councils in different aspects, which is not necessarily part of the formal pieces of work that we might do over a course of year, but might be more localised. I will give you an example of that again this year. A piece of work that North Ayrshire decided to do was to look at non-domestic rate collection, and we decided to do that piece of work with Perth and Kinross, which would normally not be a family group that we would be related to or a group that would be related to quite different demographics, different geographical profile and so on. We decided to look at them because they were a really top performer, and we wanted to find out what they were doing and how they were going about it to see whether there were things that we could do in North Ayrshire to try and improve. We have a couple of actions from that that we are going to implement within our non-domestic rates team, and that was staff that did that. My view is that the groups are working quite well, but they are not exclusive. You can dip in and out of them to do other pieces of work, as you think are particularly relevant to your council. Briefly, I broadly agree with Elma's comments on that. I think that it is early days in relation to the family groups. One of the advantages is the potential for the structured approach through the pilots to further examine how effective those family groups can be. It is definitely a positive development and something that we are comfortable engaging with, but it is not the only thing that we would engage with. For example, when we have been looking at improvements around social work provision, which is prompted through the local government benchmark framework data, the way that we are taking that forward is with our range of partners. We have a particular relationship with North Lanarkshire Council in relation to exploring what they are doing, because we think that that has got a particular relevance to improvements that we want to make in Fife. They are not in our family group, but we would also engage with that. Thank you very much. Good morning. How could the family groups be improved? You also mentioned peer groups. You are obviously flicking around with the family groups going to other councils such as Perth and Kinross. How could they be improved, do you think? Is there any way? What is the peer group? I found my note in my peer groups, and I will include that in the material that I sent in to you. It was part of a report that we had done internally to remind everyone what the peer groups were. We have peer groups that deal with issues around population employment, size, young people, child poverty and rurality. Depending on the indicators or the service area that we want to look at, we might dip into doing some particular pieces of work with the local authorities in that particular peer group. I would not have a suggestion for improving the family groups at the moment. I am quite happy with how the family groups work as long as my authority and I can dip out to look at particular aspects in other areas, because there are occasions when I want the opportunity to compare what the council is doing with another council that is not part of one of the groups that we currently have. However, having all that information for 32 councils allows you that flexibility and the rigor that is associated with working as part of a family group. On the back of your comments regarding the peer groups, when you have staff go to speak to other local authority areas to learn about what they are doing with the best practice, and if you do try to instill some of that in your own local authority area, do you have a consideration to include the third sector to potentialise or deliver some of those improvements, if there is an opportunity for the third sector to get involved? I think that my quick answer to that is yes, but I think that Steve has a specific answer here. If you have an example of that, be graf. Yes, that is the short answer. I equally short answer would be that going back to the example that I used of looking at our social work services and improvement arrangements there, looking at the role in which the third sector can play in relation to that and learning from experience elsewhere is absolutely central to that agenda. I would like to raise one of the concerns that we heard in the early days of the benchmarking process. Local authorities were complaining about the number of reporting agencies that had to report different data. I would like to know whether or not the benchmarking process is eased up in relation to the amount of reporting that local authorities have to do to other agencies, or have we found a way of being able to provide and collate the information that is required for a range of agencies within that framework? I think that the framework has been a positive step. One of the things that local government has to do is to get better at providing information that will service more than one client so that we are efficient in terms of how we are doing that. Has that resulted in a reduction in the level of other external scrutiny? It is less clear that that has led to a significant reduction. The positive benefits of the local government benchmark framework have largely been for local government in terms of having a more transparent approach to identifying performance and driving an improvement agenda rather than that being a driver to successfully reduce a wider range of scrutiny. I agree with the points that Steve Smith made. Mark McIntyre earlier mentioned to you that the local government benchmarking framework information had not just local authorities as its owners but some other agencies as well. That is because some of the information that we pulled together into the local government benchmarking framework comes as a result of information that we perhaps put into education. That is the one that mainly comes to mind just now. We are trying to use data for a number of other agencies at the one time, but we have not streamlined it as much as any of us would wish to streamline it. The national work that I mentioned earlier is looking at the performance management framework. A part of that work that I am looking at will be to establish whether there are performance measures and whether there are input or output measures that do not add real value to what we are doing in relation to the public sector or the public service in Scotland and whether there is scope for some of those to be removed from the suite of indicators that we currently provide to allow us to focus very much on those that are absolutely adding value to what we do, but that is a separate piece of work and quite a big piece of work. John Lylew, has there been additional resource implications for your local authorities in relation to providing the information that is required for the LGBTF on top of the reporting mechanisms that have already been in place and continue to stay in place at the same time as the LGBTF? The first year and probably the second year, I would say that we spent additional time checking and verifying the data. Mark Maceteer referred to that earlier on in relation to making sure that we were measuring the same things and that the quality of the data was absolutely right. The staff from the different departments that are involved in this have now got that more regularised, if you like, into a process. My own sense about my own council is that that does not feel inappropriate in any way or as an additional burden to the council. I agree with that. From Fife's perspective, there is no additional burden. I think that we are using the staff that would have been engaged in performance data collection more intelligently against that balanced basket of indicators. We have reduced some other information that we would have previously been providing because the council has decided that that is less important. One final question, convener, to Ms Murray. You made reference to the North Ayrshire membership of Apsi and the value that you placed on that membership. You also indicated that there were reporting mechanisms that were in place as a duty of that membership. Do you think that North Ayrshire will continue long-term to be a member of Apsi? Or is there some indication that because of the lgbf, some of the indicators that are being used by Apsi will be better drawn out in relation to the lgbf, rather than continuing to be members of Apsi? What has a financial cost to the council, as I understand it? We do pay to be a part of Apsi as well. What we do as a regular process is evaluate on an annual basis what we want to be part of and what is providing added value to the council. We would not do it if it was not providing us with value. The value that it gives us is that broader comparison to what is happening in England and Wales particularly, but also because of the additional elements of service that we are comparing as well, which we are not necessarily comparing through the local government benchmarking framework. As well as Apsi, we participate in a range of other benchmarking clubs. For example, for IT, we have the Socketham benchmarking club, we are part of the Scottish community care benchmarking network as well. My point in picking Apsi, because it is a big national one that looks outside the Scottish local authority area. My point in illustrating it was not just about that wider lens that it gives us, but it was also about demonstrating that performance improvement and using the local government benchmarking framework is not the only way to drive performance improvement. We believe in using a range of other benchmarking frameworks and organisations to help us to do that. Ms Murray, are you saying that LGBF is not sufficient at the present moment to cover all the areas that your local authority would wish to benchmark? That is correct, because it does not have the broadest range of indicators that we could use. It gives us what we require for statutory performance indicators and it gives us a lot of very, very good areas of performance. However, there are other areas that, as a council, we choose to compare ourselves to because we think that that is important to the services that we deliver in North Ayrshire. Finally, you heard my last question to the last panel. All of that was thought to possibly be a set of league tables that would give various folk a stick to beat local authorities with. Do you think that that has been the case? If not, how have we managed to alleviate that situation? I have to say right at the very start of the process that I was never fearful for the councillor for myself in taking that forward, because it is absolutely the right thing to do at the right time. The way in which we started to publish the information and our openness in terms of our approach to it has served the local government community in Scotland particularly well. I hope that that will continue year on year. I think that it has been a huge benefit to Scotland already. It has been a huge benefit to a lot of our staff as well as our elected members to take that broader look at performance improvement and understand what benchmarking is a process is all about. I was not fearful of that either at the outset and I think that the way it is played out has confirmed that lack of fear in relation to the framework. I think that we need to be bullish about being transparent and open about performance both within local government and with the public that it serves. I think that the framework provides a means to do that. If we do not know how we are performing, how can we possibly improve? I think that it has been a tool that has been helpful in that regard. The other positive thing is that it has not played out as a single set of league tables. There has been a recognition of local circumstance that Scotland is not homogenous and that we need to deliver services that are responsive to our local needs and demands. The way that we have saw that information being played out locally, both through our public performance report and any wider examination of that, has recognised that there are local factors. I think that it has been a positive cause for good. I suspend for 15 minutes for a change of witnesses and comfort period. Thank you very much. I now welcome our final panel for the day. I welcome councillor Elaine Greene, chair of the education committee from East Renfrewshire Council, Mary Shaw, director of education East Renfrewshire Council, councillor Stephen Curran, executive member for education and young people at Glasgow City Council and Maureen MacKenna, executive director of education services at Glasgow Council. Would you like to make any opening remarks at all? Just to say, thank you very much. Oh, just thank you. In which case we will move straight on to the questions. Can I ask you overall to what extent the benchmarking framework is used in education services to learn from others and then drive forward improvement? Who wants to start? Thank you, convener. Obviously, it's a great opportunity for us to be here today. It's a really important facet for us and a tool for us in terms of the drive to raise attainment in Glasgow as the largest authority, but also the authority with the biggest issues in terms of disadvantage and together with the young person the best start in life. It's been a really helpful tool for us in terms of the recognition particularly for the Scottish indices of multiple deprivation because you could have a quite simple measure which perhaps doesn't take into account disadvantage and some of the difficult circumstances that young people are facing in different parts of the country. So, the important thing for us is we can get a clear picture that's measured against their colleagues in other parts of Scotland and we can also talk about ways of improving and sharing that best practice and it's been a very helpful route for us in terms of getting that information through. But maybe more importantly it's been helpful for us to show where we have made an improvement and something's a very dramatic improvement in terms of raising attainment for some of the young people and facing the most disadvantage. It's used regularly, but it's not used exclusively. Yes, I would agree with what Councillor Cun has said and we use it extensively across the authority and we use it with our family group of authorities but also I think we use it across the whole country. The elected members find it very valuable because it helps us to scrutinise what we're doing well and maybe what we could improve on. As I say, it's used, benchmarking is not the same as Glasgow is not exclusively, but it's used extensively in East Trenforshire. Ms Shore and Ms McKenna, do you want to add anything? I perhaps would add that, as Councillor Curran said, it's not the only set of statistics that we use. I view the benchmarking tool as quite a high level and that we would want to certainly, from a director's perspective, be able to drill down much more down towards individual school and classroom levels. I know that that's the same position that I'm sure Marey would do, that we do as directors a lot of work below those levels of statistics, really getting down to look at what makes a difference for every child and young person. Obviously, East Trenforshire has areas of social deprivation as well, and your attainment levels are extremely high. Do you drill down to look at the differences between socially deprived areas and attainment levels in East Trenforshire, for example, and similar areas of social deprivation in Glasgow and take lessons from that? It's probably worth looking at the scale in terms of Glasgow, as 42 per cent of the children in Glasgow schools are in SIMD 10, and that bottom 10 per cent in terms of deprivation stats. Sometimes it's difficult to get a comparison that's similar enough. I would say probably that we would have examples with colleagues in Fife, for example, where we've looked to share best practice in terms of some of the improvement there, because it's similar challenges in terms of the scale of deprivation. However, as you made the point, each council can find young people who are facing that disadvantage. I think that the priority in terms of us is showing that difference can be made for the children who have in the past been deemed to be less likely to succeed. There's certainly a good conversation that we have with colleagues in neighbouring authorities like East Trenforshire in terms of looking at some of that practice. I suppose that the intensity of that in terms of some of the school environments we're able to look at what can really make a dramatic difference for young people in terms of using the benchmark and to show where we could see a difference in terms of that level of attainment and raising expectations. It's not just raising expectations among staff, among selected members. It's communities, parents and families being able to see that that actually is something that can be evidenced on the ground. I've got confidence that it's not just a measure, but it's actually showing progress that they can feel in their own community and in the school that their families are part of. I would say that we do punch above our weight in our areas of deprivation. We are very proud of our schools, particularly in the Bariad area in Eastwood. I think that it's all about aspiration for our young people, whether regardless of where they live in East Trenforshire, we are looking for them to have the best educational experience that they can have. We are sure that they will do that, and the director can give you more details on how she drills down. I think that it's all about quality of teaching as well. Thank you, convener. I think that in terms of the local government benchmarking framework, the family groupings are very helpful, as Councillor Curran has referred to. What we would say in terms of our youngsters from areas of deprivation is that they perform very well, and that is around or to do with the quality of education that they get in our schools, but we also use the framework to measure ourselves against councils that are similar in characteristics or profiles. For instance, East Trenforshire performs better than East Trenforshire in those areas, and we would use that to be able to work with our colleagues in East Trenforshire to find out what it is that they are doing that we can learn from. That is a benefit of both the family groupings and the LGBF. In terms of using the benchmarks themselves, let's stick with the areas of social deprivation first. Do you put more resource into those areas than you do some of the more well-off areas in East Trenforshire? We have had great support from the council in terms of addressing and raising attainment of the lowest performing 20 per cent, and over the past three years we have had additional support. That, however, has gone across. It is not necessarily focused only on areas of deprivation, but where we have youngsters across the authority where they will be in schools that are more affluent, but we have addressed that and have targets set for specific groups. Those groups may be those who are entitled to free school meals, rather than necessarily on an area basis. It seems from the evidence that you have provided, the written evidence, that East Trenforshire seems to be pretty forensic in terms of that drill down and then addressing any difficulties that it finds. Has that ethos taken a while to build up or is that something quite recent? I would say that we have a pretty mature approach to using benchmarking information. Without taking any credit for it myself, my previous boss set up a unit within the quality improvement team to ensure that intelligent use of data was the basis on which we identified where there is room for improvement and where there is room for celebration. Our schools benefit from all that, right down into the individual child level, where we track attainment of individuals from primary 1 all the way through. I am interested in the use of language that has been used this morning. Councillor Curran, you said expectation. Councillor Gray, you talked about aspiration. Now we have celebration as well. Language is often important in driving forward a policy, and it can form attitudes right from the bottom up and from the top down. In terms of reaching those levels of aspiration to get to the celebration point, how much input do your front-line staff have? Are they aware of those benchmarks? Are they the folks that are helping you to drive forward improvement? Ultimately, they are the most important people in ensuring that you can celebrate that success. It has to be taken into account that raising attainment and achieving is not as simple as some of the measures that are in the benchmarking framework. It is a bigger picture for some of the young people, the challenging circumstances when, for example, we have got a fifth of Scotland's looked-after and accommodated children. That can be a very difficult set of circumstances where getting them to achieve as well as the next person in the class is possibly the target that you are aiming for initially with them, and that level aspiration has to be developed beyond that. We share that bigger picture from the framework. The comparator schools and local authorities in terms of the peer group are an important part of that, but sometimes it can be very good informal relationships developed from that. They can become quite formalised around close connections with schools that have similar issues. For example, from our perspective, it could be a school in the north-west of the city, with our neighbouring school in Western Bartonshire, and they would have similar issues of facing the communities there, and they can share some of the best practice around that. As long as we can evidence that in terms of making a difference in meeting the needs of all the young people, the staff are the ones that we really trust to do that. The leadership in the schools is very much trusted in taking that forward, involving parents at every point, in making decisions and explaining why they are prioritising them in a particular manner. The key issue for me would be that we have to make it very clear that that is part of the picture. The understanding of every young person and their circumstances is what we expect the staff in the schools and other organisations that they work with to be up on most of their minds to make that radical difference and to celebrate that dramatic change in attainment on other levels of achievement. As far as aspirations are concerned, everybody from the directorate, the electric members, headteachers and parents staff are all aspirational for young people and children in East Remfrewshire. Benchmarking is very important, but I do not think that we see it at all as a league table. We see it as a way that we can challenge where necessary and celebrate where necessary. All of our committee every six weeks, the committee will challenge the directorate if they think that it is necessary, but we also celebrate the successes of headteachers of young people, so it is across the board. As the director said, since the inception of East Remfrewshire Council in 1996, it is mature and our elected members know just what questions they should be asking. You mentioned the drill down beyond the council-wide data to locally available data. When you look at the locally available data, is that then made available to both elected members and communities themselves, for example parent councils, so that they can see the data that is relevant to their individual school or schools within their communities? I am seeing lots of nodding, so I will take that as a yes. One of the traps that we fall into far too often and slight bugbear of mine is firstly the distinction between attainment and achievement, and academic attainment is not always the encapsulation of the child's experience through the school process. Have you looked at it? Obviously, it is very difficult within the sort of data that you are collecting here to capture that wider achievement, so is that something that you have looked at as councils? Also, one of the other traps that we often fall into is that when we look at attainment levels, we compare the previous year to the current year, and you are talking about two very different groups of children who are going through the process. The difficulty that you can always often face is that children are being compared against those who have gone before them, not against their own progress. What steps do you take to track the child's progress through the school system so that, when their attainment is being looked at, it is being looked at, not just against what previous years had attained, but also against the expectations that might have been against the attainment for that year group itself? We look at wider achievement and gather that. I think that it would be fair to say that it is not as mature in the information that we gather, but our schools are gathering information about youngsters' involvement as well as the achievements that they have in activities that we would always see as contributing to their attainment. The attainment would be the measure, essentially, of that achievement. You are not going to get youngsters achieving well or attaining well if they do not have the confidence to achieve as well. We do do lots of activities and measures in terms of the number of youngsters that go through, for instance, the Duke of Edinburgh award. We report that through our Standards and Quality report, but also through our end-in-mid-year reports to the Education Committee. I indicated earlier that we track youngsters' individual progress, and that starts in primary 1 with a baseline assessment. We have a standardised test that we administer in P3, P5, P7 and S2. Therefore, we are able to have those expectations of how youngsters should progress and should attain in the later stages at which the LGBTF gathers. However, that information is used to predict what range of attainment results youngsters should be achieving or having ambitions for. Our ambitions for that tracking, therefore, are essential. That is available both at individual pupil level, but it is also looked at in terms of school or stage performance, and that is shared with our headteachers. Essentially, that tracking information is available to all class teachers as well. Therefore, they can look up what the expectations are for youngsters and, among the lots of other assessment information that they will use, for instance, through looking at how well they are performing in class. On some of the points that Mary Shaw was making, we look a lot at wider achievement. We feel particularly strongly that, particularly for our young people in Glasgow, we need to raise their expectations by broadening their experiences. We use wider achievement opportunities to broaden those experiences to develop the confidence and resilience that Mary Shaw was referring to, which is so important if young people are going to attain in exams. Achievements—we have been working particularly hard in the past two or three years on Duke of Edinburgh and getting more successes there. Duke of Edinburgh is a very challenging programme. For our young people coming from difficult circumstances, that can be a real around the planning of it and finding those opportunities and the finance of it can be tricky. We have focused a lot on sports leadership. We are now the UK's leading local authority for sports leadership, and that has been a wonderful success for us, particularly in the run-up to the games. There are lots of opportunities and we report on those again through our standards and qualities, similar to all the other local authorities in Scotland. I take your point about the one-year attainment statistics being about that particular cohort. For a local authority, we need to look at trends over time. Although we can allow for a little bit of local variation in terms of one cohort compared to another cohort, across any size of grouping, you should be looking at that trend over time to see whether you are getting that improvement coming through consistently. You allow for little fluctuations, but that needs to be watched carefully. Our schools track individual young people's progress. We are not as mature in East Renfrewshire in terms of the data about individual, and some of that is to do with scale. We have got 36,000-plus in our primary schools and 26,000-plus in our secondary schools. We have headteachers who I consider to be senior officers of the authority, and it is their responsibility to track individual young people. Central staff go out and will sample that, work and scrutinise some of the data. We engage in a lot of activities where we bring heads together to talk and challenge each other about levels of attainment and how we are monitoring and tracking. A gap that we have is data at primary school without a shadow of a doubt. It is a national gap that is not there. We have looked at diagnostic assessment. Can you explain what you mean by a national gap? There is no national attainment data from 3 to 15. The first national data appears in SQA examinations. There used to be five to 14 national assessments. Not that I haste into arguing for a return to national assessments, but I think that there is something needed. Standardised tests in East Renfrewshire serve a very strong purpose. There is a whole range of different types of assessments out there. Our staff gather a range of assessment information in primary school. We have worked closely with them to ensure that they have data on each and every young person to make sure that we are raising their expectations and that children are making their appropriate progress. However, I do not have data that I can gather together to look at how Glasgow is performing in comparison with East Renfrewshire at the end of P4 or P6 or whatever. That data does not exist just now, and I think that it needs to be debated. I will come back to the point that Ms Shaw made about classroom teachers having access to the data. You mentioned head teachers, and you said that central staff go in and that head teachers and senior managers have managed the data. What is the access of your classroom teachers to the data? The data comes from classroom teachers. The head teacher will have the overall responsibility. The senior management team will link with individual departments and it will go right down to classroom teachers looking at their young people's performance. I was, in a previous life, a principal teacher of mathematics in a secondary school. We used a lot of data and it was in partnership with classroom teachers who took the responsibility in their classrooms. That is the same in the primary school too, where there will be regular meetings between the deput, the head and classroom teachers looking and drilling down on the progress of individual children. That is part and parcel of a primary school or a secondary school nowadays. One of the failings that there always is in this life is when there is no sufficient data to transfer. As kids move from year to year and teacher to teacher and possibly even school to school, which is often the case, how does that data follow on to make sure that we are getting it right for that child? In primary schools there is always transition. I was also in a previous life as a school inspector and looked at a lot of schools and a lot of the processes that they use. Transition is a very key part. The transition between a child going from P3 into P4 is a very critical transition for them. A young person is equally going from P7 to S1. Over the years, there has been a massive amount of effort going into that information transferring about the child. What is the most important information that transfers? They need to know about the child's progress in particular in key curricular areas and how that young person is a learner. Some of the work that has been done through the new curricular reform about personal learning plans is going a long way towards improving that level of information that is being transferred and held within schools. I think that one of the things that I was remiss in pointing out earlier was that to overcome that year-on-year comparison with the different cohorts of children that we have and have continued to have three-year target setting. Those are targets based on what has been achieved in the previous three years and then targets are set so that it helps to smooth out where there might be spikes on an annual basis if we go down that sort of annual or three-year approach. Going back to what Ms McKenna has said, I think that curriculum for excellence is certainly that we do not have opportunities to be able to benchmark at, indeed, before S4 now. What we are doing is trying to build the skills of teachers in terms of their professional judgments. That is both across sectors and across schools. I think that there is an opportunity to do that between local authorities, and some local authorities have started to do that. I am not sure that, as a country, we are able to rely on those assessments yet, but, on top of that, we have the Scottish Survey of Literacy and Numeracy on a yearly basis with either literacy or numeracy. That gives a national picture, but Maureen Wight is right to point out that we do not get information from that as an education authority to be able to say how well we are performing against the results on that national basis. The important point, convener, is to touch on Mr McDonald's question a little bit more, as the context for that is what happens after the young people leave school. The positive destinations figures that Skills Development Scotland work on are critical on that because they are broken down to a school level as well as a local authority level, and we can compare and contrast and share best practice in terms of some of the challenges that have been faced. The other point that I suppose is around individual young people is that I think that the onus is very much on the secondary level and we need to see more perhaps national focus on that in terms of primary and even early years as well in terms of tackling some of the worst disadvantage. For the young person individually, some of our schools have been doing exceptional work and we are sharing that practice now in terms of lifting the expectations, but more importantly for individuals you could be on track, but you should be on target as well. The target has to be more ambitious than being on track for young people as much as possible. I have seen very good examples in some of our secondary schools where individual pupils in one subject get strilled down to that level and we have looked on that from an elected member perspective as well as the professional focus in terms of saying what works in that environment, getting that young person to achieve better, even in one subject compared to other ones. That individual approach that the staff lead on and collate is critical in that process. Just to come back to Mr McDonald's question about achievement. Achievement is celebrated at any temperature in a big way. We have convener's awards for outstanding achievement to young people. It's not all about educational attainment, it's about the rounded child and the learning experience and very often they will be invited along to committee and their achievements are celebrated there with the elected members as well. I'm just looking here at the accounts commission and they state that council spending on education fell by 5 per cent in real terms between 2010-11 and 2012-13. You are operating in a fairly difficult financial environment right now and all the signs are that it's not certainly going to get any easier. How does this information and what other information should be brought together to influence policy makers in terms of taking this place, directing funding and how funds should be used, what should be directed at and councillors and local authorities, what they should be prioritising? How do we use this type of information? It's difficult for me to look at that and to commend both authorities that are here today for the progress that you are making in Old Glasgow, more so than I do in East Wales, but in Glasgow's case the levels of deprivation and poverty and the progress that is being made is great. Right across Scotland in education I think we need to do much better. What is it that we need to do and how can we use this? For example, I came from a council five council and on the advice that we were getting on shifting significant amounts of money into early years because the advice that we were getting was that, by the time a child is four or five coming into primary school, their future could be set out for them and you've got to get in there much earlier, but how do we prioritise and how do we use all this information to make the case for where we should be directing resources? I think that it has to be targeted to where it is needed most. Benchmarking certainly helps where you see that there is room for improvement and we can put extra resources in there, but in East Remshire we are investing a substantial amount of money in early years in the Auckland back area because it is seen as an area of deprivation. You are rightly saying that we are working with the CHCP partners and Sir Harry Burns in getting in as early as possible, not looking to intervention but prevention, so we are tired of getting a lot of our resources there. Across the board, spending the money where it is needed is the very difficult in the climate just now. We are all having to pull in our horns and it is not easy, but the money always must go to where the most need is, in my opinion. I suppose that a good example of that would be the focus on early years that Mr Rowley mentioned. Ourselfs in East Remshire were the only two councils that were delivered in 575 hours for free nursery place. A way ahead are the other local authorities in Scotland because both are local authorities prioritised that. Obviously, it was a Scottish Government objective and it was a wish that other people had, but the resourcing was not necessarily there. Both those authorities made a conscious effort to prioritise that. You can see that in terms of the measures, the improvement service outlet and the expenditure on each nursery place. That is largely around having better qualified staff, sometimes in our situation it is having stand-alone establishments in particular areas of deprivation, where we know that that is what is needed in that area for pre-5 education to be delivered in a quality environment for young people facing the most ethical circumstances. For our perspective, certainly with the pressure on finance, we have got a political commitment around that. We know that there is a national goodwill around focusing on early years at the moment, but to some extent we are ahead of the curve because we always saw it as an important place to put our money where our mouth was. I think that from a national perspective and also working with other colleagues across the country and councils, the improvement service benchmark and support that you can see today before your committees is an important aspect in saying what do we do, how do we adapt to that pressure and resource and bluntly how can you make savings to continue to deliver that service when there are going expectations around two-year-olds, which we have already been meeting for the first year of expectation around vulnerable two-year-olds, but that growing pressure is something that we will need to find extra resource from. As Mr Rowley said, the resources are not going to be rising over all over the next few years, we expect it to be a more difficult situation for us. That is an important aspect of it, that the political commitment followed up by the evidence that shows that you are actually putting your money where your mouth should be. There is the case of putting out your money where your mouth should be in some regards, but also in terms of early intervention, Glasgow City Council gave evidence to the finance committee not so long ago, and the finance committee said the statements from Glasgow City Council about the lack of existing evidence raised serious questions about why such key delivery agents are not familiar with the available wealth of information on early intervention that is discussed throughout the report. Have there been improvements in terms of the evidence-gathering that Glasgow is now doing in that regard? It is a report that the finance committee had not so long ago, and that was one of the conclusions in the finance committee's report. I cannot comment on it if I have not read it. We can allow you to come back and comment on that later. It is worth commenting on it, but it depends on what the early intervention point was on. If it was on early years specifically, there is obviously the early years collaborative, which is a new way of all the local authorities, the NHS and other partners, including the third sector, working on. The early years focus on early intervention in terms of some of the work that has been going on there is quite new in terms of measuring the process and the outcomes of that. It could be that it is because of the infancy in terms of the early years collaborative and some of the work around that, that we all need to get to grips with how that really makes a dramatic difference. It was an inquiry into preventative spend, and it was in 2011, and it was basically about your early intervention programme. I believe that Ms McKenna might have been a witness. We can come back to that, because that is a trickier question. I think that it is important that we manage to evidence these things. Sorry, Mr Riley. I am puzzled as to where you pulled that out, convener. I think that it is just in terms of the early intervention aspects and gathering up the evidence to ensure that the resource that is going in is getting the outcome that we require, and that we are actually evidencing that as well. Back to that point about how we influence policy. There needs to be, I certainly favour, a debate in Scotland around education and where we go. The importance or not of early intervention and the evidence for that. I am interested in the educational point of view on how perhaps the director also sees that in terms of how we use evidence, what have we got there, what else do we need. Some people say that, for example, if you are going to put a major investment into early years, it will be 10, 15 years before you will be able to actually prove that that word. Others say that that is not the case. You should be able to say that. I would be interested in that point of view. My final question as well would be, again, in that area, looking at how we go forward in terms of education, training and skills. In terms of employment, it just seems to me that education, training and skills are absolutely key. However, the links with vocational versus academic, the links with colleges and at what early stage. I will just briefly say that, by taking my own constituency, we have the aircraft carriers being pulled together, the fossil parts that have been built in Glasgow. What I am finding with those employers is that they are recruiting all over Europe because they cannot recruit the skilled labour in the local area. It is again, how do we measure what the links are with businesses? How much is education actually working with businesses to ensure that kids are able to get the qualifications that allow them to then develop the skills and get the skills? How do we use all this information that is supposed to try and, as a policy maker, look at how to be able to direct future priority and future spend? Shall we go to the educational professionals first and then the politicians, Ms Shaw? I will come back to the school leaver destinations and the links to employers. I think that that is an area that we can strengthen in East Rhenfisher. In terms of the early intervention and being able to measure the impact of that, I think that Councillor Curran is right to point out the work of the early years collaborative. The fact that youngsters and their progress with developmental milestones is going to be measured at the 27 to 30 month assessment. Again, in the entry to primary school and again in primary 4, there are opportunities to gather evidence to show the impact of that early intervention. The early intervention is more about working with families. I know that, in Glasgow, they are working very hard on it, as we are in East Rhenfisher, to make sure that we make that difference as early a stage as we can. With youngsters, before they reach nursery age, they become three and identifying and working with our colleagues in East Rhenfisher's CHCP to make sure that we identify them. There is a way to go to make sure that those measures that we will gather are robust. I am not sure that we have a coherent set of assessments that is consistent across the country. Certainly, even within East Rhenfisher, it is not consistent, but we will be able to see results from that. We gathered a baseline last year in line with the national target, and we will see some impact of that this year, because of the family-friendly approaches that we have taken in our pre-5 centres. I am busy racking my brains on the 2011 to the Parliament a few times. I think that it was three years ago and we have moved forward. I think that, as Ms Shaw says, the evidence to support preventative spend, as Mr Rowley was saying, is long-term, and there still is a lack. I do not think that Glasgow is any different from anywhere else about that coherent set of indicators that would allow you to be able to say, am I making a difference? Our work is very much focused with third sector and families and nurseries, and we work very closely with our health colleagues, and it is across the boundaries. There is shared learning because it is NHS Greater Glasgow and Clyde. I know that East Rhenfrewshire is also working along the same lines in terms of looking at family centres and the support that we provide there. It has been very much focused, in my view, in Glasgow because of the scale of the challenge that we face, is that we work very much with third sector and looking to see where we can maximise the support from the third sector, which I believe are much better placed than the statutory services, to make an impact in local communities. It is more than just what happens in a nursery, it is beyond the doors of the nursery. The nursery can be a catalyst and can pull partners together, but if we are going to impact systemic long-term change, we need to look at how our communities function and how they are working as families and how we can support those families. Although there is no coherent agreement, we have been undertaking in the past three years a significant amount of research and partnership with the Centre for Population Health, looking at longitudinal evidence of what differences our interventions are making. One of the things that we started as long ago as 2011 was also stretching the age range from the early years collaborative, which, when it started, was very focused at under fives. We always said from the outset in Glasgow that we needed to keep it a zero to eight because our children continue to experience difficulties. Families take time to build their capacity. In terms of policy making, one of the challenges that we have faced in the city with our college partners is being able to build the resilience of families, build their confidence, help them with their literacy levels, signpost them on to employment and further training, but being unable to get them to access college places because the funding became focused on 18 to 24 and some of our vulnerable parents were 25 plus. That has been particularly challenging. I sit on the board for Glasgow region and it is an area that we are looking at to see how we can assist with that. I do not know whether you want me to go on to the business partnerships that Mr Rowley mentioned. Vocational education is clearly a critical area for us and we have been making slow, steady progress in terms of our positive destinations. It has been hard fought gains and we are making little steps every year to improve and to close the gap on the national picture. We have been particularly focusing on raising expectations and aspirations and so our biggest gains have been around higher education, which has been delivered in both colleges and in universities and up at this year. In 2013, we increased by 2.5 per cent when nationally it dropped by 0.6, which is particularly proud of that. We recognise that our business partnerships are absolutely critical. As an education service, we have spent a lot of time getting young people ready for businesses, looking at employability skills and lots of programmes. I think that what we have not done well is getting businesses ready for young people. Businesses, particularly small and medium enterprises, are the biggest range in Scotland of working with them because it is a big, big decision for a small business to take on a young person and for them to understand and to be able to respond to that young person's needs. This year's challenge for us is working in partnership with the Chamber of Commerce in the city to look at how we can work better with our small and medium-sized businesses, but to look at the senior phase programmes to see whether we can build better pathways, perhaps not using traditional attainment measures such as hires but looking at national certificate pathways that would be delivered between schools, colleges and businesses so that young people start to get business experience from a younger age, perhaps maybe a day a week, and moving forward. I am aware that we are now straying into various realms of education policy, and I do not want to upset the education committee in that regard. In terms of the benchmarking aspects, if we could try and stick to that aspect in the main, and if we could temper the questions to that. I think that the point is that the benchmarking would have to sit alongside the Wood commission in terms of developing Scotland's young workforce. Obviously, the points that Mr Rowley made in terms of that are where we very much have to sit, because the work that Mr McKinnon has done in terms of business partnerships but also the relationship with the college sector is important. I suppose that the critical issue for me is that we sent up to a single outcome agreement as a council. The benchmarking is based on a council perspective with partners, the colleges and universities are in a different environment and a different committee in terms of their image. It is an important way for us to look at it in terms of answering the specific points around how we make the young people ready for the job market that is out there, but also how we measure it against our colleagues in our parts of the country. I am impressed that you got back there and did it very well indeed. As far as vocational education is concerned, we have certainly welcomed what the Wood commission recommendations were. We certainly need to link more with business. As far as early years and early intervention are concerned, Mr McKinnon covered it very well. We need to get to the communities, work with families and ask what they need of us and not impose what we think they need, so that it is all about communication. On the positive destination, what do you mean by positive destinations? It surely depends on where you are. As Mr Rowley mentioned, in the fight they had the look for engineers and shipbuilders, you focused a lot on positive destinations, particularly Ms McKinnon and the rates from Glasgow schools. What exactly do you mean by positive destinations, directing pupils? Are you directing them? Are you giving them vocational guidance? How do you monitor it? Can we temper that slightly and say how do we benchmark? Benchmark, that is what I really meant. Positive destinations are youngsters who are going on to schools to hire further education employment or training. Those are the measurements that are shared. In terms of the work of Skills Development Scotland, that is published on an annual basis. That information is shared again through the family groups that we have been working on in the pilot dawn from the LGBTF with positive destinations or the school leaver destination return. In my own view, I think that that is the fairest measure of a school. In relation to where it comes from, you can get lots of schools in Glasgow, for instance, outperforming many schools in East Renfrewshire in terms of youngsters going on to positive destinations. They will not be going on to the same destinations. We undoubtedly have a large cohort of our youngsters who go on to higher education, but as long as youngsters are going to the correct destination, that destination is sustained. The measure that is followed up in March is often a better measure of how successful we have been in getting those youngsters into or on to the right pathway. Sharing information through that family group that we are already working with and really sharing best practice and looking behind the actual statistics that are published is the way forward in making sure that we seek and indeed secure improvement. A set of examples for your convener would be the number of pupils who gain level 5 or level 6 qualifications. You look at East Renfrewshire statistics, for example, and they would appear to be streaks ahead and rightly so in terms of the focus on that. In terms of what we are doing, we want to see ourselves measurably improving in terms of what has been done against other comparisons, but that specific figure is vital and important for us to show that we have lifted the expectation. Those are two important benchmarks, but the positive destination is to go back to Mr Buchanan's point. We need to see more of our young people seeing higher education and further education as where they would expect to be. That is something that we are very much focused on. As long as we can see that particular improvement over the past year, when we had a growth in young people's higher education, when there was a slight dip in the Scottish figure, that is a warm feeling across Glasgow in terms of all the schools seeing that they have made a dramatic difference. Do you monitor after they have left school? Do you benchmark after they have left school? Is it just when they leave school? Skills Development Scotland contacted me a year after to ensure that the destination remains the same, but we can certainly look at more work on that. That is why it is important that the bigger picture through the whole process of education is vital in terms of the benchmarking and being part of that picture. Good afternoon. I am trying to drill down into the benchmarking framework, particularly for Glasgow and there may be issues in terms of East Wrenfrewshire, but I know that in the area that I live in, in part of the constituency that I represent, there are approximately 450 secondary school pupils that are bussed in from East End to Glasgow every day into Coalbridge to a high school there. How do those attainment levels, achievement levels and positive destinations get measured in relation to the benchmarking framework? They are not being educated by Glasgow City Council but by North Lanarkshire Council. How do those figures and pupils get measured in relation to particularly Glasgow City Council benchmark figures because they are not being educated in Glasgow? They are part of North Lanarkshire Council's secondary school system. The young people who travel into the schools in Coalbridge are attending there by right because their primary schools are associated with that secondary school in the same way that in West and Bartonshire there is a primary school that sits in Glasgow that historically is associated. We had the same situation until recently that East Wrenfrewshire had to change because of the pressures on the building that they had with their house building growth down at the south-west of the city. Although the children are educated at primary level in Glasgow, the responsibility is part of North Lanarkshire for the secondary school system. However, we work closely with North Lanarkshire and we would not consider removing them or seeing them as Glasgow City Council children. I think that Ms McKenna has picked me up wrongly in terms of what I am trying to get in terms of drilling down into the benchmarking framework to find out how those children, what benchmark framework do they get measured in North Lanarkshire's benchmark framework. Bear with me, Ms McKenna, because some of those children are coming from the most deprived areas in Glasgow and therefore some of the figures that are being produced for Glasgow in terms of the framework may be being skewed because they are not accurately measuring residents of Glasgow and children that live in Glasgow who are being educated in a neighbouring authority. I am just trying to find out how, is there any way that we can address that issue to ensure that the adequate resources and the adequate measuring, because what you have said in terms of attainment, achievement and positive destination, there are potentially 450 children who live in Glasgow and those measurements are being placed against or set aside another authority rather than being set aside at Glasgow City Council. It is whether or not we are adequately measuring that in a way that takes full account of the potential long-term problems. I am not talking about just in education, I am talking about beyond education years for Glasgow City in terms of the measurements that are coming out of the benchmarking framework. Ms McKenna? I suppose that some Glasgow children use places, some parents use placing requests to go into different authorities all over and I suppose that the challenge would be as to how you unpick that then because a number of our children are mobile, aren't they, you know, families choose to go to different schools and that is a challenge. I am not sure, I have to confess that I haven't thought about it in those terms and I am not sure how you would unpick or unpack those statistics. I mean, we have to work in partnership to have the assurance that the young people are getting the best possible opportunity and certainly our psychological services work closely with North Lanarkshire for any young person transferring, but I am, I have to say, I am struggling in terms of how to think about that in terms of the benchmarking tool. I notice that others want to come in. I think, in some regards, you know, from a tracking point of view where you have kids who have gone to primary school in one local authority area and then suddenly go into another local authority area, that skews the yearly tracking in some regards, I am sure, if kids are coming from more deprived areas with traditionally lower levels of attainment. It may well skew in some areas the other way and show that folks are doing better in secondary because it may well be that kids from affluent areas are going to poorer secondaries. Ms Shaw? Thank you, convener, that the LGBTF is six out of kilter. I think it would be fair to say in this aspect because we would always use information to help our schools to improve as they would in Glasgow. It is more difficult to do this or to use that as a measure of an authority that is educating a child or is not educating a child. I think that is what Maureen is suggesting. I think that there is opportunity for the LGBTF to be brought into line with the other measures that we have in terms of the attainment of youngsters who attend schools as opposed to where they reside. We would always say that we take responsibility for any children that are in our school, regardless of where they stay. They are East Renfrewshire pupils and we teach them and hope that they learn and achieve in the same way as we would for any of our other children who live in East Renfrewshire. It is a very important question that Mr Wilson has raised. I suppose that there are two aspects to it. One is the catchment area, where you already can plan for the young people the anticipation that they go to that school. Of course they can still have parental choice around denominational, non-denominational, and they can have parental choice around place and request outwith the catchment area. I think that every year in Glasgow we have around 3,000 applications for placing requests into primary 1 and S1. That is almost entirely within Glasgow in terms of that number, but it means that it gets something that is quite difficult to plan, so you need to know the young people very well. That is where the benchmark is important at a higher level, but drawing it down is really significant. The other important point from a Glasgow perspective, and it is the same in other cities, particularly across Scotland, is the number of young people who are coming to the city for the first time from outwith Scotland. We have about 15 to 20 per cent of young people who have English as an additional language, but 15 per cent from a black and minority ethnic background. Every year in Glasgow now we have seen 2,000 new pupils presenting at school with a range of language needs. So how do you measure that in terms of the benchmarking and have a fair assessment in terms of the needs that they have to have met in the school? It is easy to do that, but understanding a wider picture is very significant. It is a really important question about understanding who your community is that you are serving. In a Scottish context, it is easier, but you are right that we need to have a better, clearer picture in terms of explaining who is where and how does that impact on the delivery of that service? In your paper that you submitted, you have highlighted a very interesting point in all paraphrase that it is important that Glasgow continues to actively benchmark against other suitable authorities within the wider national and UK context. Some of the panel members that I have been in this morning have asked them about their UK connections. Can I ask what Glasgow's is? We look at Manchester and London because of the scale and the numbers. As Councillor Curran said, 42 per cent of our children live in the 10 per cent most deprived postcodes, which is about 27,000 children and young people. There is not another authority that is close to that in terms of percentage and scale, so it is important that we keep that outward look. We linked with Manchester for a bit of work, particularly around the games and looking at synergies between the two councils. More recently, I have been looking very closely at some of the London challenge work and the impact that that initiative had in raising young people's attainment and their aspirations, given the level of deprivation that was associated there. It is not a two-way process. Yesterday, we had a headteacher from a London school up to look at the dramatic improvements that we have seen in terms of improving attendance at school and reducing the level of exclusion. That is fundamentally the most important thing to do if you want to raise attainment to ensure that the young person is at school and is able to continue at school. We have certainly got a very good understanding and a close partnership, particularly in terms of London and Manchester, because of the challenge work that was conducted by the UK Government in earlier years. That is similar to our raising attainment agenda that we share with the Scottish Government and other colleagues across Scotland. A lot of resources went into that, and we know that resource is not there. That is an important aspect in terms of measuring the relationship that we have with colleagues elsewhere in the islands. Do you look at any other European or world cities that are comparable to Glasgow? Yes, so obviously the PISA results come out on a basis that is measured nationally. What does that mean for us in terms of the context in Glasgow? Obviously, the Commonwealth Games has been a fantastic opportunity for us to have international education as a big focus in our schools. Around the Commonwealth family of nations, there is a partnership with UNICEF around children's rights. We have got some very good developing work that is going on. We are certainly looking at work that is going on, for example, in the Canadian cities, where they have a different approach to earlier years and perhaps has more focus on four-year-old full-time places rather than two-year-old and three-year-old places. We are developing a lot of relationships, certainly outwith the Scottish picture, but the benchmarking is our bread and butter, because that is how we will be measured and assessed in our communities. The next question was about how that has impacted in the budget settlement, but you have just answered and given a fine example of that. You heard me ask the others, I am sure, about the fact that everybody thought that there would be pelters when the framework came into play. Some folk were talking about the possible press headlines. That has not happened because your authorities are managing that particularly well and giving explanations for the differences that exist. I think that a number of journalists, although they like the headlines, recognise that benchmarking is very important and that the context of our local authority is particularly critical and that we take the approach that statistics and benchmarking do not provide answers, but all they do is raise questions that allow as many people as possible to engage and that it is that dialogue that brings about the improvements, not the benchmarking in itself. A Glasgow perspective on benchmarking is that there is a recognition that deprivation is a significant factor in determining the outcomes for all the young people, certainly from own perspective, a feel that if you have a political leadership and direction that sees this as important, if you have a professional commitment, particularly from the staff from your schools, and you have an openness and confidence around being able to explain why you are in a particular place, but also to explain why you are not perhaps what people would anticipate to be for good or bad reason, then that sharing of the information is obviously good in terms of being accountable to all the people who we represent. I think that because we have really been doing this from the inception, as I said earlier, in the Strangfordshire Council, it is something that we are comfortable with, we are comfortable with comparison. It is something that elected members really appreciate and parents can see the information that is out there, they can see exactly where we are going in the authority, and if they do not like it, they would certainly let us know. It is so easy to do it when you are right at the top of the tree, and so will he aspect. Michelle? I would think that I have got anything further to ask from what my convener has said. Well, thank you very much for your evidence today. I suspend for just a brief moment to allow the witnesses to leave. We can now move on to agenda item 3, which is to consider the petition PE1469 by Aileen Jackson, which calls on the Scottish Parliament to urge the Scottish Government to consider a change in planning regulations to enable an increase in the current neighbour notification distance of 20 metres in relation to wind turbine planning applications. As members will recall, we have taken evidence from this petition, as part of our scrutiny of the third national planning framework. We have a paper before us today that sets out the actions that we have taken on this petition since it was referred to us in December last year. During our evidence taken on the petition, the Scottish Government indicated that it did not think that it necessary to change the current neighbour notification distance, but Minister Mackay informed us that the Government would look at issuing new best practice guidance on the notification system of wind farm applications. We have now received correspondence from the Scottish Government confirming that and setting out a timetable for the development, consultation and publication of such guidance. That is attached to the paper that we have before us. Can I ask members to look at the paper and see if they have any comments on the issue or the petition? In which case, can we agree to write to the Scottish Government acknowledging the actions taken on PE1469 and drawing up the aforementioned guidance? Can we request that the Government ensures that the petitioner is specifically consulted on the proposed draft guidance and that any views that she expresses are taken into account by the Government before it finalises such guidance? Can we ask that a copy of the finalised guidance be provided directly to the petitioner and that we be notified of that by the Government when the guidance is published in spring 2015? Can we request that the Government ensures that the finalised guidance is properly publicised and brought to the attention of all planning authorities in Scotland as well as all those making applications for the development of onshore wind farms and any other relevant persons or organisations whom the Scottish Government considers appropriate to notify? In light of the Government's decision to issue guidance on neighbour notifications as a result of PE1469, there would appear to be no further reasonable action that we can take in relation to the petition. Therefore, are we agreed to close petition PE1469 with immediate effect and ask the clerks to write to the petitioner and the Public Petitions Committee to notify them of that decision? Before we finish, I would like to take this moment to thank the petitioner, Ms Aileen Jackson, on the record for her petition. I feel that this is an excellent example of where an individual using the Parliament's petition system can affect meaningful change in important areas of public policy such as the planning system. I would also like to thank the Public Petitions Committee for the work that it undertook in this petition before it was referred to us. As agreed, we can now move into private session.