 Good morning, and can I welcome everyone to this 31st meeting of the Public Audit Committee in 2022? The first item on our agenda is for members of the committee to agree or not to take agenda items 4 and 5 in private. Are we all agreed? We are agreed. Thank you. Agender item 2 is to consider whether or not to take our next meeting, which is to be held on Thursday 12 January in private. Are we all agreed? We are agreed on that too. The principal item of business for the committee this morning is to consider the Auditor General for Scotland's section 22 report on the 2021-22 audit of national records of Scotland. Can I welcome our witnesses this morning? The last time you gave evidence to the committee you were remote, so I'm very pleased to welcome both Graeme Samson into the committee room. Graeme is a senior auditor with Audit Scotland, with Dashi Sansa-Carman, who's an audit manager with Audit Scotland, and of course the Auditor General, Stephen Boyle, very welcome. We've got some questions to ask you on the section 22 report, but first of all Audit General, can I invite you to make an opening statement? Many thanks, convener. Good morning members. I am presenting today's report on the 2021-22 audit of the national records of Scotland under section 22 of the Public Finance and Accountability Scotland Act 2000. Convener, I've prepared this section 22 report to update Parliament on progress since my report on the 2020-21 audit, which highlighted the challenges that NRS was facing with delivery of the census programme and the financial impact of the decision to delay the census to March 2022, given the Covid-19 pandemic. Since I last reported, the census went live as planned in February but achieved a response rate lower than the 90 per cent target. NRS decided to extend the collection until the end of May, resulting in an overall response rate of 89 per cent. NRS estimates that the extension will add a further £6 million to the lifetime costs of the census programme, and the Cabinet Secretary agreed to fund this additional expenditure. NRS is taking steps to mitigate the impact of the lower than expected response rate on the quality and robustness of census outputs, including the decision by the Registrar General to establish an international steering group of experts to provide support and guidance. NRS is confident that it is still on track to produce high-quality census outputs, but it will now be more reliant than originally planned on the use of administrative data to support the production of its population estimates. It is important therefore that NRS continues to be transparent as it progresses the significant remaining work to conclude the census programme. As ever, my colleagues and I look forward to answering the committee's questions. Thank you very much indeed for that helpful opening introduction to the report. Can I just begin by asking a factual question really? The report talks about something called the census coverage survey. First of all, could I ask you to explain to us what that is? Yes, convener, I will start. I am sure that Darshay might want to come in and comment further. We refer to census coverage survey in the report, and this is effectively a survey taking exercise that takes place after the completion of the original census. Darshay might want to say a bit further, but it looks to explore some of the reasons why people did not participate in the survey exercise in part to understand people's choices, some of the behaviours and data. We highlight in the paper that it goes into exploring some of the breakdown as to why people did not complete the survey and the most significant explanation given that they were too busy to add along with a host of other reasons why they did not complete it. Before I go into any more detail, convener, I think that it is an interesting point overall in terms of survey results is that, although the overall return rate for the main census was below target and also below the survey results from the 2011 survey, that is similar to what we have seen from the following survey results that are also lower in terms of people's completion and behaviour rates. I may have said more than you, but I will pause if Darshay wants to add. Yes, so just to clarify, the census coverage survey is something that is used in Scotland, but also in the rest of the UK and in other countries it uses a similar methodology to follow up from the census. It is essentially about filling some of the gaps to identify some of the number of households and some of the characteristics of the households and individuals that did not respond to the census itself. The sample size is roughly about 1.5 per cent of the population. There was a separate survey that NRS did, a smaller survey that was exploring some of the reasons for people not responding. The coverage survey is part of the standard census methodology to help to provide more information about the population for the census estimates. The return rates from memory in England, Wales and Northern Ireland were about 97 per cent, were they? Did the census coverage survey focus on the reasons why the 3 per cent did not respond, or did that have a wider role to play in giving more qualitative information about the census return? The coverage survey itself is not so much about the reasons as about providing—it is part of the methodology to provide information about the households that did not complete the census, but it is not about the reasons, if that makes sense. There are three pillars, as the international steering group describes it, of the census methodology. There is the census itself, the coverage survey and the use of administrative data. Those three things combined are used to produce the population estimates and to ensure the quality of those estimates. I turn now to administrative data, because the committee took evidence earlier this year, and it was at the end of last year that we took evidence from you almost about a year ago. We took evidence from Audit Scotland on the report at that time, which was in advance of the census being carried out. We then took evidence from the National Records of Scotland in January of this year. We had exchanges with them about administrative data and how that would be used. We were told that that would not be used to fill data gaps, but we learned after the census return rates that administrative data will be used probably more heavily than we originally anticipated. Can you tell us what the implication of that is for the quality of the census data that we will have? Yes, convener. It is fair to say that administrative data, as Darcy describes, needs to be used more significantly in Scotland than had originally been planned as a result of the survey completion rates that we have settled on of 89 per cent in Scotland. Marginally below the 90 per cent target, but within that there is a bigger story. As we set out in the report, Scotland has 32 local authorities. The completion rate target was met in 30 of those local authorities, but there is quite significant regional variation. For example, in Glasgow, as we set out in the report, the completion rate is 83 per cent in Glasgow. Therefore, there has to be other approaches brought in to give robust reliable data. Administrative data forms one significant part of how to have reliable population estimates, which is where the survey programme needs to get to for all the very important reasons that sit behind it. One last point that I would make about administrative data. NRS has brought in advice and guidance to support the completion of the end of the project to that reliable position international steering group led by Sir Ian Diamond. Sir Ian gave evidence to the constitution committee and I think his judgment was that the overall arrangements that they now have in place provide a solid foundation to produce robust, reliable population estimates. That is a process that NRS needs to go through over the course of 2023. It is a very important role for its regulator, the Office for Statistical Regulation, who we know are working with NRS to arrive at that final point. However, go back to your original point, convener. Yes, administrative data will play a more significant part than has originally been planned. Are there or are they not more risks associated with relying on less primary data or more administrative data? I am not sure that we are in a position to give you a definitive judgment on that. Primarily, the regulator, the OSR and the steering group have said that this is not an inappropriate methodology for compilation of robust population estimates. It does not lead us to a point to say that that is not an appropriate route for NRS to take. I think that it is fair to say that they did not plan to have it this way, but it does not invalidate the overall results. Ultimately, that will be the judgment of NRS and the OSR, if that is where they get to. You referred to that a few minutes ago. Do you have a view on the fact that a target of 90 per cent return rate was set against a backdrop of a 97 per cent return rate in the other parts of the UK? Does that lack ambition? If other parts of the UK on a digital first census managed to achieve return rates of 97 per cent virtually 12 months before, why was the ambition only to get a 90 per cent return rate? If I make a bear, part of that NRS is probably better placed to answer. From our analysis, drawing on the 2011 survey results, Scotland's completion rate was lower than other parts of the UK's Scotland return rate was 94 per cent then. Although you rightly say that England and Wales and other islands for this survey result had a 97 per cent return rate, drawing on other sources, we are seeing a reduction in household survey engagement in return on surveys. Coming back to where NRS will go next for the next iteration of the survey, I think that it is becoming much more likely that wider sources of data will need to be used, not just in Scotland but likely in other jurisdictions as well, to arrive at population estimates. Bear in mind that the census methodology, the origins of it go back hundreds of years when there was no alternative other than to do household door-by-door surveys, we have alternatives in a broader suite of data. It is certainly not for our statisticians to make that judgment, but if the OSR judgment along with the expert panel is likely to be the direction of travel, what matters is that, where we get to in the report, it is transparently set out and that people who will use the data and scrutinise it at the Parliament can have confidence that the overall end position still produces reliable estimates. Okay, and do we know at this stage whether the next census will be in 10 years' time or nine years' time? We don't, convener. Ultimately, that would be a political choice in conjunction with NRS for the years to come. Okay, well, let me steer away from politics then and just ask a final question from me at this stage, which is, I mean, you may recall that we had some quite detailed conversations with NRS about their access to administrative data. I think that it was the case that the ONS had much more extensive access to HMRC data, for example, and it was DWP data, but there was a much broader suite, which was as a result of agreements that they had entered into. Now, we were told that those were not agreements for data sharing that were entered into for the sole purpose of the census that they were entered into for other reasons, but nonetheless it meant that they had much wider access to much more comprehensive data than the national record of Scotland had. My recollection is that when we quizzed national records of Scotland they said that they were looking at, I think, Scottish data sources to try to improve in a range of work that they do, the data access that they have, but that they were still at quite an early stage in that. I guess that what the committee would be interested to learn, if you can help us with this, is the extent to which there has been any progress made in those data sharing agreements that were spoken about in January of this year? I would like to say a bit more perhaps about some of the gatekeeping governance arrangements that exist, and the progress that NRS is making to access those additional sources. At a high level, NRS has identified a range of administrative data that, in their view, is necessary, along with taking the guidance of their international steering group, be it electoral registers, NHS registers, student data, and some school data. All of that suggests that there is a clear path from NRS about what data needs to be sourced in order to ultimately get to the end result again of their reliable population estimates. We are not detecting any of those barriers that are in place, but ultimately people need to have confidence that their data is secure, is being used properly and that it is consistent with the purposes that were originally intended. We can say a wee bit more if Darsha is more detailed, but I think that, similarly, if NRS might also be in a position to update publicly and transparently as they have committed to do so. NRS is working with the support of the steering group to get access or extended access to administrative data for the purposes of the census population estimates. There are two panels that applications for access to statistical data sets have to go to. There is the Public Benefit and Privacy Panel for Health and Social Care, and there is a Statistics Public Benefit and Privacy Panel that controls access to Scottish Government data sets and census data. As I understand it, NRS is hoping to have secured all the necessary permissions by the end of the calendar year, and it will also have to come to data sharing agreements with the organisations that hold the data that they are looking to access. That includes electoral register data, NHS register data about people who have registered with GPs, school pupil census among others. Sorry, Darsha, just for the record, today is the 15th of December. Are you saying that, within the next two weeks, or is it the next calendar year that you are talking about? That is what this calendar year is, what NRS reported to the census programme board in September. We have not had any further updates on that, so NRS would be the best place to update you on whether they have been able to do that, to secure those permissions or not. Okay. As a committee, we will need to consider whether we want to pursue that with them. Can I invite Willie Coffey to put some questions to you? Willie Coffey, before I ask a couple of questions about the work that is remaining to be done on some questions about the digital aspect of the census, is there a standard, an industry standard or otherwise, a percentage return rate that you should look to get to get a representative sample of the population in a thing, like a census? What is the percentage figure or is there none? I am not sure, Mr Coffey. I will turn to colleagues if we know the answer to that, if there is a reliable number, a threshold that must be reached. Given that there is variation between ambitions set in different parts of the UK, we can assume, as with any statistics, that there is a margin of confidence on some of those numbers, but I will pause and see if colleagues can support that. I am not aware if there is an industry standard, as it were, but we know that NRS would consider that an overall response rate of at least 90 per cent would be what they were aiming for. They were hoping for 94 per cent or above because that is what they got in 2011, but they considered 90 per cent to be the target. I am really curious about that, because we talk about target rates and then response rates, and they are entirely different things. As we all know, the UK target was not 97 per cent, it was 94 per cent. However, I am curious about why we think that we are significantly above or below something that we do not know that we are trying to reach to be a valid survey. I note that in your report, Auditor General, you say that the census target response rate for Scotland's local authorities was 85 per cent, and that was exceeded by responses. I also note that, in the UNS report for the census survey carried out in England and Wales, that their target was 80 per cent for local authorities in England and Wales. Why do you think that there is a lower target for response rates for England and Wales compared to Scotland? I fear that I may not be able to give you a satisfactory answer to that, Mr Coffey, that the variations in the methodology and how robust they are are probably more questions that NRS and ONS will be able to answer. What we have looked to do is to take a view about, did ONS ultimately for the spending of approaching £150 million of public money, achieve the outcomes intended in terms of a robust survey that supports population estimates. If I may broaden out in terms of the local authority completion rates, although they have achieved that in 30 out of 32 of Scotland's local authorities, there are still gaps to fill and NRS, together with the regulator, need to be satisfied that, ultimately, there are robust results. Regrettably, I am not able to give you a helpful answer in terms of the differences in methodologies between Scotland and England and Wales. The point that I am trying to make, convener, is that if you set the target lower and you exceed it significantly in response rates, it looks as though it is a better performance than you might otherwise expect. The higher the target that you set, the more difficult it is to do that. I move on to another question in your report. In one of your key messages, you talk about the significant work that remains to be done to ensure that the census delivers those robust population estimates and other outputs. Can you tell us a little more about what that work involves? I am happy to do that. Darsie may want to supplement any of my contributions. As we touched briefly with the convener, overall, NRS has identified that they have a gap in respect of the household survey results compared to their targets. They have regional variations that require them to bring in other sources, primarily that of the administrative data. That is the process that we understand that they are going through at the moment. Mr Coffey will do over the course of 2023, with further reporting that year and then into 2024 before they complete the overall programme that is validated with the judgment of the regulator that these are robust, reliable population estimates. One of the work of the OSR had earlier this year called on that programme of work to be done transparently, so that the Parliament and the public have a clear understanding of the work that is being undertaken. We have touched on in the paper that the OSR has given a view that that is becoming a more transparent process from NRS. Effectively, the analysis of the results, together with filling in the remaining gaps with the use of administrative data, is a brief summary of what NRS is currently undertaking. I thank you very much for that. Could I turn to the digital aspect of the census and ask for your views on whether we were hampered in Scotland by the public's access to digital devices to complete the online survey? Maybe you could start by telling us what was the difference between this survey and the census in the last one. There was a big digital component and an online component to this and there was not the last time. Has that been a significant factor? There are a number of points in there. This was primarily an online survey that households in Scotland chose to complete it with appropriate guidance and signals from NRS. There were still many people in Scotland who requested a paper copy—some 600,000 or so paper copies were issued—with half of those returned. It is absolutely the case that Scotland has moved to almost an entirely digital survey. Your question about whether people were hampered in their ability to complete the survey by access. We have not analysed that yet. Going back to one of our key recommendations in the paper, NRS, as part of its remaining work and evaluation, goes through that process to understand the choices that people made to complete the survey and why people did not complete the survey. We have spoken with the committee through much of our recent reporting about access inequalities and so forth. Whether that was one of the key factors behind Scotland having a lower than anticipated return rate and what that means in terms of reliability of data. That is a key responsibility for NRS to understand what barriers there were for household participation in surveys. Do you look at variations in population about the level of access that people have to IT, skills and so on? I was well aware of that when I ran the cross-party group on digital exclusion. There was huge differences even within Scotland of access to digital technology for people to do that. Despite their willingness to participate in the online world, there is still an issue about whether they actually can do that. Will that further work try to investigate portions of the population that could not participate as fully as they might have wished to? I will start by answering that. I think that that is a really important process that NRS now goes through. I have mentioned Glasgow a couple of times already in my responses. I really have reached a view about whether or not deprivation and digital exclusion are a direct parallel between participation in a survey in an electronic context. However, there are some telling indicators about where deprivation exists in Scotland and the participation rates in a survey digital or otherwise. There will be ultimately important lessons for NRS to learn, whether it is to be applied in however many years time for future survey. However, any population survey that is undertaken has to think about how people can most easily participate to get reliable results. I wish that. We do not know the exact details of what will be involved in NRS's evaluation work, but we would expect them to be looking at issues of digital exclusion and how they could build on that in future censuses. However, we do know that, during the collection period, particularly in the extension period, they were targeting their field force efforts and support efforts to areas where they had identified that the response rate was lower, and that will hopefully have picked up areas where people were digitally excluded or having issues with completing the census online. Obviously, they had a helpline and other means to support people to either complete it online or to get access to paper copies of the census. Okay, okay. Thanks very much for that. Anecdotally, one of the things that I picked up around the time of the census was that people who did not complete the census in one go, when they went to log back on, discovered that they had to start right from the beginning. I do not know whether that was a feature common to the systems that were applied in Wales, Northern Ireland and England, or whether that was a deficiency in the Scottish system that perhaps drove down the completion rate. I do not know whether you have picked that up at all. I do not know, convener. I can well understand why that would be a deterrent for people to re-engage with the survey. It feels like a really important area for NRS to investigate as part of their overall validation judgments that they have made on the programme. Thank you. I am going to bring in Colin Beattie, who has got some questions. Colin Beattie, I would like to go back to something that you started to talk about previously. I think that I was in the conversation with the convener. For the rest of UK, England, Wales and so on, they had access to other databases that enabled them to boost the effective return levels of the census. Have you any idea what sort of percentage improvement was achieved because of that? I may be just checking my understanding of your question, Mr Beattie. I am not sure that there is a direct relationship between using administrative data and the overall survey return rates to say that it boosts the overall return rate. I think that our understanding is that it is a parallel use of data to produce ultimately reliable population estimates, as opposed to saying that administrative data equates to a percentage of return rate in lieu of members of the public not completing the survey. If I may, I will just check my understanding. That is our understanding. That sort of improves the quality of the return, not the contrary. Correct. OK. That is good to know. In your report, you stated that in order to improve the response rate, NRS, the field force staff offered support and assistance across Scotland and particularly focused on areas that have lower response rates. Can you tell us which areas had lower response rates and what additional support that is referred to was given in those areas? Was it just more of the same or maybe extra? I am happy to do that to the extent that we have that detail. Again, Darcyllian Gray may wish to come in further. I have mentioned Glasgow a couple of times already this morning as being the council area with the lowest return rate of 83 per cent in Scotland. What does that mean? There are connections with Mr Coffey's line of questioning as well. Where there is access to or lack of access to digital devices or whether it is a preference of a household, people are given the option to complete digitally or to complete in person. To support the completion rates, NRS employed a field force temporary workers to really go around people's doors, to remind them of the importance of completing the survey, offer assistance where necessary and follow that up as required with additional visits. It is quite a traditional as many of us will recall from surveys in previous decades of actually going door to door to support, remind and encourage people to complete the survey. Do we have any information, for example, in a place like Glasgow, whether any particular ethnic groups returned lower than others? It is quite important that we get that outreach. Darsie might know that, before she comes in, it is an incredibly important part of the survey validation. It is not just about getting up to a certain percentage of confidence that the survey results are representative of society. To get to robust population estimates, they have to be tested against a range of parameters in terms of protected characteristics groups, all of which will drive funding formulas and the delivery of public services over the course of the next decade. It is also really important that all that thinking is part of the methodology and quality impact assessments need to feature prominently as part of that assessment. I will pause Mr Beattie and check with Darsie what further we can add. We do not have information to hand about response rates among specific population groups. NRS should be aware of that and be part of their work, certainly during the extension to the collection period and subsequently when it comes to the census coverage survey and the use of administrative data is about filling in the gaps about the people that did not respond. That means that you have better data for those smaller geographical populations and ethnic groups and so on that are harder to count, if you will. When OSR is reporting on NRS's efforts in the new year, part of what they presumably will be looking at will be checking that they have robust data for the smaller populations. So what you are saying is that data does exist but it is just not available to yourself at the moment? I assume that NRS will have that but we do not currently. We do not know that yet. It is a reasonable assumption to make that within the data and the analysis and evaluation that follows that NRS, together with the international steering group and the judgments made by the regulator, will be going through that process. Ultimately, all those factors will be combined to reach a judgment towards the end of next year and into 2024 that Scotland, as a result of the census, has reliable population estimates, including all of that analysis that you referred to in terms of ethnicity and other characteristics. You referred to Glasgow as a particular area that has had a lower response rate. Are there other areas of significance where similar patterns emerged? At a headline level, Scotland met its target rate that NRS set for 30 out of 32 local authorities, and the target being 85 per cent completion. Within that, I think that that only tells part of the story. We mentioned Glasgow, which is below that target, but it probably follows on from your previous question. That does not necessarily mean that it will produce reliable population estimates. It flows into what comes next in terms of the evaluation. Do they have reliable data for different parts of society, whether it is Scottish instance of multiple deprivation data for funding formulas, different ethnicity and all the associated flow of funds? That takes us back to alongside the survey results, plus the administrative data is the route through to get to robust reliable data. For us at the moment, we are reporting what NRS has got at this stage. There is considerable work to be done over the course of the next year, so that by local authority and by different groups across Scotland, it can produce reliable population data. Your report also states that NRS is investigating or continuing to investigate the reasons for the lower and expected response rate and why it was lower in comparison to other countries. Are you aware of how they are doing this and what progress they are making and whether there is a target date involved in this? We are aware of that up to a point. I call the committee a bit more. At an overall level, establishing the international steering group was a positive thing to do, because it has given NRS access to experts in the use of data. Alongside that, some of the behaviours of the public as to why they will or will not participate in surveys, Sir Ian Diamond has, in his evidence to the Constitution Culture and Europe Committee, talked about some of the change in behaviours and the extent of the public's engagement with surveys being lower than they have historically been. All of that is part of the work that NRS is currently undertaking. We, in our report, encourage that that is reported publicly and transparently. Here are the lessons from the 2021 survey of what that means, not just for the next survey but for future surveys. I think that it is safe to say, Mr Beattie, that people's behaviours are changing to the extent that they will willingly comply and engage with survey activity. For NRS and other survey organisations, they have a range of tools at their disposal—an administrative data, online survey, paper-based survey—that they can have as broad a reach as possible. All that is part of NRS's work for the year to come. You have not mentioned anything about timescales or actual progress. Two things. The work of the Office for Statistical Regulation is due to publish a report towards the end of 2023 on their work with NRS on population estimates. NRS has publicly committed—I think it was confirmed by the cabinet secretary—to produce a report in 2024 on some of the evaluation lessons learned from the survey. Those are quite important milestones to satisfy the Parliament and the public about the assessment of the survey result. I will stop in case there is anything that Dasha wishes to add. Just to say that the full evaluation report is due to be laid before Parliament as the census programme comes to an end. That should be in 2024. That evaluation is supposed to look at the entire scope of the census, including the design and the operational elements of the census collection period, along with all the subsequent statistical methodology and the processing of the data and the output. It should cover the whole programme. I will bring in the deputy convener, Sharon Dowie. Paragraph 12 explains that programme costs are estimated to increase by £6 million in 2022-23. The report goes on to state that the actual figure will not be known until the end of 2022-23 and that that will need to be carefully managed. What are the risks if the costs are not carefully managed and do you have any concerns? I will bring in Graeme in a moment to say about the audit work that we have undertaken in the past year on the spending to date. As we said in the paragraph, costs increased by about £6 million this year. We analysed that a little bit more closely in terms of additional supplier costs of £3 million, £1.7 million for further field staff and then just over an additional million for the census coverage survey. The risks do remain in terms of additional spending on top of the further costs that we discussed with the committee last year as a result of the extension, but I do not think that we are talking about the scale of additional spending that we had been doing as a consequence of the planned delay given the pandemic. Nonetheless, given that the project has cost around £27 million more than had originally been intended, it matters that those costs are still kept under close review. They are managed to keep further public spending to a minimum as a result of the project. I will bring in Graeme in case he wants to share with the committee any of those arrangements that NRS has used. NRS monitors the census spend regularly and reports to its committees on board. It is something that we look at as part of the audit to test the expenditure on the census and to confirm that it is being closely monitored and scrutinised during the year. It has managed to achieve that in recent years. It is kept very close to budget and it is managing it well. As the Auditor General mentioned, the extra expenditure of £6 million for 2020-23 was originally given up to £9 million extra, but it has reduced that down to only estimating that it will lead to £6 million. I would say that they are monitoring it closely and will need to continue to do so. Moving on to paragraph 15, page 6 of the report states that, as the census programme progresses through its later stages, it is important that NRS has knowledge transfer plans in place to build on the skills that it has in-house. Do you know what action NRS is taking to ensure that that happens? Colleagues can update if we have detailed insight as to the specifics that NRS is undertaking. Before bringing Graham or Dashie in, we make this comment in a lot of reports when we talk about IT arrangements in the public sector and guarding against the risk that absolutely public bodies need to bring in expert services where they are undertaking a project that would not require those services on a rolling day-to-day basis. Do not employ them on a permanent contract per se. One of the risks that we have seen over the years is that once the design or build phase ends, contractors leave the organisation and then that expertise and skills that the public sector has paid significant sums for goes with the contractor. It is something that they need to guard against but within reason. They are not going to be undertaking the survey year after year, but NRS is a largely digital organisation. Seeing that those skills remain in that expertise that they have paid for remains in the organisation is really important. We expect it to be part of their evaluation of the survey, but in terms of the steps that they are taking, I will ask Graham if there is anything further that he can share. I do not think that I have any further details around that. As the auditor general says, there is a large use of temporary staff during the census project. It is something that we have commented on in our annual audit reports in recent years, but it is the nature of the work that staff are only needed for a short period of time. It is something that they will need to look into just how that knowledge is transferred but do not have any more detail around the arrangements. I am now going to turn to Craig Hoye, who has some questions to put. Good morning, Mr Ball. As we approach Christmas, we should be reminded that censors are nothing new. The Romans were conducting them every five years, over 2,000 years ago. I want to reflect on what happened in the most recent census in England and in Scotland. Last week, there was a report from the UNS, which is entitled maximising the quality of the census 2021 population estimates, relating to the census south of the border. One of the main conclusions and points that the UNS refers to is that our planned flexible approach to collection and well-tested response strategy enabled us to respond to changing circumstances such as the coronavirus pandemic. Focusing in the words that they said, they planned flexible approach to collection and a well-tested response strategy. Have you looked at what happened in England in relation to Scotland to find out what was more flexible in the UK census to Scotland and how the well-tested response strategy in England differed to what was taking place here in Scotland? We have not done that work, but I think that that is exactly the type of analysis and reflections that NRS will be doing or ought to be doing over the course of the next 12 months as part of its evaluation of the completion of Scotland's census for 2022. Not just about how it went, but whether there were any opportunities to apply into different approaches next time round and absolutely to learn from other jurisdictions. The fact that they have appointed an international steering group suggests that they are open and ready to bring that learning into future surveys in Scotland. I note that NRS is not represented on that group. Is there any particular reason as to why the NRS is not formally represented in the group? I think that that is potentially quite a helpful thing, rather than the body itself being seen to have undue influence on the views of experts that they are drawing on that independently. That is led by the national statistician, Sirian Diamond, who gives it a reach and credibility to support NRS. I think that it is also important to say that we know that NRS is engaging well with its regulator in terms of the overall population statistics. I think that it all speaks to that they have got a range of sources of expertise to inform judgments about this survey or where they go next. Obviously, the census in England proceeded on the timetable as planned and it was delayed in Scotland. Have you, as of yet, been able to make any preliminary judgment as to whether or not the decision to delay meant that we improved the data capture range in Scotland? Or, for some reason, might that have impeded the management of the census? Reflecting back to the previous report, NRS, together with the Scottish Government, took a view that it would not have been able to progress to a point that it would have produced reliable population estimates by undertaking a survey in 2021. The decision to delay, in their view, was absolutely necessary so that Scotland produces reliable population estimates, albeit with the delay as a consequence. We remember that the committee explored this with NRS in the evidence session, and you have already referred to the discussion that you have had about administrative data and NRS's methodology, compared with that of ONS. At the risk of labour in the point, Mr Hoy, I think that that is effectively where they are at now and the work that they need to undertake over the course of the next 12 months is to do that analysis to support learning for reflections from this survey, what that means for the methodology into the next survey. I am sure that this will absolutely be the case for NRS, but given the fact that surveys take place every 10, 11 years, changes in behaviour technology are moving at such a pace that wherever lessons are learned from this survey will undoubtedly need to be refreshed in the intervening period in anticipation of the future significant public surveys in Scotland. In terms of the use and impact of the Scottish census data, you are in discussion with Mr Coffey, Luke, the digital issue and the issue in perhaps the hardest to reach areas, those with the highest instance of deprivation. Given that those are the areas where Government bodies need that data to be able to work out what policy instruments are going to be deployed, is there now a risk that unless we focus in on those areas, the hard to reach areas where there was a lower response rate, we may actually have a problem moving forward addressing the societal and social need and economic need in those areas? There is a risk if that does not go well, but I think that it is a key focus. Certainly, we know that the OSRs are very alert to that risk and that is a key part of their discussions with NRS. The discussion that we have already touched on this morning is not just about getting to a particular number in terms of survey returns. It is the analysis that sits beneath that for different parts of society that produces ultimately reliable population estimates. We are very clear that NRS, the regulator and the international steering group are very focused on that point. The report states that the international steering group concluded that extending the census coverage survey would have a negative impact on subsequent stages of the census programme. Do you know what those negative impacts were likely to have been? I think that it is just in terms of timescales and making sure that they were able to produce outputs, which they are aiming to get first outputs published a year from the end of the collection period. Essentially, if they had extended the census coverage survey, that would then have a knock-on effect on their ability to produce outputs. On paragraph 15, you indicate that 42 per cent of overall staff costs consist of temporary staff. The explanation given for that is the census, which must be a major part, as well as specialist IT projects. Do you know what the split between those two is? I will check with Graham if we have that information at our disposal, if not, we can come back to the committee in writing. It is not something that I readily know, but we can check and see if we have additional information about it. Okay, thank you very much indeed. There may be elements to say that we need to follow both with ourselves and with NRS arising from the evidence that we have taken. I am going to draw the public part of the meeting to a close, but before I do that, can I thank our witnesses this morning, Dashi, Sansa Cumaran, Graham Samson and Auditor General Stephen Boyle. Thank you very much for your evidence this morning, and I will now move the committee into private session.