 Good morning and welcome to the 22nd meeting of the Education, Children and Young People Committee in 2023. I would like to welcome Liam Kerr and Michelle Thompson, who are joining us as new members of the committee, replacing Stephen Kerr and Bob Dorris respectively. We thank Stephen Kerr and Bob Dorris for their previous work and engagement as members of the committee. Our first item of business is to invite Mr Kerr and Ms Thompson to declare any relevant interests. Can we start with yourself, Liam? Thank you, convener. I have no registered interest to declare, but I think it important that the committee and any witnesses and anyone watching knows that my wife is an ASN teacher in an Aberdeen school. Thank you, Mr Kerr. I have no relevant interest to declare. Okay, thank you very much. A second item on our agenda today is the election of a new deputy, convener. The Parliament has agreed that only members of the Scottish National Party are eligible for nomination as deputy convener of this committee. I understand that Ruth Maguire is the Scottish National Party's nominee. Do we agree to choose Ruth Maguire as our deputy convener? We agree. Excellent. Thank you very much and congratulations, Ruth, on your appointment. The third item on our agenda is a decision to take business in private. Are members agreeing to take item 7 in private? Thank you. Now we can move on to the substantive items this morning. The next item on our agenda is evidence with the Scottish Qualification Authority. I welcome Fiona Robertson, the chief executive in Scotland's chief examiner officer of the SQA, and Dr Jill Stewart, the SQA director of qualifications development. Thank you both for joining us today. We will begin with a short opening statement from Fiona Robertson. You have around two minutes. Thank you very much. Thank you, convener, and I will try and keep my opening remarks brief. First of all, thank you for the opportunity to appear before the committee today to reflect on national qualifications in 2023, to look ahead to next year and to discuss education reform. On results 2023, I would like to begin by paying tribute to the 141,089 learners who received their SQA certificates on Tuesday 8 August this year. Those learners, including thousands, within their own constituencies and communities, can feel proud of their achievements across a wide range of national and vocational qualifications. In celebrating the remarkable resilience and commitment of learners, I also want to pay tribute to all our teachers and lecturers who have supported them and our partners across the education and skills system who have helped to shape and agree our approach to awarding. This year represents a further positive step on the path back to normal awarding, but the impact of pandemic disruption to learning and teaching continued to be felt. Recognising that, we put in place a wide-ranging package of support, including a sensitive approach to grading to help learners to perform to the best of their abilities, while maintaining the credibility of our qualifications. For awarding 2024, following extensive consultation with the education community, it was agreed that we would return to full-course assessment for most courses in the 2023-24 session. Our wider approach to assessment will draw carefully upon the experience and evidence from this year and the views of learners and partners. Finally, on education reform, SQA is engaged positively with the Scottish Government's reform programme to replace SQA with a new qualifications body. We have also contributed to a range of reviews over the past few years, most recently the independent reviews of qualifications and assessment and the review of the skills delivery landscape. As a number of interconnected reviews have now come to a conclusion, great care will be needed to ensure that any change and reform that follows is coherent. Well understood, aligned and deliverable. The timeline for the replacement of SQA has changed, but we remain committed to delivering high-quality, credible qualifications, helping learners to achieve their ambitions and deliver the skills for the future. Thank you. I am happy to answer any questions. Thank you very much, Fiona Robertson. I will move on to questions from members. I will take the convener privilege and I will have the first opening today. I suppose that we know that we have had a very turbulent last few years with the pandemic impacting on result results. From a perspective, it is quite key to compare 2023 with 2019 in terms of that benchmarking. At a national level, how comparable are those results? I think that I made clear on result stay when we published information relating to the aggregate position for national courses. Over the past few years, particularly given the pandemic and the years since, we have had to adopt a slightly different approach to awarding each year in common with other awarding bodies across the UK. On that basis, we need to be quite cautious about drawing comparisons, particularly in relation to any judgments about educational performance. Given that caveat, it is fair to say that we have seen some recovery and that combined with a sensitive approach to grading has resulted in a strong set of results for this year. Learners have done well, they have shown great resilience and they have worked hard. I think that that is evident in the results that we have seen. In terms of where the results sit, they sit somewhere between 2019 and 2022 overall. Some of the data regarding attainment seems to indicate that since 2017 there has actually not been much noticeable improvement with overall attainment at A to C marginally down at all levels compared to 2017. We can also see that in 2019, before the pandemic, attainment at A to C was starting to decline slightly. Where I am is that I want to be able to see attainment across the board improving. I am concerned that since 2017 and even before the pandemic, there has not been what I would like to see is that overall improvement in attainment. It seems to be that attainment is either levelling out or tapering downwards. As a highlighted, we need to be cautious about drawing conclusions given the changes to awarding over the past few years. On a more granular level, you will see variability in results in individual subjects at individual levels for a variety of reasons. It is always important to look below the headline of national 5 higher and advanced higher results and to consider the issues. We are in the midst of publishing core reports for every subject at national 5 higher and advanced higher. That provides more information and reflection on performance in individual subjects, which can be helpful to practitioners and to learners as they are approaching further learning and teaching this year. However, I think that I have to advise caution around comparisons. It can be for a variety of reasons. As I said, the 2023 results are pretty strong overall, particularly given the challenges of the past few years. However, we need to be cautious about drawing too many conclusions about the results. I am sure that we will pick in more on that thread as the session goes on. Can I move to questions now from Pam Duncan-Clancy, please? Thank you, convener, and good morning. Thank you for the information that you have submitted in advance and also for your opening statement. Just a quick point on attainment. How would you describe the attainment gap between the years 2016 and 2019? Only since 2020 have we published an equalities impact analysis on results days, so we have done that since 2020. I think that, certainly in the 2016 to 2019 period, we have seen an attainment gap that has persisted over that period of time. Like overall results, we can see slight variability over time. When we published our equalities analysis this year, we saw slight changes to the attainment gap in 2023 compared with 2022, although I need to caution about drawing too much by way of conclusion. I think that it is an accepted fact in Scottish education that there is a persistent attainment gap in Scotland, and I know that the Government has made commitments to addressing that and investing in addressing that. Thank you for that. I want to move on now to the sensitive approach that was taken during this session. Sorry, I apologise Pam, I am first day back for me. The sensitive approach that was taken as you have described this year. Can you tell us the impact of that sensitive approach and how you know? The impact of the sensitive approach? I set that out in more detail in a methodology report, which I also appended to the papers that I sent to the committee in advance of this session. The sensitive approach is a step-by-step approach that we take in individual awarding meetings for individual subjects. Now, of course, colleagues including members of the committee are interested in the aggregate position, but Gillan, myself and another colleague chaired a series of meetings with teachers who are senior markers and senior appointees this year to consider a range of evidence that would inform our grading decisions. That is the final stage of the approach that we have taken. I mentioned in my opening statement a package of support that included modifications to assessment, for example, which continued last year, which took out a range of assessment instruments to increase learning and teaching. What we did in our grading meetings was to consider a range of evidence, as we would normally do, about the performance of individual assessments. How have young people performed in the exam? Has the assessment instruments performed as intended by the teachers who develop and design our qualifications and assessments? We would also consider any impacts around the modifications, the removal of revision support, because we had revision support in 2022. If necessary, we would consider whether there were any further adjustments that might be required or appropriate as part of that sensitivity. It is important to highlight that the sensitive grading was designed to benefit learners. It is designed to be more generous if needed and if the evidence supported that. It is quite difficult for me to say that the impact was X because we held 129 grading meetings across national 5, higher and advanced higher, and we took individualised decisions on that basis. We published our grade boundary adjustments on results day, and I published the median adjustment of grade boundaries as part of my chief examiner's report. I appreciate that. I found that report very helpful, including the detail on the sensitive approach as you have set out. Are you able to set out what the results would have been had you not applied the sensitive approach? No. How do you know which parts of the sensitive approach helped, which parts were unhelpful, and how do you know that they are not needed into the following year? I think that what I have set out in summary and what the methodology report sets out is what we did. I appreciate that. For all the right reasons, we did not say that this particular percentage is the sensitive approach because we took an evidence-based approach to grading and considered, with our senior appointees, whether there were adjustments needed. Bearing in mind, looking at the performance of assessments, looking at the impact of modifications, the impact of the removal of revision support and any further, there is effectively a holistic judgment made on the basis of an individual subject. We have not and did not make or provide or quantify what that sensitive approach would be. However, as my chief examiner's report highlights, in a normal year, the median grade boundary adjustment is zero. If you go back to 2019 in the period before, the median grade boundary adjustment was zero. Last year, in 2022, it was minus 4 at C and minus 2 at A. This year, it was around minus 2 at C and minus 1 at A. That highlights in broad terms the fact that we took steps beyond what we would normally take for a variety of reasons, but we took steps beyond what we would normally make in a normal year. That demonstrates in broad terms the quantum of the adjustments that we made, which were larger than usual. Bearing in mind, minus means that we lowered the boundary at C, and therefore more learners achieved A to C. Willie Rennie, do you have a supplementary? It's just a quick one. You said that we shouldn't draw comparisons between now and prior to the pandemic, but then you were confident that things have got better. Is that not a contradiction? No. I think that it goes back to those discussions that we had during June and early July that Gill and myself were part of, where we saw evidence of strong performance, we saw evidence of some improvements in performance, and the fact that the grey boundary adjustments were less significant this year than last highlights evidence of some recovery. There's that balance between recognising that we are on a path to recovery, but also recognising that we continue to take a sensitive approach to grading to benefit learners. I think that it's really important that we're not... I do have to stress that, in every year since 2019, circumstances have necessitated that we have had to take a slightly different approach to awarding, and that has led to different sets of results in different years. Therefore, we need to be cautious about seeing the education improvement overall or not, as the case may be. We have had to make some adjustments, but it's important to highlight that we saw evidence, some evidence of strong performance, and compared with 2022, a degree of recovery. I know that Mr Kerr, Liam Kerr, wants to come in on this topic, so we're going to jump ahead a little bit to him before we come back, okay? Very grateful, convener. Good morning, panel. I'd like to just stay with this line of questioning, if I may, about grade boundaries and sensitive approach. First of all, because I appreciate... There'll be people watching this and hearing all these terms, and hearing you talk about grade boundaries and minus two and things. Are you able to concisely upfront just say that these grade boundaries that you set each year, that in your submission you say are not predetermined, they're based on evidence, you talked about the median grade adjustment being zero. Are you able just to set out concisely what you mean by grade boundaries, what it is to be adjusted and what the evidence is that you use to do that? Yes, I'll attempt to set that out clearly. I accept and recognise, and I hope that the committee also appreciates that this is a technical process, but absolutely, I think that we're very conscious of the fact that there are learners behind all of this, and we're very cognisant of that. So we do have a policy in relation to the grade boundaries, and that's published, so there's transparency around all of that. Basically, the grade boundary is the grade at which you achieve a particular grade. So there's a grade boundary set at C, and we also set a grade boundary at A in upper A. It's effectively the marks at which you achieve a particular grade. Now, in our assessment, our assessments are set with the expectation that grade boundaries will be notional. Now, notional means that, on average, you would expect a grade A to be 70%, you'd expect a C grade rather to be 50%, and a grade A to be 70%. So that would be an expectation, and the assessment would be set with that in mind. But setting assessments is... There are judgments that are made, and my colleague Jill Stewart can talk about that when more detail... Jill has many years of experience in this particular area. Oftentimes, assessments perform as expected, but sometimes they don't. There can be modest grade boundary adjustments to allow for that. So in very simple terms, sometimes an assessment is easier than expected and sometimes an assessment is harder than expected. So we can make modest adjustments to account for that. The heart of what that is is that a grade A in one year would be, broadly speaking, the same as a grade A in another year, so we make adjustments to deal with that. That's what grade boundary adjustments are. That's really helpful. On that adjustment, in your submission you said that when you were looking at setting grade boundaries, the SQA took into account the legacy impact of the pandemic upon learning and teaching. How did the SQA evaluate the impact of the pandemic and quantify it when you were setting grade boundaries? The methodology report sets that out in a bit more detail around a staged approach. The normal awarding approach would be to look at a range of evidence about how assessments have performed. So we would look at a range of evidence. I think that in the last couple of years, with a slight adjustment to our grading, we have also looked at a range of other evidence and we've taken a judgement about whether there's any further adjustments that we should make, which don't undermine the integrity of the qualification but recognise that there have been impacts. I think that Jill might be able to exemplify a couple of examples, for example in the sciences and languages, and that might bring that to life a little bit because I understand that it may feel rather dry, but it is important and it's an important part of our process and it's evidence-based and it's seeking to ensure that year on year there's fairness in our approach and I think that that's really important. Stuart, to give us some live examples, helpfully. I think that the best example I can think of is the modern languages ones. We assess, as you would expect, the skills of reading, writing, talking and listening and we assess those separately so we can see how learners have done in each of those assessments. I think that it's important to realise the breadth of evidence that we consider, so we look at statistical information which helps us to determine whether or not the assessment performed as intended, but we also ask all of the markers and there's over 8,000 teachers and lecturers involved in marking to complete a marking report when they've completed their marking to ask them information about whether or not they thought learners had done how they thought learners had done relative to 2019 and relative to 2022. We also have teacher estimates as well as part of our panel play of information and we also have the principal assessor and the deputy principal assessor who have overseen all the setting of the assessments and the marking of the assessments and they have looked at all of the markers' reports. We use all of that evidence to make those judgments. I've come to the modern language's example. First of all, we would compare performance in each of the reading, talking and listening writing against 2019. That's the original standard that we're trying to get back to in terms of what we expect our learners to have at national 5 or higher. That would tell us where we are relative to our original standards. Then we would look at performance this year across those four assessment areas in relation to last year. What that told us was there had been significant recovery in 2023 in comparison to 2022. We used the comparison between 2019 and 2022 to see what the impact of the pandemic had been upon the learning and make a judgment the previous year. Moving forward to this year in 2023 we were able to compare how they had performed in reading, writing, listening and talking this year compared to last year. It told us that there had been another step-wise recovery in the learner skills but they were still a little bit off the 2019 standards. Does that help? I understand. Languages specifically. I think we saw an impact particularly around listening and talking for example because the disruption to learning had an impact. We saw a weaker performance in relation to those elements last year. Still slightly weaker this year but a bit of an improvement on last year. Last year we made a significant adjustment to address that because we felt that was the right thing to do. That is part of the generous grading. This year we still made an adjustment but it was not as significant as last year. The methodology report sets out that step-wise approach. Very much in line with... That makes sense. Consistently but crucially involving principal assessors and deputy principal assessors who are teachers and bringing feedback from them as part of the judgment that we made. That makes it clear. It begs one final question Is it your intention to take the sensitive approach next year and if not, is that because you have concluded that there is no longer a legacy impact of the pandemic? I set out in my opening statement in all the documentation that I published on 8 August. I think we have a balance to strike and this is not just a balance that we consider but it's a balance that other awarding bodies and other regulators are considering across the UK. We need to make sure that we are recognising the challenges of the last few years and the impact that that has had on learning and teaching and on learners. We need to balance that with the on-going credibility of the qualifications and that we are on a path back to normal. This year we saw England substantially return to normal awarding with some protections in place. There is an expectation that Wales and Northern Ireland will follow this coming year. We have not made a final decision in relation to our grading approach for this year. We have made, as I outlined in my opening statement, a decision around full-course assessment. That is bringing practical work back into science and assignments back into a range of our courses as they were designed to do. We will be engaging and discussing further final awarding and grading approaches. I hope that we can substantially get back to normal. Given that we have returned to full-course assessment, we need to, as we would normally do with any changes to the assessment approach, be mindful of the impact that that might have. I am very aware of the fact that some teachers have expressed concern about the return to full-course assessment, particularly for those learners who have gone through S4 and S5 without having done some of that work. I think that we need to be very mindful of that in our grading approach and where the evidence takes us on that. Thank you very much. Bill Kidd, can I come to you now for your questions? Thank you, convener, and thank you to the panel. National Four Advanced Higher and Local Authority centres is just a couple of quick questions. I do not think that they go very deep, but can the panel explain the rise and number of entries at national four level that have taken place? We have seen centres make decisions in the best interests of their learners, and that includes decisions on entries. It is quite difficult for me to give a granular explanation for any changes. What we have certainly seen is, over the last couple of years, and compared to pre-pandemic levels, we have seen an increase in dual entries at national four and national five. I think that there has probably been a bit of a shift from national three into national four. Similarly, we have seen fairly stable entries across national... Sorry, we have seen a greater degree of stability at higher and advanced higher, but we have seen the number of national five entries go up, and a big part of that is dual entries between national four and national five. I think that we have also seen a bit of a shift from national three to national four. Modifications may form part of that story because we removed the added value unit of national four qualification in response to the pandemic and the disruption to learning. So, again, freeing up time for learning and teaching, that could have had an impact on entry patterns and entry levels, but entry decisions, rather, rightly are a decision for centres to take in the best interests of their learners. I think that we have seen, while the focus... I'm sure that the focus will remain in this discussion at national five higher and advanced higher fairly consistently. We have also seen increases in our other awards, you know, PDAs, Skills for Work and other awards. We're seeing quite healthy numbers there, and of course we offer a wide range of other qualifications that we're seeing coming through, including in the school sector, vocational qualifications, including HNCs, modern apprenticeships, including SQA qualifications substantially. So, we are seeing also a bit of diversification of entry patterns, and that should be seen as a positive thing because that's recognising that there are choices for learners to suit their interests and their pathways. Well, thank you very much for that. On the back of that then, to what degree is the SQA seeking a return to the previous achievement patterns, the ones that existed before the pandemic, is the SQA seeking to do that, or is it looking to see new patterns emerge? There is not predetermination in our resulting, so, you know, as I say, in terms of entry, in terms of entry decisions that centres might take, they'll have a range of choices in relation to that, and as I've highlighted to your colleagues, we take an evidence-based approach in terms of resulting and the outcomes of our qualifications. The responsibility that SQA has is to make sure that we can judge the competence and the skills, knowledge and understanding that's required to achieve a qualification, and it's on that basis that we make our judgments. There's no predetermination around final outcomes. Would the SQA be happy, at least initially, to return to the achievement patterns that were seen before the pandemic, or is it just, I mean, I know you're saying that you're not setting exact limits or rules, but those patterns that existed before the pandemic, were they good enough to want to go back to? Well, as I say, it's not really... My focus is not really about achieving a particular pattern of achievement. My job is to ensure that we're awarding on merit and on the evidence that's in front of us, so I don't have a predetermined view about what's about that. I mean, I think everyone involved in Scottish education wants Scottish education to be improved outcomes and good pathways for young people. I think that goes without saying, but I think our role is very clear in relation to ensuring that we are making awards on the basis of demonstrated attainment and achievement, so I think that's... I hope that answers your question. Yes, it does, and thank you very much for that. Can I bring in Ross Greer for a supplementary on this line of questions? The issue here is not that the grading is ultimately relative. By the way that the system that we operate by, there can only really be so many A's each year, so many B's, so many C's. For example, if the number of A grades in the first instance looks like it's increased significantly, that's interpreted as a question over the integrity of the data and that the approach to grade boundaries ultimately sets a cap on the number of A grades that there can actually be each year? No. I think that there is absolutely something in terms of... You expect, in the external assessments that we set, you expect to see a degree of differentiation in performance, and actually questions are set with that in part. Some subjects more than others, but I might use maths as an obvious example, that there are some questions that are set as perhaps A-type questions or B-type questions and so on. So, as Jill has highlighted, our markers will provide feedback on the performance of those assessments and how those particular assessment instruments have performed. If there was a cap or if there was predetermination, we would see much less variability than we do see in our awarding. If you are looking at individual subjects, there are around 140 national 5 higher and advanced courses and some with very small numbers of entries, where in fact there can be quite high degrees of variability year on year because of the impact of a particular cohort. For larger subjects, there is a spread and perhaps a bit more stability in outcomes overall, but every year you will see differences and changes. On the one hand, the assessments are set with a degree of differentiation in mind, but we do see variation in performance each year and we have considered an evidence-based approach before we make a judgment about grading. I do not want to over or indeed emphasise the importance of the grade boundary meetings, as I have highlighted in a normal year. In a normal year, the median adjustment is zero, so for the most part assessments work, we make no adjustments to grade boundaries and the results fall where they fall. Over the past couple of years, we have made more adjustments to grade boundaries as part of a more generous approach to grading that has benefited learners. We have provided the grade boundary information, and we publish it every year. That highlights that there were a number of courses where we made significant grade boundary adjustments, and there were a large number of courses where we did not. It was very much on a case-by-case basis. I hope that that explains that there is not a predetermination because we do see variability year on year. You would expect to see that in courses that have a small number of entries. There may be years in which there is strong performance, and there may be years in which there is less strong performance. It is important that we reflect that in our results. I will carry on on the theme if that is okay. You have spoken about median adjustments and results falling where they fall. You specifically spoke about larger subjects, such as maths, where you used that example specifically. Despite the sensitive approach to boundaries that we have been discussing at length so far, the results in national 5 maths in English were worse, or the same as in 2019. Does the SQA have particular concerns around what are key qualifications? How are you feeding back to local authorities, schools and teachers to address some of the gaps in learning? Dr Stewart, do you want to come in on that? To add to what Fuehr has said about grade boundaries, our job is to maintain standards from year to year in individual subjects. It is through the setting of the assessments in a standardised way each year, and then using the grade boundaries process to assess whether or not we have maintained those standards or whether or not it was slightly more challenging or slightly easier than intended. That is very much the purpose of grade boundaries. Fuehr has said that we have made other adjustments to do with the pandemic in our generous and sensitive approach to grading to take account of the impact of the pandemic upon learning. National 5 mathematics is an interesting area because there are lots going on there. There are a number of different factors at play, and we are carrying out further analysis of all the data to help us to understand and to have discussions with local authorities and teachers and so on. Some of the things that we see happening there are a larger proportion of dual entries for national 4 and national 5. We do not just have one course in mathematics, we have a course called National 5 Applications of Maths, and we have seen significant increases in the uptake of that qualification. We are sitting at something like 19,000 from memory, but it has gone up from 14,000 last year to 19,000 this year. We are seeing a shift in the use of the different mathematics courses as well as the patterns of dual entry. The other factor that we have to take into account here is that national 5 mathematics is often required by learners to go into particular further courses of study, be that at college or at university. We see significant numbers of resets for national 5. If you take out the reset population and look at those in S4 who are sitting in national 5 mathematics, the attainment rate is sitting at just over 70 per cent. If you then take out the dual entries for national 4 and national 5, the attainment rate goes up to 80 or 80 per cent. There is a lot going on and there are also different patterns happening across different local authorities and different schools, but we are doing a lot more analysis in order to understand that more fully because we need to have dialogue with Education Scotland and local authorities if we think that there are. What you have explained there is that you were explaining the number of extra entries, not the results that are coming from those. I was wondering if you could make... I also consider, I asked about English as well, where I don't know if there is quite as many. If you don't mind, I'm going to direct the question to Dr Stewart again. You have the other course in mathematics significant because if you've got a variable cohort which you do have here and the decisions are getting made locally about which learners will go to national 5 applications of maths or national 5 mathematics, you don't know if that's a uniform spread of learners across the ability range or not. For instance, it might be that the better learners are going to national 5 applications of maths, you don't know because now the cohort that previously all did national 5 maths are now split across two mathematics courses and you also have significant numbers doing both courses as well. I'm not providing answers, I'm just saying there's a lot of new variables that have come into play here that mean that we really need to go down to a lot more higher degree of analysis of what's happening at a local level, what's happening across different year groups and also what's happening in national 5 applications of maths. The opportunity for you to feedback... I'll bring you in a second, Fiona. When you've done that, which sounds like it's going to be quite substantive piece of work to feedback to us around some of the findings from that. Yeah, that'd be superb. Fiona Robertson, if you can respond briefly on my question. Thank you. Yeah, I mean, just to add to what Gillis said, I mean, obviously in any year local authorities and schools will be looking at their results and undertaking an analysis and reflecting on learning teaching, also entry patterns as well because, you know, the percentages, you know, there's a focus on percentages here, while actually percentages are certain part defined by the entry decisions that local centres are taking and there's only so much we can say about that. Those are local decisions made and, you know, local authorities have statutory responsibilities in relation to improving education and considering these issues more generally. We will play our part in providing data and analysis to aid that thinking. I would say, though, that my chief examiner's report did highlight a weaker performance of mass. I mean, that's what we saw. But as Gill highlighted, underneath all of that, there are some unusual patterns on entries and the point about resets, I think, is important. There's not many courses in which there's a substantial number of resets, resetting candidates, masses and exception to that. And it actually has been previously, so it's not new. But I think that this is an opportunity for where the data, the results that we publish and the data that we produce can aid further conversation in the system, including in local authorities and in schools and in Government about some wider issues in the education system, and that's part of our role. Thanks. Can I also ask briefly about entries into the Gallic medium qualifications, because they still remain stubbornly low, relatively low. So have you undertaken any work to understand the reasons for that? Dr Stewart. We could come back to you with further information about that. I mean, we do have our Gallic language plan and we support the medium of Gallic in education. But I suppose we're limited by the provision at a local level again. It's schools that make the entries and so on. But we do do our bit to support Gallic medium education and provide assessments in the medium of Gallic in a small number of subject areas. We also have our Gallic learners courses and our native speaking Gallic courses as well to help. It would be very helpful if you could provide that information when you have it. Excuse me. Mr Rennie, Willie Rennie, can I... It's in the same light. Advanced tyres in disadvantaged areas. Advanced tyres in disadvantaged areas. Have we got on top of why there is a lower offer, a less of an offer in those schools, and what do you think can be done about it to improve it if it's not improved in recent years? I think that's an area where it's quite difficult for us to comment actually in any detail. The provision, the availability of courses and the provision within local authorities and centres more generally is ultimately a matter for them. There have certainly been some... There are certainly advanced tyre provisions specifically. There has certainly been some movement there including the Hub at Glasgow Caledonian, for example, the scholar resources for particular courses as well at Advanced Tire. There has been some movement there, but ultimately we offer a very wide range of courses including at Advanced Tire level. Some of the Advanced Hards I would acknowledge are quite small entries, quite small entries. But what we're finding is that learners are taking a variety of... are making a variety of choices in particular in S6. That sometimes includes Advanced Hards, sometimes it includes other qualifications, it can include a vocational offering. So we are seeing a bit of diversification, but those decisions are made at a local level. You have a strategic overview across the country. You'll have developed an expertise and understanding about perhaps what works in some areas and what doesn't. Do you have regular discussions with local authorities about how to improve that offer in certain localities? We engage with centres, with local authorities, with ADES, the director of education on a regular basis on these issues. Advanced Hards provision is not something that comes up. It's not coming up all the time on a regular basis. Do you not think that it should? There's quite a big gap between the wealthier areas and the poorer areas. Should that not be on your red risk register? As I say, our role is to offer qualifications. I think that we would all want to see choices being made available to learners wherever they live. Of course, those same choices are being made available to learners wherever they live, but ultimately that is a decision that does not rest with SQA, that rests with local authorities and with individual centres. As I say, there has been some movement in schools working together and at Glasgow Caledonian around making sure that volume so that some courses can be more viable on the basis of volume. The availability of courses and curricular decisions are decisions that are made locally. I was just to say, not specifically on Advanced Higher, but I know that Education Scotland is doing some work currently that is about showcasing best practice, particularly in relation to diversification of the curriculum. They are looking at different models that are being used across different local authorities so that we can spread the word around about some of that and we are working with them in that space. We have also done some work with SQF, the SQF Partnership, around pathways as well. There are things that we can do, but ultimately the curriculum decisions and the availability of courses is a complex issue, but it is also a decision that local decision makers and local authorities and individual schools make on the basis of the needs of their local school community. You have alluded to some of the other courses there in response to Mr Rennie's questioning. Can I continue on this theme? Can we bring in Pam Duncan-Clancy, please? Thank you very much. Can you explain the rise in entries for non-vocational work courses and the national progression award? I think that what we have seen is that I was at Leith academy last week and I witnessed the value of that diversification of choices that learners are making. So, while absolutely learners continue to make choices in relation to national 5, higher and advanced higher provision, there is also a range of other qualifications that they can take throughout the year and, indeed, can be certificated throughout the year, both in the senior phase and in some cases before the senior phase commences. I think that that is good to see, so national progression awards, for example, PDAs, we do lots of different awards, mental health awards and other things, and we have been seeing some quite healthy growth. Again, it does boil down to individual decisions that are being taken within schools around the diversification of the curriculum and the opportunities that are being made available to learners. There is still some variability across the country in that context. That is absolutely something to be celebrated, that individual schools are thinking about these things and thinking about what will best suit the needs of learners. It also brings into the question that Mr Rennie just asked in relation to advanced hires. I think that there is also a debate to be had about whether there are entitlements of learners, and I know that that came up in some of the Hayward review discussions about whether there should be more consistency. There is a balance to be struck there, and I think that that is a debate that the education system needs to have. Insofar as we are concerned in the SQA, we offer a huge amount, our catalogue of qualifications is wide, it is deep, there are lots of choices that can be made, so the offer is there. The issue is the choices and the availability of courses within schools and colleges and others. Do you know anything about the demographics of the numbers of people who are going forward in these circumstances? That is not an analysis that we have undertaken. Is it an analysis that you could undertake? It feels like it would be useful for the committee to understand some of the demographics of that increase. That might be something that we could look at. As I mentioned, we publish the Equalities Impact report. We do not hold that information. We have to engage with the Scottish Government to hold that information in relation to SIMD. It is not just SIMD, it is also other characteristics as well. We do not hold that information, but that is something that we can certainly look at as part of our wider equalities work. It is really positive to see schools diversify in their curriculum, because I cannot remember which OECD report said that young people can be motivated by... I want to go to university or whatever, but some young people are not motivated by that and they are looking for something that they are good at. Sometimes, a skills for work course in a vocational area or a national progression award in cyber security or whatever might really engage that young person and get them interested in learning and they can actually find out they are good at something. That is really important to encourage them in their learning. I think that it is a really positive move that we are seeing increases in uptake of vocational qualifications in our school, trying to engage young people in what would help them on their next steps in their career. The work that Education Scotland is doing is trying to help spread good practice in some areas to help schools to understand how they could do some of that as well, which is really good. Fiona mentioned the work that we are doing with SCQF. As she said, we have a very broad catalogue, but it is trying sometimes to identify which bits of that broad catalogue might be appropriate for use in the schools and just highlighting some of that. For example, in computing, we know that there are some young people that are particularly good at programming, which they need to be for national 5 computing, but they are interested in things to do with cyber security, gaming, data analysis, all those different things, software development. We have lots of small qualifications in those areas that some schools are using with those learners and are really getting good engagement with those learners. National 5 computing is not for them, but it is suitable for some young people who want to go on and do programming, but not for others. I will use that example. Do you like examples on this committee? We do, and I think that that is a really good example. Thank you for sharing it. What we are seeing in those particular awards is helpful, and I think that diversification is really important, not least because the OECD had picked it out, but just because we understand that that is what young people want. I think that it is also really important for us to know who is going forward, who is being presented for those, as opposed to who is being presented elsewhere, to check if there are any kind of patterns that may be needing to look at further. Thank you very much. Stephanie Callaghan, thank you for your patience this morning. Thank you, convener, and thank you panel so far. Just looking at the variation in results associated with SIMD areas, I wonder why different approaches to certification can lead to significantly different attainment gaps, and can you explain what is actually behind that? Is that perhaps a question perhaps more for Dr Stewart? I think that it is very difficult to say definitively what might lie behind that, but what I can say is that the wider research evidence would say that if you have teacher judgment, they will be more likely to make more generous judgments on the performance of learners, particularly those from lower SIMD, or less advantaged backgrounds, if you see what I mean. There is research that also shows differences between different protected characteristics as well. I am talking about broader research about teacher judgments, not specifically about research to do with Scotland and national courses and so on. There might be some natural bias towards that, but there could be all sorts of things going on. I do not think that you can see uniformly that one form of assessment, either teacher assessment or external assessment, will suit one less advantaged learners or advantaged learners because learners are a mixture. Some will respond better to teacher assessment, some will respond better to examinations and coursework than others. It is not a uniform population. I do not explain that very well. We all have our preferences in terms of how we like to learn and similarly we all have our preferences about how we would like to be assessed. Often, boys prefer to be assessed by an exam rather than some boys prefer to be assessed by an exam rather than continuous assessment, but there will be some boys that prefer continuous assessment. There is a lot of variation in the population. I do not think that you can see less disadvantaged people prefer teacher assessment and more advantaged prefer. I know that you are not saying that. I suppose that it would be quite interesting if there was any further work to be carried on around this to look at that in a little bit more detail. I carry on from that as well. The attainment gaps seem to be lower, generally speaking, so maybe over the past five years the higher the level of qualification the smaller the gaps seem to be in attainment. I wonder what is the thinking behind that? Is there any reasoning? I think that there is no doubt about alternative certification in 2020 and 2021. We saw quite a different pattern of attainment. We saw teachers awarding higher grades overall. When you look at the composition of the attainment gap if you are moving what would have been in 2019 or 2018 an attainment pattern that we had to a more generous attainment pattern that was achieved in 2020 and 2021 a result of that would be a narrowing of the gap. That was not just seen in Scotland, it was seen in the rest of the UK as well. It was a common feature of our alternative certification. Interestingly, in the return to exams last year we published some research that looked at teacher estimates compared with the results that were achieved through external assessment. I would certainly recommend that you have a look at that report because we actually saw teacher estimates return broadly speaking to pre-pandemic levels and we will also produce some further analysis on the position this year. This year and last year we saw a similar pattern of results from awarding through external assessment in the return to exams as we did as teacher estimates came in. That is quite interesting because over the past couple of years we have also seen a changing pattern in teacher estimates compared to 2020 and 2021. You mentioned research, we have undertaken some really significant research over the last couple of years and we will do another evaluation of 23 awarding. That has given rise to some really interesting observations about how people felt about alternative certification and how people felt about continuous assessment and how people felt about their return to exams. We surveyed learners, practitioners and parents carers as well as those who work most closely with us, particularly appointees. There are some really interesting conclusions. For example, learners absolutely trust the judgment of their teachers. They absolutely trust that judgment. We also found, when we looked at the evaluation of the 2021 approach for alternative certification, that, although they absolutely trusted the judgment of their teachers, there was a concern that perhaps the judgment of those who were not their teachers or those who were in another centre were questioned. The distinction between the judgment of the individual versus the relative judgment of others. That is why we have a national awarding system rather than a local awarding system. There are some really interesting observations and findings that absolutely should feed into what happens next. The final thing that I would say is that we have a pretty balanced assessment system. We have an exam system at present, but the courses have been designed to be balanced. Many courses have a significant amount of course work. Some have teacher assessment and continuous assessment of some kind or different components that are assessed during the year. There is some balance across our courses and that continues to be the case. I would advocate for that balance. On the second question that was asked, are you able to explain why the treatment gap gets smaller, the higher the level of qualification? For example, at advanced higher level, it is smaller than it is at national five and higher level. We would have to do more analysis on that. If you are asking for my thoughts on that rather it would be that the numbers during the qualifications significantly reduce, so it is a much smaller population that are doing advanced higher in comparison to that. It will be, by nature, a much more select group of young people that will do advanced higher. We would have to do further analysis to really interrogate that and see what the patterns were and so on. We can identify what the patterns are, but we cannot necessarily explain why those patterns are thus. I am just wondering up until this point why there has not been some curiosity in looking into that aspect of it. Is it not something that is kind of stood out as something that perhaps does need to be looked at? In a positive way, yes, of course. We provided the equalities impact analysis. You are absolutely right. In broad terms, compared to 2022, albeit with the cavegats around comparisons, that is important because there is a different awarding approach. We saw a slight widening of the gap between 2022 and 2023, a narrowing of the gap between 2019 and 2023, but advanced higher looked a little different. We saw a slight narrowing of the gap this year. To be really clear, this has been kind of over the past five years or so, so it has not just been during the Covid period? I think that the contributors to any changes in the attainment gap are absolutely... It goes back to Mr Rennie's question around entry levels and where those entries are coming from. That is the first thing. The achievement of individual learners and for some advanced higher courses, you are seeing greater variability in performance year on year because it is a smaller cohort. That can impact on the attainment gap. There are a variety of factors that can contribute to that. It is important to highlight that the analysis that we are undertaking is a national analysis. We are not looking at individual local authorities and that analysis are indeed individual schools. I think that there is a Government programme, the attainment challenge, and it is important that they are looking at that data and interrogating that data. Education Scotland, of course, is very heavily involved in seeking to understand and inform what further approaches should be taken both around learning and teaching, particularly support or indeed in relation to entries. I think that our analysis is prompting some of those questions and that can be a good sign. Thank you. Thank you, Stephanie. Can I move to questions from Ross Greer now, please? Thank you, Ross. I would like to ask a few questions about the appeals system. It has changed quite a bit over the last few years for a variety of reasons, most obviously, but not entirely due to the pandemic. The 2022 appeals system was one that was probably received the most positive welcome from young people, organisations that represent them and their rights. We had an appeal system that was direct access to young people, that was free and that considered evidence in the round. It was not just a script remarking service. Obviously, we have gone back from that, although maybe that is a subjective term, but we have moved away from that for this year. It has gone to script remarking again. Could you explain what the rationale behind that decision to go back to script remarking was? Specifically, what the issues presumably were with last year's appeals service that was based on wider evidence of young people's work throughout the year? I will try and think, but there are a number of things that we looked at. First of all, in terms of the 2022 appeals service that was introduced, we felt that there had been two years in which there had been no exams, and we felt that, as part of the package of support for one year only, we would put an appeals process in place. It was much considered and discussed, including with learners, including with the wider community, that we would put in place an appeals service, unusually, that could be based on alternative evidence. It was not an appeals service that was based on the exam or the remarking of the script. It was not an appeals service that was based on whether we had marked it incorrectly, which would be a typical appeals service, not just in Scotland but everywhere else, but an appeals service that could look at alternative evidence that was taken forward during the course of the year in terms of learning and teaching and assessment. We did that as you have highlighted, again, unusually, because it is not available elsewhere, and it is certainly not in the UK a direct appeals service, a direct appeals service that is also free. You are right that that was welcomed. We continued to have a free and direct appeals service this year. While we had announced that the appeals service in 2022, which was based on alternative evidence, was for one year only, it was absolutely right and proper that we looked at that as part of our evaluation. We took forward a lot of discussion about the appeals service, about what people had felt about it, what the views were, but also what the evidence was as well. We concluded as part of the feedback that we got, and that was feedback from learners, practitioners, from appointees who were involved in looking at the evidence. There were almost 60,000 appeals, and if you can imagine that, that involved collecting evidence from every centre that was submitting evidence, and that was looked at by our appointees, also teachers. I think that the biggest issue was fairness. On the one hand, while the appeals process, based on alternative evidence, was perceived to be fair and generous in the sense that, if you did not perform in your exam on the day, you could look back at your alternative work. That worked for some learners, so about three out of ten learners who got a higher grade as a result, but seven out of ten did not. Keep in mind that you could only make an appeal if your teacher estimate was above your exam grade. The expectation was that the evidence and the judgment of your teacher would have done better than you did in your exam. Those numbers, the three in the seven out of ten, how does that compare to what it would usually be with a script remarking service? It was a little more, but not much. Bearing in mind that a normal... It's not apples on oranges. It's not apples on apples in the sense that... Presumably you get far fewer appeals with script remarking than you did last year. Yes, and it's on a different basis, in a sense. I think that we've got to be careful about drawing comparisons between success or otherwise rates. The consistent feedback that we got from markers was that there were issues around the sufficiency of evidence from schools and there were, in some cases, issues around the judgments that schools had reached around the estimate itself. That does call to fairness. I dealt with a number of individual cases in which learners, through no thought of their own, had not been assessed properly or, indeed, that there had been inappropriate judgment made about the standard that they were expected to achieve. So sufficiency of evidence, which means breadth of evidence, so that we had evidence that we could look at and say, yes, on balance, this learner could have got another award, but also the standard at which that evidence was judged. So it was really on the basis of variability in the standard and breadth of evidence through that approach. There was also emerging evidence that if you have an appeals process of the type that I'm explaining, if you have an appeals process which is based on alternative evidence, that that in itself, particularly over time, and we had a similar appeals process, it's before my time, but the pre-2014 period, we had a similar appeals process, and over time what happens is that, for all the understandable reasons, schools start to collect evidence on the off chance that they need an appeal and it actually promotes assessment, potentially over assessment, creates workload for learners and for teachers, and we did get some practitioner feedback on concerns about workload for learners. We don't want learners to be over-assessed on the basis of the fact that they may need an appeal further down the line. I completely understand the concern about over-assessment and particularly the challenges that we saw in 2021 trying to manage for the lack of exams and then the concern about over-assessment there, but in that period between 2014 and the pandemic, the scriptural marking service that we moved towards, because that was now, this is partly because of cost, et cetera, but the scriptural marking rather than assessment system was disproportionately used by independent schools, so I get the concerns around fairness here, but the scriptural marking system that we used and that we have now returned to has its own evidence issues with fairness as well. Again, going back to apples and apples, the pre-pandemic service is not the service that we now have. The pre-pandemic service, you're right, and I understand that there were issues, it predates me in this role, but there were issues around the fact that the service was, it was determined by a school decision rather than a learner, it was also in the knowledge that if the appeal was unsuccessful, the local authority would usually be charged, so the charging system presented a perception that that could influence behaviour, hence including any judgment about independent schools and the ability to pay, et cetera, and all of that, so the cost became part of that, the perception around that service. So what we have now is we have a free service, it's an open service, anyone can appeal, any subject is a learner, direct appeal, it is a click of a button that learners can appeal, but it is an appeal on the basis of the assessment instrument that's been used, so it's an appeal on the basis on which their grade has been determined. I'm just very conscious of the time, that's the core issue because it comes back to the debate that we've had over the last couple of years and discussions that I've had with yourself around those exceptional circumstances, the young people who had a family bereavement immediately before the exam, had a panic attack during the exam, whatever it might be, and I've brought some of those cases to you as casework, we've had wider policy-wide discussions about them, how do we make sure that the young people who are in those exceptional circumstances and there's a wide variety of them get that fair opportunity? They still do, so we've retained, and I perhaps should have said that at the start, although it's not an appeal, we've retained an exceptional circumstances service, which is precisely for individuals in those circumstances. Those learners are resulted on results day, so it's not really part of a post-result service on appeal, so in effect, we've retained alternative evidence for those that need it most, and around we will produce, I don't think we've published figures as yet, but pre-pandemic and during the normal course of things, there's usually around 5,000 entries, which would be in that kind of, who would be taking up the exceptional circumstances service. For those learners who are unable to take an exam, for personal reasons or reasons of illness or bereavement or other things, or where there's been disruption for some reason on the day of the exam, that can again be for a variety of reasons, the centre can make an exceptional circumstances request. Can you drill it down, please? That's covering some young people, but not all. For example, the young person who, I've dealt with case work like this, the young person whose parent died the day before the exam but really felt that they wanted to go in and do the exam, they're having to make a choice there. Do I think that I can perform well enough in the exam or do I pre-... before that make a choice to take the exception? No, they could still be covered by exceptional circumstances. Right. Can I just finally... Finally, please, Rob. I think that's really important. It doesn't include those simply who did not take the exam. Did the young people in the learner panel support this change and what organisations who represent young people's rights, so for example, the Children and Young People's Commissioner, did they support the change to the scripturing marking service this year? I don't have a list of every stakeholder who agreed or disagreed, but what we did was we undertook the evaluation which highlighted the issue of fairness. Sorry, just for time, because I'm conscious of taking up other members' time. The young people in the learner panel then, did the learner panel support this? My recollection is that the learner panel had some concerns about the return to the review of scripts. However, we engaged with the breadth of stakeholder interests. We undertook survey work of 2,000 learners, 1,000 practitioners, 500 parents and carers, appointees and those issues of fairness I have a responsibility not to ignore. I have dealt with a number of difficult cases this year. You highlighted your constituency cases. I get representation from a number of MSPs and others during the course of the year. I found it quite difficult to deal with cases in which a learner had an expectation of an award from their school and the evidence did not support that expectation. Through no fault of their own, because either they had not been assessed appropriately or the judgment had been made incorrectly. I have a responsibility around fairness as well. I realise that there may be some stakeholders who disagree or agree, but I also have a responsibility around fairness. I'm not aware of any other country who has an appeals service on the basis of alternative evidence. Most exam boards, most regulators determine an appeals service that is based on the assessment approach on which the appeal is based. I understand the concern. We've done as much as we can to ensure that those learners that need it most still have access to a service that can utilise some alternative evidence and a wider range of alternative evidence while maintaining fairness to an appeals process. Thank you very much, Fiona, for making that quite clear. Pam Duncan-Glancy, what's up from yourself? I know that Michelle Thomson is also wanting in. Thank you, I appreciate that. My colleague Ross has covered this in quite some detail. Can I just check something that you said? When you said that 3 in 10 appeals are successful on alternative evidence, that's higher than the number of appeals that are successful on the script approach. Is that what you said? I think that what I said was that it wasn't appropriate to compare the two because of the differences in approach. Because the pre-pandemic approach, and I don't have the outcomes for this year's appeal service because we're still working on it, so we're working through nearly 40,000 appeals at this time, and we will do that, and we will publish information on those appeals outcomes. Again, I don't have a degree of predetermination about an appeals outcome. The appeals outcomes are based on evidence, and they will be what they will be. So there's no predetermination about one appeals service being better than another. The fact is that the appeals service from last year was based on an estimate being higher than the resulting grade. I would have expected other things being equal for the success rate for appeals last year to have been higher than it was, because it was purely based on the estimate. Therefore, if there was integrity to the estimate, the appeal should have been successful. The point is that when you looked at the evidence to support the estimate and you looked at the assessment evidence both in terms of sufficiency and standard, only three in 10 were successful, and the reason that we did not continue, albeit that we had made no indication that we would continue, the reason that we didn't continue with that appeals process was on the basis of fairness. On that basis, earlier you said that the assessment was based on a balance of exams and coursework. Can you explain the variability of evidence that you've said has caused some of the concern around the alternative evidence approach? The alternative evidence approach for appeals was based on the assessment evidence that was held by the centre. Our appointees, again, working teachers were those that were involved in making the determination on whether an appeal was successful or not. As I've said, there were issues around the breadth of that evidence. I did the evidence cover an appropriate amount of course material and was the judgment on that course material appropriate. There are two elements, the breadth and the standard. I'm going to move on to something else. Schools and colleges and centres will have utilised a number of different forms of evidence in making that determination. Okay, thank you. Can I move to Michelle Thomson and then Liam Kerr, please? Thank you, convener. Staying on this theme, I referred to an article from Test Scotland from the 7th of July in which it noted that attendance at school has historically been a problem, but Covid has exacerbated it. The long shadow of Covid still reaches particularly into certain socio-economic groupings. I wanted to know how you had reflected on that in your decision making around removing alternative assessment evidence that there are still these significant pockets of children for whom attendance has fundamentally shifted. I think that attendance per se did not feature in the decision around moving from one appeals process to another. In fact, attendance issues in isolation can impact on both achievement that could also potentially impact on alternative evidence, too. That's what I'm saying. However, in terms of the first part of the evidence that we have provided at this session when we've been talking about our grading approach, I think that what we have been very mindful of both this year and last in playing our part in providing support to learners and in doing what we can as a national awarding body is that we recognised that we needed to take that generous approach last year and continuing with that generosity through the sensitive approach this year. I think that what we are recognising is that learners have had very significant challenges over the past few years, and therefore it's been important that we've played our part in assisting and supporting that. That's why we've done what we've done. Absolutely, school attendance plays into that. The impacts of the pandemic and the legacy of the pandemic. One of the key fairness points for us was we might have somebody in one school and somebody in another school making an appeal based on alternative evidence and whether or not it was successful or not depended on how well the teacher understood the requirements of the course and the standards and the judgments that they've made. The appointees who are teachers and lecturers who looked at all the evidence in the alternative appeal service were disheartened at some of the packages of evidence that they saw and how it didn't back up the estimates that had been made. That is not in the hands of the young person. If you're like, I'm at one school and I'm dependent on my teacher making an estimate and having good evidence to back that up. I might be sitting in a school where the teacher might be a great teacher but they might not be good at that particular aspect and I don't have a choice about that so that's where the lack of fairness for the learner for us comes in because it was dependent upon the school and their understanding. That's where I'm a wee bit confused. This concept of fairness has a multitude of variables. I'm merely reflecting that perhaps another element of this fairness is the fact that there are still significant longitudinal effects of the pandemic in pockets of society of which attendance, which Tess Scotland should have quoted, put in 90.9 per cent, which is historically lower. That was their own estimates, I should point out. I'm merely noting that but I'll let other people come in. The only thing that I would add is that learning and teaching comes first and the support that individual learners get whether that's encouraging them to come to school or other things is in the hands of the school and where appropriate the local authority. None of us are underestimating the challenges that the education system has experienced over the past few years and therefore there's a system piece around the support that's put in place. I think that as a national awarding body it's actually very difficult for us to address differential disruption to learning. We had this conversation earlier in the year when there was industrial action by teachers in schools and in some cases some targeted industrial action in schools which meant some schools were closed and some schools were open. It's actually quite difficult for us to make adjustments to awarding to allow for that. I hope that there would be an acceptance by the committee that that would be very difficult for us to do. However, we can deploy flexibilities and we do. For example, in that case in the spring we made adjustments to visiting verification where we go out and undertake assessments in the school. We were able to offer some flexibilities to schools to help address some of that. However, I think that as a national awarding body there are limits to what we can do to address that difference across Scotland. Learning and teaching comes first and the support that learners get in their school, in their classroom comes first. Liam Kerr, thank you for your patience. To close out on the appeals process you pointed out that there was a pre-pandemic system, a during-pandemic system and a post-pandemic system. What's your early thinking on the future system going forward? Is this year's now the standard for appeals structures or will there be further revisions? I think that I've highlighted my commitment and that we have undertaken full evaluations of our awarding approach both last year and the year before. We will do so again this year. We have made adjustments to our appeals process that recognise that we've now got a direct appeals process for learners, so learners can make that decision themselves. It's a free service and that's quite different to the pre-pandemic, but it is based on the assessments that they've undertaken. We will obviously, as part of our evaluation, consider whether any further change is to be made, but I don't expect there to be a full iteration in the coming year. I wouldn't expect there to be significant changes to our appeals process this year. We will reflect on the evidence that comes through from this year's resulting. As I say, we've delivered priority appeals for those learners that are going on to university or for which the results are important for their next steps. We've delivered on that already. We're now going through standard appeals. We hope to result those before the end of next month, and we will publish the outcomes of those. We will see what the evidence tells us. We're going to take a different tack away from appeals and move on to some of the reform topics. I'm back to Michelle Thompson with this line of questioning. I want to allow you the opportunity to get some stuff on the record. You'll be aware that I'm new to this committee, but I have had a pretty extensive experience with large so-called transformation programmes in corporate life and they are invariably difficult, time-consuming and expensive. I just wanted to reflect on where we are with that. The decision to abolish the SQA must have a resultant impact on the morale of your staff. I wanted to get your reflections on that and to hear more about, from a leadership perspective, what you are doing to maintain morale in the organisation. Thank you for your question. As you have highlighted, organisational reform can be difficult, time-consuming and expensive. The then Cabinet Secretary for Education and Skills in June 2021 announced that the SQA would be replaced 27 months ago. That has created uncertainty for staff. I think that it's been on the record last year and the year before when I've been asked about this, that the SQA is full of colleagues who have great expertise, who operate with professionalism and integrity and all that they do. We've talked substantially in this session today about the work that the SQA has continued to do over this period to deliver for learners every year, every year before and every year since. We will continue to do that. It's important to highlight that, with organisational reform, it will be absolutely critical that we can maintain continuity of delivery over that time. We can't stop what we're doing to allow for organisational reform to take place. We have to manage that process and, at the same time, it's continuing to deliver and continuing to improve our services. I've highlighted that our appeals service is now direct to learners and it's free. That's been a service that we have delivered in short order, for example, over the period ahead. The uncertainty that the reform process has created has been difficult, and I will ask my colleague to comment on that. I think that what we've sought to do as a leadership team and in discussions with staff is to keep close to our colleagues at this time to be honest with them about what we know and what we do not know about what's happening next. Sometimes that's been quite difficult, and to keep our focus absolutely resolutely on the job at hand to deliver with integrity and professionalism as we have done over a period of time. We've sought to keep close to our colleagues in doing that. I'm sorry, but just to jump in there, when you say sought to keep close, have you got specific regular communication sessions in place with your staff? If you could give an indication of what they are and how frequently they are, that would be helpful as well. Over the last couple of years, we've taken regular exact team sessions with all staff. How often are they? At least on a monthly basis. Obviously, there's a series of directorates. There's lots of engagement within directorates. During the pandemic, we had regular pulse surveys, annual people surveys, all the things. We obviously have a performance framework. We were looking not only at the performance over qualifications delivery but also at a range of other issues, so keeping an eye on things like employee turnover, retention, etc. All those things, we've kept a very close eye on. Obviously, as part of our audit and risk responsibilities, we've been keeping a very close eye on ensuring that that balance is right around the risk appetite for the organisation as a whole. We've sought to do as much of that as possible. However, I acknowledge that it has been quite difficult because it's not been possible to provide answers to staff at times about what's happening or indeed what's happening next. The reform programme is a Scottish Government reform programme. This is an announcement that the Cabinet Secretary made in relation to SQA. SQA is a public body. There was an expectation, of course, that there would be legislation to create a new qualifications body, which would be necessary with a creature of statute. We have to have legislation in place to create a new qualifications body. The Cabinet Secretary made it clear in the summer that that legislation was going to be delayed, so there will not be a new qualifications body between before 2025. That's a considerable period of uncertainty. I think that it's really important in all the discussion and debate that has been had about the organisation and what we do and how we do it that we continue to improve our services. We continue. We've absolutely had a big focus on our communications, including with learners, the executive team is meeting with the learner panel next week as a team to discuss how learners are feeling. We are doing much more direct engagement with learners and me and my team are engaging with the wider system very proactively and that will include closer engagement with practitioners in the period ahead, so we've continued to do that. In terms of legislation and the sort of operating framework, if you were in charge, is there one change you'd say, well, this is what I would like to see? In terms of the legislation, SQA has been consistent in its wish to move from a voluntary framework for accreditation and regulation to one in which there's an expectation and this is to benefit learners that all publicly funded qualifications in Scotland are regulated and that provides assurance that all qualifications whether SQA qualifications, new qualifications, body qualifications or qualifications provided by any other provider in Scotland are of a high quality. At the moment, we have a voluntary system in which qualifications are only regulated on a voluntary basis. That sounds like quite a technical issue and it is, but this is about assurance that any qualification that is offered in our schools and colleges is of a high quality and with my other hat on which is a chief regulator for qualifications in Scotland, I think that's really important and I was really heartened to hear the minister for further and higher education and his statement in relation to purpose and principles and as a result of the withers review on the independent review landscape that his view was that the new qualifications body should have oversight for all publicly funded qualifications below degree level because I think that's really important to give schools, colleges and any other centre surety about the quality of the qualifications that are being offered and delivered in Scotland and that would put us in line with the rest of the UK. I know other colleagues want to come in. Mr Kerr, Liam Kerr, a very brief supplementary before we move to Willie Rennie. Very grateful. Do you believe that the new body will be in place by the exam diet in 2026? I think that the cabinet secretary has set out in June her expectation that legislation would be forthcoming in this parliamentary session, this coming parliamentary year. That's been set out in the programme for government and she set out an expectation that there will be a new public body in place by the autumn of 2025. On that basis there is an expectation that a new qualifications body will oversee the qualifications in 2026. The truthful answer to that question is that that depends on parliamentary process, doesn't it? It depends on the passage of the legislation through Parliament and the implementation of any legislation which follows. If all goes to plan as set out by the cabinet secretary there is no reason for me to believe otherwise that a new public body will be in place in the autumn of 2026. I was hoping for a brief response but can I move to Willie Rennie now? Michelle Thomson covered the impact of the structural reform and the impact of the delay. Are you losing good people from the organisation because of the further delay? I think that we have we certainly saw following the announcement by ministers on the uncertainty that that created and I think it's important for me to highlight that it did take unfortunately a number of months for ministers to confirm that there would be no redundancies so that jobs were safe that I think that that did impact and I think that we did see an increase in turnover and we did lose some people that we would not have wished to have lost from the organisation but only that I think that in that context it can be harder to recruit it can be harder to recruit to an organisation when the organisation is not going to exist and indeed for a period of time albeit that that was resolved and that ministers were able to confirm by the end of 2021 that there would be no compulsory redundancies but that's a context which is for me a chief executive and is still a senior director in the organisation and I think that Scotland we want to ensure that our qualifications body has the best people possible that that was difficult so so we did we did lose some people and we found we have in some circumstances found it harder to recruit all that said are many fantastic colleagues have remained and remain committed and have continued to work hard to benefit learners and we have some exceptional colleagues many exceptional colleagues in the organisation who have remained committed to SQA and to the period ahead this is really interesting and quite disappointing in many ways did Government ministers respond to your pleas for some certainty about redundancies? Yes, they did and how long did that take? It took a period of time What, how long? I think a period of five or six months Right so five or six months of people potentially leaving the organisation are you able to quantify it? I don't want to over or indeed underplay it but there's no doubt that we found recruitment and retention in that environment and to an extent continue to see recruitment and retention in this environment as more challenging than it might otherwise have been Can you quantify it for us? I don't think so I don't think so I don't think so I don't think so Can you quantify it for us? I think that I don't have it's hard to quantify the effect but we certainly saw the effect and then the further delay then so the decision in June that there would be no legislation for a period have you seen an effect of that at all on recruitment or is it just compounding the same issue? I'm not aware of a particular effect no It does affect stuff Morale is the prolonged periods of uncertainty in relation to what is the shape of the new organisation everybody will be asking am I going to be part of that or not or how people think about how their roles are going to change they're also obviously we've had the publication of the Haywords review and the Withers review what the Government response will be to that so there's some frustration that when are we going to be able to make changes to qualifications and so that frustration comes through particularly from my directorate because they would like to make changes to the qualifications to improve but obviously we have to so you get those sorts of frustrations coming through In the interest of time there's a really really interesting topic when it comes to the people I'm sorry I am going to have to move on from this subject if we don't mind we have a lot of content still to cover and I have now got my eye on the clock have you got anything further to ask in this space okay thank you Mr Rennie can we move to questions now from Ben Macpherson please just briefly on that last point of consideration is there been an impact on your international work there's been a focus on the domestic but it could be succinct our international work I think that it's an important point there's a broad point in relation to reform in the context of the brand of SQA having quite a lot of recognition overseas so SQA and the work of SQA and Scottish education is seen positively overseas so moving to a new organisational name will potentially mean some things that we just need to manage because so I think our international work obviously over the pandemic and the period since the pandemic has been subject to some fluctuation in a way that you might expect we obviously had some assurance from the Government should that be required for our international partners around our organisational on-going concerns so that's been dealt with in the normal way and we've continued to see some growth in some of our markets but in some cases there's been a bit of a pandemic effect but overall I think we still are conscious of the convener so well if there's anything further on that that you wanted to add please follow up in writing I'm happy to provide further information if we can be I'm moving on to considerations around the delivery board I'd be grateful if you could share what the target operating model for the new qualifications body will be what it would be so as I highlighted my answer to Ms Thompson in relation to the reform programme with the new qualifications body I would be grateful if you could set out what the target operating model for the new qualifications body will be what it would be so as I highlighted my answer to Ms Thompson to the reform programme we've been part of a Scottish Government reform programme and a number of pieces of work and there was a strategic group being set up and delivery boards a delivery board for SQA to take forward some work in relation to the national qualifications body which includes a range of stakeholders and there's been a similar board for the setting up of the new inspectorate so that's been the setup of the governance arrangements in relation to reform as part of that we were asked to develop a target operating model aligned with the design principles which were broadly based on the design principles that were brought forward by Ken Muir in his report in relation to the work that he did as part of the reform programme and the delivery board oversaw the production at the Government's request of a target operating model otherwise known as Tom and that was to be prepared and submitted to the Government in June of this year which we did I think we made it clear in relation to Willie Rennie's question in relation to reform or generally is something really important in any organisational reform which is function followed by form so function first so form follows function and I think one of the uncertainties in relation to the new qualifications body was in relation to function and not least given the independent reviews by James Withers and Louise Hayward put bluntly you need to know what a new organisation is going to do before you can fully understand how it is going to be structured and so on and so forth and so the target operating model has been submitted to Government on the basis of what we currently know so I think I would consider the target operating model to be new qualifications body as submitted to Government to be work in progress on the basis of what we knew at the time but what we have done is aligned the target operating model around the design principles for reform which is about being user centred data focused flexible to change coming from the operating environment that will be a learning organisation as well as being a digital space around learning and that we will be digital by default that will be collaborative and will operate in a sustainable way so we set out through some work and that includes looking at route maps in terms of the customers that we have and the products that we currently are responsible for and we set out some thinking around what a target operating model might look like but as at the point at which it was submitted there were frankly some missing pieces because the Hayward review had only just reported that has not yet been a conclusion to the Government's consideration of that report or James Withers report purpose and principles there was some information came out in relation to that for the tertiary sector at the end of June we've obviously also had the national discussion so there's a number of things that need to be considered before that can be concluded and in relation to those design principles if the new organisations are truly to adhere to and align with those design principles there will need to be investment there will need to be investment in our systems investment in our processes and I'm sure that that will be the case similarly for the new national education body and the new inspectorate so there's quite a considerable amount of work still to do there the Government has also set in train some thematic reviews which are the kind of connector pieces for the reform programme and those are still at a very early stage and those do absolutely feed into the target operating model so things like digital culture there's a whole range of thematic projects so those are still at a very early stage and I noticed in the spice report I am going to have to ask you to cut down your response times to these questions I understand the importance of this I absolutely appreciate that but I also have to highlight complex issues complex issues no further questions for me I saw that Michelle Thompson was looking to come in on this point is it a brief bit just now in the interest of time okay thank you very much and can I now bring in Willie Rennie please thank you it's probably the most critical I've heard you speak of the Government I can tell there's a frustration that there's a lack of clarity you talked about in terms of employment but also in terms of clarity on the function I would quite like your view about the in terms of the Hayward review and whether you think that's headed in the right direction because I noticed that your chairman at the time said we must ensure that change can be delivered successfully with a hint that perhaps the current direction of travel was in danger of not pursuing that successfully I'd be interested in your your view on that first of all I think I probably used to say that I don't think I have been critical of Government but I think I have a responsibility to be honest with the committee about where things currently are and to be fair to the Government the Cabinet Secretary on the basis of the reviews that have been completed and on the basis of the point I've made and I've been advocating for a period of time since the announcements were made back in 2021 it is important to consider function first and therefore it's important that that consideration takes place so I think that's important but in relation to the Hayward review I think that the Hayward review is a substantive report it considers some quite potentially some very considerable changes to our education and skills system and that we need to be very carefully thought through and actually organisational reform and the order of that needs to be considered quite carefully and in order as I've highlighted to the committee to ensure that we can continue to deliver successfully to today's learners at the same time as preparing the ground for preparing for tomorrow's learners and future learners that's really important that we need to make sure that we continue to deliver successfully at the same time as making the changes that any changes that may come from Hayward I think that for me there are a number of important considerations that need to be taken forward in relation to Hayward and we've made some points quite consistently through a period of a number of OECD reviews and another of a range of other reviews that have taken place over a period of the last few years firstly that any changes to assessment and qualification should be seen along consideration of matters of curriculum, design and pedagogy so Louise Hayward's report if implemented would change significantly the curriculum models that would be in place in our schools and would therefore give rise to quite significant consideration of the things that learners have experience of and issues that this committee and predecessor committees have spent quite a lot of time on subject choice, numbers of subjects et cetera if the Scottish Diploma of Achievement were accepted as something that Scotland wants to pursue I think that there would need to be and I think that Louise has been very careful about looking at the conditions for success and the investment in the system that would be needed I think that it would be very important that such an SDA could benefit all learners whatever their pathway and that we consider very carefully any unintended consequences particularly around equity and particularly around the personal pathway element of the SDA I think that there is a need to promote further integration and choice to ensure that every school is offering a rich curriculum and I think that that's at the heart of the breadth that is evident that the vision for assessment and qualifications that Louise sets out I think that as I've highlighted to the committee I think that clear models of change across the education system clarity of role and responsibilities is really important so those conditions for success and the need for sufficient investment and being clear about what is required will be absolutely critical to ensure that there's capacity I think that there's already some concerns emerging around workload implications from some of the recommendations that have come up but not only capacity but capabilities, systems, technology that would be required to truly make this a success and something that I feel very strongly about from the perspective of my role as chief examiner there's really important issues around the principles of assessment around validity, reliability practicability, equity and fairness that need to be at the heart of any assessment and qualification system and that's really important so my view is I am sitting here with my eye on the clock four minutes to respond I'm not going to ask any question I think that the fact that you've got a long list there is an indication of your anxiety and I think that's a clear message to the Government about the process of reform finished thank you and can I move finally to question from the new vice convener Ruth Maguire thank you convener and thanks for your evidence so far panel I'd like to ask about SQA's communication with the profession and with young people had previously been a bit of a theme of criticism and the committee had heard in the past SQA described as an unlistning and distant organisation I see that your SQA's position is that it does involve teachers, teachers are integral to the work and not from your submission that you are committed to and I'll quote you here incorporating perspectives and experiences of teaching professionals and learners in our decision making processes and you also say that you have a refreshed engagement programme could I just ask you to give us a bit more detail on that and I suppose specifically I'd like to hear how if I am a teacher or a learner involved in that process how I know that the SQA are listening and responding and reflecting back you know the diversity of views out there thanks very much so just firstly on a communication two aspects there's communication we are imparting messages about what's happening and then there's engagement which includes discussion and debate about decisions that we need to take so I'll start with communication I think I would acknowledge when I started in this role that a lot of our communication with learners was through centres you know so that we focused on communication with schools with the expectation that those messages would be passed on and indeed in many cases they were but there was very little direct communication between SQA as an awarding body and individual learners and that's something that we've sought to substantially address while maintaining that obviously that very important relationship that learners will have with their school so we've done much more direct engagement direct communication first of all and that includes for example this year booklets in relation to national qualifications and everything you need to know and I actually wrote to every learner in Scotland directly with information on our new appeal service there was no ambiguity and no uncertainty about what that was and that included key dates included a full explanation of what things were we've also increased our communication on a range of social media challenges social media challenges channels to ensure that we're reaching learners directly and we've had some good feedback and indeed we've engaged with learners themselves on what good channels might look like to ensure that it is hitting the mark and we've had some good feedback and we've got some really good data in relation to the reach of our communications this year which is in some cases up to 300% so I think that's that we've got some really good data analytics on the reach of our communications with learners again similarly with practitioners I think that there's been a range of approaches that we've taken there's a weekly centre news which goes to every centre in Scotland which includes core messages but I've not managed to I think that that sounds really positive to all the conveners eyeballs on me will we move to the engagement bit you've given some good examples of direct communication so we have a group of around 4,000 practitioners, learners, parents carers who we can survey at any time so we do and we've been using a variety of tools as part of our research to elicit views on a range of issues and over the last three years I've chaired the national qualifications and the HNVQ groups which include a group of national stakeholders and in 2021 for example the national qualifications group met every single week to talk about the decisions that we needed to take in relation to provide advice on the decisions that we needed to take that's also combined with our communications group so that we're engaging on how we're communicating with learners and Jill's directorate at a more granular level have a range of national qualifications subject teams which are groups of subject practitioners we're currently looking at that approach seeing if we can widen that approach let me just jump in again sorry, I've been rude here I'm reflecting on a response that was given to Ross I think that Ross could hear about a decision being taken that one section wasn't content with I think he said the young learners panel I believe I think it's interesting how organisations feedback when they're not going to do what their surveyed people would like them to do could you speak to how that decision to I can't remember what the decision was now, apology to do something different yes I think the engagement is important and the quality and integrity of that engagement is important but I'm sure the committee is aware that in education there is usually a variety of views on most things and we have to make decisions and we have to make balanced decisions on the basis of the responsibilities that we have and the feedback that we hear and actually most organisations have to absolutely get that so we have to do that so we obviously have we seek to collect views we undertake evaluations so we I appreciate that my answers are long because there is a multi-layered approach I guess I'm asking you about a specific example so that you can give me a quite short answer back because it's not about the wider it's just an example of how the organisation might do that we could give you an example maybe follow up afterwards on how we've conducted the evaluations of 22 and how we will do it for 23 so it's a range of survey work focus groups engagement with the range I think I understand what you said you're talking about how we feedback the decision and is that what you're talking about so I think what I'm looking for is in completely understanding the range of views, the range of people we speak to we had an example there of where one of your stakeholders views wasn't taken forward for presumably completely understandable and legitimate reasons as a reflection of how you're communicating well with your stakeholders how did you manage that in telling them why and how did you tell them I think we tried quite hard to continue that conversation about the decisions that have been taken and of course a decision like the appeals approach needs to be taken through our advisory council or qualifications committee and then it's a decision of our board and there will be formal communication in relation to that decision but there will be on-going engagement with those groups so the learner panel for example is on-going engagement with that group about the feedback that they've provided and the decisions that we've had to take and that's what we do and that's an on-going process in the normal way I'll accept this, it's maybe in the way I'm asking the question but I don't feel like I'm getting an answer there and I don't know if it's reflective of the communication style of the organisation I'm not sure I'm understanding your question, apologies but I think what I'm trying to say is that we are to make evidence informed decisions that are the culmination of engagement and discussion and that feeds into a formal process as we are obliged to do as a public body including through our board and that we will feed back the decisions that we have taken in a variety of ways both formally and informally as part of our on-going engagement with different groups and I think that's what I think that's okay, thank you okay, thank you very much vice convener for drilling down there can I now thank the panel for their time today and the public part of our meeting has now concluded and we will consider our final three agenda items in private, thank you