 Good morning, and welcome to the fifth meeting of the Education and Skills Committee in 2019. I remind everyone to turn their mobile phones and other devices to silent for the duration of the meeting. Today we have received apologies from Ian Gray and Gordon MacDonald. The first item of business is our inquiry into Scottish National Standardised Assessments. This is the fifth panel of evidence to the inquiry, and I welcome this morning Lindsay Law, convener of connect, James McEnany, lecturer and journalist, Darren Northcote, national officer of education of the NASUWT, and Susan Quinn, education committee, convener of the EIS. Welcome to you all. The panel, if you would like to respond to a question, is a big committee and a big panel, so you may not want to contribute to every question that has been asked, but please indicate in myself and the clerks, I'll try and pick that up. For the benefit of those watching, I just want to explain that the committee had an informal meeting with teachers on the topic this morning, which may be raised informal evidence, and I would like to again put on record my thanks to all those who attended the informal session this morning. I'd like to just open by asking each of you to just briefly outline your experience as it relates to an inquiry, and I'll go to Ms Law first. I'm the convener of connect. Connect is a parents group and a registered charity, and we provide support to parents and carers all over Scotland. We have a membership model, so parent councils, PTAs are members of connect, and they can access additional services. I've been on the board of connect since, I think, 2016, so three years. Prior to that, I was on the education committee as a parental representative for the City of Edinburgh Council, and I've been involved as a parent helper or in parent councils since my daughter started nursery school in 2007. I'm the education convener for the EIS, which is an elected position within the institute. I am a primary teacher to trade and a primary head teacher up until the last few years. I've been involved in the work around the SNSAs and the wider assessment curriculum work as a member of the curriculum for excellence management board and subsequent follow-up boards in different groups in relation to that. I was a secondary school English teacher, and I'm now an FE lecturer. I suppose that I'm here because when the policy was first announced, I tried to investigate the origins of it through what of that point was my first FOI request. I spent a year dealing with that and trying to get the information to publish it. Since then, I've had an interest in looking at the development of the policy and the way that it's interacted with issues such as government transparency and policymaking. Good morning. My name is Darren Northcott. I'm a primary teacher by background. I'm currently the national official for education at the NSWT. We represent members in Scotland who are engaging with the SNSAs directly. We also represent members across the UK. There's a degree of compare and contrast. There are different approaches elsewhere in the UK that it might be interesting for the committee to consider. We're also involved internationally, so we have a great deal of experience in how other jurisdictions introduce nationwide and system-wide assessments. So there's hopefully something that we can share that's of use to the committee this morning. Thank you. I'm going to move to Les Smith. Thank you, convener. I'm particularly interested in this issue about what I think is the central dilemma that the committee has been facing about the purpose of the standardised assessment. We've had, as you know, three evidence sessions prior to this during which time. I think that it's been put to us that there is a dilemma because, in some cases, the assessments are being used to measure individual performance by the child, but at the same time they're being used to have a summative purpose so that schools and local authorities can drill down on where there are issues about underperformance. I'd be very interested in your views both from a professional background as to where you think we can go with this dilemma, because obviously there is an extremely important educational argument that the child and the best interests of that child matter most in the assessment, but on a national basis there is obviously this concern that there is some underperformance and whether it is used in that perspective. So I'd be interested in about how you feel we could address that dilemma. So shortly before the announcement that the SNSEs would come into play, there was a national discussion about how we would gather information about where education was within Scotland. There had been the Scottish Survey of Literacy and Numeracy in place for a number of years and there was some challenge around that towards the end of its existence. Part of the work of the management board and the wider stakeholders was with the then cabinet secretary for education to discuss what was needed in terms of going forward. What was generally agreed within the stakeholders at those meetings was that there was a body of evidence within our system that told us everything that we needed to know, that the challenge was how that could be gathered to give you as elected politicians a national picture that was going to be easily understood or otherwise, but there was no real feeling within the room that there was any requirement to introduce an additional test into the system as a national test. Now what's come forward from that then is an assessment test, whatever we want to call it, because we're at cross-purposes around it, that has been developed and taken account of some of the concerns that certainly the trade unions and parent groups raised at the start of them, that we didn't want to return to what we had under 5 to 14 and the national tests there. What we've got is something that actually is trying to do all things for all people and potentially can't, so the SNSA, as it's been developed, seeks to provide some diagnostic information across potentially only about 10 per cent of the curriculum base, so it's not providing the widest of information to schools but can be used to look at individuals and groups within classes. One of the challenges around that then has been how it's been implemented in local authorities, so had there been a situation where this had been introduced and it had been left to teachers and schools to decide if and when they're required to use it to inform their professional judgment, then it may well have provided a valid resource in addition to what's already there. Schools will have heard use a broad range of assessments, including a number of standardised tests and assessments for specific purposes, and those will continue regardless of what happens with SNSA. In terms of what it provides for a national context, it's how reliable that can be. If we use it to drill down for teachers, because it's going to be done at different times in the year, it's going to be done with different groups of young people, different supports are going to be provided, so it's not standardised in the broadest definition, and that then brings its problems. We would suggest that the information that is now being gathered around achieving a level that sits within the national improvement framework bank is the kind of evidence that you should be looking at and parents will be interested in, because it gives the broadest picture of the young person, not just 10 per cent, of their learning across a period of time. That in itself is becoming more reliable, because I know that that was the question at the time, is how reliable that is. That's now becoming more reliable as teachers and schools work together to moderate the information that's there, that they get a better understanding of the benchmarks that have been introduced only in the last year or two, so we've been working with a curriculum and then benchmarks were introduced after the fact, so there's a lot of work needed to be done. That could have been resolved before SNSA was introduced. I would argue that it should have been resolved, teachers should have been given the time, the training and the space that they needed to better understand the levels and the standards that were required to make it a more reliable system for you, and then we might not have needed to have gone with this. Thank you very much for a very full answer there. Can I ask Lindsay Loh if she feels that parents understand the purpose of the new assessments in the way that informs them about how well their child is doing at school? Do you feel that that purpose is clear? No, that purpose isn't clear, partially because although they are described as standardised tests, which would imply to parents that they would happen at a certain time and in a certain way, across each local authority and in fact across each school, the way they've been described to parents, the way they've been communicated to parents is different. Some parents have told us anecdotally of letters coming home telling them that this is a mandatory test, that they have no option, that they can't withdraw their child from it. That is partially against a milieu of a dysfunctional and difficult relationship between central government and local authorities. Parents and children are being used as a playing piece in that dysfunctional relationship, and that's not helpful to the individual child in the individual classroom. Just in terms of what parents get from schools now, there's been a complete cultural change in Scottish education over the last decade. It was reflecting on the reports that my parents would have got when I was in school, because my mum kept them, so I have them. There are literally lines of maths and a score, an English and a good, better, best. Whereas my daughter's reports, especially in primary school, are much more descriptive, they talk about the broad general education, they talk about the outcomes in the curriculum for excellence, they talk about how they're becoming responsible citizens, effective contributors and those can't be measured once in primary one in a very narrow focus of literacy and numeracy. What you're doing by suggesting that they can be to parents is that you're confusing parents, because it's quite difficult when you go to school and get a report that has a whole load of descriptions about outcomes that you're not familiar with. It's difficult for teachers to get parents to a place that is removed from where they were when we were at school, which is simply how are you going, what level are you at, what progress are you making against very easy to understand scores. Because of that, what parents and teachers need to do and what we encourage at Connect is to have a conversation, what is the potential of the child, how is the child getting on in school, what can we do across that whole curriculum, and this narrows the focus and somewhat undoes the work that has been done on the assessment and the monitoring of progress in curriculum for excellence. Finally, when curriculum for excellence was introduced, I recall that we had people from the SQA and the National Parent Forum for Scotland come to the consultative committee with parents, and they said to us, look, it's going to take some time for this to work through the system. It's going to take some time for schools and for teachers to become proficient at assessing where your children are. Parents don't have time. For a parent, they have one shot at schooling and their child has one opportunity to meet their potential. If those tests, if those standardised assessments are aimed at improving the system over the long term, but at the same time are intended to improve the experience and the outcomes for children in the classroom, at present, it's very difficult to see how they'll do either. My last question, convener. I just think that my question might pick up from the other witnesses. Can I just ask you about the comment that you made about where you feel there's a dysfunctional relationship between national government and local government? Do any of the other witnesses feel that that is because they are looking to these tests for different reasons and a different way of assessing? I'll just respond to that briefly. Your opening statement said that one of the purposes of these tests could be to help teachers in effect to make effective teaching learning decisions about pupils' next stages in their learning journey, and then also for the system, if you like, to get information about where it is at a national level, at a local level and a school level as well. So, if you like, a formative purpose and a summative purpose with one assessment, and that has never been achieved anywhere. So, I think there's a prior step in terms of analysing this, is do we want these assessments to be formative, or do we want them to be summative? Because it is difficult for one assessment to achieve both ends. I mean, I was just quite taken by the evidence that you received from Education Scotland, where on one page it says that the tests are designed to be used formatively and not as summative assessments. And then on the very next page it talks about how the assessments have been used to form judgments about the effectiveness of a particular department in a school. So, if there's confusion at that strategic level, it's not surprising then that teachers and parents question what is the purpose of this assessment and how should it be used. So, formative and summative assessments are completely legitimate policy aims, but we have to be clear about what this assessment is trying to achieve, one or the other, because experience tells us that trying to achieve both is very difficult. As various people have said, the word confusion keeps on coming up, so I've been sort of viewing this as a bit different from other people's because of my interaction with it. But if you look back through, as I did before, coming here, through all the material that I've dug up in the last kind of three and a half years and you look at the email chains and documents and policies and everything that's all shifted about all over the place, confusion is the word that sums up everything most effectively, I think. So we have a situation, as you say, where we claim to be trying to have a testing system that's going to be performing a summative function, which is the thing about ensuring teacher judgment is reliable, is to help to ensure that that's something that we can trust, which is a separate debate anyway. But that's a summative assessment that we're trying to do there. We're also apparently using them, as you say, for a formative assessment to tell teachers more about each individual child in the class. The notion that you're going to be able to combine those two things with a single assessment is optimistic, I think, at best. I would agree that I'd be surprised if there'd be any evidence to show that it's something that is likely to be possible. In trying to do both jobs at once, I think what we're actually probably doing is actually doing neither of them well at all and doing quite a lot, potentially doing quite a lot of damage while we're doing that. And it goes back to the origins of it all. There's a testing system that was initially, when it was sort of put forward, was quite clearly conceived as something about national level data. This was going to be a national measurement because that was what we needed. That was why, ultimately, it's been seen as having incorrectly, seen as having replaced the SSLN. You then go from there into right well, we'll just use it to inform teacher judgments and that'll be the national picture and then the tests will be part of that teacher judgment as well. But then, when you're confusing the two test purposes, you're also in a position where you're essentially saying, we trust teachers to make these judgments and we're going to rely on teacher judgment as a national measure of an education system, but somehow we don't trust teachers to make those judgments unless they're using a standardised testing system that we think has got two or three different purposes. This goes on all the way through this. There's no level of this testing system that I've looked at in the last three and a half years where some screaming contradiction doesn't seem to come out of it and it leaves me feeling that ultimately we're going to be in a situation now where a lot of people three and a half years ago said we would be, which is a committee sitting and having a discussion about what is a formative assessment, what is a summative assessment and are we ever going to get any closer to having some sort of magical testing system that gives us national data and actually even on the point of the summative and formative thing because it comes up all the time, this confusion over it and the argument that this is some sort of formative testing system. The rate came up most recently when the Scottish Government in response to an FOI request cited two academics as having supported the policy and when it turned out they both said, no that isn't true, the defence was well. What we thought was that they supported formative assessment methods, we're really sorry if we misquoted them. I actually contacted one of those individuals who was Professor Dylan William and put the First Minister's response during First Minister's questions to him. It had actually been for a story but there are some things going on right now making it hard to get stories in the press that aren't to do with Brexit. His response was quite clear that these tests, specifically at primary 1, do not provide useful formative information and anybody who knows anything about education I think we probably understand why Professor Dylan William saying that is quite a big deal as well. We're not going to get any further forward with any of this and that's before we even get to these ideas about closing attainment gaps and dealing with poverty through schools until we can actually nail down what this testing system is supposed to be about because we're still no closer to doing that three and a half years after Nicola Sturgeon's judge me on my record speech in Wester Hills. I don't wish to trade in professors here but since they've been mentioned and some very important points have been made about the purpose of the assessment, I was just interested in this idea about assessment having more than one purpose potentially. We've obviously heard quite some evidence on this in previous hearings. Professor Hargraves has pointed out that there's a general principle that many but not all people accept that data that is collected for one purpose should not be used for another but that does not mean that data should not be collected for two purposes. My question really is about this question that some of you have pointed to, which is about the need and the pressure on local authorities to try to close the attainment gap and deal with the inequalities that exist very early on in a child's life. My question is if it's not this data that you want to be used, what data is it that you'd want to be used as a benchmark for local authorities and others to try to address those problems? The benchmarks that schools and early years establishments have on young people from the very early stages, the widest possible assessments that go on, the young people will have assessments that are carried out by health visitors into the early years establishments and a number of approaches that are there that then go on into schools where there are benchmarking early intervention strategies put in place from the beginning in primary schools. There will be very few where they won't be assessing the young people's point of entry at the point of entry to primary one. They will do that using a whole range of strategies, whether it's teachers' observations of the young people at play, whether it's their own formal approaches to the assessment that they've had. There will be a whole host of pieces of information about the young people from the very earliest stage to see where that's at and then you get into the fact that at the end of primary one teachers are making a professional judgment as to whether or not the young person has achieved in the early level. That will be based on three years' worth of assessing of a young person. It won't be done in a single session, as we're generally told, in a single half-hour session, assessing their literacy, in a single half-hour session, assessing their numeracy. It won't give you that information in terms of closing the attainment gap or otherwise. It will always be a position where the teacher's professional judgment and the informal assessments, the informal interactions that happen day in, day out between teachers and pupils, support workers and pupils, parents and teachers and everybody that's involved with the young person will provide that data, which will then allow us to see whether or not the young person has achieved in the level at the appropriate point in time at the end of primary one, primary four and so on. And then taking into that any other specific diagnostic assessments that are identified as potentially being helpful to an individual young person because there's a conversation about whether or not they have additional support needs or otherwise. Those assessments won't identify whether a young person has got additional support needs or not, won't do it, it will require other assessments to take place and those will only take place if there is a really broad basis for assessment. So if you're saying are these assessments going to be the benchmark by which you determine whether or not the additional money for PEPF has had its impact, then no, these single half-hour whatever time it takes them to do assessments won't do it, it will be the broad assessment bank that goes on in schools day and daily and the conversations between head teachers and their teachers about whether or not the young people are achieving the level based on the benchmarks that have now been introduced and the ease and knows that they've been planning around. Yeah, I'm going to bring Ms Law in and then I'll come back to Dr Harlan. So I think there is confusion from parents over the diagnostic nature of the test, so parents might expect that they might get a diagnosis of dyslexia or something else that might need additional support. So that was just my first point that there is confusion over that. But my second point just in regards to gathering data and closing the attainment gap is that we don't, when young people leave school, we don't talk about their literacy and numeracy, we talk about positive destinations, we talk about the people that they have become and actually schools are becoming much more creative and much more broad in their understanding of the broad general education and the curriculum. You know, they're undertaking work with colleges, work with local enterprises, bringing the community into the school, taking the school out into the community, all to drive positive destinations for our young people, which may or may not be academic when they leave school. By focusing at each stage on a very narrow focus of literacy and numeracy, you really are detracting from the message that you sent to the people of Scotland about that broad general education and what the curriculum for excellence was supposed to attain. My concern would be that some of that really great work that's being done in schools on those positive destinations will be, that attention will be drawn away by the pressure that they get from local authorities, by the pressure that they get from official or unofficial league tables, which would spring up off the back of this data, from a very narrow focus on literacy and numeracy, which is exactly what the curriculum for excellence was supposed to draw us away from and focus on the whole child, the whole young person and how they move forward to be effective and happy responsible citizens. Dr Allan? Just on the back of that, I absolutely accept what I was saying there about the importance of the conversations that are taking place throughout a child's school career between parents and the local authority and the other sources of information that exist. I suppose my question is really does that provide data that would be usable for local authorities to make policy interventions, because I appreciate we're talking about two things at once here, we're talking about interventions in the life of a child but also policy interventions. Do those kind of conversations in your view, I don't know the answer to this, do they provide the kind of information that would allow local authorities to make policy interventions? They should be able to and certainly with schools talking more outwardly to each other and across local authorities and the like. The information that's there would be around those conversations about what kinds of interventions a particular school has been developing, whether that's made a difference to the achieving a level data that they have with its broadest sense, considering whether or not those interventions are something that might be transferable elsewhere, with an acceptance and an understanding that education isn't one-size-fits-all and that you have to look at the context and the determination about it. I'm absolutely clear that if the conversations are going on properly, if the time and space is there to do that, then this kind of evidence, which is based on much more than a very narrow approach to the standardised assessments, note that the information around the standardised assessment that's been gathered nationally is to consider whether there are norms around particular areas. It's only going to tell you about 10 per cent of the curriculum in a point in time that's so very different and we haven't got common approaches to things across the country because the country isn't a common space in terms of what will work in one place and what will work in another. Using the wider educational information that I believe does already and has done in the past made a difference to local policy. I've sat in Glasgow City Council's assessment curriculum group on a number of occasions over the years and we use the information that comes from our schools based on the information that headteachers are telling us to drive policy at local level. Just briefly to return to your first question about whether there is a need for national and local authorities, national bodies, local bodies to have data and educational performance. I'd say yes, they're absolutely yes because they're public bodies, they're run by democratically accountable and elected people and therefore there is a need for them to have good information in order to make national and local policy. I think the issue is that if you try and make that policy on the basis of a very narrow range of indicators or a single indicator, I think the problem is that you may not make the best policy. So a national assessment may have a role to play in that as long as you're clear that that's the role it has, but I think it's also important, as we've heard, to make sure that there are other sources of data and information as well. And I would just add that it's not just about assessments. I think inspection has a really important role to play in giving policymakers at a national and a local level some understanding of the progress that's being made in the education system, the impact of interventions generally and on different groups and what policy, I suppose, implementation needs to be taken forward in order to address any problems that are identified. So it's about having a range of good quality information and data on which to base policy decisions, not simply to focus on one standardised assessment across the system, although that may have a small part to play, but we need to put it in perspective in that sense. Two points. One is going back to that first thing about having national level and council level data. We did have something that gave us national level data, which was the SSLN, and it didn't break down to council level, but I can't remember the name of the paper. Assessment and Transition? There's a paper that was done by Glasgow University, it looked at all sorts of different things, but one of the recommendations and ideas in there was could expand the SSLN to perhaps include local authority level breakdowns, which would have been an idea that would have been worth looking at that could have taken us some way to having this national picture that we could also have looked at in a more sort of localised fashion without ever needing to go down to how can we get school level data about every single school because the potential problems that that create may very well be likely outwey any benefits that you would get from it. The other thing is, so the point Susan was making about this thing, and it kept coming up at the informal session as well, this thing about time and space. So there seems to be this idea that by introducing this test, some bit of data is going to arrive, which is then going to allow a council or a government to make policy interventions, which is going to allow this bit of data is going to let us do this thing, which will make things better. So firstly, as people have said, it simply isn't going to be able to do that, and there's no point kidding ourselves on that these kind of tests will. In order to get that kind of information and to create those kind of improvements and to have a system where we can actually achieve that sort of progress, time and space is the key thing. You need to have a system where professionals are able to be professionals with the sorts of discussions that are being mentioned there. There's actually time and space for them to happen, and not just happen at council levels, but happen at school levels and then across local authority levels and then across national levels. I know that it's not a hugely convenient thing that there just isn't a single way to do it, there isn't a straight easy answer, there isn't a quick answer to that. The only way to get that kind of information and to have it reliable enough to make it worth acting on and to get a situation where we could start to, as we say, transfer in a useful intervention in one place and decide on whether that might work somewhere else and maybe transfer it over and try on it, it is a necessarily, I think, slow and careful process as opposed to one where we focus on getting data quickly and finding a use for it or telling ourselves we've got a use for it because we're in a position where we feel we need to be seen to be doing something. There's a lot of damage done a lot of the time by that constant need to be seen to be active rather than carefully working through the kind of slow processes that people at every level of education keep consistently saying are the key thing that we need to focus on and therefore the risk with the SNSA debate or national testing debate is that the more time we spend on that, the more time frankly we spend in chambers like this having this kind of conversation, the less time we're spending doing the things that we should be doing that are actually going to make, that could potentially make some sort of difference to addressing the impacts of poverty in schools, although that itself, I would argue, is also an issue that we don't really understand very well. I just want to pick up on this point from Lindsey Law about literacy and numeracy and broad education. I understand that you don't want to focus entirely on a very narrow assessment, but it isn't not reasonable to say that one of the things schools need to be doing is to give children the building blocks in order to access broader education and that, in fact, being disadvantaged in terms of literacy and numeracy from early on, then feeds its way through exactly to the point where what your positive destination is going to be and some of them are not very positive at all. How do we get that balance right? I was a teacher in the late 1970s and early 1980s and there was a bit of a thing about we don't need any of that rigor, we just learned through reading or whatever it might be. How do you address that question that it is a fundamental means of opening up access to education to be able to be confident in your reading and mathematical skills? I don't think that Lindsey was suggesting that we didn't have a focus on literacy and numeracy in health and wellbeing or the core of what we're doing across BGE and the transformational work that's being done in relation to those areas as a result of the development of curriculum for excellence is there for people to see within schools. However, by focusing a national test on those two areas draws away from much of the other work that's going on in schools, Lindsey will speak for herself. I'm sure if I'm misrepresenting what she's saying. The fact is that those tests won't make sure that there's any more rigor in terms of the work in literacy and numeracy than before. Those tests won't do that. The rigor is there, the work is on-going, the transformational work around looking at literacy and numeracy in particular at the early stages and the development of play-based pedagogy in terms of developing literacy and numeracy and tackling the issues around the gaps in learning that young people come to school with are there to be seen in our schools. All you need to do is visit them on a regular basis and you'll get sight of them. You will get information on the impact of that through the work and through the assessments and the discussions around achieving a level. The standardised national assessments will give you no more than that. That's what our teachers are telling us in the first year of using them, that even where people say, either okay, they tell us, it has told me nothing much more than I already knew about the young people in my class. So if they're telling you nothing much more than you already knew, why waste of time doing them? Why not use your time to teach the kids and get on with the rest of the bits? If you're assessing, you're not teaching. Even if you're saying they're only taking half an hour, they're not because the way they're having to be worked on, teachers are having to take their time away from addressing the teaching needs for the whole class to work with a couple of people on a computer because there's not the infrastructure there or otherwise. So there are all sorts of reasons why if this information isn't adding given added value to the system that's already in place, then it's not worth doing. It's not that literacy and numeracy aren't the focus, they absolutely are the focus for it. I would echo Lindsay's point, which is that we focus too much on narrow bits of what we're doing within the education system. One of the ways that you will know whether or not the interventions that are happening at the early stage are working are looking at the destinations of our young people. If we continue to improve on the positive destinations and the positive experiences there, then that will show that we are making a difference all the way through the system. There has to be some rigor around what's put in place to make sure that no child is being missed out in terms of the work that's there. However, literacy and numeracy are at the core of what we're doing. It's just that there are many things more that we could be looking at and promoting in terms of our education system than just what the literacy and numeracy scores are at P1 or P4. Mr Nofficle? Just again briefly, literacy and numeracy are foundational. I think that that has to be recognised. I think that one of your many professors who's given evidence said that although the areas of assessment for SNSAs are narrow, they're quite important and that has to be recognised. I don't think that there is a challenge to policymakers at national or local level having a particular interest in literacy and numeracy. I think that that's legitimate. The problem comes when that is all that you end up focusing on. I have to say, if you take, for example, experience from south of the border, that has been a serious shortcoming in the education system, that even the national inspectorate, Ofsted, now recognises that there has been a disproportionate emphasis on literacy and numeracy to the detriment of the rest of the curriculum. The inspection system in England is being recalibrated to say that it's not just about a narrow range of literacy and numeracy indicators, it's trying to get a broader balance. In the context of national policy here, I think that it's fine to have a particular focus on literacy and numeracy, understanding also that the SNSAs were supposed to be part of a broader assessment focused on CFE levels. I'd add that point about the critical role to be played by inspectors as well, so inspectors are able to go into schools and form a judgment about the holistic nature of the education being provided. That balance is difficult to achieve. It is, and it takes, I think to some extent, this is an iterative process. We're in the early days of the SNSAs, and I think we have to work through some of these issues, but it's about getting that balance right. The key trap to avoid is that you end up focusing in a punitively high stakes way on a very narrow range of indicators, because that will impact upon the breadth and balance of the curriculum that children experience. This thing about becoming far too focused on these very narrow indicators, as you say, is that you would expect people to have a focus on literacy and numeracy and health and wellbeing as foundational aspects of education, but it is very, very clear—and in Scotland it's very clear that we're going in the wrong direction on it as well compared to some other countries—with a really, really extreme focus on a relatively small number of things that we're going to try and target or that are going to form our data points. It happens for various reasons. Governments need good news stories. Oppositions need things to batter them with in the press. Journalists want stories that can go out the next day and everything. It's hugely frustrating. As somebody who writes in the Scottish media about education, it is massively frustrating for people like me as well. It's incredibly difficult to have the full discussion that we need to have about those kinds of issues, but I worry that, increasingly, and certainly over the past few years, Scotland is moving very obviously in the wrong direction on that. We're doing more and more of it, and I say that there are political media reasons and issues that are to do with the devolutionary structures of the Scottish Parliament that feed into all of that. However, if one of the good things that comes from that is a recognition across the whole chamber, beyond that across the whole society, that we are actually making a mistake with the way that we are becoming far too focused on really narrow, atomised aspects of the education system, that would at least be something to come out of all of that. It would at least take us from a direction of travel right now, which, as somebody who teaches in the education system, writes about it as a four-year-old son, worries me with the direction that we're going just now, but, as I say, if a consequence of all of this, of these various mistakes and all this confusion, is maybe a discussion about actually stopping that, just putting the brakes on a bit and actually thinking through the direction that we're going to go in, maybe that would be a positive to come out of it? Yeah, I'll bring in Rona Mackay. Do you want to go to your question? From what you were saying, Mr Mackay, I'm just wondering if we're in danger of overcomplicating this in terms of, Susan Quinn has said, this is one of a broad range of assessments that are made, so I'm just not sure why it is so, yeah, exactly. There are a couple of, I suppose there are a few reasons for that. One of them is there's a sort of principled thing of, on the one hand, saying that we trust teacher judgment to be the metric by which we measure an education system, but at the same time saying effectively, whether it's explicit or not, that we're only really going to trust it if it's based on a set of standardised tests that we've decided as a government that we want. I mean, that's going to get a reaction, I'm afraid, but the other thing, which is the point that's been touched on already and was mentioned several times in the informal session, there is an opportunity cost to not just these tests themselves but to the culture that they are likely to lead into. Ultimately, it almost certainly ends in standardised test data versions of becoming public. We've heard about schools sending out SNSA reports to parents and things like that, so part of the reason that there's a lot of concern around this is because it represents something of a direction of travel in and of itself, and in other countries that have gone down this kind of road a few years before us, the tests themselves have had an impact well beyond what they've ever intended to do, but it's an impact that was always relative, certainly at this stage is now relatively predictable, so direct people towards things like that plan in Australia. So I think when you see a lot of comment, particularly coming out of the teaching profession, I think a lot mostly it is coming from that. There is this genuine concern that actually there comes a point when a real focus on we are going to introduce this kind of testing, and this kind of testing will tell us things. It's not just whether or not it might give you a bit of information on any positive or negative, but once you start facturing in things like the opportunity cost of all, there's potentially more damages down the line as well. I think that that's the thing that maybe needs to be borne in mind with a lot of it, beyond the simple. Your basic point is correct that it's a single assessment and we do lots and lots of assessment, but there's a particular effect of this kind of system and the way that this kind of system has been instituted as well. I think that it would be… Potentially. You could argue that, but ultimately, if you're going to have a politician stand up and give a big speech about judge me on my records and try to tie educational improvement to electoral cycles, you might need to take that as the starting point for how things then start showing up in the media. A roomfiller journalist took her first minister at a word, I suppose, in that kind of sense. I don't think that the First Minister was seeing judge me on the record of SNSA. No, it was on the record of educational improvement, but in the same speech saying, I will introduce standardised tests. I mean, what do you think is going to happen? I appreciate it, it's not intentional, but I think that a lot of where we are just now is where we were always going to be as a result of a lot of the things that happened earlier on in the process, which is unfortunate. Eslaw. Why is there such a big deal? Because resources are limited and resources that are placed here are not placed somewhere else in the system. What we've heard from parents… What do you mean by resources in terms of teachers' time? I mean in terms of money, in terms of teachers' time, in terms of everything. We're in a finite system. If that is being introduced, then that costs something that is not going elsewhere. Parents are already having these conversations. Schools already are tracking the young people in their schools. At my parent council at high school, now my children at high school, we've sat down with the head teacher and we've looked at the performance of each year group. We've looked at the virtual comparators. We've looked at the comparators. We're the local authority. That data exists at the school level. The tension is between whether it exists at school and local authority level and whether it can gather at national level. That is not a tension that should be played out in children's lives. Primary ones are just arrived at school. We know what the data will tell you there. It will tell you that children from a lower socioeconomic background will not be as advanced as children from a higher socioeconomic background. It won't help teachers to get anything out of that or parents to get anything out of that that they don't already know. What we encourage at Connect and what we should all be encouraging are good-quality conversations between teachers as the primary route by which the education system is delivered to children and to parents as the primary supporter of that. We know that parents and their engagement makes a massive difference to young people's outcomes. The way parents get involved in school is not to receive a report with quite dense and complicated information in it that tells you about a point-in-time snapshot of a small part of the curriculum. It's to have good-quality conversations with teachers about the young person in the context of the classroom and how the parent can support that. That requires investment, smaller class sizes and all the things that the background of austerity is stopping us from having. It means that parent councils are no longer supporting value-add to a school. They are now providing basic infrastructure and basic needs for a school. It means that teachers are no longer being able to have the time to spend looking at a personal learning plan for a child, but instead will be more focused on ensuring that their school doesn't come at the bottom or the middle of a league table that will be prepared from a very narrow focus of the curriculum. We need to think about the young people. How will that affect their classroom learning and what can we do to change that? I don't think that that will do it in the short term and I'm not convinced that it will do it in the long term. So how do you advocate that we assess children? No tests? No assessments? We've already told you that there are assessments going on. Can I just respond to that? I think that in the evidence that I just gave you, I explained that from my understanding and from parents' understanding, children are assessed constantly by teachers. They are assessed through local authority standardised tests. They were assessed through the SSLN. They are assessed constantly throughout the course of their school career. In fact, when you come to high school, you might argue that they are assessed too much to take away from teaching time because there is a continued focus on numbers rather than individuals as children. That is driven by a national government obsession with being able to say that Scotland is leading the world in X, Y or Z. Parents don't care about Scotland leading the world, parents care about their child's education, what it means today, tomorrow and the next day. If you test a primary one today, you're going to have to wait for seven years for that to work through the system until you can get anything meaningful out of it that will help the next cohort coming through, and our children do not have that time. They need resources now in the classroom now to help them now. The introduction of a national standardised test, as we've seen in the past, makes us lazy in terms of our conversations about education attainment and achievement. When we had five to 14 tests, all that we talked about was the percentages. I remember having those conversations as a class teacher. How can you get your class to be at 80 per cent when it's at 79 per cent? Not what's the added value that's going to go in there for this young person. How are we going to get to that percentage point? A test is a lazy way to report. It's an easy way because you can make graphs and all sorts of pretty pictures out of it, but it's a lazy way to report what happens in our system as a whole. Assessments happen every minute of every engagement that teachers are having with young people. If, as a teacher, you're seeing that something isn't working, you're immediately looking to the next step of it. It doesn't require a formal test for you to make your decision to change how you're going to work with that young person or that group of young people. What it does do by introducing something that is described as a standardised national assessment or test makes us, as a country, lazy about how we approach things, because it's easy for the journalists to get the information and create the league tables. It's easy for you in Parliament to say, oh, look at that, the SSLN has gone drop down by 0.06 per cent, so you must be failing as a Government and everything's wrong there. It makes us lazy to focus on an individual thing, but when you put a standardised national test in place, that's what happens. That's where the focus gets directed because it's easy to do it, rather than having the complicated conversations, which are the ones that are meaningful, that have happened in every year in the 30 years that I've been teaching between parents and teachers about what's going on with their young person. It creates a narrative around the system that becomes negative and takes us away from the good work that's going on in our classes. Again, just to come back to your point, I don't think that there is a problem with having some kind of national form of assessment. I think that that is legitimate. I think that the issue is the purposes to which the outcomes of that assessment are put. That is the issue. I think that to give some credit to the Scottish Government, their evidence to this committee makes clear that they recognise that there are dangers in high-stakes assessment, there are dangers in narrowing the curriculum, and we've seen the damage that's done elsewhere. The trick here is to make sure that if we have a national system of assessment, we understand its limitations, we understand what it tells us, we absolutely understand what it doesn't tell us, and then we act accordingly. We put, I guess, a proportionate amount of weight on what I think the assessment can tell us. The danger that we have here, given some of the evidence that we've heard, is that that message isn't getting through and that people are attaching high-stakes purposes to the assessment. If there's one kind of recommendation that I would make to the Scottish Government, it's to double down on its commitment that this is not a high-stakes assessment and that they don't want to see teaching to the test. That's a really important commitment that's been given, and it has to be put into practice. To stick with this point around confusion over purpose, I was quite interested in the Britain Submission to the Committee on the Union's success initially in the discussions before SNSAs were implemented in shifting the Government's position. The Government started off very much with the understanding that the purpose here was summative and you managed to shift that. Do you believe that the confusion has come because the Government has shifted from having one relatively clear position to another relatively clear position and it's in that shift that this information has not cascaded down properly, or are they still hedging their bets? Is the confusion because they're still sitting somewhere in the middle or is it because the position changed and inevitably during that change has not transferred down to local authority and to school level consistently? The confusion will be, for the individual who is confused, will be what their reason is around that. Some of the confusion is because, although we believe that the advice that has been developed within the groups around this for local authorities for the implementation of it, a significant number have chosen not to follow that advice. I can't speak to why they would do so. I can surmise that it's a whole host of reasons. I know from the previous evidence that it's partly to do with the fact that they had systems in place that they were happy with. As I go back to my very opening comments, at the beginning of this process there was no stakeholder within the system speaking to the then Cabinet Secretary saying that we needed to add any other form of assessment into the system. All we needed to do was look at how we gathered that to get a national picture. So, lots of local authorities were not on board with the standardised assessments coming in, so whether they'd come in the way that was originally designed or where they have now found themselves to, they were never going to approach them in that particular way. That then gets a confusion for how you relay that to parents and how you share that information about it. Throughout the process of negotiation and development of the standardised assessments and the policies that went around that, the advice notes that went around that, there were still mixed messages coming out of Parliament in relation to the purpose of those assessments. I had members saying to me, as an education convener, we have council five times a year, I am on my feet talking education on average an hour five times a year and a substantial part of that time in the last two years has been about me. Members saying to me, you told us these were going to be this or you said that they weren't going to be that and now they are and they're whatever because the messages coming out continued to be mixed around it. Why that is? I don't know one part of government talking to other parts of government or otherwise or a desire for the message to be something that had eventually been negotiated to not being. If they had been implemented in the manner in which the advice notes and the details that were eventually developed, I don't believe that we would be here in the same way as we were because teachers and schools would be deciding if and when those assessments were going to be of support to them in relation to the work that they were already doing so you would decide that there would be a benefit to the diagnostic part of it for a group of young people but actually your own evidence already supported that a group had achieved a level and didn't require something else to be put in but because we had this little bit of it which was we were going to gather some information from it nationally to get trends in it it was everybody's got to do it and then it becomes well how are we going to put that in place? Yes, excuse me, you asked is the problem that we've gone from one relatively clear position to a different relatively clear position? I think I probably take issue with the idea that we've reached a relatively clear position now so we're still in a situation where we've got tests that are meant to be formative and summative and individual and local and national and we've got all sorts of different versions of which that information is being dealt with so I don't think actually even that is quite the explanation. In terms of the one of the things that's coming up as well as the idea of had they been implemented a certain way or had x, y or z happened things wouldn't have been so bad and I suppose that's the unintended consequences defence I suppose you know like we didn't intend for this testing system to become this kind of beast and I have some sympathy for that but it's limited by the fact that I don't know that claiming the unintended consequence is as strong as it could be given or as strong as it would be were not for the fact that a lot of these apparent unintended consequences were also predicted at the start and I think that's something that has to be taken into account as well. Yes a lot of this has come from the confusion a lot of this has come from poor implementation but a lot of this is actually kind of what a lot of people said from day one was going to happen we are we're in so many ways we are where we thought we'd be we are where the EIS said we would end up we are where connect said we would end up this is this this committee session today the fact that we are here the fact that I'm here instead of in a college teaching this is rather an indictment of the fact that the things that were said from the start which were true were never weren't really listened to it would appear so we've ended up in this situation and spending this time in here today. I'm interested in why you think the the government's position shifted or perhaps maybe became more confused or have you might want to characterise it if we're to assume that the end the tests were not the end goal in themself the the government must have come at this with another end goal in mind with one particular end goal in mind that takes you in the direction of summative testing but if you have a different end goal that would take you in a direction of more diagnostic and informative testing and yet they seem to have embarked upon a path where midway through this process the position changed from your experience of of engaging with them or James and your experience of investigating this after the fact why do you think their position changed because it seems that to change the potential purpose and the design of the test so much fundamentally shifts the overall objective but they must have started with an objective. I would argue that the key one of the key drivers would have been that the EIS indicated that we would ballot our members to boycott any system of any system of tests that were put in that we're going to be the way that they were described in the outset that you know we were clear we had very quickly had our members behind us in relation to the idea of a single window for every single child in the country being tested at a particular point in time with a single test we that would have we would have trade union thresholds we would have smashed it in our primary sector in particular in relation to this so we would argue that that was that was a significant driver to to the negotiations that then took place to allow us to get to a situation now the fact that we didn't we were unable to shift it from a position that this overarching data was still going to be gathered nationally to look at trends and I still not really quite clear how that's going to work because it can't work in year one or two anyway because its trends are more than more than a single year's worth of data it means that you have assessments to switch are trying to to meet and a multiple of masters thank you mr scott thank you i wonder if the panel might want to clarify whether they believe that government of whichever political persuasion genuinely needs to have some data about what is happening in schools absolutely and it's there so what would be the best way to achieve that so the best way to achieve it is what you have currently in place which is that the the levels are across each school or gathered and we're told that data is unreliable well that data is well again then the so you have to that data will become no more reliable with with these tests in it that becomes no more reliable with them but absolutely but that data is becoming more reliable there are ways and means in which that and that was the conversations we had back at the start was how can we make sure that the data we have then the system is more reliable than it is which year do you think we'll be able to compare acl data year on year to understand what's happening in our schools then which year when oh i don't know you'll need to have a conversation with with Education Scotland and the directors of education in relation to that i believe that our teachers are working hard within schools and are doing incredible work to moderate well if you let me wait a minute wait a minute we're trying to establish i'm trying to establish what data government of any political persuasion needs to understand what's happening and therefore i'm asking when in the eis's view and i'm very open to suggestions from the other panel as well when we'll actually have that information i believe that the data that's in the system at this point in time is reliable i believe that it's there that's absolutely not the evidence this committee's had well again i can't comment on where other people think it's unreliable but at this point in time we see an increasing number of systems in place to to make sure that the system the the data that's in the system from teachers professional judgment is more and more reliable teachers are engaged in more moderation exercises the benchmarks are now in place the benchmarks weren't brought forward by education scotland until a couple of years ago so teachers were working blind around certain parts of it and that was again asked of them and they continued to work to make sure that the best of evidence was there so i put it back to to to those groups as to why they believe that evidence is not reliable when it is based on the broadest of information and we now have systems in place where teachers head teachers deputes qio's education scotland are working not just across and within individual schools but across across a multiple of schools across local authorities i believe the system is reliable now but thank you but i think it's the government who are telling us it's unreliable it's the government's own assessment so i think that's all the only the committee can can can possibly assess i wonder mr makin any if i could ask you about the specific points you make in your submission about the ssln you've already mentioned the potential to expand that if that was expanded and you might want to elaborate as to how that best could work what would again what could that tell policy makers whether they be journalists national policy makers or indeed government well i'm i'm always nervous i says the journalist i'm always nervous about what it could tell journalists kind of thing that kind of data the advantage of a national of national data for governments that is something more like the ssln is that because it is sample based you don't go down the road of having issues around teaching to the test i don't know another a couple of people who were teachers on the on the here and just they i don't know if any of them ever it did the ssln but i did when i was teaching in secondary schools you couldn't teach to the ssln the data it gave was not just reliable in the sense of giving a national snapshot but actually and i don't know if any of the committee members of i mean if you haven't go back and look at the 2016 ssln report at the level of data in it is remarkable at points and not just the report go back and look at things like the statistical tables that show you things like 2016 one that showed something like 26 percent of what i say something like it was 26 percent of kids in primary four or something like that reported that nobody ever read to them at home you know i mean that's national data that you need an ssln gave us a wealth of it and the ssln for the record is still technically available the material for it is still sitting there so on the one hand i would say that ssln and that sort of idea of a national sample is really really valuable and i don't think a case was ever made for getting rid of it i think ultimately it was gotten rid of because if you're going to institute a system of standard national standardized testing it looks very difficult to justify also doing a national sample model especially if you're making the case to say we need the standardized test system because there's something wrong with a sample model on the point of i think i think i know what you're getting at on the thing with the teacher judgment and the question of whether or not individual teacher judgment of the air pupils is reliable but it's a it's actually a two-part question i think you're asking can you trust teachers judgment in terms of the progress the air pupils are making to which the answer is absolutely yes i'll trust a teacher every day of the week before i trust a standardized test i've got a way boys four i've got a primary school next year i'm not interested in seeing a standardized assessment report about how he did in 40 minutes one day but i'd love to be able to sit with his teacher for an hour a couple of times throughout the course of the year the issue comes i suppose whether or not you can use that at a national level to give you the same kind of thing that the ssln did in my view would be that no those are two very different things and given the point we've made earlier on about trying to trying to be clear what assessments for what data is for i think those are two different kinds of things which is not to say i don't trust teacher judgment data it's not to say that achievement of a level data is not useful or accurate i do think there are issues with things like so we've heard that teachers are doing more and more moderation around what those levels look like which is true but i think we're still a long way from where we would want to be with the time available for doing that i remain unconvinced that the that sort of for want of a better term that standards have yet been properly exemplified which is something that comes up quite a lot when you sort of discuss aspects of coming to judgments across more than one area could you just help me with the other point you make on that where you say in your submission that no properly agreed standard of what the achievement of a level looks like yeah so what do you mean by that i think the thing is if you look at like um if you go back and look at the fact did i open it up here if you go back and look at the ssln there's a very clear statement made of like this is what you know performing very well looks like um or this is what someone's struggling looks like there's a data given to around that i still don't think there's a what's ever been achieved is you know a clear a clear cut you know this is what a level 3 writing assessment looks like but i would add and i think i perhaps wasn't clear about this in the submission that part of the reason for that is that that's not really what that kind of information is particularly for i don't think it's necessarily very helpful to try and view level 3 writing as there is a thing and that's what it's going to be but that's but it's still the case that i think we are a long way from having a situation where teachers actually have the time and the space and the professional autonomy and the trust from government and from parliament and from journalists i would i would happily say as well to actually do that proper moderation job um that i think would lead to as much as i trust the judgment in teachers trusted that i think might lead to to go to your original point where the government was saying it couldn't trust it well i think that's the way that you'd probably get to that point thank you thank you mr northcott do you have a view about the ssln would are you comfortable with the with the suggestions we mean not just today but in previous committee sessions as well that that should be a changed altered enhanced in some way i think all i can do is make the obvious point that that if you're in the position of the scottish government given the advice it was given by the oecd it was probably going to be difficult for it to continue with the ssln in its in its sport question is what do you replace it with i personally i'm not sure i share the oecd's analysis i think there were important strengths in in in the previous arrangement um the ssnsa's i guess might be able to fulfil that function to some extent but i'll go back to the point we made right at the outset you have to be clear what they are for and at the moment i don't think we are clear what they are for they described as formative assessments and then they described as summative assessments and that is as i say that's a legitimate policy debate to be had but you can't effectively try and get one assessment to fulfil both tasks the other point i would make is that in a sense yes it is important that national and local level policymakers have good information and data about what's happening in the education system but they have to be able to interpret that data and information in context and the danger always is that the data becomes everything it becomes the only lens through which you look at the education system and that's dangerous and experienced from elsewhere in the UK and other parts of the world underlines that many times i would say okay thank you lindsay lock could i just ask you one question is a dad of a nine year old at a school in scotland um in terms of information and it's kind of the point that's made earlier on i get more if i just ask a lot of direct questions than i ever do by reading the report and i've looked at mine from because my mum kept all mine as well god help me from all those years ago as well and i think we told we told my parents generation more than we tell parents today what's your view about what needs to change about parental information from whatever system ultimately is going on in schools i think that parents probably need to more information earlier about school and about the system of school in scotland because it's fundamentally changed since we were at school so you sort of arrive and there's lots of new terms and there's lots of words that you're sort of assaulted with and not every parent comes equipped to understand that and not every parent i loved school but not every parent did you know for a lot of parents going back to school and going through the doors of that school reminds them of an unpleasant experience of maybe you know those reports were very clear but they were very clear that you were either succeeding or not and that's a very stark thing to tell a nine year old the great thing about the reports today is they tell you loads of things about what your children are doing in class you know how they're developing against these levels whether they're secure whether they're consolidating it's not a pass or fail situation but even those words secure consolidating that's confusing for parents so i think we we need a a system of education you know early years play based curriculum that introduces children to school so they learn how to be at school but also reintroduces parents back into school and and really we need to reconnect schools with communities so we really shouldn't view schools as somewhere that children go to be given something by a teacher we should view schools in in in the context of their local community we should be inviting industry into schools we should be inviting young people out of schools into industry into the local community and only by doing that and by sharing with industry as well the meaning of curriculum for excellence the terms and curriculum for excellence will you start to bridge this gap of well you've got sqa results but what do they actually mean what does that do for you when you go into the adult world of work thank you and good morning to the panel i want to go back to the question of the ssln tavish scott was following that line and james matt many i know you spoke about having peoples that you taught previously to do it and i had peoples removed from my class as a modern studies teacher and to be quite honest that data meant nothing to me it didn't obviously come to me and actually i felt pretty disempowered by the whole process and i was quite taken by the eis has written submission appreciate Susan Quinn you alluded earlier to the challenge around the ssln and you say that the eis favours the proportionate gathering of data to provide appropriate system wide information to inform policy making so the ssln was that it sat at that level at informed policy making is there not there for an opportunity with the snsa to empower our classroom teachers to use data more effectively and also to track people progress because we've heard from previous evidence sessions that it does generate that information and it's able that you're able more readily to use that information at a pupil level and a classroom level to track people progress right the way through a child's educational journey not in the not in the manner that they're designed or or or are being used and as i said earlier our members are telling us that they're not learning anything significantly new that they didn't already have from the other assessment work that they had done with the young people so the tracking of progress does go on within schools there will be potentially one of the challenges is that we have a multitude of tracking systems within schools across the country but tracking of progress across the benchmarks and using the benchmarks and using the eis and oes and the curriculum as a whole goes on on a daily basis and and will have its trigger points around conversations within schools so that teachers can use the information to to improve upon what they're doing next with them. I go back to what they're also not providing is the national picture in the way that you had from the ssln so they're doing neither of the things that we were suggesting we would want for the system so we have a system where there are a breadth of assessment strategies going on in schools that are working we needed to find some way for that to be collected nationally so that there could be a national picture around it I think that the the data around the achieving a level provides that albeit with the the challenges that that faces in terms of time for moderation and the the recognising of the understanding of the standards and whatever and schools are getting better at that as it goes on the ssln provided something nationally that the ssna are not going to provide and whilst I understood because out again I was the head teacher that removed the young people to administer them I was the class teacher who had them removed from the class and didn't get in and just because it didn't happen doesn't mean it couldn't happen in terms of what could have been used out of the ssln that there were ways and means of using that information and they're potentially as James has outlined ways of developing that so that you get a really genuine a genuine picture of what where scott's education is at a single point in time but also with some of that really rich information that's below it around around our young people and what we could be doing in terms of targeted interventions. Okay can I pick up then on Tavish Scott's line of questioning with regard to the reliability of the data prior to the sna's because it was quite taken by what you said that the EIS felt that it was reliable it was efficient and we know that 28 out of the 32 different local authorities use some form of standardised assessment prior to sna's but it was happening in very different ways and I don't think that we're assured and certainly from the evidence we've taken that all of those assessments that were taking place were benchmarked against cfe which is quite problematic actually if you want to look at the reliability of the data if we're not benchmarking what we're testing the kids with against the actual national curriculum you know what was the purpose of it. So if we have greater standardisation then of what's happening at a local level isn't there an opportunity to level the playing fields because at one of the previous evidence sessions we heard from I think it was Professor Sue Ellis who spoke about unethical assessment approaches and she talked about taking for example groups of kids out of class and that being quite unfair therefore if we have a standardised approach isn't that a fairer system? Have a standardised approach the sna's have not been introduced in a standardised manner and the advice and the agreement from Scottish Government was that they wouldn't be introduced in a standardised manner because they would be used they should be used by a local authority schools and teachers should be deciding when the young people are engaging with the standardised assessments so they're not being used in a standardised way they're being used differently in exactly the same way as the current raft of standardised assessments are being used in a variety of ways and may we add haven't been stopped in many local authorities with the sna's been introduced even though that was one of the pieces of guidance that came forward which was as soon as we introduced these you stopped doing everything else but all local authorities are now well under the sna's they should be doing the same thing and that's not what happened previously so it is standardised it's up to the teacher you know at what time in the year they want to carry out that assessment that empowers the teacher surely but at least we know that what's happening in schools is to some extent standardised and I have concerns that if we had 28 local authorities previously doing very many different things that created a pretty unfair level playing field you know it wasn't fair to the children and surely it should be in the children's best interests that they all have the same opportunity and that's what this is about but assessment isn't an opportunity for a young person it's about it should be about informing learning and teaching and so as we've said it actually doesn't necessarily the different standardised assessments were used in very different ways and for very different purposes so some local authorities said they used them but they didn't it wasn't an across the local authority it was about individual schools determining what they used and when to inform the learning and teaching of the for the young people in their care and this won't this won't change that in my view because there still will be the standardisation the the reliability of the information you're going to have at around achieving a level is based on teachers professional judgment this won't this won't change that or shouldn't change that unless you skew it to being simply about teachers professional judgment being whatever the scores on the doors are around the SNSA which takes us all the way back to that then makes them a high stakes test the fact is that the way that you get a standardized approach a equitable approach across the country in terms of learning and teaching and assessment of learning and teaching is to look at the moderation practices that are going on and to look at how you are inspecting that to see that the practices are equitable and that everybody is working to the same the same standards the SNSA won't fix that it won't do that it deals with 10 percent of the curriculum in a really narrow point in time any assessment deals with a narrow point no but that's the whole point it's a hope but that's the nature of assessment I mean we're looking at a snapshot in time it's not just to say that that assessment data is the only thing that a teacher would look at they look at a broad range of different things that happen in their classroom but the whole point is we're not looking at a snapshot in time the SNSAs are supposed to inform teachers professional judgment they're supposed to inform them when they come to have the conversations around they're not about confirming teachers professional judgment that's the language that has been agreed around it so they're only informing a tiny bit of the curriculum at a point in time which isn't agreed will be one point in the year or one point elsewhere will end up will need to agree to disagree I don't believe that these are going to add to the system that we have in place now don't believe they're going to deal with the issues you have around whether or not the evidence that's there in the system around achieving a level is more valid than any other they potentially could be used in terms of those conversations but there is no guarantees that they are going to make anything in those conversations any better the only way you're going to make those conversations better and understand whether or not those people that those young people who are being deemed to have been achieving a level is genuine is is if we train our teachers more effectively in the moderation of that if we make sure that senior managers and line managers and otherwise who are having those conversations are able to do so in a consistent manner so that there is a consistency of approach around the moderation of the achieving a level and as James says the understanding of those standards needs to be what's there the introduction of this this test hasn't done that and won't do that it draws the focus away from that moderation exercise mr nothcote just quickly I didn't want to lose that really important point that you made because I think that is an important part of the story is that before the SNSA's 28 local authorities imposed standardised assessments on their schools I think just one element of that to consider is that when you I won't name any but if you look at those standardised assessments they were incredibly narrow some of them and I think it is an interesting contrast between that assessment and looking at the SNSA which has some I think technically attractive features so it's an adaptive assessment for example it's got potential to be a better assessment than those it's replacing in 28 local authorities I think this does come back to the point though about if you have an assessment and it's a reasonably good standardised assessment that isn't the problem the problem is what are you going to do with the data that that standardised assessment generates and then if that's used for high stakes purposes you undermine the formative value that that assessment would have but I think it's an important part of the narrative here that what's happened is that you've got rid of 28 different approaches to standardised assessment with one national approach that creates challenges there's no question about that but we shouldn't pretend that there was no imposed standardised assessment in schools before there was an awful lot of it I was just going to observe to the point around the standardised nature of it in the sense that teachers are supposed to administer it when they feel that the learners are ready that in practice we're hearing from parents and teachers is not what's happening so in some local authorities that is happening in some local authorities they are actually administering those tests in the same way they would have their own standardised tests in a set window for example in the summer term so they've got that information for the transition year from P7 to S1 so it's not that the it's not being administered in a standard way across Scotland local authorities differ Mr McEnany it's essentially just sort of looking for that but yeah this kind of idea about you know we need to standardise something if you want to standardise something standardise the standards um not the not the way that you try to measure it one tiny little bit of the system it's I understand why it's attractive at government level I understand somebody who writes about Scottish education in the media and is eternally frustrated but never been able to write the 10 000 word piece that he wants to do you know kind of thing I understand why this kind of thing is attractive but it's um it's the sort of yeah it's ultimately it's not just the the wrong road to go down but it's always the same point about the opportunity cost it's always that you spend so much time focusing on this this obsession or this need to think can we standardise provision can we use this test make sure everyone's getting the same kind of bit of data and all the while all we're doing is actually slipping further and further away from a situation where actually we've got a teaching and learning system in place which gives every single kid in Scotland the best possible chance in life okay miss Lamont yeah thanks very much I was wanting really to kind of focus on the practical elements of this what is it appears to be standardised but what strikes me is not even standardised in its purpose um and I wondered if James McKinnon he would confirm really your own findings around your research about where this all came from is it right is my understanding right that basically you start with a decision to have testing and then there's a kind of a post hoc rationalisation of it or is there any genuine evidence that somebody sat down and said this is an issue and took themselves to testing well I mean taken taken Susan Quinn's evidence already that you know during those the various meetings that there was nobody coming forward saying we want to see a new set of new set of tests would certainly align with the kind of information that I've found looking at it I would point out of course that it's very difficult to be sure because all of those meetings that took place were admitted but I think it's um it's quite easy to see I think in the at least in the very early stages of it that there is at least um it's at least partly driven by a political decision and I think there is certainly a significant element of it that looking at the available information looking at the material was there looking at the the scarcity of written advice for example that led to the implementation of the system was four emails you know given all of that um I think you can understand why people would come to the conclusion I'm sorry would come to the conclusion that you've come to there that was a decision made to start with testing and then everything proceeded from that I wasn't in the room I don't know but there's not an awful lot of evidence that would show the alternative and do you think that some of the confusion has come from a reluctance to pick a side in this argument so you can argue there is a benefit from national testing it's really rigorous and everybody has to understand that or you can say no it's actually a bit diagnostic it's about the individual child and depending on your audience you pick which of these you're going to emphasise me one of the things we've been told certainly in our own debates that it provides you know why would you not want to have a system that would identify a child's developmental challenges or that you know there's a diagnosis that's just been missed what's your I think that actually just sorry but see that point and this came up earlier on this idea that these tests are going to be used to figure out children that have got dyslexia I've seen people talking about could these tests be used because they could help us identify autistic children in schools of kids with dyslexia and that is incredibly dangerous and really irresponsible because the tests absolutely will not do that and any sort of thinking that that's going to happen that these are going to be some sort of diagnostic for additional support needs needs dealt with right now that that is absolutely not the case we've got a bad enough situation in Scotland just now with the way additional support needs has been treated over the last few years and we certainly don't need to make it any worse with this on going and again it's one of these things again it's not an intentional thing there's not as if a minister has come forward and said you know these kinds of tests will lead to kids with dyslexia being diagnosed earlier anything like that but actually this this this line of thinking has continued and even if it's only through the admission of saying that is not that that's absolutely wrong then it's I would regard that as being particularly irresponsible I'm not I mean I couldn't be absolutely certain but my sense is I think there has been an argument made back to those who've got concerns about the testing surely you don't want to be in a situation that a child with autism's needs are not identified that is actually right on that case yeah I would regard that I would regard that as being very irresponsible okay and I suppose the other thing well two last points I want to make on the question of if you're going to do a standardized test is is the notion I've raised this in previous sessions and I wonder if you've used in this I was advised by those who were presenting to us how the test worked that basically you could do it anytime in the year which would mean in primary one anytime between a child being four and a half and a child being six I was advised that you could rehearse with individual children and that the test could make no distinction in the information it then provides between a child who had to hear the word in order to identify the rhyme and a child who could read it to what extent does that range of possibilities make the word standardized just ninsensical? Yeah there was a point a couple of years ago that a government official when the statisticians they're in an event and and so I mentioned this and talked about how effective I can't remember the exact words they were used but I can I can find it and I can provide it to the committee if necessary I've got the video somewhere it was recorded and so I raised that point that you know all the tests being done at different points the data the phrase that was used on more than one occasion was the data was not compatible now of course that's only an issue if you're going to try and use the data at a national level which at that time was part of the part of the conversation if it's purely if it's strictly going to be formative information about each individual child it's less important that I think that you know the data between one point in the year and later is not directly compatible because you're not using it for a for a comparison between two points but it does kind of speak to the the confusion around the entire system that all those kind of issues are still kind of unresolved about it certainly we'll bring miss Quinn and then miss law just on the point about the age range from four and a half to six so overwhelmingly in the feedback that we got from parents and teachers it was the primary one testing that was the the major concern that teachers had and parents had and that's because p1s are engaging should be engaging in a play-based curriculum they're learning to be at school they're learning how to be human you know they don't really know how to take a test and and not only is that hugely variable you know that it's a fifth of their lives difference between whether you do it the start of the year or the end of the year but also the feedback we were getting was in terms of resource time and this time I do mean teachers time it was hugely costly so you would have a class you know if you've got three p1 classes the p1 classes would have three teachers they would then have another three teachers taking children out one at a time alongside two support for learning and another assistance you know that is a huge number of resources and a huge amount of FTE teaching time on a test that is simply to gather a baseline and is not comparable because teachers can pick when to do it between the ages of four and a half and six so in terms of whether or not the standardisation part is is useful or otherwise James is right it kind of only really matters if you're gathering it for for a national system or indeed if you are looking at using it in a way in local authorities that they use the current their current information which means that what they're doing is they're separating out a standardized test from the overall picture of achieving a level and using that as a single which is what we've had this conversation around around the block around is that if it is that you are singularly using standard a single standardized test to determine interventions or dare I say league tables or otherwise then it's a narrow approach and it leads to the road to ruin if you use these assessments as part of the broader bank of assessments that a school is choosing to use whether or not it's used in August, September, October or wherever within the school year doesn't matter because you're using it to inform your decisions and the evidence then will be set behind that around the moderation and the understanding of the standards in relation to that but yeah if it's if it's if we're looking at something more than what I understand its national uses around gathering information on the trends across year groups then there is a potential difficulty for it however we would argue that that's not what we should be looking for within the system anyway and we should be looking for something which gives us improved opportunities for our young people in our classrooms. Okay so the issue about purpose is absolutely fundamental. Absolutely. And if you weren't using it for national comparators you wouldn't use that kind of test. Well if you weren't using it for national comparators a school would choose what they wanted to use for it so again I say I was a primary head teacher. My local authority have indicated it's up to the school to determine when they need when they're going to use it within the year that they're determined around. I would have always added in to the part of it the if so again you could then genuinely be using it as a diagnostic tool and you would be looking to see which individuals you were absolutely confident that your evidence that you already had showed that they were achieving the level and in which case you wouldn't waste their time or yours using an additional piece of assessment and additional assessment tool for it it would be used and it would be used genuinely as part of of the assessment bank to to inform teachers in relation to what they did and as I say in some cases I would argue that you wouldn't need to use it at all because your body of evidence would show that they were achieving the level. James said what does a level 3 look like write a piece of writing look like it will look like a level to a stage 2 level will be a primary sevens writing jotter that's that thick and on a collection of evidence and all sorts of things there that show that they're achieving all the bits and not necessarily backed up or in any way supported by the need for addition. Just the final point on that issue would you agree that a diagnostic test that can't make a distinction between a child who's been able to read the word and has had to listen to it and has to learn to press the button to listen that's not even a diagnostic tool because it's not giving you the information to be looked at. Certainly that came up earlier this week and I think that is problematic in terms of how the teachers use the information that is generated there. I also do think and go back to the comments that are made I think that we need to be really clear about what we mean by a diagnostic test because I think that there were those who may well have got on board with the SNSEs because they understood it to be diagnostic but were misunderstanding what a single diagnostic test might do. Some would understand it that it was about diagnostics within the parameters of the questions that were there but others would certainly and very clearly felt that it was going to be somehow diagnosing other aspects of ASN which they can't possibly do and for which there are a wide range of diagnostic assessments currently used in schools alongside other partners in the health service and otherwise to make sure that our young people with additional support needs get their needs met. My last question is on this issue of opportunity cost I think. Lindsay Loh has already sort of indicated some of that. I think that it's a reasonable test to your cost benefit analysis. I would die by right in saying that there's very little benefit but the cost is quite significant if you've been able to quantify that anecdotally amongst folk that I know who are still teaching. They tell me 50 hours in a primary school round just delivering the assessment, a shift of we've had some evidence of additional support for learning teachers being taken to do the test. I know that the IAS did a survey and I want to do more broadly whether there is some kind of substantial evidence about what that actually looks like because if there's little benefit to what there is significant cost then it becomes more of an issue than simply well. Some of that is difficult to do because it's relying on infrastructure within schools. Members will tell us that if you are already in an advanced stage of ICT redevelopment and you've got iPads then it takes less time to do than if you still have the old two computers in the back corner of the room or kids having to be extracted. What we do know is that a significant number of schools had their whole senior management team directed for a two or a three-week period of time to administer in the assessments. We had reports of teachers who were giving up their non-contact time and so therefore their employer technically being in breach of their contract so that they could deliver them in the window that had been set by the local authority because of the structures within the school and more worryingly a good number of people suggesting that teachers or support workers who were being paid for out of PEF money were being redirected to administer the assessments or to support the teachers in administering them in some way and in which case the money that's there to intervene in the poverty-related gap were actually not engaged in the activities, the interventions or otherwise for that period of time because the only way that the assessments could be delivered in the windows that were being set where were to redirect staff in this way. Very conscious of time and I still have one member who hasn't been able to come in yet but if you could be a little more concise in the answers if possible Mr Northcote I know you wanted to come in. Just on the value for money point and the cost benefit that that is really important we know there are costs associated with SNSAs it is difficult to say well what would the cost of the alternatives be but I think it's important to try and bear those in mind so for example if the alternative to SNSAs was going back to the system we had before then one of the costs with that system was 28 local authorities each purchasing tests from different test providers that's quite a substantial cost. If you were to replace SNSAs with some form of moderated teacher assessment which would have its point that could also be workload intensive as well that could distract from other parts of the system so I think if we want to think about the value for money element or the effective use of resource dimension of SNSAs we have to think about what the costs of the alternatives would be when we decide whether this makes sense resource-wise or not. I suppose it may be very old but when they did the sample testing and standardised testing in the schools I worked in one person ran them they put you know you could do 30 kids at a time so there wasn't this kind of working with an iPad type stuff which I think has added problems. Opsville mark readers all the rest of it you know they could be very straightforward in that sense they're just not cheap to buy and local authorities had to buy them and they pay a full commercial rate for them so they have a financial cost that if that's the alternative to SNSAs going back to local authorities buying standardised tests and imposing them on schools then we'd have to weigh that in the balance. Can I just make up a point on that and I don't have the figures to hand but I remember initially one of the big defences of introducing the national system was that councils are spending money on standardised tests and this will save money. The cost of the national testing system increased a couple of times I remember a particular kind of back and forth the press office is over it but I'm sure common space is reported on it for example it should be easy enough to check. I am not sure it is in fact the case that actually the amount that the Scottish Government is spending is lower than the amount that councils were previously spending it may be but there is something in the back of my head that says that's something worth checking. If you're going to look at it as that kind of straight opportunity cost it's absolutely true you'd have to look and see right what was the previous one but there's sometimes in the way that this has been framed a kind of a quite direct assumption and it's happened in the press as well I would sort of concede with that obviously. I think it would be worth if you're going to look at this idea of the cost benefit side of things fully then I think it would be worth looking to see even was that initial claim that the national system would cost less because that was the government made yes it's two different budgets but the government made that claim so I think it would be worth checking to see is the amount of money the Scottish Government is spending on this actually less than councils were spending just to be clear about that point it's just it's ringing a rebel in my head just now. It's not really about how much the script cost but what it means in terms of staffing. Yes and then the opportunity cost is much broader than just the individual money it's just when that point got there there was something in my head that's saying somebody or me should probably go and check that just to be so we've got a clear bit of data on that. Okay very patiently Mr Mundell is going to go in. Thank you I think it's also just on that point is correct I think if I'm wrong correct me that not all local authorities have stopped purchasing other tests correct and I think there's a I mean I put the question to you really why if the new assessments are so good you know have councils continued to to spend money on something else? I think it's a fair question but it certainly would be for the councils I would surmise that one they have argued that the in the first year they were not confident of the new assessments and and didn't want to give up what they had in terms of a year's worth of data in their view. Some have also we're also in the position where they use standardised assessments every single year of a young person's time at primary school so they're not giving up the in-between years. Others dare I say have introduced assessments beyond that in relation to PEF because they've started to use assessments that they never used before to give them to paid for out of PEF money to give them a benchmark for the the start of the PEF thing so there are there'll be a whole lot of reasons reasons why individual local authorities have chosen not to but it's absolutely clear that very very few of them have gone and certainly in the first year with removing all the other standardized assessments at the same time as doing SNSE. Now we hear that a good number of them are getting ready to to do that as time goes on but that time will tell on that. I think you'd probably expect there during a period of transition some degree of conservatism if you like about departing from a well-established system whatever its shortcomings but I think in the longer term if we persist with SNSAs those local authorities would need to be challenged because the points been made earlier one of the disadvantage of those systems is not only are they relatively expensive they're also not at all aligned with curriculum for excellence so it does question the value of local authorities spending public money on standardised tests that bear no relationship with the curriculum that schools are supposed to be pursuing but maybe you can cut them a little bit of slack early on to say they needed some kind of period of support or transition to SNSAs but in the longer term it's probably going to be quite difficult to justify. Maybe the other point there is in terms of say peff money and local authorities and schools having to be accountable for that I think it's encouraging those to whom they are accountable to think carefully about the kind of indicators they want to use to make those judgments particularly indicators that aren't linked to curriculum for excellence. Okay, thank you for that. The other question that I want to ask was about transparency and I wonder I think I probably know what you'll say about in terms of the development of policies you think it's unhelpful you know not to be transparent in those early stages and allow a broader sort of conversation about the actual evidence base you know rather than presenting policies and leaving journalists to... Well yeah you'll be shocked to hear me saying yes I think transparency would be a good thing but not just in the simple terms that yes you know obviously more information is likely to be a better thing and less time spent for journalists you know chasing everything trying to find any bit of useful information. The lack of transparency around the development of the standardized testing system fed into the way in which the testing system was then sort of had to be defended as the process went on which has then made it quite difficult. I think part of the reason it's become very difficult to be clear about what the assessments are doing and what they're for is actually because all of that's all got bound up together but I would argue it originates in the things like spending a year and you know taking me all the way to the information commissioner to try and stop me releasing the fact that there were four emails that formed the entirety of the written advice the government had received before it. I would always argue that in those sorts of situations that a government if you're going to make the case that what you're trying to do is implement or introduce, I just word and pose, but introduce a new policy that's supposed to be beneficial to the education system and crucially you're going to argue that it's something that is going to help teachers. These are big claims and I think you need to be very very very clear about the evidence that you've got to support that and the process that you went through to get to that stage. I would say in gen, so yes there needs to be more transparency and this is actually a general point about policy formation in which this policy forms a part of that but in our experience of consulting what usually happens is an idea is created and then stakeholders from Scottish society are brought in to consult on that idea because that is part of the consultation process but the idea already exists and so it sets up a naturally combative response between the people who are bringing the idea and the people who are saying well actually have you thought about this and if you thought about that you don't give much time to those people and so it becomes an exercise in someone coming to defend an idea, people coming to knock it down and actually that idea is still going through leaving a whole load of stakeholders feeling somewhat disempowered and disenfranchised so in general I would say involve your stakeholders much earlier in the decisioning process in the policy formation process and also look to understand root cause before you start to develop policy because this policy strikes me as something that was developed to try to understand something whereas it probably should have been done in reverse. So what are we trying to solve in the education system and how would we be best to solve it rather than looking straight to the measurement of something that we don't quite know yet what we want to solve or to measure? I agree with that transparency in all ways. I think even worse in the case of this is the situation that we had a meeting where stakeholders were round a table, I mean we're talking as big a table as this with all the seats filled with the then cabinet secretary for education where we discussed what was needed in terms of an understanding of educational standards across the country. It was generally agreed around that table that there was a wealth of information within the system and what we needed was something that would allow it to talk to each other so that there was a national understanding about it. We discussed whether there was a need for a national standardised test and the general viewpoint was that no, what was in place whilst maybe being different from different places was what people were comfortable with, what people were happy with, that what we needed to find was a way to gather that information together and then two weeks later we attend an event where the First Minister announces that she's introduced a standardised test. I was at that meeting, I had been at the one before and I was sitting going so where between that meeting and that meeting were any of the people around that table spoken to again around what was going to be decided and what was becoming policy because from my point of view that then puts you on the foot of alright okay so you think so and then we have to have the negotiations around what it's going to look like and everything else and I think that was a real problem with the introduction of these tests was the manner in which it was done because there was a there appeared to be an engagement exercise taking place around the national improvement framework and the development of the database around that and the development of how we would report on national standards at that stage in point and time there had been a discussion around what that might look like and then something else comes from left field that's nothing at all to do with what the stakeholders and everybody was there we're discussing. Obviously just to add from that so when you go from that point so if you take as the starting point there have been these meetings where the people in those meetings said well we were there and we all said that there was no need for this and then we've ended up two weeks later having to watch the First Minister announce it's happening anyway and then in that space somebody like me comes along and tries to say right well there should be material to look at them and can figure out what happened and it turns out there isn't any it turns out there's a I can't remember what it was it was something like there was a series of meetings there was a number it was something like it was in the teens and all there was for all of them were agendas for three of them you know and there's no written material now don't get me wrong it may well be the case that something is out there somewhere but according to the FOI response that I received it doesn't exist well you know what kind of conclusion do you expect people to draw when you're left in that sort of situation and I think you know it's an entirely fair point that Lindsay's making how do you how does government expect stakeholder organisations to feel at the end of that process when you make a kind of make the kind of point of right come in and talk to us have the conversations get an answer that it seems that you don't really like and two weeks later have apparently changed your mind and there's nothing to it's the it's the end that there's nothing to show where that changed where that change came from because there are some quite significant issues with transparency and policymaking at sort of Scottish political level thank you for that I wanted to follow up on two points from this morning as well one round training for teachers and just whether there was enough training in advance of the SNSAs coming in to help people understand the data that they were producing and whether the training that was there was actually accessible to classroom teachers difficulty with it was that the timings of it so teachers training is generally if it's a hofty as we call it if it's something that everybody has to do within a school or or you as an individual has to do within a school needs to be part of your working time agreement or your in-service days the timings of when the training became available by scholar and acer meant that that couldn't have wasn't getting put into working time agreements for the for the first year of the assessments which meant renegotiations had to take place around how people could get out for training and and beyond and so there were challenges in certain local authorities in local areas around people being released given the shortage of supply teachers across parts of the country it becomes difficult for people to get to the training the message around whether or not the of the quality of the training again within our submissions it's mixed you know the actual how you do it was fairly straightforward but then there was a gap in time before the the data literacy was taken place and again in some cases that would be that it would be a senior manager who would attend the training and then cascade the information and that in and of itself can always bring a delusion of understanding of what's there so if there had been a period of time in in which the assessments were being developed and prepared and people being trained in the implementation of them before the session started that they were to be implemented in then you would have been in a much better position but it was and that's why in some local authorities the windows were left until the very end of the the summer term because teachers hadn't received the training and they couldn't receive the training because the training group could only deliver a you know a quantity at a particular point in time so we would attest again if you're introducing a policy you need to actually have the resource behind it you train people not just as the day before they're about to use it but in good enough time that they can digest and ask the questions and become really familiar with it and then introduce the system okay thank you final question there was also a suggestion this morning that up to 25 local authorities are sort of mandating a window in which the tests can take place what why would they be doing that if it is in fact for for teachers to decide and to look at when individuals are ready to to sit those tests because they want to continue with the model of practice that they have already in place they want to they want to continue to use the test data in isolation from other assessment practices to to to do whatever it is they do to inform their local politicians in nice graphs around a particular thing to overcome the fact that it's not a standardized test in the way that people understand a standardized test unless it's done in a particular they'll have all sorts of reasons to do it but and do you think that increases the the stakes you can talk about absolutely is low stakes or do you think they then become medium or high stakes I think they certainly the stakes increases as local authorities because what you have are things that can be FOI'd or gathered by journalists in order to create league tables around schools and and the like that was never never the intention and in fairness to to government when we were in the discussions around it was what they tried to try to avoid by not gathering the standard then the national assessment data at national level by only by only gathering the high level stuff around the trains that prohibits that but by having windows of opportunity for testing at local authority level all it does is means that there will be local there will be local league tables and that and then in itself leads to the challenges that we raised right from the start right from the outset around it that we should be focusing on the bigger picture the whole thing not just this wee tiny test can I just add something there yet it just gives pops back into my head there this thing about local authorities and given sort of windows that thing I mentioned before the event where one of the government officials had been talking about different aspects of the testing I'm sure it was a hamden one of the again I'll I'll go and check this for the committee and I'll I'll resubmit something for you and check just to make sure it's absolutely accurate but I'm sure there was part of the discussion for which I have the transcript there was a point at which it seemed quite clear that what they were talking about as statisticians was that even at that stage there was an expectation that over time the test would start to be done in relatively set windows whether or not it was ever set out as a policy intention there seemed to have been an expectation that you know ultimately this is where it'll end up and as I said I'll check the exact transcript for you but I'm pretty confident about about that one I'll make sure miss law my understanding that a number of local authorities use standardized testing at the end of p7 to form part of the information that the cluster primary school would send to their local high school so it would be logical to assume that if local authorities are no longer allowed or are trying to reduce their spend on standardized testing because this one takes place in p7 it would be logical to assume that they would prefer it would be done in the window a standard window to gather that data for high schools and high schools would prefer the same but that's my assumption oh mr north cook just going to say very briefly that I'm a teacher in a school in a local authority where the assessment window has been imposed on my school and the assessment window is very narrow I'm going to be very skeptical about claims this is a formative assessment to help me make professional judgments about the children I teach so it doesn't help this lack of clarity about what these assessments are actually supposed to be for okay thank you very much and I can I thank all of those who have given evidence at the committee today it's a very long session we're about to go into private session wouldn't normally do this but if you could leave the room quickly because we've got an early start in chamber this afternoon which is unusual for members that will let us continue straight into a private session thank you very much