 Welcome. This is the Education Committee and the Vermont House of Representatives. And this morning we have invited Patrick Halliday in from the Agency of Education to give us an update and a little background on the annual snapshot. So welcome back Patrick Halliday. Appreciate you're joining us today. Thank you, my pleasure for the record Patrick Halliday and the director of the Education Quality Division at the Agency of Education. So I'm going to, I'm going to share my screen. There we go. Okay, good. And as always, please interrupt me if there are any questions I didn't want to take up too much of your time and go too deep into the snapshot itself before 20 this this most recent release, but I'm happy to go back and fill in kind of any gaps and understandings and and give any information that folks have. You could just start with explaining where this comes from. Absolutely. So the annual snapshot. And as we talked a couple weeks ago about the Education Quality Standards annual snapshot really comes out of a way of looking at the education quality standards, and then figuring out a way to understand how well schools or LEAs, how successful they're being and addressing those education quality standards in meeting the goals laid out there. And at the state level, excuse me, at the state level we have two different ways to help schools with that. One is the annual snapshot, and the annual snapshot is a series of indicators that are organized under the, the domains that you see there of academic proficiency personalization safe healthy schools. They're the same questions for all schools, not all schools can necessarily answer those questions. For example, one of the academic proficiency indicators is graduation rate. If I'm going to K through six school, I'm not going to have a graduation rate for you know just just as an example so not everything is applicable and every in every case. And the information is reported out by the individual school it's reported out by the SU or SD and then it's also reported out at the state level. And it's the exact same indicators again at all three of those levels the organization of the snapshot is the same for each of those. And under the each one of these five domains that you're seeing here proficiency personalization safe healthy schools etc there are a handful of indicators that are used. And really this in conjunction with integrated field reviews are the two ways that the state has to help schools get information on how they're doing in meeting their education quality standards. So the school and service not in service of driving continuous improvement plans so the schools can get information from that snapshot get information from the integrated field reviews, and then really understand better understand what their specific needs might be so that they can address those needs. And it's also I think the other thing to add is there was really designed with a really strong eye to to equity. So schools are able to look at, and schools, LEAs, the state are able to look at specific student groups. And you look at all students but we can also look at specific student groups to be able to identify where those achievement gaps and those equity gaps are pervasive, and you can't address the gaps unless you actually know they're there. So I'm going to move forward here. And for every for this we're looking at state, a state level screenshot from the annual snapshot right here. And this is from data from the 1819 school year. And you can see under each one of the 1918 as opposed to 1819 right. This is this is, this is from the 20. I'm looking right now from the previous year this is from the 2018 2019 snapshot, and we'll look at just a second we'll look at 2019 2020, just because I want to show the difference in what we have what a typical year and then what we have from the information that was just released. And you can see in the, the 1819 snapshot under each one of these domains, there's a rating and then we can also see how, not sure if you can see that how that holds. Sorry. And we can see how that rating has changed compared to the previous year. And then under each one of those, and I'll show this in just a second under each one of those domains, we can look drill down further to get to get more nuanced information. But last week, I think we could go Monday we released data from the 1920 school year that was obviously substantially affected by the COVID shutdown. And so instead of looking like this, the state level snapshot for the data that was just released has a bunch of nays on it, because many of the indicators that were used to populate these domains that weren't collected. And so to go a little bit further on this, and since they weren't collected there was no way really to report them out to go a little further on this. If you were to drill down if you're just a click on the actual website itself under this academic proficiency right here. If you were to click on that you would go to a page that looks like this. Now again this is from the 1819 as opposed to the one that was just released last week. And focusing so much on the outcome of the data but just the presentation of it, you can see for this is based on the, the aspect for English language arts the aspect for math, the Vermont science assessment the Vermont physical education assessment. There's information, both on current performance how that performance is compared to last year and what sort of gaps that we're seeing from from year to year. Yeah, I'm sorry yes please. Just to understand how to look at this, the equity index. I'm assuming that means equity of opportunity. That's what that's looking at and I'm just wondering. Okay, I'm sorry. I'm just wondering how it's measured the equity index. So the equity index. So what we're looking at right here is information for the entire state. So these are all students in the state. And we're looking. So the equity index in this particular case. What you see kind of under these two reflects how the historically how big is the gap in performance between on on the English language arts aspect, how big is the gap in performance in 18 the 1819 school year between historically marginalized population students, and their historically privileged peers. And what this is simply showing is there's a fairly large gap between those and historically marginalized has several different student groups included into that. It's any student who is from a racial or ethnic minority is eligible for free and reduced lunch is on and is eligible for special education services, or as an English learner. So anyone who fits in any four of those categories this is kind of an accumulation of those four categories. We could choose, say free and reduced lunch, and I don't have that up here. I'd be happy to show it to you I could choose free and reduced lunch. And in that case the equity index would show just specifically students who are eligible for free and reduced lunch, how big is the performance gap between those students who are not eligible for free and reduced lunch. And this change over here represents how does this gap. How big is that this gap, compared to the previous year so in this case how big is this gap compared to the 1718 school year. And what it shows is there's a gap. It's about the same gap as it was the year before or the gap in 1819 is similar to the gap that we see in 1718. I thought I saw another hand. Yeah, I'm sorry, go ahead. Does this align with the dashboard the aoe dashboard data. Yes, I am not an expert on the dashboard though so I don't spend as much time looking at the dashboard, but it's populated from from from the same collections. Okay, great, thank you. James. Thanks. This is so interesting so I was going to ask if it was what groups that was disaggregated by but you answered that question. So, now I want to ask on the equity index, any gap. I mean I assume that the standard is there should be no gap. And so anything less than a gap, any kind of gap means you're not meeting. It gets complicated and that is the intent of it, the way that it actually shows up however is a little bit different so if we were looking at, you can select just to look at historically the historically privileged students right those students who are in one of those four categories that I mentioned. And at that point, what we'd see is a gap that we would see kind of a positive gap that would show up as not meeting but it would show up as exceeding because that group of students doesn't have a gap compared to are outperforming their comparison group. So, that's not necessarily the, the intent of looking at the equity gap but if you do start poking around you can get some things that appear counterintuitive to that but, but the default settings on it are to look at, you know, in this case, historically compared to is, is the comparison group that were that it defaults to but I can select on this all sorts of different groups so if I were to select students who are not on an IEP for example who are not eligible for special education services. Most likely I'm going to see kind of a positive gap, if you will. Because the students who are not on an IEP are outperforming those students who are on an IEP. And is this data that we can actually select and see. Absolutely. Yeah, this is all information that's that's available on the website. And, and yeah, and if you want I'd be happy to navigate over here I didn't want to spend too much time talking about the kind of the mechanics of the snapshot. I don't know if that's not what the board was the, the committee was interested in hearing, but I'm happy to go into that in more detail to give more of a tour if, if you'd be interested in that. I don't I do have more questions but I don't want to bog us down so you know I can come come back later or reach out to you to understand more. Sure. Let's see maybe if there's time we could we could go do a tour, but I'm happy to do so here. Yeah. So really what I really wanted to show here is you know this is data from the 1819 school year, and then you'll see a start change when we look at the 1920 school year, because we have no data. None of these assessments were there. So, this is one of the tools that we're giving to schools to try to say, to identify what their specific needs are, but none of these assessments, the English language aspect the math aspect the science, the Vermont science assessment the Vermont physical education assessment. None of those assessments took place. So there's no data that schools can be using to, you know, to get that this. And there's nothing we can do about it because the assessments didn't occur during during COVID. The full indicators. And let me just use some English language arts math science physical education just understanding our language, those are indicators and there are in total 18 I think different indicators that show up. But these indicators are all ones that were not reported in the 1920 school year for some reason or another. The top four were all the top four listed there, not the most important four but the four listed there are all assessments that didn't take place. School offerings for flexible pathways this is looking at the specific number of flexible pathways that a school makes available to students. The decision or the agency made the decision not to report that out. Because a lot of those flexible pathways were not were not offered, for example, using an example that's that's close to my own experience. I have a child right now at Burlington High School. Last year they have a year and studies program which is a kind of an innovative project program at the end of the year where students really get some some real control over their education and those didn't happen last year. So that would have counted as a flexible pathway, but it's something that didn't happen in that particular year because everything went remote. Disciplinary exclusion is another one that we chose not to we actually did have, I understand I don't do this collection but we did have data for disciplinary exclusion, but it would really be an apples to oranges comparison because most disciplinary exclusions happen in the springtime when students are just getting tired of school and that's when we start to see an uptick of behavior. And last year when everyone went remote in March, there were no disciplinary exclusions are very, very few so that you know any reporting out would just it would be effectively meaningless in terms of, of, of, of rates compared to other years. And then finally there's one that's not included in, in what was released last Monday, and that just has to be it that's just a timing issue there's a per pupil spending indicator on there that's required under the federal ESA law, and that just gets reported out in June just as the time that we get that information in and then gets into the system it's just not a little bit different reporting schedule. That doesn't mean, however, that there was nothing of value. Like I said they're about 18 different indicators, 17 different indicators that show up. And there are there were plenty of indicators that still were reported and I think that that there's that they're still worth looking at. For example, one of them is on English language proficiency this is the percentage of students who are demonstrating English proficiency. This assessment actually took place before was completed before everything went remote in in in March of last year. So we do have data for that one. And you can see from here that the performance was not as high as we would like we don't have as the number of English learners demonstrating proficiency is lower than what we would like to see. And the change suggests that the actual number of people, not only is it low, but it's lower than it was a year ago in the 1819 school year. And that's something that you know that's concerning and something that we would want to, you know, to try to make sense of going forward. So, that, you know, we're seeing some of our most vulnerable students are not are not succeeding at the, at the same level. Another example. Not the number of schools who are offering flexible pathways but the number of students are the percentage of students who are participating in flexible pathways. Now, you can see up here what I've done is we're looking just at students who are in one of those for what we call historically marginalized groups. So we're at the in this case we've just, we're not looking at any, any students outside of that group. But what we're seeing is that there is continues to be a gap in the just in just simple participation in flexible pathways. So, our historically marginalized students are not participating in flexible pathways at the same rate as our historically privileged as their historically privileged peers. So, if I'm on an IP, if I'm from an ethnic or racial minority. If I am free and reduced lunch if I'm an English learner. It appears that there's a lesser chance that I am engaging in a flexible pathway than someone who is not in a member of one of those groups. And that gives us some, some ideas that we need to figure out. And again, I'm looking we're looking just at state data here we could do the same every SU every school can look at the same data to understand their local, their local context as well. But at the state it suggests that we need to be helping schools understand ways that they might be able to reach out more to those historically marginalized students to participate in flexible pathways. A similar one this is looking at the percentage of students who have demonstrated proficiency on a college in career college or career readiness assessment. And so this would be a particular cut score on the SAT ACT, a passing score on an AP exam, completing a dual enrollment course in earning through the CTE process in industry recognized credential, there are about six or seven different ones there. And again, it's a little hard to see here but looking just at our historically marginalized students, right, we're seeing that their, their performance is is low that we have it's it's very hard to see it on this screen but that's is 31%, only 31% of students in the state are demonstrating proficiency on one of those indicators of college and career readiness. And that is substantially smaller than the percentage of students who are demonstrating from the historically privileged group or demonstrating proficiency on one of those college and career readiness indicators. So, you know, the data that we that we do have from the 1920 school year shows that students who are who are poor, our most vulnerable students are struggling in a in a few areas when compared to their more privileged peers. I've thrown a whole lot there so I want to pause for a second to see if anyone has questions and would like to need to go into anything in a little bit more detail. I think one of the things that that I'm getting is it's not looking great. Is that something that is a fair assessment. So, certainly, certainly this follow I mean this isn't I wish I could say it were a big surprise but our students who come from less advantaged backgrounds, haven't done as well in school. And I mean that's that's not that's not a that's not a Vermont issue. Well, it is a Vermont issue. It's not a solely Vermont issue it's an issue that we see repeated, you know, in basically every district in this in the state and every district in the country, in every state in the country, and nationally that that, you know, your background has profound influence on your performance in schools. Thank you. When, when we met with these act this year. You know we were wondering. I mean I was wondering about, I asked about what was getting in the way of some students either applying to college or getting into college or even the technical colleges and again it's anecdotal but they said math, you know that the students you know were weak in math and the math skills and knowledge that they needed, you know, to go on to higher education. Again that's an antidote I don't know what the data is on that but I'm just wondering they see a lot of kids and I'm just wondering if you see that with that then translate to kind of a focus, more of a focus on math instruction. So, you know, just, I'm curious how that that would translate into action and, and getting that getting the students the skills and knowledge they need. Yeah, so that's an interesting one. So I'm going to be, we don't have data for at the state level for math this year because there's no aspect so, but we do have from previous years and I remember from looking at the 1819 year old snapshot one of the things that we saw is that math performance is at its highest. And this is looking at a summative assessment the SBAC this is not looking at what happens day to day in the classroom. But math performance on the SBAC is at its highest in in third grade, and that it steadily decreases from third to ninth grade. And that same pattern follows for every single student group, whether it's, you know, if we're looking at students who are on free and reduced lunch looking at students who are not eligible for free and reduced lunch we see that same pattern. The one place where we've seen something a little bit different is, is girls, girls still have a decline, but the girls are not having as steep of decline from third grade to ninth grade as every other group. I don't know why that is, I don't know why that is. You know, it would be the job it would be our, our thinking then to see. Is there something that we're doing that is either alienating other groups of students or something that is particularly engaging or instruction or helpful for for girls. That is, that is beating this trend. And so that would be something that we would really want to be looking at to, you know, that we could look at as, as a signifier for, for other schools are, you know, to drive. And, but then this is where it gets really complicated is that is it a situation that there is a challenge with the curriculum. Is there a situation where there's a challenge in the way that the curriculum is being delivered is there a particular way that that curriculum is delivered that, you know, that that alienates some and engages others. And I, you know, we, we really don't have simple answers to those questions. Yeah. Thank you. So, yeah, so one other place that this is something that we don't see. I want to mention to this in the 1920 the data that's just released last week, you'll see this blue bar at the top that shows up all throughout the annual snapshot that basically says, when you see these NAs, it's because we didn't. We didn't collect that data due to coven just so people are aware of that. And this is one particular indicator that we that we have that we're really curious to be tracking over the course of the next couple years so right now. This is looking at educated retention. And basically it looks at, it looks at superintendents it looks at principles and it looks at teachers, and it's saying what percent of teachers. What percent of principals what percent of superintendents have been at their positions in their current placements for at least three years. And so how much turnover we're seeing or how much how how successful are we being in retaining teachers. And in general, this is something that's that's not a concern right now that we don't have compared to we have about the amount of natural turnover that we might expect. However, this is a really, really big concern locally in the state and nationally in the educator workforce because of coven so you know there's there's a lot of thought that because of coven we might see really really big turnover. So this particular indicator is going to be something that we really want to watch over the course of the next couple of years to see if we see a major decline in in the number of teachers who are the number of teachers who are leaving the profession. You know, it's a concern. This has been a very, very hard year to be a teacher. So there is some concern that we might see that start to take place. You also have anything to do with an aging population. It certainly could and it and it's also probably specific. We're not seeing our hypothesis is and we're working to gather some data on this hypothesis is that it's not the same issue in all parts of the state. Certain parts of the state are seeing more than others. And it's not the same issue for all endorsement areas. For example, we think that we're seeing more turnover and special education than we are seeing elementary teaching. And we're trying to it's very, it sounds like it would be a pretty straightforward data collection to get, but it's it's actually quite tricky to be able to say those things with with with certainty. I think there's a question there. Representative James. Yeah, thank you. And could you explain just a little bit more about the properly licensed factor. This is just looking at what percentage of teachers who are teaching in their teaching assignment they are a fully licensed either a level one or a level two teacher to teach that as opposed to someone who is teaching on an emergency or provisional license, or someone who is who is teaching outside of what their their licensed area would be a middle school social studies teacher who is picking up a middle school English class because they don't have, you know, they don't have enough to be there. But really what we're seeing with that is that's a pretty small problem in Vermont that we really have a very high percentage of our teachers are properly licensed, fully endorsed to teach in the subject areas that they're that they are endorsed to I think there's another question there maybe. Yeah, excuse me, do you have a correlation between the turnover and the le a our pay scales, the DC signal, more turnover, lower paying districts. That's that's a really good question. It's not analysis that we've done. We can look at like I mentioned this. We're looking at state level data right now, right now. But we are, we could look at that. This, this retention question at every single le every single supervisor union or district to, you know, to see what that you know what that correlation might be. I can say this is anecdotally and I feel comfortable saying this, although I always caution when I'm putting words in someone else's mouth. I have heard from, I do hear from certain certain superintendents that they have a harder time retaining it's, it's a combination of pay and location. So, for example, districts kind of in in Franklin County, have a tend to have a hard time retaining teachers because of their proximity to Burlington that they get a lot of teachers who will come for a few years, the anecdotal information as they come for a few years, until then they can get hired in, you know, someplace, they continue to live in Burlington it's clear chit and then I should say Chitman County. It's close enough to be able to make that commute but are looking to work closer to home, whether that's quality of life issue whether that's pay issue would would be a, you know, require further investigation, but it's not a, it's not a question that we really that would really investigated specifically. I just remember in testimony, the superintendent for Kingdom East. I believe the number eats that she said was 50% every year would sing a little bit excessive. Yes, it's certainly there certainly. It's not something that we would expect to be consistent across all le as whether it's pay or quality of life or you know whatever those those get a little bit more difficult to see but you know a correlation like that would be would be something that would be you know, interesting to see to make a make sense of. I apologize my dog saying hi. I did want to just show a little bit. I know that you've had some some presentation or some testimony on recovery plans but I just wanted to kind of show what the LEAs have been having to do in order to address the recovery plan in order to to complete the recovery plans real quickly. So these are looking at their, their, their needs along three different pillars, student engagement, academic achievement and mental health and social emotional learning. We are at the point where we've received kind of their, their identified needs, and by the end of, by the end of the month, every le will have is to submit their, their plans, which is very different than the actual recovery work but just the plan for, for getting that recovery work. The challenge becomes, this is kind of going back to that that very first graphic of kind of tried to recreate that we have standards over here, the standards haven't changed, but the data that we have to to make sense of those standards has changed. We still have the annual snapshot, but it's not that they don't have the LEAs don't have the, the same level of data to look at in their annual snapshot as they would in different years so instead, they need to rely on their local comprehensive assessment systems for short to you know to really look at what do their, what are their local data saying, which is good practice anyway they should be looking at the local data and they do look at their local data so it's it's good practice, but they need to rely a bit more on their local data as opposed to these these larger scale summative data sources, and then they also need to look at kind of daily stuff that happens in the classroom where teachers are making decisions, you know, on a split second by noticing or asking smart questions to students and then using that kind of that that those constant decisions to really understand trends that they're seeing across the school. And then, instead of the continuous improvement plans this year, in most all cases, it's the recovery plan, which is being done which is, which is really, I don't want to say scale down because it makes it sound like it's not rigorous, it remains rigorous, but it is a version of the continuous improvement plans focused specifically on recovery efforts that are that are going forward so the logic remains the same, as we saw in the beginning, but a little bit different here in terms of what the actual product that a school needs to put out. And then the big is that schools are having to rely more on their local comprehensive assessments their formative assessments than they would in previous years just because those other data sources are not available. And, you know, recovery plans, I don't, I don't want to go too far in this if this is not of interest to the committee but, you know, they're giving identifying needs on all three of those recovery pillar pillars, as I mentioned they're having to look for diverse data sources. And really is focusing on multiple years of work, schools are not going to be back to quote unquote normal in September, they may look a little bit more normal but there's been a big upset to the system and it's going to take several years for schools to get to the point where they where they kind of back to where they were. And then, you know, there are a lot of different funding sources that are there and there are others who are a lot smarter about the details of those funding sources that exist but certainly the SR sources are big funding sources available to individual le as. And, and we're, we're seeing this from schools we're having this conversation at the state to that there's really. This is a kind of a generational opportunity for schools to really be thinking deeply about what they want to look like, you know, three to five years from now. Because it's kind of unprecedented levels of funding that's coming into individual schools through the SR money and a lot of schools are using that to really rethink kind of previous assumptions or to enact ideas that that they wanted to in the past but just haven't had the funding to do so. Thank you. Other questions. We have about five more minutes representative Austin. Much to the discomfort of my committee I could probably ask a hundred more questions and spend two days talking to you. I love, I don't know the granular stuff of data but I do like kind of evidence based data. So I'm looking at the SBAC, you know, kind of the snapshot here and it looks like 50% point to third graders in the state on the 218 SBAC. And we just are past or hopefully the governor will sign this literacy bill that we just put, you know, that hopefully once it's passed it will be a huge systems issue in terms of addressing reading specifically in Vermont and I'm wondering, can you take this kind of as a benchmark or as a starting look at it again and let's say well look at it every year but just like say we're going to look at five years and just see this initiative, you know that was passed by the legislature did it make a difference in increasing and improving. Is there a way to do that to look at what we ask and tying it to outcomes. Absolutely, you can look at multiple years of data, and I'm going to stop sharing my screen and I'm going to just real quickly I know that we're, you're running close on time but I just want to I'm going to pull up the snapshot. And just give you a quick idea of how you can do that so it's great. Let me share my screen again. Okay, there we go. Okay, so I think you can see this. This is just the landing page for the snapshot that exists right here. And what I'm going to do I'm going to go through this a little bit quickly but we're looking just at Vermont data I could put it. Addison, and I can pull up Addison Central School Addison, you know all the different anything with Addison and it is going to show up and I can select that but for, for our purposes let's just look at statewide data. And so you can see this is looking at 1920 data up here at the top. This, I remember if you can see my cursor but up at the top. So that's why we have all of these NAs. So, I'm going to look at a little board academic proficiency for your question. And I'm going to look specifically at English language arts and I can click and look at English language arts that we have all of these NAs from this year and, as I mentioned before, I can pick all sorts of different student groups to look at this and it's probably really small for your eyes but you know like, this is based on race and race and ethnicity this is economic status, whether a student is eligible for an IP I could I could sort it in this way I could sort it by individual grade. Third grade English language learn you know English reading, you know, the LA or whatever. Yeah, that's what I would be curious about how do we measure that this what we, you know, the bill that we just passed actually improved outcome. So what we can do is we can look at multiple years of data. I can, I know it's always frustrating to watch other people kind of navigate a website like you're supposed to understand what they're doing. But what I've done is I've selected three years of data. And so now, what I'm looking at is, you know, three and I'm looking at all grades. Now I'm going to look at third graders in the state. All right. And I'm not sure exactly I'd have to dig to see why it's not showing up but what we're looking at is I know why it's not showing up for this but let's just look at this box right here and this is how are third graders doing all students, because I've picked the all student over the course of those three years what sort of trend and what it shows is that third graders in the 1718 year are meeting kind of the expectations in the 1819 year are, you know, roughly meeting their meeting or meeting our expectations. And then, you know, looking out five years from now I could see what sort of trend is going to exist, or I could say, I'm really interested in looking just at our students who are who are eligible for special education so you probably can't see this this IEP up here. So I can look at multiple years of data to see what sort of trend and third graders for the who are eligible for special education. Am I making a bump in it now obviously in perpetuity we're going to have this big NA for the 1920 school year because we don't have data from the 1920 school year. But we could certainly start from the 2120 21 school year and look out, you know, you know, for the next five years or however long we want to to see if we're actually, if we're actually, you know, kind of changing the performance of either all students are particular student groups that are of interest to us. And again, you can look at an individual school district. Now, once you get smaller and smaller groups of students. If I'm looking at third graders in South Burlington, I'm great who are eligible on an IP, I probably can see that data. If I'm looking at a really small district that data is going to be suppressed to the public just for the because of, you know, purple laws and the ability to kind of determine individual student performance on that so we do run into some suppression issues as we start to get to smaller grain sizes of that. And I should say one thing about the suppression. That's the suppression is to the public view but if I'm in a district that has a small end size. I'm the principal of that school. I can see that data, you know, regardless of whether it's, you know, of it's just the public can't see it. So I'm looking at the SBAC results for the third grade in 2018 and it says it's 50.2%. So is that meeting is that kind of why is that that's what I'm trying to figure out what is that that seems like it's not meeting to me so. We've only got about what we're actually over time at the moment we have less counseling for the next next bill but representative Austin I think this would be a great thing for you I'd like to have Patrick Calladay answer that question but I think I'd love to have you spend some time. Right. And let us know more about that so if you could just respond to that one. Mr. Helen. I'm happy, happy, happy to come back for as much time as you like. One, one thing and this is a little and I'm going to be real brief on here because your question is a really good one but it's a complicated answer. We made the determination not to use proficiency reporting on this. And instead it's just kind of showing where the scale is the challenge with proficiency is that if a student is just above the proficient line and stays steady. There's a school that has a lot of students above proficiency and I'm just saying, are they proficient or not. I really don't know how well that school is doing at addressing the needs of students, whereas if I'm showing their actual scale scores, which are aligned with proficiency it's not exactly the same thing if I'm showing the actual kind of like the hard number scores. I can track progress a lot better for how a school is doing so. Imagine I have a school that has 0% proficiency, and I'm going to simplify this, every student is scoring on making up a scale is scoring one on a one to 10 scale proficiency is defined as a seven. So I have every one of those students who scores a one increase to a six and I'm still showing 0% proficiency. I could also have a school that has 100% students at an eight for that made up scale. And five years later, 100% of those students are still at an eight the school actually hasn't increased the performance of those students school one looks like they're really really good. They're really bad because a 0% of the students that proficient in school to looks really good because they're 100% of the proficient of their students at proficiency, whereas if you dig deeper. And I know this is a simplistic example but whereas you dig deeper what you really find is school one is actually really move the bar in school to has done very little to increase so. And I think if you're looking at the Vermont Education dashboard they're still reporting proficiencies there. Thank you that helps me understand how to read this data. Sure. It's a great project for you representative awesome. And I'd be happy to collaborate with that however much you need. Yeah. It's helpful. I want to thank you for this this is helpful as we continue to look at how our students are doing and how we calculate and how we use data is very, very helpful. Let me just give one. Yeah, 10 second parting we have, we had 250 educators across the state joined yesterday for our second of nine professional learning opportunities that we've partnered with West Ed, just to look at how to get smarter about using data, and it was just really, it was really fun to see are really encouraging to see that many educators interested in the conversation, and it's going to be going on for the entirety of the calendar year. Thank you so much. The number that have actually completed their assessment needs at this point is it complete. I think there's still a couple of districts that have not submitted their needs assessment and they've asked for an extension because they're working specifically with West Ed on this data literacy to identify what data sources they want to use to, to really figure out what those needs are. Thank you. Okay. Thank you. Thank you. I actually might want to stop this one and start another one. Just because there's such completely different topics. So, why don't we just take a two minute pause or two minute will end here, we'll start another session and start looking at 426.