 And welcome to this, the second of our webinar series on work-based assessment. I'm delighted to be with you this morning. We've got a lot of people. I think we've got about 175 registered. So obviously the topic is really, really important. As you know, when we started the series in partnership with our colleagues from QQI, our aim was to try and get a national conversation going around work-based assessment. And in particular, to look at the challenges and the opportunities and the critical issues for staff and students. We've worked towards, I think, a national understanding and a learning community so that we can actually work together to find some shared solutions to common problems. So just before I talk about the series as a whole and I hand over to our presenters for today, I'd just like to tell you that please use the chat for any questions you have as we go throughout. As we have lots of people, it would be quite difficult to put people individually in using the videos and their voice. There is actually being recorded. So the recording of the webinar, the slides and the results of any of the activities, which there's many that you have to do in the next hour. All of those will be made available to you after the webinar today. So we started with the first webinar, which was shared challenges and opportunities. Today, we're exploring the challenge of consistency. And just to remind you, on November the 19th, we have a much longer discussion, a symposium, getting to grips with policies and practice. And I'll talk a little bit more about the symposium and the plans for that at the end of the session. But why do we have exploring the challenge of consistency as the title of our webinar today. And the reason was that when we asked you in the first webinar, what one of the biggest challenges was, you said consistency. So I'd like to hand over to my colleague, Associate Professor, Jordan O'Neill, from UCD, and who's going to lead this particular webinar. And I'm delighted to say that Jordan is also one of Ireland's inaugural teaching and learning fellows and her work for her fellowship is based in this area. Jordan, I had over to you. I'm delighted to be able to speak to you here today. As Terry said, my research is going to be in this area, but I've also been involved in work based assessment for a long time in my career. I was an occupational therapist many years back. So, so I have had some experience over the years at work based assessment. So I wanted to start today with the results of the first webinar that we had in that webinar. We actually said to people, what were the challenges that you have, and what opportunities, could there be, and what do you want further discussion on. And what you came back were consistency and standards was one of the key issues that you really struggled with. And the other thing that you talked about, which is was also linked with that so this gray in the middle here is the is the challenges that you talked about authenticity and relevancy was also an issue and student engagement and feedback so they were the key assessment challenges. And a bit of that circle in the middle are non assessment related things that link with assessment. And, but we have followed your wishes, as Terry said, to actually do some more on consistency. So the full data of that is available from the natural form if you want to see it. So consistency is what we wanted to do today and to explore a bit today. Specifically, we want to look at what is meant by the term consistency. What might be the causes of inconsistency, particularly across different contexts. We want to explore the students experience of consistency. And we want to look at how the sector might address the area of consistency going forward. So that's just some initial thoughts. So what we mean by consistency, the terms often linked with the term liability. And it means it's often associated with consistency of outcomes as consistency of scores. So one of the definitions of it is that consistency in assessment involves the achievement of comparable outcomes. For example, an assessment process would be considered to deliver consistent outcomes. If assessors assessing candidates against the same unit of competencies in different concept context, made comparable assessment decisions. I know that might be hard for interpreter to do to get all those concepts but because I know we have an interpreter here today so, but basically consistency is around. Scores, grades and how kind of reliable they are over time and over over different and razors. QQI themselves very nicely put it as the same result under similar conditions. So the same result under similar conditions. So there's a few different types of consistency across many different types of assessment. Some of these are particularly more challenging maybe in a work based learning context. But the first one is consistent over time. I can call test retest. The next one is consistency between assessors. And this is a one in the literature that is often described as challenging in work based learning contexts. The next one consistent across contexts. So across different placement types, another challenging area in work based learning, and then consistency across the assessment tasks themselves. So these three authors Miller Downing and Brennan have talked a lot and many of the literature talks about the different types of consistency. The next activity we thought we would do is to ask you what aspects of consistency are challenged specifically for you. And the way we wanted to look at this was asked you to go to mentimeter.com. So I would suggest to do this task you might use your phone if you have it beside you and type in mentimeter.com or open another tab here and go to mentimeter.com. And use the code 565131. So that's 565131. So we'll give you a few minutes to do that task. And but want you to answer the question, what aspects of consistency are a challenge for you. There was many different types, test retest, cross context, cross tasks. So I'll give you a few minutes to do that. So 565131 www.menti.com. So Colin, could I ask you to even start sharing that? Okay, great. So let's have a little look. So developing reliable well trained assessors at the workplace. So assessors at the workplace so work placement assessors, yeah. Consistency without homogenization, great work, yes. Interracer is one that people are particularly worried about as the assessors themselves, yes, across contexts and across assessors, yeah, across assessors. Interpretation of grades, once a distinction. Yeah. Getting system across faculty, cross context, so context is coming up a lot. Lecture materials between assessors across context, cross respect tasks. That's the first time that one came up. Okay, yeah, interesting. Different types of placements across different companies for program. Yes, again, across context. Yeah, so these two are coming up a very strongly. Cross grades agreement between razors. Subjectivity, probably the person involved in it. Yeah. Assessors between stuff. Yeah, so, so really some nice results there consisting between assessors so it's the assessors one that's coming up a lot, actually. Great so so course outline and lessons plan. Yeah, so that's maybe the task itself. So most feedback so, and what's coming through a lot and it's comes through in the literature a lot as well is the workplace assessors yes. One of the challenges is that the assessors so the practitioners and people of different language on this practitioners, employers work placement stuff, trying to get consistency with work placement staff in the assessment. Okay, I think it gives us a flavor very much so you can stop sharing that column that would be great and people can still keep adding because we will share this back with you afterwards. And some of the influences on consistency. Well, first of all, there is the work context. One of the challenges is that, unlike in in the institutions are on campus, the work context very hugely from very sort of supported or very unsupported projects, very different kind of placement types, and these make assessment really challenging as some of you have already alluded to. There's the unique student students are unique. So they, they don't think they come with their own set of skills knowledge capacities. So they're not at the same starting point. The assessor their values, their knowledge, their skills. The standards in which we are measuring against are really key in relation to trying to get things more consistent and the clarity of these standards is really important. The level of staff training on assessment. Many of you have alluded to that, that the assessor is the assessor and the context are two key challenges for you. The assessors and Roberts and Moon and Van Loon have talked about the more assessors, the more reliable. And many of the times it is the one assessor in workplace learning. And then you have the range of encounters or tasks. And some of you mentioned the task as actually being a challenge on assessment consistency. I just, Colin, could you progress that for me it's not progressing. Sorry. Okay, working now. And this one is one that is mentioned a lot in the literature that is having a shared understanding of the standards required. These are statements of what you want students to do. And having a shared understanding of these is really important. And we need to work as Herbert and I would say in communities of practice to try and achieve a strong understanding of the standards required. And I know some of our speakers are going to talk to that. So, some of the challenges of inconsistency, I'd like to think of it a little bit like shifting sands and I was thinking about this. Work placements and work is never the same. The context is changing, the people are changing, the students are different. Common in the middle, maybe our standards and how you grade. But even in them, they are, there is fairly static and steady, but they can have different challenges with them. And then there's the task and the two, and the staff involved. So many of these are changing, which makes the challenge of consistency quite difficult. What we thought we would do is look a little bit at consistency in relation to the context starting with the context. And one of the ways I'm looking at this in relation to my own fellowship research is the context based on who is actually assessing. In workplace assessment, we have three key assessors, three key partners. In the middle is the student who should be very much part of assessing and knowing and judging their own work. You have the higher or further education staff who are in campus in the institutions. And then on the right you have the workplace staff, who many of you call clinicians, practitioners, mentors, perceptors, many different names for those staff. But if you look at these in relation to context. Very busy slide I know, but I'll try and talk you through it. There are three broad contexts in relation to who assesses. On the left here you've got context a which is on campus. And the wider definition of work integrated learning includes this context. And these are modules that are primarily based on campus, but are very connected with work. So they might be case based learning projects, authentic projects, projects with industry partners, clinics field trips. But the assessor is primary, the higher or further education institutional staff, the education as we call them. In the middle you've got on placement, they're out on placement, but the primary assessor might is still possibly the educator so they might do something on placement, and they come back, and they may write a reflective journal project and present something that is assessed primarily by the higher education staff, or sorry or further to include the further education staff forgive me and my background is higher education. And then on the right you have sort of placement staff who are clinicians or employers who are the prime assessors. There are different contexts. And then if you look up the vertical axis you have students involvement in this process so are the students very involved in the assessment. Do they have any control of what they're assessed on, or how they're assessed. So just myself from my own experience placed what I think this is my view. Like sometimes internships and co-ops are often in this context B, whereas sometimes clinical placements apprenticeships but I would be I will let you decide that are in context C. So what I thought we would do, we're going to get you discussing your consistency challenges, thinking of these contexts. So the first activity we wanted to do in this. So to ask you where would you position your work integrated learning placement modules, based on context and who assesses. So where would you position. So we want to ask you to go to the, we're going to get you to position on that matrix which you'll see in a minute. So the way you do it is go to the top to see view options, you go to view options at the top of your screen. Go to annotate and choose a stamp, and then a star. Okay, so choose a stamp, and then a star on the whiteboard that we are going to share with you with you now. View options, annotate and choose a stamp and then choose a star. If you have more than one type of placement you can also use across so you may have, for example, but we have internships but you know I'm also involved in project modules in my institution. So going to do this. So maybe you would share that whiteboard column, and if people can start to place themselves on it and let's see nice little heart there lovely. And we get a feel for people in the room. So we're going to bring the vertical access is whether students are strongly involved on the bottom horizontal access is where the practitioner is summatively primarily involved. Okay, so we're getting a nice little spatter across yeah. Interesting here there is practitioners very involved but maybe the students not quite as involved. Yeah. If you have any trouble with using this you can add to the chat. So if anybody does have trouble technically doing this, or finds it too difficult to do just right in the chat. I'm in context be with some student involvement or context see. So nice to see some people have the on campus stuff as well with strong student involvement. Yeah. Many of you in that top right hand quadrant actually. And one of the challenges with that is, it has a particular type of consistency challenges, because again you have more work placement more practitioners involved. And sometimes it can be a little less challenging in context be but we'll explore that a little bit further. Okay, so what we'll ask you to do now is we're going to get you into breakout groups for 10 minutes. And we want you to have a little discussion. I really want you to highlight your context so if somebody where you've already started to do that well on on campus or actually I'm on placement with I'm in that context be in the institution, or I'm in context see, and I want you to be in the institution for 10 minutes. The causes of inconsistency in your context teasing out, you know what is it getting a sense is it different in these different contexts. And maybe one person would just summarize in the chat afterwards you might just nominate somebody to do that. We'll share this slide with you in the breakout group as well Colin where it's going to do that too. So we're going to get you now into breakout groups so give us a minute or two for us to do that. There should be four of you in the in the breakout groups, and we'll give you 10 minutes and we'll give you, we'll give you a countdown to your 10 minutes to actually look at the this question. What are the causes of inconsistency in your context. Okay, if people are starting to come back in, we might be starting to come get people back in, maybe just remind you that maybe one person would summarize in the chat just even one or two points that came up. So we have that and again be able to share it back with you because it would be really interesting to get a feel for if consistency is different across these contexts. So if one of the people would do that that would be great. I think we'll look at that as probably take a few minutes to do that. So we'll come back and look at that, maybe towards the end. Can I hand back over now to Terry is we have we most people back in the room. Colin, do you think I can go with that. Everyone's back in the room. Okay. And thank you for that. And thank you for putting, I see somebody starting to put the main discussion was about governance, who makes a decision about a trainee, the local colleagues, an external academic consultant okay. And thank you for that Manuel. It's a very useful thing because that is the challenge between the three different types of assessors. And one of the things in the list that comes through is that it's really a balance of those three assessors, trying to see who gets, you know, has the most waiting between the, the local colleague or the, the internal kind of institutional and it's very much a try person try apartheid solution to it. Another one coming back. The three of us in the group seemed fairly new to work based learnings. So we're here just to listen with general trash. It's okay. Like it says that's a right okay no problem. And local interpretations of standards to be achieved need more for training so training so local interpretations that goes back thank you for that for NOLA. That goes back to understanding of the standards. In Siobhan's group herself is achievable employers are time poor instruments can be poor also yes yes so a real challenge of time poor. Okay, I'm going to move on because we have some keynote speakers to introduce to you so Terry can I pass to you to introduce our speakers pass over to you, and we'll have a look back at the chat afterwards. Thanks. And, and welcome back everybody. And last time when we had our first webinar, I had my evil, my evil, very evil. The day jurors persuaded me not to use the horn for our speakers today, but to actually use a much gentler noise from her phone so we're delighted to be able now to invite two of our speakers who are going to speak to you just for five minutes, and about some of the work that they're doing within their own contacts their research and their practice. So it gives me great pleasure to introduce our first speaker Dr. Anna Connor from the University of Limerick. And Anna is going to speak about strategies and solutions to inconsistency in performance based assessments. Thank you. Great. Thanks, Terry. And think Colin, I don't have control yet of the screen there. And again. No, I'm not moving them. Yeah, if you're in there now and just try. Okay. Yeah. Okay, I'm not moving at Colin, but I can get started and I can tell you when to move it forward for me. Okay, so hi everyone. My name is Anna Connor, I am currently working in the University of Limerick. And just completed my PhD just last year on workplace assessment in physiotherapy practice education. So in other words, clinical basements that our students go on. And the assessment process that occurs out there in the workplace which essentially determines their readiness for independent practice as a clinician. And this morning I'm going to talk to you really just about the students perceptions of that process so the whole PhD research was around looking at a stakeholder centered approach to determining the challenges and facilitators of work based assessment in the physiotherapy context. So we had three stakeholder groups in total, it was a national study so it invited all of the students at physio across the country, and within the Republic of Ireland to engage with the study. So we had students, we had practice educators so in other words they're the clinicians who basically work in the workplace as clinicians but also voluntarily and supervised physiotherapy students and during their programs. We also have what's known as practice tutors who are essentially qualified clinicians, but who are dedicated educational roles in the workplace, and who were implemented around the start of mid 2000s and to essentially support the students and the clinicians who specialize the students on clinical basements so it's quite a specialized role it's quite unique to health care, and but it has certainly helped in terms of improving workplace assessment. So, like I said, you can move. Oh yeah. Okay, so the students and findings then really like we're kind of encompassed within two themes so the first one with this whole aspect of looking at the inconsistencies that they perceived in the process, and inconsistencies and then the second one was around say strategies and solutions in order to improve that process. So students overall felt that there was a clear lack of standardization of grading across the sites. Now our other stakeholders felt the same so it wasn't an unusual finding. They felt that variable grades could be awarded depending on say geographical location or depending on say the area of specialty that the students were involved in. They felt that there was a say a variation in terms of the assessors, some could be very hard markers, some could be very easy markers. And also they perceived quite a lack of knowledge, again between some sites who never took students, some sites who took them all the time. So they felt that there was a clear lack of knowledge among some sites, in terms of filling out the forms, what learning meant, what they were expected to achieve. The impact of that clearly was that they felt that there was a lack of transparency across the process and as a result, they felt that they lacked or that they didn't believe in the process and they didn't believe in the fairness of the process so clearly those findings are quite significant in terms of moving forward. Okay so, as a result, they developed these strategies and the strategies then were, I suppose, ways that they felt they could beat the system in other words. So they felt that if you developed a good relationship with your educator, you were unlikely to get a low grade. They also believed that if there was any choice of say clinical sites that they could go to, that they might actually pursue the one that was the easier marker and forgo the benefit of learning. So again, these are quite concerning little strategies that they had built up. And the other one that was quite concerning was they felt that if they just copied the practice of supervising therapists, rather than thinking independently, which would be our say educational ethos, that they would actually just get better results because if they just copy the supervisor well then they were working towards what they worked towards. So the solution then, where they obviously looked for more transparency in the system, they actually were looking for a mandatory training of the educators who were involved in the assessment process. And they wanted more observation of students on placement so that regardless of whether you were doing well or whether you were doing poorly, the educated could continue to see developing performance. So finally, just I suppose a commentary on all that, clearly their solutions are relevant, and certainly we saw similar things being brought up with the other stakeholder groups. But what we really saw was this failure to see their own role in terms of their own development of learning in the workplace process. There were similar views from other stakeholders regarding the inconsistency and generally there was a sense that we need to encourage ownership of learning among students through say self assessment, peer assessment. We also need to look at the number of assessments and stay sharing the assessment process among other assessments in order to build a more comprehensive and reliable judgment of the students performance. So certainly the dedicated educational roles were of huge support across those sites that had them. And finally, I suppose we decided in terms of just looking at say training educators, we had a complex then around should we train less educators more stringently, or should we engage with all of them but risk that lack of consistency across say and across assessors so their points to consider I guess, and just really a little just taste upon him out of that PhD research. Thanks everyone. And many thanks and and you did it within time. Very much just about that red thing. So it was the more subtle approach from here today so don't worry. Can I just say to everybody that if you have questions from and can you please put them in the chat and I will answer the questions directly through the chat for you. I'm now going to hand over to our second speaker to Siobhan Magner from the Mayo Sligo Lee from Education and Training Board. And Siobhan is going to speak with us about ideas for work based assessment consistency in the FET site in the FTE sector. Thanks very much. And remember Siobhan. I know I'm looking forward to see I'm looking forward to Terry. Hello everybody, and thanks Terry and you have me on here. My name is Siobhan Magner. I'm the national program manager for the sales apprenticeship and a very privileged to be working in the area of friendship in the FET sector. And to see also how my colleagues across the country and 16 ETPs actually manage consistency in the whole area of work based assessment. And for those of you that wouldn't have a knowledge of apprenticeships. Back in the day a few years ago, the apprenticeship council specified criteria that we needed to incorporate into our apprenticeship programs when developing them. And those included first of all 50% minimum of on the job work experience on the job learning. And the other was we had to have the capacity to deliver the program nationally. So I use the sales apprenticeship as a case study here. We're currently the coordinating provider of the sales apprenticeship we don't have collaboration providers we don't have other ETPs engaged in the apprenticeship itself. And our current cohort we convinced the program in September and our current cohort we have 12 companies across eight countries, and small these are all small medium sized enterprises. We have diverse cohort of learners and Geraldine mentioned earlier on, we, you know, there's different cohorts, different abilities, we have leaving students with people in sales for years. And we've returners, etc. So trying to keep consistency and assessment, you know, is very challenging. 84% of our program is delivered on the job so workplace mentors are extremely important to the whole sales apprenticeship. When talking about and consistencies we are a consistency, I used an Australian model that from the Department of Education Training and Youth Affairs, and in Australia and they came up with these five P's that were important to ensure consistency. And what we do in apprenticeships across the ETPs is we would focus on week we could actually align what we do in with these five P's people process products perspective and policy. So the people are all of the those involved with the assessment process themselves, and the process involved is how these assessments are planned conducted and reviewed. The products are all of those documentations, etc. items that are used and planning conducting and reviewing the perspective is taking into consideration the requirements of industry employers. And then the policy areas are all about how the assessment process will be managed and implemented. So to see to look at what we're doing in MSLE to be with the sales apprenticeship. So first of all the people involved. There's ourselves the program of the program team so I'm the manager of the program team. We also have lead tutor we have our tutors. We have the tell team including our e mentor because we deliver some of our program online, and we've literally numerous specialists. Every apprentice that that comes on board to the program is nominee or they have a signed a workplace mentor from the company to them and they're there to act as the guide and the support over the two years. And there's a very strong relationship. There's a tripartite relationship between ourselves and MSLE to be the company themselves and the workplace mentors and the student or the learner and you know there's quite a lot of support offered there to both the mentors and the students. We have external authenticators involved and my colleague Jenny Conroy has they've just completed a very big project actually down in ETBI in conjunction with our colleagues in the further education support unit where they've established a national external authenticator panel. So they've over 750 external authenticators trained at the moment. So they're trained nationally. And then they're, if you want, if an ETB is looking for an EA, they go to this panel to find somebody that would be suitably qualified for their particular field of learning. So the 750 of those at the moment that have gone through the exact same process of training and then ourselves and MSLE to be we would meet them to brief them in the whole area in terms of learning outcomes, etc. We have collaborating providers, a lot of the ETB sort of collaborating providers in their apprenticeship. We have a National Examination Board and the National Examination Board considers the delivery and assessment of apprenticeship programs and we also then have our employers, the CSG, the consortium steering group. So these are the people involved in the assessment process. So all of our assessments are generated and designed in a national, you know, nationally. So once we come on, once we have collaborating providers other ETBs and board, all of everybody will play a part in designing these assessments and developing these rubrics, etc. The second is the process. And the process, this involves, you know, how assignments are planned, conducted and reviewed as such. So there's a formal process of designing assessments. We have, as I mentioned, they're the National Examination Board. In terms of teaching and learning strategies as well, all of our tutors have to complete a 20-hour CPD program in tele and blended learning and also they all have to complete digital skills for the online classroom. And part and parcel of that is developing and designing assessments, giving feedback, etc. And we also, as part of that training, we look at tools, particularly for formative assessment as well, tele tools for formative assessment. We also have sexual quality assurance procedures. And these were designed by ETBI and they were adopted by the different ETBs taking on apprenticeship programs. So in those quality insurance, oh my goodness, quality assurance procedures, these outlined the roles, responsibilities of everybody involved in it. Okay, in terms of the products, we have things like the validation documentation, the professional award type descriptors. We have experience exchange workshops with all of the tutors, national tutors. We design, there's a specification for assessment and rule bricks that we have to adhere to. And then all of our skills-based assessment are all recorded. The perspective addressing industry, we've got to ensure that employers needs are met. We, and we go to the policy then the lastly, again, we're looking at the policy, how the assessment is managed and implement. We adhere to QQI guidelines, our sectoral QAPs and ETBI after a year have a, and do I not get one of your minutes, no? No, I'm afraid not to have on. After a year of apprenticeship running, we have a monitoring enhancement panel that come around and what they do is they observe and give advice to see if we can improve, etc. So we do our level best at consistency across the FET sector. And we'll be talking about challenges later on, but the challenges I suppose that we would see would be things like the grading of assessments. And we would also have challenges. We're going to have to place a second time. I think you want to don a second time. Okay, I give up challenges and resources. Okay, good luck. Actually, Colin, if you can fill up the slide, that would be quite good. I'm just conscious. I'm sorry that it's quite close to the end. No problem. At the end of the webinar. So these are the three challenges. And thank you. Thank you. I'm going to hand back to Gerna. Okay, thanks, Jovan. And I know it's very tough to actually try and get it all in. But thank you for that. It gets really nice flavor. And actually the five P's was went down very well with people. So you might have questions for Siobhan. So stick them up for Siobhan there. Thank you both speakers. Just moving on for the last few minutes. This is coming up a lot. And the grading approaches and Jovan had it there in her last slide. Actually that the grading approaches. This challenge has been there for as long as I've been working in this area. How do you, what was the grading scale? Do you do a pass-fail? Many of you who are in that C context, that top right-hand quadrant will find it very hard sometimes in the shifting sands of the complexity of different placements. And Anne alluded to it as well when she talked about the geography and the different contexts that the students are avoiding and because they feel they're not going to get consistent grades. Do you go with pass-fail? Do you go with pass-fail distinction? And somebody mentioned it's very, what does it even, what does this distinction mean? Somebody mentioned that in the chat. Do you call it, Ben, students can say pass sounds like 40% to see, do you use the language competent, not yet competent? That was a very nice one I've used in the past. I think that's, you know, so you're alluding to the idea this is a developmental process or do you use graded percentage, grades, ABCD, and percentages. And some of the challenges around this again alluded to by some by our speakers are on placement. This was something that came through with the research I had done in UCD. And some practitioners say, well, I don't just don't give firsts, I don't give A's, just never do. So they've decided that they don't do A's. So, I mean already students are at disadvantage. But then if you go pass-fail, some students don't like it. They feel it's unfair. You know, this distinction is very unfair for students. Students feel deflated when they get a pass after 30 years of work, sorry, 30 weeks of work. You know, and again that came through in internship research that we did. So, so there's not an easy answer to this, but it has a big impact and it really needs to be part of your debate when you're looking at this in different contexts. And it's certainly in an Irish kind of context, common internationally as well, we work off a norm referenced thinking, where we feel it's certainly in the institutional kind of context that we should have a spread of marks. And certainly, when it comes to then practitioners grading, they don't necessarily think in that way. So there's an underlying challenge in the grading and the value system on grades. And so I think that this really needs to be interrogated in your work-based assessment because it has a big impact on your, on the consistency. The other big challenge in consistency is, and some of you mentioned it in the chat, so see coming through, is the standards and competencies. Language here is quite complicated. We want you a few minutes to go through it, but it's really core to what we're measuring against, what is it that you're trying to actually measure. And the standards are often, this was the difference between the two. The standards are what students should know and be able to do in relation to established criteria, often set by policy makers, professional bodies, and competencies are more how students apply their learning to new contexts in multiple situations. And there might be sort of competencies and these are where your assessment tools come in. These are more unique to the individual assessor. One of the challenges in this are, are they measuring links with validity? Are they measuring what you really want to measure? Another thing in this, and it comes from the general literature around assessment, is around well-designed measures. And some of you mentioned tools and the tools coming through. How good are your standards? And how specific are they? And one thing to think about here is, are they more holistic? They talk about holistic standards, which give a broad, wide picture. I think Anne might have mentioned to me earlier in our chat that, you know, the number of statements that you might use on a form, you know, what's the optimal specificity of the form? Because the more holistic, the stronger it is on validity, but more challenging with reliability, consistency. Whereas if you have a lot of detail, you are more likely to get consistency, but is it valid? This we could talk about for hours, but there is something here about the nature, the specificity of the statements and how they're measured. So we did think in the last few minutes, which is why we cut you off, because we wanted to make sure we got this. So sorry about cutting you off. We just thought in the last few minutes, if you would in the chat, this is an activity Terry introduced me to, the idea of hold and share. It's an interesting one to try. We want you to write in the chat, but don't press return yet. We want you to write in the chat very simply. Well, there's not quite a simple thing, but to think about it. How do you use standards to support consistency? And if we take the standards are like your statements of expected knowledge and skills. How do you use them to support consistency? So just have a little thing type in the chat, but don't, don't click return yet because we will call it at the end. We'll all click it together. We share it together. So just have a little think. How do you use and show on and then it'd be good to get even your thoughts on this as well. How do you use standards to support the consistency. So just give you a minute to do that. We'll share all together in a minute. How do you use standards? Pause for a second. A bit of quiet. Give Alison a rest from her hands. Okay, if you all click share now so I'll click return. And we'll see what comes up. There we go. Great stuff. Make sure everyone's aware refer to the national framework with difficulty. Students are designed for educational purposes as practice examples. Ensure a fair and equitable marks. Great. Thanks for that. A guide a framework green. Yeah, thank you. Standard user scuffles around grading. Communication robust rubrics. Yeah. Great. Thank you. I have one clearer statements. I want students to be able to do it in order to achieve outcomes, but can be challenging across varied areas of speciality. Yeah, thanks for that again. And do you allow students and this gets more into validity, but do you allow students to create their own outcomes learning contracts, developing their own kind of unique measures. So thank you for those. Thank you for those. I'm just going to do a little concluding thought here. And I suppose it's trying to put reliability in contexts. I was a really interesting author roller. I think that's how she pronounced it. She's going to be a speaker in our in our conference coming up on our symposium. And I thought this might be a nice statement kind of nearly finish on but like all assessment design requires compromise between contextualization and standardization. There's a difference between the context that we've been talking about, whether it's geography or types of placements. The educational impact and validity of the assessment might not be worth sacrificing in the pursuit of reliability. I'm throwing a Savannah in the works here a little bit, but it is about thinking about reliability and consistency is really, really important, but carefully considered, because it does it's really interesting to think about. In this work based learning how, how important liability is versus validity and I think it's an important kind of thought come to conclude with. This slide just pulls together maybe some of the thoughts that have been talked about today. We have standards in the middle really really important and I know qqi in particular, and involved in national bodies and professional accreditation bodies really really important that our standards are right. Our grading schemes are perfect in these really need to think about your grading schemes and go back to that. What is the right types of grading. And all these things that may help their two speakers talk to, you know what was on rubrics moderation policies, the five piece approach that she found was talking about really nice kind of way of tidying all that up together, and and research around grading and interesting students talking about belief in the system I thought that was an interesting point as well that doesn't feel credible, if it's not reliable. So let's just kind of summarize it. And so hopefully you have enjoyed the talk. I will hand over now maybe to Terry to just say what the next steps are in the last few minutes here. So Terry might hand over back to you. Thank you very much. So, so first of all, can I thank to her. Can I thank Anna thanks to Vaughn for the contributions I think it's been very interesting discussion. I'd also like to thank thank the qqi our partners in this particular series. And just to let you know what's happening for the next phase we will of course as I said to get the recording of the webinar and the summary of the chat and the slides and make them available to you. And our next webinar, Colin, if you can go on to the next page or symposium is on the 19th of November. We've already confirmed through our speakers. Dr. Nora McCrae, who speak about the global work integration training framework and understanding the context of assessment discussions. And a roller ad war, who's just who has just voted will talk about the challenges of Atlantic and aligned assessment in complex work context, the way forward for policy and practice. We'll also have speakers that will talk around professional competence, we're putting the program together to can I earn ourselves a putting the program together, and we'll send you the information on it as soon as we have it available. So that's what happened in the first webinar, and this webinar will of course feed into final symposium. And once again, thank you to everybody. Now, any of you have got five minutes to spare one of the, if I could ask any of you that could wait for five minutes. One of the things that we aimed to do across the series was to set up a community of practice and the community hasn't been that active so we just need some help to know about how we can make it better. And I would like to hand over to my colleague Catherine and he has going to have some questions for you. The session will take maybe five to seven minutes we just need a few inputs from you about how we can try and make this community work. Thank you very much Sherry and good afternoon everyone and thank you for those who can stay on just for a few minutes. And thank you again to the great seminar or conference you had today, but one of the things we want to do is to try and get some feedback from you today is how how we can make this a better community how we can get more interaction and so I basically have three questions I want to ask you here. The first is I'd like you to put these into the chat also so that we can capture it but the first is what value do you see in being a member of this kind of community. And you might reflect on other communities that you are with, you know involved in and the ones that bring most value to you and what you know what types of values are they, and why would you want to be a member of this community. What would you hope to get from this community. I can see a few people putting in chat there that's great to hear other people's views to extend practice to information share and that that is definitely one of the values of a community of practice. Learning across the sectors that's an interesting one because there's a diverse group here you've many different clinicians internships apprenticeships and so on, different types of practices. Standardization so a lot of things probably better discussing with chair and then earlier coming up here. So that's great. So maybe Colin if we just move on to the next question. The next question I want to ask you is what you know we want to make this community serve you better as Terry alluded to, there hasn't been a whole lot of traction in it. So, in order to make it serve it better what can you what do you think you can do to make it serve you better. You can just click on their Colin. So from your perspective, and then from the National forums forums perspective. What do you think will make us do this community of practice better. So I want you to think of it in terms of what input you can make. And also what the National forum can do with the with the two. So there's one and request for volunteers so you've jumped the gun there and request to volunteers that's that's something that I we'd like to do choose a theme to explore what quality means work together to create a national toolkit. So that's these are great ideas. And we certainly take those on board and my final question that I want to ask you then if you want to move on Colin is is in terms of joining up. A lot of you here are in the network some of you may not be, but in terms of joining up and contributing. So when we come up here, we need some sort of leadership some sort of people to volunteer to manage and maintain a community of practice so these things don't operate in isolation they need somebody to have an input so somebody who has a particular vision and just to pick physiotherapy practice somebody might like to moderate a discussion on that. So what we're looking for here is volunteers from this group would like to be more involved in this community and practice and to have a role in maintaining it. So you can either volunteer here and I see a few people doing it already that's great or you can see my email address there in the forum or you can contact anyone in the forum and let us know. And we'll get back to you shortly after this webinar with a plan of going forward and that's great I see loads of people volunteer there and thank you very much that's that's super. Well, everybody I really appreciate the few minutes that you've taken there and we take away all of your, your ideas and your insights, and most especially the names of those that have already volunteered to see it between now and the symposium in November, and we can actually start to get that community working. Once again, thank you very much to our partners to write for speakers and to our interpreter, as in who's been working so hard for the last hour or so. And I look forward to seeing you all at the symposium. And where our discussions we continue. Thank you everybody bye bye.