 Okay, great. Thank you, Anna. Our third and final case study for this morning. I'd like to introduce Senior Lecturer Christina Doe from the University of, sorry, from Curtin University and also Dr. Andrew Brennan, also a Senior Lecturer at Curtin University. Their case study is entitled, Students as Partners, Co-Creation of Curriculum Model, Enhancing the Learning Experience through Assessment, Rubric Design with Students and for Students. And I love the play of the prepositions there with Students and for Students. Okay, without further ado, I turn it over to you. Thank you so much, Ron. We're really impressed. And thank you, Anna. And we know we're the last presentation between you and lunch, so we'll try to work through it as quickly as possible. But before we do that, we would like to begin by acknowledging the traditional owners of the lands on which Curtin University is based and which we have the privilege of living and learning and raising our respective families, the Wajah Nonga Buja, and we'd like to pay our respects to the elders past, present and emerging. And in the same token, we'd also like to acknowledge that this is solar land that was never ceded. Yeah, and I'd like to just say kaya wanju, which means hello and welcome in Wujak from the Nonga land that they own. And I'm also just a quick side note. I'm from the northern suburbs of Perth, the city of Waniru, which means the place of the digging stick, or place of the stick. Waniru means stick, and I love the history there, amazing. So wonderful. Oh yes, and so just to let you know, we're an interdisciplinary team, as you can see. We've got economics, law here. We're just two members of our team, but also we have so you can see their nursing and marketing and international linkages. And basically we got together in 2021. I thank you very much to Curtin Academy, Academy Grants that we received some money attached to this. And it was difficult times, you know, during COVID, those sort of challenging times with the meet online, et cetera, like we are now, but it was worse back then. But it was mainly a mentoring relationship between staff as a primary focus, but then it led into students as partners approaches that evolved, which we'll talk about shortly. So that was the cool thing. It started with some focus groups and then it led into students as partners. And of course we've received ethics approval for this. We won't go into too much detail about the literature as students as partners as a lot of the presentations have already covered what students partners is. And I'm sure a lot of you are already familiar with the concept of students as partners. Based on our project, our project initially started focusing on mentoring, like Andrew had said earlier, but with the focus on really assessment rubrics and the design of assessment rubrics. But what we found throughout as the project evolved over time was that facilitating students as partners arrangement is a meaningful way to facilitate an inclusive space to empower student voices and to incorporate students perspectives with respect to shaping the university experience. In particular, our case study highlights the benefits of incorporating students perspectives and voice into assessment rubric design itself. Yeah, and I forgot to mention our main love was for rubrics. So it's mentoring one another, encouraging one another and our love for rubrics. And so that's where we kind of began in 2021 as we mentioned. And then we ran these focus groups where we had our economics staff and students and nursing staff and students and law staff and students and some in Dubai too, the students there to get like a get them together, run down some theory on rubrics, how to utilize them and then get them to critique and provide some insights on existing rubrics. But then it sort of ended. We got the data, we're happy with that. So that was good, like a typical focus group. But one cool thing led to another was one particular student in the law focus group made this critical insight and comments. She wanted to be more a part of this as a partner, student as partner type thing here. So she really wanted to be more consulted in these sort of things. How can it go further? And how can we incorporate students? In other words, see real tangible result. So that was our main driver, the result of our incorporating student feedback and seeing tangible result shortly thereafter. So that's the main drive now. And have we had these two different units we've identified, first year units, a law one economics one. And lastly, then we're just doing some dissemination conferences, those sort of things. So yeah, that sounds evolved quite cool. And thank you for that student. Oh, yeah. And also, so we started off with saying, hello, can we get you to come along to this workshop? Hence the call of interest. This will look like just a brief overview. So in other words, we're targeting to just get the sequence quite important. We went to approach second and third year students who have done this first year unit. And we approached them before the semester started. And then we went through the rubric and got their critical feedback as it were. So it makes sort of sense. They come in, give the feedback on the rubric that was used in the past semester that they're very familiar with the second and third year's opinions and perspectives and critical evaluation. And then we implemented change during that semester one. And so Tina's going to talk about her law one. And I'll come back later on and talk about the economics one. So that's how it all sort of worked out. What we would like to notice that for this pilot project, particularly with the law student as partners workshop, we did manage to obtain student volunteers from a diverse group. For example, there was diversity with respect to gender, age, academic performance, university engagement and life experience. We do acknowledge that while there was a degree of diversity in this particular student as partners workshop, it was not necessarily a result of our conscious doing. It just happened to be that way. And as a result of those who volunteered and responded to the call of interest. In order to ensure diversity in a student as partners arrangements such as the model that we adopted, we believe that facilitators do need to be mindful to ensure that different student groups are represented. And this is something that we need to focus on and think more of in the future when we implement this model again. But to just to talk a little bit more about the model that we adopted, as Andrew said earlier, we did send out a call of interest students did volunteer. It was for a two hour workshop. And in the in the call, we did say that it would be conducted by an academic facilitator. It was catered and students will be given a voucher in exchange for their time. So to start off the workshop, we really thought it was important that students understood what we're trying to achieve with respect to this project. The first thing that we did was we opened the workshop, obviously telling the students about the human ethics clearance and declaration. And we obtained their formal consent to be involved in the project. Following that, we explained to the students the importance of ensuring the graduate capabilities promoted by the university are indeed embedded into their courses. And through the process of constructive alignment to the corresponding units. And in particular, as it's regulated by the higher education standards framework threshold standards. We then spent a lot of time talking about the unit itself that we're creating the assessment rubric for because it was fundamentally important that the students understood the task that we were doing. So we spoke about the unit where it sat in the course and the unit learning outcomes at this particular assessment was seeking to assess. And then after that we got straight into it. So the formalities was only a very small portion of the workshop itself. What we really wanted to do was spend a lot of time facilitating the actual rubric design and having students engage in the process of designing the rubric itself. So look, I can speak for myself, but when I first started this workshop, I thought all that would be doing was changing the language in the descriptors and the different levels of performance within the assessment rubric itself. And while it certainly started off this way, as you can imagine, given that this was a workshop coordinated by a law academic and a lot of students who volunteered were indeed law students. It became more of a fiery negotiation and the students happily volunteered a lot of suggestions as to how this particular rubrics could be improved. And these are some of the suggestions that were made and that were ultimately implemented. So this workshop was really interesting because a lot of the times the students would make suggestions and in real time, the facilitator myself, I would incorporate the suggestions immediately in real time so that way the students could see it being implemented. And what we do have is we have a short video from a student, Ryan Kirby, who is a Bachelor of Arts, Bachelor of Commerce student. Also is the Faculty of Business Law student union rep who was involved in the workshop. He delivered a quick script to camera explaining about his experience in the workshop. This workshop required of us as students aspects of ideation and a sharing of perspectives. This setting allowed for us as students to challenge traditional structures of regulation in regards to assessment policy within the tertiary education environment. Top-down regulation and regimented policies are normative in these settings and often these conventional approaches characterised by rigid regulations fall short in promoting students' interests and fostering their critical engagement with both the discipline material and the assessment frameworks themselves. The workshop involved aspects of design thinking and radical collaboration. Of note, suggestions made by students' partners, participants were incorporated in real time to encourage robust discussion and feedback. This structure allowed for an equal and democratic input. Nobody got all they wanted but nobody got nothing. The redesign of an assessment rubric in collaboration with engaged and active students has distinct merit as it enhances student engagement with the course material, incorporates student input in assessment design and in turn improves student receptivity, promotes student empowerment and champions diversity. This workshop required. Perfect. We assure you, Ryan, we did not script that script at all. He wrote it himself and delivered it to camera just last week prior to him taking off to do a study exchange in Morocco. So really powerful words from Ryan there about his experience. And here what you can see in the screen is the direct impact of incorporating student voice and perspective into assessment rubric design. What you'll notice on the screen, the left-hand side or my left at least, was the rubric that was used prior to the workshop and the rubric now that subsequently has been used as a result of the student as partners workshop. You can see the suggestions that the students made, including the criteria descriptor within the rubric itself, using color to demonstrate the levels of performance. One of the ones that the students really pushed for was having feedback instead of holistic feedback right at the end of the rubric instead ensuring that the feedback is targeted to each assessment criterion. And those are some of the suggestions that the students really pushed for and we were able to facilitate. One of the other suggestions that the students were really vocal about was that they really wanted to have an example of an exemplar that was in the 80 plus range. A lot of them were pushing for this because a lot of the students said that well they want to know what they need to do in order to get a good mark or the best mark. And I tend to be really hesitant in providing the samples just because I feel as though it diminishes critical thinking, stifles innovation and problem solving. But what we came out of that student as partners workshop, that discussion, was that students just wanted to really see that rubric in application. So as a compromise, as a part of the negotiation process, we decided that instead of releasing full examples of an example of the assessment, we would instead release bite size videos, examples of each criterion at each end of the rubric spectrum. So what you can see on the screen here is the example of the presentation criterion. So presentation structure, sorry, criterion one at the 50% mark and example at the 80% mark. These, again, these resources were created in real time really quickly. So after the workshop, a week later, the these short videos were scripted, they were recorded and implemented into the unit Blackboard site for that following semester, as well as with the rubric. So the way the students could use the assessment rubric and apply it. So this all happened in semester one of this year, and we're able to then roll out the rubric, the video supplementary materials, and we are able to survey the students who were enrolled in the unit and subsequently had to use this assessment rubric. We had a response rate of 66.8% of students completing the survey. What was really interesting and was specifically of note was that no students indicated that they did not understand the assessment rubric. 99.2% of students indicated that they understood the assessment rubric now admittedly to varying degrees. 12% indicated that they understood some of the rubric. 35.2% indicated that they understood majority of the rubric and 52% understood the whole rubric itself. There's a lot more data that we did, we were able to collect, but for the nature of this presentation, we won't go into it to too much detail. This just demonstrates any one example of how powerful incorporating student voice can be in assessment rubric design because it allows students to actually better understand the task at hand. And now Andrew is going to talk about his experience implementing a similar model in economics. I'll be relatively brief so you can have lunch soon. Thanks Tina. I was so inspired by the law avenue that they did and the feedback I heard about what they did in the workshop. I was really excited. So I thought for a unit I teach, I'm involved in, which is a common core unit. So they're always tricky, which means students have to do it in the Bachelor of Commerce. It's called Markets and Legal Framework and it's a consists of economics, law and marketing, all an album added into one. And I teach the economics portion. And so one of our assessments is called this so-called economics principles analysis over there on the left. That's the original rubric. So then we got the students in just before this assessment was released in semester two this semester. So we've got some economics third year and second year students who attended to the call of interest. And we got them to critique and have a workshop together. This is what they come up with. Pretty interesting. So just a quick side note. I thoroughly enjoyed interacting with our students getting their insights because you just realize one's age and one sort of lack of knowledge in trying to, you think, I look at my rubrics, the world's best, but then you realize it's not. So that's cool. Just a quick sort of important point. So the important point is that academics is we don't get a chance really to spend time with students to explain what we do and why we do it. So this is a great time to say, look students, this is what I do already is a good enough. And so I said, for example, we have a video guide on this assessment. And you can see on the right hand side there, my horrible writing and highlighting. And I go actually go through the assessment piece and record a video with annotations. So they thought that was useful. And of course, I forgot when I'm doing that, I'm linking back to not this one, but to the new rubric that they designed. Yeah. So they're doing the video and link it back. So it's really important. They're also happy with the exemplars of the tutor, the tutor sort of working out, okay, and explaining in class. Here's a way of answering a particular economics problem. This is what you need to do to get a HD high decision. This is what characterizes a good answer. So rather than providing a written exemplar, separate the tutor in class and keep it more in house to avoid, you know, AI, you know, leaking leakage as it were. Gen AI, taking over things. You got the tutor sort of sorting things out. So they're the example and of course, the lecture to as an example. Now other key comments, more positives. And then we get the negative at the end. They want them all color. So that was good. And we saw that out. I'll show it in a second. But lastly, I thought this is a really insightful insight and I'm radically changing. So TLDR too long. Don't read it. I didn't realize I had everything in one document. There was too much for the student. They said, look, it's just too much. No, I'm going to read it. No, we can't spend ages on it. Not interested. So then they asked me to separate things, divide it up, make it much cleaner, crisper. Look at that. It's just quite so much text that turned off. I didn't know. So that was good. And so here's the end result. You got on the left-hand side, a lovely fruit salad rubric. How could you not want to attempt this assessment? It's just so delicious. And then on the right-hand side there, you've got sort of more specificity on how marks are allocated. And I had this in my video guide and everything so they can link into it. So students could clearly see what a exemplary HD mark is versus say it meets expectations. And then the corresponding overall sort of judgment and assessment. Then you exactly what they need to do to do well. And just to get through it perhaps and how to fail. But hopefully not. So that's the general gist. We'll finish up now a couple of things we're going to talk about. I'll just name one thing because Tina mentioned it beforehand. She talked about how when you do a call for interest, you tend to attract the high achieving student. We got gender diversity. That was really good. I make sure of different genders. That was very useful. But we didn't really have sort of maybe lower on the scale of achievement students because I knew those students personally from the economics line. They're very good students. So we got that sort of perspective and perceptions on the rubric redesign. But we didn't hear from other students. So that's something where we could improve the future. That'd be my main comment. Excellent. Yeah. No, that's it from us. Okay, great. Yeah, so we can take some questions from the floor if there are any. And happy to do so. Great. Thank you so much. I love the fact that work, more work is being doing on assessment. I'm a big David Bowd fan. And I think that not only is, is some kind of assessment important, but formative and I love the involvement of students in this very student centered. Are there any questions either online or on the floor? So thank you to both Tina and Andy. Sorry for the informality. I'm just reading your, your. Okay. Well, we're waiting. I have a question. One. Issue that, that might be asked is with student involvement in their own assessment, was there a trend or did you notice any, and I suppose subsequently in analytics, did you find that the assessment task was actually made easier so that students could get better marks? No, I don't, not from the workshops that we conducted. A lot of the comments were more constructive about how can we make the information in the assessment rubric more easily understood by students. So I think that was the main focus. And I think the students also adopted the same focus as well. Yeah, it was definitely not on the assessment piece, but it kind of made it easier in the indirect sense because we had, we gave much more clarity or at least I did my economics one. So in other words, there was less text for them to, to sort of sort there through in their mindset. So in that respect, it was somewhat easier, but not in a direct sense to make it easier to pass as it were, but the student, we would make sure they soon has to work hard. That's what came through the economics workshop. I'm going to have to work hard to get my HD. That's still the case too. Yeah. That's great. It sounds like it was made better as opposed to easier. Absolutely. Something that came out of the lawsuit workshop was that they wanted more, they wanted us to be a lot more prescriptive. So for instance, with content, they said, well, say how many cases we need to refer to, say how many journal articles, say how many pieces of legislation. And because they wanted that, that level of detail, whereas that's where I pushed back as the academic a little bit more saying, no, again, we don't want to stifle innovation. It's too hard to dictate exactly the number of references that need to be made with respect to that particular question because each question was different for each student. So we, it was a lot more of a negotiation process to ensure that, I guess, that that there was still academic, it was still academically rigorous. Great. Are there any questions from the floor or, or from our online participants? Please feel free to type them in chat or raise your hand for Anne. Thanks, Ray Jones for the comment and the love heart. Love heart. That's great. I think it was David Bowden himself who said something to the lines of students can, can endure bad teaching, but they can't do so with bad for assessment. And so I think when we look at making assessment better, whether it be formative or summative, and when we make it more transparent and less sort of holistic and subjective as Rubik's do, it's a, it's a great win-win for everyone as a, as a teacher and as a, an academic in past life. I know it's a lot easier to mark with, with, with Rubik's because of the, the, everything's visible and everything's clear. And as a student as well, it's a lot easier to, to not have the guesswork involved. There's a question. We have a question. I'll just, I'll just read it for Daniel. Did the university need to change their overall assessment guide to allow new tasks from. From cosine, cosine, sorry. No, no, we didn't. Before we embarked on this task, we did get human ethics clearance. And we also discussed it with our respective directors of learning and teaching within our schools. And there was no requirement to have any change. The overall assessment guide itself. This is why we also think assessment Rubik design is a good place for students as partners, because a lot of courses these days are really heavily regulated and prescribed. So it doesn't get leave. I feel academics were a lot of scope for including innovation and including innovation that's led by students, whereas the way in which we provide feedback and assessment feedback doesn't tend to be overly prescriptive. So for that reason, we believe that student as part of initiatives can really be incorporated in this space. And of course, the benefits are that students can better understand what's required of them when they're completing the assessment task, but ultimately the feedback that they do receive after they complete the task so that they wait, they can ultimately improve and implant that feedback for future assessments. No, and I think a lot of flexibility for some reason to work on assessments and to modify them as long as it's just done before the due date I suppose in the semester starting point. I don't know that could work for it's negative as well, but in terms of positive, we could implement things relatively quickly for the economics and law things. That was really a major benefit. So yeah, we could just do it. So thanks for the flexibility. Yeah, it's good. I just have a quick question. So when it came to obviously you've run the focus groups, from there you had some really interesting students continued with your collaboration and co-creation. How did you close the feedback loop there? So you've had some students that were in the focus group stage. Did they get to hear about where their feedback was implemented in the final result? And how did you go about that? Absolutely, it happened in real time. So in the workshop, the students we conducted in a seminar room. So we had the computer at the front. It was projected on the screen every time the students made a suggestion. The facilitator or myself I would say, oh, you mean like this? And I would implement it immediately. So we were able to see it in real time. Those students who attended the workshop, once we finalised the rubric, they were able to see it. And then before we formally released it, again, students were given the offer if they wanted to see the rubrics customised over one last time to see if there was any further feedback that they would like to see incorporated. So that's how we closed the loop. They were able to see it in real time right then and there. And then obviously when we implemented it. And for the economics, when I just sent them my rubric adjustments, all the things that hopefully incorporate all their feedback just to say, look, are you happy with this? And they were happy. So that was a bonus. And now we're going to answer another question. Some amazing questions and some equally excellent answers. I had one, another question if I might. Often in assessment there is the, it's no longer a debate but there's often a discussion between norm based versus criteria based and clearly a rubric would be in the criteria base. No, I'm not speaking to any university in specific, but in those discussions it sometimes said that faculties like law tend to be bastions of the old norm based assessment. Did you find in either of the trials that there were any sort of pushback against the criteria based as opposed to norm based assessment? No, not from these two particular case studies. We didn't notice any pushback without respect from the two cohorts. Can you think of anything? No, no no major pushbacks or queries like those sort of things. No. At least from the economics, nothing more. No. There you go. Good. Okay, great. It sounds that criteria based is well entrenched and I think a lot of people, myself included would say that's a great thing. Okay, are there any other questions at all? I know we are running over time, so if there aren't any other questions, I would just like to thank Dr. Brennan and Ms. Stowe for their wonderful presentation on this very important topic. I know sometimes people think assessment is dry and they've certainly proven that it isn't. It's vital and very important. Thank you again. Thank you, Ron. Thank you, everyone else. Excellent. Thank you, everyone.