 Roeddwn i'r sgol, mae'r Eurusio ein Llyfridd. Roeddwn i'n ffordd y pedigoddi, ond yn ddwych i'r ysgrifennid gyda'r fawr cyfeithio i'r fawr yn y ddiddordeb ym Mhwng. Mae'r ysgrifennid gyda'r fawr cyffredig cyflwynt, dyfodd yn ysgrifennid gyda'r fawr gyfrifennid cyflwynt, dyfodd yn ddiddordeb ymdwych, ac mae'r ddiddordeb yn mwylai'r adynnig eich pryd o'r pwysig yn ddwych gyda'r cyfrifennid cyflwynt. part of the project, and this is actually part of just the Moodle successes we've had within that. So, we came into this project sort of aware there were a couple of issues with Moodle. The first one is, well, people seem quite scared of the grade book. It's quite time consuming to set up, and we also find that for a lot of people actually what they're doing is quite monotonous. I think they're all the same structures all the time. We also find that people then pick up the grade book and they latch onto things like calculations because they understand Excel, and these are quite brittle features of Moodle, and we wanted to try and make something which was much more robust to really prevent the sort of grade book becoming an obstacle to the teaching and learning sort of that the staff are trying to deliver. On top of that, we have organisational issues. These are some of the examples of the different forms that our institution describes a module, and there's at least six different varieties of paper form, and there's no consistency in them. So, for us as the central IT, trying to communicate some of the information in there is quite difficult. So, at the start of the project, we actually did a sort of random sample of the module descriptors, and we went through and identified each of the issues and tried to cherry pick the ones that we really wanted to fix. We found lots of really, really wacky issues in actual, the content they put in there. So, students were given module descriptors that said, your assessment is 30% of your coursework. Well, that's 30% of 100, where's the other 70% coming from? We had terminology such as sufficient or normally, again, the students not left with any guidance about it, and then we have really complicated calculations which are given to students to say this is how it works. So, what we tried to do is try and take a sort of pragmatic view, and we did a series of sort of workshops with them, and we tried to go into them and try and condensers down to a sort of terminology that everybody could use. Going into these workshops was a bit like Julia Sears and we thought everybody and I were going to be out to get us with what was happening in it, but we broadly settled on four really broad categories and put everybody. So, if you start thinking about your activities in these broad terms, we can do something to just help that communication of the assessment structure to the students. Now, you may recognise that some of these categorisations are fairly close to unistats and key information sets that we had to do a couple of years ago, and again, that's consistency with the student experience trying to find ways that they can have that thread of understanding following through their course. So, once we settled on these sort of four component categories, we then went about coming up with a core gradebook structure. We recognise that there are different pieces of data that appear in the gradebook. We had come up with these broad topic headings and the purists in our institution pretty much made a big much to do with nothing that these things aren't what they say they are. We basically had to turn on to say them. We've co-opted them. These might not mean what you traditionally mean them to do, but in the context of March of term we're going to use these terms to mean certain things. So, some motivativities, they apply to the students' grades, formative ones, they don't contribute, but they go in and go up and do them. Administrative, it's for information, it's therefore, you know, sacking all that information that we've done that. And we took this structure and then we mapped our component parts onto them. Now, due to the quirks of moodle and how we work with the gradebook, we had to do a little bit of reorganising in terms of creating some additional categories. And we split it up into a description of standard activities, which tend to be the first attempt students, and then resictitivity to the students who are in that class, but they're doing a slightly different sort of assessment regime. And the important part is when we actually start talking to the students and showing the gradebook, only the categories that are pertinent to the assessment regime are actually presented to the students. So, if you have a 60-40 split between coursework and examination, you'll never see practical or project. And it's trying to tidy that up with the staff. So, this is our list of classes that we've got. We had to build a tool out in moodle to sort of try and capture this data set. When in each class, we ask for a really rudimentary breakdown of the assessment regime. We're not asking them to go down to you're going to do six essays and stuff like that. We are doing really high-level description. And we ask them to explain really broadly how those component parts sit together. And once it's filled in, that's what we get on there. So, once they fill this in, we now hold this in data. And we found that for a lot of cases, this is sufficient data to get that broad description. So, this actually then allows us to put in most people we found tend to work in the percentage mode. But we did find that there's a couple of places where people are working and really adamant that they want thirds, which if you're doing moodle gradebook calculations and thirds, it doesn't work. Once we've done this, we can now apply this to the moodle gradebook. Now, that's normally fine if you've got a new class. We did look at migrating all the existing assessment regimes into this. We wrote a wizard tool for trying to do this. We spent a lot of time on it. It doesn't work. It's too dumb. For the simplistic cases, we could do it. It's really fine, but we spent effort and we basically went, we're going to have to bite the bullet and say the moodle way of working with the gradebook is going to have to suffice for just now. There's too many cases where people had set up fancy aggregations and things like that. But we hit that apply button and it automatically sent it to the gradebook just as you like it. And this is what you get. We have a structure which is set up with the percentages at the high level. That data will also get synced to an automatic base as well, so you don't have to push the button. And it goes and creates a number of high-level categories. In order to communicate that top-level class total, we've actually set the summative category to be weighted one and everything else to be zero. So, for all the students who are doing the subject for the first time, the class total is what they should expect to see. And we use the grade items to display the other information under a recent condition. One of the issues we found is that people can edit things in the gradebook with a lot of freedom. So, stop it turning into a comedy. There are errors. We basically disabled all the editing of the top-level categories. So, you see, unlike the standard Moodle gradebook where you can move them about, the ones that define the core structure for examination or practical can't be moved, can't be reweighted, can't have the aggregation modes changed in terms of how they sit together at the top level. Below that, we offer them all the Moodle functionality that they want to do. So, if they want the flexibility of putting six quizzes into a category and saying you're dropping the last two and then taking the average of the rest, they can do that. But that will only contribute to a fixed part of the grade of it. And ideally what happens is those module descriptors all get done. We migrate the data into our dataset. We apply it to the gradebook and at some point the teaching happens. And all we have to get the staff to do is, well, when you go into teaching inside, how you want to implement that particular bit of teaching, you put it in the right category and that's all you have to do anymore. So, we come back to this idea and we'll see that the class total as it fills out gets worked out automatically by that top level. This works really well for the gradebook. We also wanted to make sure that the data we had in here also fed into a marked return process. This actually led us to identify a bit of a shrodinger's cat situation. For some of our courses, they typically have two potential values that students can be having at any one moment in time that we can't resolve with the data that we had. One is that at first attempt students, they get the standard total. If they're a research student, we don't know that in data in our VLE. And if there's an exemption criteria, some of the exemption criteria is so complicated that we can't actually algorithmically work out on the gradebook. So, what we actually have had to do is present at least three different class totals to a member of staff and then ask them to decide. So, this basically takes us into our marked return processes that we're doing at the end of the academic year. We provide this report to them now. And again, it's meant to replace a lot of the spreadsheets that our staff have been using. We're getting it into the VLE. What we're just asking them to do is not do any of the maths or the calculations but just choose which conditions apply. Now, on this screen, if they so decide, they can override that course total. We have cases where classes, if you get 39%, you automatically get upgraded to 40 and they can do that on this screen. We also had to bite the bullet and basically say that some of you are doing stuff that is so wacky. We can't model this. So, we've had to offer an ability for them to take the data out of the screen, put it into a spreadsheet and then re-import it back into that. But once they do that, every mark that goes on to there is then validated against the corresponding marking scheme that's been set up for that course. Hitting the go button, reformat sends it to our student result system and it goes off. And any issues get reported back to that screen for them to display. So, that's what we've done to try and make our assessment regimes flow all the way from the design part right through the teaching and right through into the delivery when we actually have to do Mark's turn. Like you said, the progress is long going. We've had successes in terms of the moodle development that features the moodle. One of them has been the discussion around how we describe assessment regimes. They are very complicated but some of the information students want is actually quite simple. So, we've been having a good discussion to bridge that gap. Having the ability to deploy a standard gradebook to every course and also to redeploy it to a gradebook if it changes or basically say actually apply that gradebook assessment regime to a different site that I'm going to use to replace that course has been incredibly useful. We've managed to solve the bootstrapping class gradebooks for our staff. Hopefully, that's reducing the cognitive load on them thinking of where things have to go and what they need to set up for it. The separation of the assessment structure from the course delivery has been helpful in terms of actually when we separated them into different data structures those data structures are becoming more versatile. They can be reproduced and repurposed. But we've managed to retain the flexibility within that assessment regime or the teaching delivery of that assessment regime but within a defined envelope of practice that has to go through quality control and policies at work. Wasn't so successful. The process isn't fully running yet. This isn't our fault. Actually, our student record upgrade got delayed by a year so we are sending it to the old student record process. It has created another data set in our institution. Arguably, that data set is more useful than the data set we've worked with before because it's no longer continuous prose, it's data. It hasn't entirely eliminated spreadsheets for a number of cases especially with things like there's logic. On the upside, next version of Moodle, logical conditions come into the gradebook and that's going to allow us to actually do some of those things with people who use spreadsheets to make decisions about putting back into the gradebook and that's going to make a big difference to those processes. It's also highlighted a number of things that we would quite like to see or we don't think are currently in Moodle that really would help this process from us. The idea of a master template for a gradebook that can be applied and propagated throughout an institution would be really helpful, it would really help for us. We find that a lot of that process is just repetitive make work because it's broadly speaking the same. The gradebook UI is still really cumbersome trying to move things about, it's not drag and drop, it's click and point. Calculation editor, we'd love to see more work done on that. We're comparing this against Excel's formula bar in some cases and almost everybody's happy with how that works it gives you really clear cues. Moodle, you're working with double square brackets and identifiers that aren't the identifier or the name of the activity and it's really complicated. The aggregation tools we found to be really useful and they work really well and they solve a lot of the maths case that we wanted to hide from the staff. What they fed back to us, they want to see the working. They want to see a look at particular student's value and see the breakdown of how that's happened and the calculation they're really sure with spreadsheets because they can go in and see that information. But within Moodle they're not so happy so we'd love to see ways that that's getting sorted. And again gradebook calculations feel really brittle and we still have to rely on them and there's features in Moodle that we should love to see that can it make it more obvious, can it make it more robust because when a gradebook breaks if you've predicated everything on that gradebook if it breaks it's really hard to find out exactly where it's broken especially if you're a teacher and you want to be teaching. So that's what we've tried to do with management assessment and try and integrate it into a whole single process and just try and reduce the cognitive load on our staff so they can actually concentrate on the teaching activities rather than actually working out how to add them up. That's it, thank you. Thank you. Questions? Over there, Bob. Thanks a lot for showing that to us. I just have a question about how it's a really cool project. I think it's amazing. Can you talk a little bit about how that project was initiated and kind of who sponsored it a little bit? How did you make it happen? We have been working incrementally on bits of this for kind of since we started with Moodle. Three years ago we introduced some tweaks to Moodle that's implemented our assessment feedback policy and that started to show our academic staff how we could ask for data in these things and start using the data and reusing it and that caught the eye of more of our sort of senior staff. So this has actually been driven out of our learning enhancement committee as an actual, this is something the institution wants the VLE to do. So that statement on the very first page was actually fed back from our vice-dean in the turn of learning enhancement to our senate, to present it to our senate saying this is what we want to do with management of assessment. We should be building tools which allow our staff to teach and not administrate. So that's how we do it. But it took a long time to get to the point where we could go to those committees and say we think you should drive this, not us and we don't want the technology driving it. OK. Hold on. Can we just have the microphone so people can hear you at the back? I don't know any of the microphone. That's not going to work, is it? Gafferty can be super good. It's in blue text. Did you encounter any problems with turn it in? No, we have integrated turn it in using the plagiarism plug-in rather than using other tools. So as far as we are concerned our students and staff interact with the Moodle assignment module and turn it in is a problem and that happens in the background. So from that perspective they don't think about it. We have a default configuration that is set up. So again it was about making... We have a policy that says stuff should go through turn it in. So sort of a convenience, a compliance with convenience, we default the set up so they don't have to change it and it's just there. So it's not impacted us that from that perspective. So we are basically using core Moodle features as the main stable, all the teaching activities. OK, one more question. Yeah, just behind you. Behind you is behind you, Doug. It's great stuff, Michael. It's just a comment really that for institutions who are interested in learning analytics getting consistent grade books is one of the challenges of pulling together an institutional overview. Yeah, absolutely. We have quite a developing learning analytics programme for it and part again is again taking it back to our vice-deans and vice-principles and saying if we can turn this into data we can make it part of learning analytics or not. But it has to be data. It can't just be written down on the back of an envelope but it's just liable to be, you know, broken. OK, thank you very much. Thank you, Michael.