 So hello, I'm Jessica Grant from UCL and I'm here today with Matt Smith who works as a learning technologist in the engineering faculty and specifically in the School of Management. If you're at last year's Moodle Moot you may remember me talking about my feedback report and we're here today to give you an update on how that project is going and Matt's going to speak about the pilot that we're currently running in engineering. So if you've never heard of my feedback report before it allows students and staff to easily view grades and feedback across Moodle courses. It's available on Moodle.org so you can download it and the latest version was released last Sunday. The project aims to raise the visibility of feedback to students by showing them feedback from across their Moodle courses. At the moment without the feedback report the assessment feedback is very siloed so students would have to go into every Moodle course individually to see all of that. We're hoping that this project will further encourage staff to move to using E-assessment tools within Moodle because that can really increase the speed of return of feedback as well as the quality and consistency. And our initial functionality of this tool is really aimed at students so we're focusing on the student view predominantly as well as allowing staff to view what the students can see. This is what the report looks like at the moment. As you can see here you see the module that the assessment piece resides within and then it shows grades and feedback for a variety of activities like Moodle assignments, Turnitin assignments. We're currently showing feedback for version 1 and 2 but a slight change to the code will enable Turnitin next to be shown in this as well. You can see quiz feedback as well, workshops for peer assessment as well as any manual grade items that you might have added into the grade book. Now the feedback comments tab is where the students can view all of their general feedback directly in the report. That is except for Turnitin. Turnitin doesn't actually have an API and they're not planning to release one that would allow us to draw out the general feedback and show it in this report. So the best we can do is allow students to link to that report. And alongside that you'll see that students can reflect on their feedback by adding notes in here which their personal tutor can see. It also shows when the student viewed their feedback so if you can see the pointer here on the screen, I'm not sure if it's coming up but in the final column there are dates about showing when the student viewed the feedback and if they haven't viewed it at all you see a red cross. So a personal tutor can in their meeting with the student talk to them about why they're not actually going in and having a look at that feedback. There's also links that take you into the feedback within context. So if you're say using a Moodle assignment with a rubric, clicking on the link will take you into that rubric so the student can see how they're doing. Students even know the rubric elements that they achieved are displayed here on this report. And this is particularly important for the quizzes because in the quizzes you only get general feedback which is probably of limited use whereas the students probably want to go in and see feedback against each of the quiz questions and so this report links to their last attempt. In order to allow students to access the report they can either go into their profile and click on the link from there or we've also added a HTML block to their My Home page so that they see it when they first log into Moodle and they can click through to their report. Despite this there is still a lot of students and staff out there who aren't aware of this report and it's something we need to work further on. So the benefits for those students who are using this are displayed on the screen here. We've mapped these against the NUS Assessment and Feedback benchmarking tool and it allows students to quickly see how they're going overall and identify common areas for improvement so that they can actually sit down with their personal tutors and personal tutors have this as a guide, a snapshot of how the student is going. One of the things we hear from personal tutors a lot is that they're not quite sure where to start in their conversations with students in these personal tutor meetings. So this tool is intended to help them support the students. The benefit to module tutors are that you can actually go in and see if a student has acted on feedback you've given them previously. So if in the last assignment you ask them to focus on improving a certain area and they still haven't done that in the next assignment you can bring that up with the student. At the same time you can see whether or not they've actually viewed that feedback and it tells you when they viewed it. So I'm going to hand over to Matt now who's going to talk about the pilot. Thank you Jess. So as Jess says I'm just going to give you a little talk to you a little about how students have reacted to this since we began piloting it onto programs at UCL. So I've been involved with the pilot from day one. We're piloting it on two programs. One which is slightly out of the ordinary not your normal run of the mill program. It actually is an integrated program across the engineering faculty and has around 650 students on it. So it's a very large cohort for us to work with. The second program is one in the School of Management and it's Management Science. So roughly we've had around 750 students piloting the module since we launched it. So we chose these first year students in these programs as they're quite a neutral group. They're new to UCL and don't have any expectations about feedback or potentially prejudices about the tools we're using or the feedback they're getting from academics. And we've really been able to run this pilot in the way that we've had because we've had a project team that have had time to dedicate to it and I think that's a really important point. It's had a full time developer on it and then it had a business analyst at the beginning gathering the needs of the School of Management and the Faculty of Engineering and they've been able to see this pilot through and touch base with students as we've been trialling it with them. So we've had two releases so far and to accompany these releases we've had a number of focus groups and questionnaires that just kind of gauge how students feel about the features we've implemented. The response rate on the questionnaires was slightly lower than we'd hoped for so it was at roughly 5% and this is disappointing but because we're asking 750 students it actually represents quite a decent sample size to have a look at. So what the students say, overwhelmingly students have found it useful. They say they would rather have it than not have it which is I feel is positive in some respects. Unfortunately a number of students still find it difficult to find feedback and we've found that there's two main reasons for this. Either it's not there so the academics haven't provided it electronically. We hope they've provided it therefore on pen and paper or by other means. The second reason is that it's in a different place and it's been put in a place that the report doesn't pull through. So an example of that would be in the Moodle quiz. If an academic's fed directly back on a specific question it's not going to appear in the My Feedback report. It's only the general feedback that gets pulled across. So what we've found is students on more scientific or mathematical courses where specific feedback on certain questions is more often used. They've found the report less useful. So during the focus groups students ask for a number of new features. Firstly they like to see previous years feedback so they can see progression between the different levels. And this should be possible because we have Moodle snapshots in effect archives dating back for a number of years. They'd also like to see how they perform against the class average and they all have different ideas about how they'd like to see that. Some would like to see a rank, some would like to see how they place within a percentile. But when I come on to their comments you'll be able to see that this kind of has its advantages and disadvantages. The other things they'd like to see are assessment weightings. Again it's something available within Moodle but not something we're currently showing. And they'd like to have a formative and summative filter to show them which assignments and projects are counting towards their actual grade. The last comment students have is they'd like notifications when feedback is given. So currently it may not be the last assignment they've handed in that gets the latest feedback. So students are predominantly relying on private WhatsApp groups, Facebook groups to know when feedback is released. So what we'd like to do is introduce a notification via Moodle that will then trigger an email and let them know that a new piece of feedback has been imported. So as well as asking for features, by having this report it's actually highlighted some areas where students would like changes in academic practice. So in general they would like more feedback. Part of this pilot has highlighted for some students there is as I say a lack of electronic feedback. Secondly they'd like to see grades and feedback for final exams as well as formal grades. So currently Moodle displays provisional grades all the formal grades sit in the student information system. They would like those final grades to then be re-imported into Moodle so at the end of the year when they're confirmed they can come in and have that as a record of their grades. The final thing they'd like to see are the digitisation of hard copy assessment and therefore the feedback that accompanies that so lab reports, architectural drawings, they'd like to be able to go in and click and see scanned versions of that feedback within the report. So I'm just going to pass back over to Jess who's going to just give you some thoughts on the student's feature, requests and comments. So as Matt's already mentioned the report does highlight an absence of electronic feedback which may be an issue but students already know that they're not receiving feedback electronically but one of the problems is that in some cases they are and they just don't know where to look for it so we do need to work on helping students understand that they need to go into their quizzes to see the feedback and they need to go in to turn it in to see feedback in there as well. We would also like to enable the reminders for feedback. This is something that students have asked for as well and I'd love to have the Moodle event monitoring system turned on so that this was possible. There are concerns at the moment about potential performance issues so if anybody is using this in their own Moodle installations I'd be really interested to talk to you in one of the breaks about how that is impacting overall Moodle performance. So what's next? As Matt mentioned and what students have asked for is they'd like to be able to compare past year's feedback with existing year's feedback so we're working on that at the moment and the next release we'll be able to look back into our other Moodle installations to see how they've progressed through the years that they've been at UCL and although we already have personal tutor and module tutor views these don't really provide a way of identifying students who are struggling at this point because as I said before we have focused on the student view first so we'd like to implement this as well and there'll be a program administrator view which will enable them to see all the students in the department and similarly highlight and support those students who may be struggling. What's under review are a number of things that the project board might find a bit contentious and there is some discussion over so it may or may not be implemented in future and some of these things are what students have been requesting such as the summative and formative assessments where we're thinking about creating categories within the Moodle grade book to enable staff to move items into a summative folder and a formative folder to be able to show this information to students in a way that is reliable because at the moment we're concerned that there's unreliable weightings in the Moodle grade book and we wouldn't want to then show that to students and have them think that was accurate. The performance score similarly is a bit contentious. There was a comment that one of the students made that it might demotivate students who aren't doing very well. We're thinking at the moment that if we do implement it you'll be able to see whether you're above or below the class average and what I'd like to really see implemented is an assessment calendar. This may get implemented in future if we get further funding for the project but what it would allow the program administrators to do is see all of the assessment dates for a program and enable them to be spaced out across the term so they're not all clustered around the same time because this is adding extra stress to the students. So some takeaway points from today is that communication to the staff and students letting them know that this exists is very difficult. You all know that a lot of people don't read their emails these days so trying to highlight that this report exists has been tricky which is probably one of the reasons why we only got 5% feedback from the students. It's unlikely that the turn it in issue is going to be resolved any time soon and if a personal tutor is looking at the feedback report they can't actually access any of the information directly so they can't click on a turn it in assignment and see it unless they are also enrolled on that Moodle course. So this can be an issue because they get a snapshot of how the student's doing but they can't see the detail unless the student is there with them and logged into Moodle. And just a bit of a technical issue that I wanted to raise that has been surfaced on the Moodle.org discussions where this plugin is available is if you're setting this up you need to make sure that the permissions are set at site level rather than at course level because it looks across courses so it sits above the course level and if you assign permissions for this report to student or teacher roles it's not going to give the correct permissions so I just thought I'd highlight that. I'd just like to quickly acknowledge the people who initiated this project it's based on work done at the IOE by Tim Neumann and Dr Gwyneth Hughes we were actually able to take the code that they were running at the IOE previously and build upon it for this report and locally at UCL I'd like to acknowledge Professor John Mitchell and Dr Jason Davies who are really the champions behind this project and it's one of the reasons why it's running so I'll just leave these useful links on the screen if you want to download the report to find out more and I think we're going to be asking for questions now so thank you. Hi Michael Hughes from the University of Strathclyde we've built the exact same system that you've built as well almost exactly, we'll talk about it a bit later on so I'm really interested how you've managed how your approach to getting the data out in a performance way has been done because our experience of it is that it's actually quite difficult to get all of that data out of the Moodle database quickly and cost effectively so I'd be interested to know how you guys are dealing with that cost and sort of keeping that data up to date in terms of... sorry I don't think I understand so in terms of extracting things like some of the stuff out of mod assigned feedback you've got to go into the assignment and ask the assignment to provide that information we found that to be a very costly operation in terms of how do you do that for every single module so I'll be interested okay so as in the queries you're writing to you're going to draw those out how do you manage that in terms of accessibility for adding other modules in as well so we already had that available to us the code to draw out all of that information I think the problem we're facing instead is having stuff add the feedback into the correct areas that would then display through into the report but the development of that SQL code if that's what you're talking about was already there so we just took that code from the IOE and implemented it and putting it into a Moodle plugin there's no actual cost in terms of running the report now and getting that information out of the database okay question you mentioned you asked the students whether it was useful did you measure how widely it is actually used how many students are actually going looking at it we haven't actually looked at that I believe it's being reported but we've only just released the latest version so we need to go into the logs and see I think we're actually probably going to have to go into the Moodle database into the log table and actually see how many times it's been accessed I'm not sure how else we'd do that