 Hi folks, my name is Rob, I'm Senior Learning Technologist at Dublin City University in Ireland and I just want to tell you a very short, nice story about how we used Moodle earlier this year to maximise our student engagement with their student feedback surveys. Out of interest are many of you from the university or higher education sector here, or are you from different sectors? We are some people here, so you will probably be familiar with the idea of evaluating teaching, having professors and lecturers collect feedback from their students at the end of a semester, at the end of a year, around their experience in the classroom. It's very, very widespread and it's used for a variety of reasons, often it's perhaps mandated for quality assurance or regulatory reasons, probably serves a variety of purposes to inform institutional review, to inform programme review, curriculum design or indeed for a lecturer or a professor's own professional development, they may collect student feedback in order to inform a promotion or a fellowship application. There's lots of different instruments that exist, by and large I think issuing a survey and anonymous survey is probably the dominant tool, although interestingly at DCU, whilst we do issue surveys to students, we also organise qualitative staff student forums where they can discuss matters and feedback in a dialogic fashion. There's a healthy literature base out there around the problems with these types of survey instruments to get student feedback, not least of all the biases that often exist, it's shown quite a lot that female educators, for example, receive harsher criticism than their white male lecturer counterparts, which is a shock, I know, but I won't get into that in this presentation. At DCU we have a student survey of teaching in SSOT, which is a system we've had in place for a number of years and what happens is each of our five faculties in the university decide that X number of modules will participate in the SSOT process at the end of each semester. We run the SSOT using the good old trusty Moodle questionnaire plugin, set to anonymous, obviously, so student feedback is anonymous, and we set it in public mode as well so that the individual questionnaires on individual courses can all feed into a master questionnaire. We have a very useful plugin called Questionnaire Manager that actually deploys those questionnaires out to the dozens and dozens of courses that are participating in the SSOT, so the questionnaire manager takes the master questionnaire, places it in the top section of each of those Moodle courses, and is used to configure the opening, closing dates, the hiding, the unhiding, etc. So on the left-hand side you can see the questionnaire manager block, as it would appear on the master Moodle course. You've got the options there. I just simply upload a CSV file of all of the Moodle courses that I want to receive the questionnaire, and I have some options there to hide and show and so on, look at statistics and so on. And then on an actual Moodle course itself you can see that's just a screenshot of section zero of one particular module, and you can see the questionnaire just pops in there, so nice and easy for students to find. That's been the setup for quite a number of years, as I say. You know, the tool is very simple. I upload a CSV file, the questionnaires appear on the courses. You can't make it any more simpler than that. However, from time to time sometimes our staff complain about it, and they often deride or moan about the Moodle-based SSOT process. Even though the tool itself technically works well, I think some of the process and the buying in from staff and students around it probably isn't as simple and straightforward and reliable as the actual technical solution is itself. One big issue is sometimes the questionnaires are deployed after the teaching term finishes, which I mean no student is going to take their time to fill in a questionnaire when they're off on their Christmas holidays or their summer holidays. Sometimes as well students are maybe over-surveyed. They're completing in different surveys for multiple modules at once, and they don't really have any sort of incentive or it's not really kind of promoted or marketed to them to complete it. So an opportunity arose this year from my point of view to try and maybe kind of consolidate and define some good practice around capturing student feedback on Moodle as one of our faculties was undergoing a formal quality review and as part of that they needed to capture students' experience on their teaching at program level within the faculty. I had a conversation with them and they were asking about different tools that they could use. Will they use Qualtrics? Will they use Google Forms? At one point there was a conversation about distributing paper-based surveys and could I find a scanner that would scan in all the surveys and I said folks, no way. We have all the ingredients we need to make this work on Moodle and here's how we're going to do it. So there were kind of a couple of elements to this initiative. One obviously was at the foundation of it was that questionnaire manager plugin I mentioned that deploys the questionnaires on multiple Moodle courses. We're also very lucky in the university a few years ago we moved towards creating program level courses for all of our programs in the university. Now these are Moodle courses like any other Moodle course but it's used by program chairs or program coordinators to deliver program level information to students and help them build a kind of an affinity and an identity with their program not just with their six or twelve courses or modules that they're taking because of course the faculty wanted to collect student feedback at the program level so the program pages were a useful vehicle for that. We also knew we needed to kind of promote this well to students and push them and encourage them to complete it and let them know that there'd be an incentive for them to do it and we used another really useful plugin called audience message which allows you to put a customizable message on all users dashboard so that when they log in they kind of see a very media rich or text rich message for them. And then lastly then we use some configurable reports to monitor the process as it was underway so that the faculty knew what programs they might need to target. They needed to push the survey a little bit more and what programs were had a good response rate and so on. So really importantly this was very much led by the faculty the associate dean of teaching and learning at the faculty manager. They were really invested in collecting the student feedback as part of their quality review so I worked very closely with them on the roll out rolled out the questionnaires in the middle of the semester so not after the semester when all the students were off on their holidays rolled out in the middle of the semester. They identified optimum dates and times to allow students some time to complete it in class on their phones on their laptops etc. And our program chairs and our program or program coordinators in the faculty really encouraged and pushed and spoke about this to their students and encouraged them to complete it to let them know that their feedback was important and valid. As I mentioned we had targeted messaging there was raffle prizes for the students as well to encourage them bit of extrinsic motivation there and then of course the progress reports were very useful to see which programs had good response rate and less than good response rate and then in some cases then the kind of the dates for completion of the survey were extended as a result. This is what the audience plug-in block looks like as you can see there. It uses it looks for certain fields in the user's profile in order to give them a targeted message so at DCU for example each of our students would have their faculty their school and their course or their program listed in their program field. So then I was able to configure a number of these rules to target messages for each of the students on each of the specific programs that we wanted to collect feedback from. And you can see then just a very short piece of text encouraging them to participate in the survey and really importantly our associate dean recorded a little 30 second clip where she's kind of really encouraging and putting a face out there and saying I want your feedback this is really important and that was obviously part of the audience message as well. My colleague Matassem in the teaching enhancement unit then is a whiz on all things reporting with Moodle and he developed a really handy monitoring report using the configurable reports block to kind of allow the associate dean and the faculty manager to see what the level of completion was as the process was underway so if you might not be able to see make out the full details but you can download the slides later but for example you know I'm looking here now at the highest level of completion rate for the survey so we have the MA in chaplaincy studies 82 percent response rate the master in education 66 percent response rate the bachelor of religion 56 percent response rate and little progress bar there as well to give a visual indicator. So this was really useful for the associate dean and the faculty manager to kind of figure out where where the problem areas were where students weren't completing the feedback and they were able to do a greater push and a greater encouragement for the students to complete the survey. In the end the faculty associate dean and the faculty manager were very positive about the feedback that they received. You can see a short little excerpt there from the associate dean and it's said that the volume of data was really really valuable and it was you know will be used very strategically within the faculty both for the immediate quality review that they were undergoing but more importantly for making improvement plans for the future and she also said obviously students found the process very very straightforward because of course students were just using Moodle the platform that they're used to day in day out that they know and they dream of in their sleep or certainly I dream of Moodle in my sleep anyway and I'm sure all the rest of you do as well and you can see there that dean said you know she received no queries about the process or no issues arose or no confusion or anything which I think you know is a testament to how kind of smooth and how successful this initiative was. So just thinking about kind of maybe some of the kind of strengths of this approach you know the anonymity was important obviously and the questionnaire was set up as anonymous across all of the program pages that it was deployed on but you know we were still able to monitor the progress we weren't interested in obviously what specific students were saying we don't want that level of detail but we do want to see are each of the program cohorts actually engaging and responding so those configurable reports were really useful for doing that again we were utilizing all the existing tools we didn't need to go buy a new tool didn't need to pay for anything extra didn't need to worry about our data being hosted by some other tool or platforms was all done safely and securely within our VLE that we all know really really well and that we're already very very familiar with and you can see as well the response rates were very very high the undergraduates almost half of them responded the post graduates almost 30% of them responded usually I think before that the the regular module level SSOTs might get a 0 or 1% response rate so that's a quite a significant increase which is which is good and a really useful range of feedback was then collected because obviously if we're going out and we're really pushing and targeting and encouraging and maximizing the amount of responses we're going to get we're going to get really useful feedback for the faculty to to reflect on and to incorporate into their future strategic planning so very much this initiative showcased an effective combination of our existing tools at no extra cost to facilitate promote and monitor the student engagement with their with their student feedback and it was fantastic that we had all those tools available it was fantastic that it all worked so well and so smoothly and what really the most important bit about this whole endeavor was actually the people involved in it and it was the leadership that was shown by the associate dean and the faculty manager to really push this and really engage with the students the students themselves obviously who bought in and really gave their really really good quality feedback so even though we had a number of different moodle technical things in play the human dimension was the most important part of this and I think that's really what led us to such a positive outcome so thank you very much I'll leave it at that. Do anyone put questions for Rob? Mine's a quick question you mentioned audience messaging block is that available in the moodle plugins database? I don't think so it's a brick field plugin so if you pop down to Gavin and Karen they can explain a bit more about it yeah. Hi Rob I was just wondering do you do any feedback at module level then or do you just do a program level now and if you do it at module level how do you go about so we do and we've always done it at module level just using the questionnaire manager block and deploying out the SSOT's at the end of each semester so each faculty would kind of nominate X number of modules to receive the module level feedback but there has been kind of less than great engagement from the module leaders and from the students at module level and the timing of when the module level report surveys are issued has always been problematic so the engagement at the module level has not been good over the years but I suppose we learned from that then and took from that and planned a really kind of robust and well thought out program level survey which as we can see was successful. Okay thank you yeah we struggle at module level too. Hi thank you for that really interesting we run programs with 20 modules in there every module has a questionnaire at the end the you know the first four or five modules great and then you just see a you know because it's the same questions every time but actually what we get back is invaluable I'm just interested in two things one how do you how do you shape a program survey that encompasses all of the modules because actually the feedback that they give us when we do get it on a per module basis is invaluable and are you able to collate kind of site wide feedback information so you know that you know on these 15 programs 80 percent have been rated excellent or good or poor or whatever it might be. Yeah so in this particular instance the survey was crafted by the associate dean and the and the faculty manager just for that one faculty because they were looking for particular they were looking for feedback on particular aspects to inform the quality review so they defined some questions around teaching and learning around facilities around belonging and identity and affinity and so on and they didn't keep it broad because as you say across those 30 or so programs in the faculty there's hundreds and hundreds of modules so they didn't get down to nitty gritty module level detail they did want to get kind of just a broad snapshot of students program level experiences and even at our module level surveys them as well are actually broad in nature there's only a short number of questions in the room some open ended questions mostly to allow students that room to respond with whatever they need to we don't try and make it too specific just so that it can be applied across multiple modules but we don't currently aggregate everything together across every single survey that's been conducted for certain internal political reasons which I know is shocker to everybody that universities are political entities but we can essentially because all the data is in Moodle because it's all there in the questionnaires because we have capabilities on the team for writing reports and so on we could easily aggregate it all together Oh it should be able to do that shouldn't it? They could figure out reports because I mean it's filled for that Yeah yeah yeah so we could do it we haven't yet but yeah we could do it technically speaking Yeah. Any more questions? Thank you very much.