 Cyfathreun!西wt i gyfathreun! Felly, the presentation I'm going to go through today is a joint project I'm doing with Gavin that the ball of stress is walking around the mood and moodthe last few days. The twitter name is there and of course feel free to use the hashtag as well. The reason why we came up with this particular project or this idea really stems from Well, everybody's interested in learning analytics, but it really stems from our president, or what you would call your voice chancellor, because they go around, he's a really nice guy, by the way, for being recorded, but they go around saying that we meet certain statistics. They put metrics in front of us to say our college is better than the college next door, our college shows X, Yn Zed, and one of the statistics that our vice chancellor said was 80% of our courses will be blended by the year X, whatever X may have been. And I'm sure you've all had similar targets set for you, but we had no way of measuring what is a blended course, because every blend is different, different cooks will cook different things with the same ingredients. So we wanted to measure that blend in some way. When I was a little bit cheeky with them at the very start, not to his face, but when I was cheeky with them, I said, well, all of our courses are blended because they all have a moodle space available, by default, because of our student record system. That excuse didn't go down too well at all, really. So just to give you a little bit of background as to what my role is and why I was landed with this task of measuring the blend, I head up to teaching enhancement unit, quite a small team within DCU. There is five of us and we have 16,000 students and over a thousand staff. We're responsible for professional development for our staff, for supporting online learning and blended learning, and that involves managing moodle. So I am the moodle administrator within DCU and all other different learning technologies to go with that. But we also provide and run training courses and professional development programmes for courses as well as support research into teaching and learning by issuing awards and grants and so on. We have, as I say, one moodle instance, 16,000 students, five faculties and 21 schools. We'd have 300 plus, probably closer to 400 plus, lectures across the variety of different programmes and I put down X programmes because it's so hard to get a final number as to how many programmes you're running at one particular year. But the challenge remained how are people engaging with the VLE and both students and staff and is it making a difference, is it worth the investment? So we looked at the engagement element and we looked at, literally from a student's perspective, how many clicks are they making, how well did they interact with it and we actually looked and did a project to see did that have some effect on their final grade. And I was delighted to say that there was a correlation between engagement and engagement with moodle and students' final grade. We measured over, I think it was 28 million different clicks over a couple of years periods. But what we were able to do, and excuse me for the blurred chart but there are student IDs there which is so it's purposely blurred, which is to explain this. The students' names are on the left and the course code or programme code that they're with is in the beige colour. But on the right hand side of the screen you will see a colour coded chart where students are given a number from one to nine depending on how active they are relative to their classmates. And by looking at these figures we were able to see that the third student down started off low engagement relative to his class because it's a slightly red shaded figure. Red and low numbers is low engagement relative to their classmates. High numbers in blue is high engagement. But when we started looking at this and we were able to track whether students were falling off at any particular time, it then started to look at what were the final results like. Did it make a difference and we had non-participants in a study and participants in the study and I can explain that at the questions and answers stage. But we were able to see improvements in grades for students if they engaged with our moodle content and particularly if they engaged with that particular programme. But we looked at these modules and as you will see from the titles we have everything from computer science to law to politics and so on. So what I wanted to do was look at the make up of those courses and actually get inside the moodle courses and see well are they all PowerPoint files or are they all quizzes or are they wikis. And again seeing if there was some form of correlation between the increase in grades where we notice a benefit, a correlation and the course structure. And that's where this whole getting under the bonnet of the people's courses went to. But we started to get a little bit unintentionally, we started to get a little bit personal where we went to the lecture of CA168 and says well you don't have as many quizzes as this guy does. You don't have as many and the whole name and shame element came into it and it took us down a very very dark path. So we decided not to go there because I was quite disappointed, quite a lot of our lecturers were only using it as a dump and pump style or a repository style of course. But also it's something we didn't want to encourage by isolating people following on from Michael's talk earlier on, isolating people that aren't engaging with the VLE enough or in my case efficiently. So what we decided to do was instead of and again recognising the breadth of activities that's available to you through loop and how I would use a discussion form could be different to how Davo would use it and different again as to how Baz would use it. So it's quite difficult to make a judgement on interactions and so on. But what we decided to do was look at a course at a category level so we could make an umbrella to see if we're not isolating individuals. And our moodle setup is set up in such a way where there's a category for every faculty and then subcategory for every school and subcategory if that's proper terminology for the programmes. So we're able to then generate a report or we wanted to generate a report and say well in terms of content where you're just dumping resources up there, PDFs and all sorts of other file types or where you're using it to manage your assessments or interaction where you're giving them opportunity to interact online and indeed opportunities for collaboration. So they are the four elements that we wanted to judge. So just to break that down. How am I doing for time, Tim? Great. To break that down we wanted to look at all of the moodle activities that were available to us and what could we tag as content transfer or what could we tag as assessment based. So in terms of content transfer we looked at and just I do appreciate there's a lot of numbers thrown up there on the screen. But we said well if they have pages, books, files, folders, IMS contents, labels, URLs, lessons, wherever we had Lewis there and on. So we looked at that and we decided to see in this category how many lessons do we have, how many pages do we have, how many books do we have and so on. Now we were faced with the challenge when we looked at it, quite a lot of lectures had loads of stuff up there, historical stuff which they don't show to students. They may have hidden or they may be just too lazy to delete us and have well I won't say rubbish but they have contents they no longer use on their courses. So we decided to look at the number of active items. So as in the number of an item is deemed active if it is accessed, clicked on by a student within a certain time period. And that's important too because when we generate this report we can define the time periods that we're looking at. It also facilitated the fact that a lot of lectures as they say may have a whole load of historic stuff that they've up there. We looked at again the number of enrolments but we also looked at the number of active enrolments. We found in doing the study looking at the student engagement we actually found quite a large portion of our students actually weren't even clicking into the VLE at all which was quite frightening. So and we generated essentially a figure for each one of those. Looking at assessments we looked at quiz, assignment, workshop and scorn packages. Again we accept the limitation of this because you may use a discussion form as an assignment type. But you will see the number of activities we looked at, the number of active activities, same principle again. And then we looked at whether the submissions were made and how many were graded because we could extract all of this information. The other thing we wanted to look at was the average grade for the assignments that were given to give us more information. And then in terms of interaction this is one of the smaller elements. We had survey, choice, feedback and questionnaire. I'm not going to go through each one of the columns but it's the same sort of thing that's replicated. The challenge that we have is when we get all of these different stats and figures and I'm just again putting the ones for collaboration up there. The chat to form, the wiki and the glossary. But when we have this sort of data how do we visualise it? How do we actually give it to the head of school or the head of a programme and so on? And really I'd love suggestions where we are at the development stage. What I have in my head at the moment which is a very dark and dirty place to be in. But what I have at the moment is some form of visualisation like a pie chart or a spider diagram and so on. Because we want to be able to show ahead of school. Here's how blended your programmes are. But maybe here's what the university's overall blend is like and here's how you fit relative to the university. Because I'm sure it's the same in your institution. But when you start making a little bit competitive heads of schools will start cracking the whip. So that's one potential visualisation. I got loads of help from the crowd last year when we were doing the relative data and the relative assignment grades. So any tips, feedback that you can give us or maybe you may think that will work. The idea behind it where we are measuring all of these different activities it's just going to be a report. And with it being a report it's available to everybody in the audience. So it's a category based report and we're going to be measuring content assessment interaction and collaboration. Thank you very much.