 So thanks again for joining us today, I'm just going to run through kind of update you a little bit on kind of some of the work that we've been doing in the forum lately, and then I'll talk a bit more about today. And to anyone who's just joining us, I just remind you please to mute your mics as you go, and to let you know that we are recording. So, and so student success is kind of one of the four strategic priorities of the forum. And this comes in under the this particular webinar comes in under the DESI project data enabled student success initiative, which is a stream within the student success priority. And if I could remind people to please mute your microphones as you're coming in. Thank you very much. And so, yeah, this is under the DESI project through which we've been working with institutions over the last year and a half. And more about that later actually come back to that in a few minutes. Under the DESI project as well. I just like to kind of inform any of you don't know or remind any of you that you that do about or our online resource for learning analytics, which is available through the teaching and learning website. And so I'd be distributing these slides afterwards. So this will be available to you. So Orla is a publicly available web library of resources covering everything from strategy development data quality, making effective interventions, and so on. It's. I'm sorry, but I just remind you to my to mute your microphones again, please. And so we also have a Swedish of ever expanding case studies with a few new case studies to add in there as well on how institutions in Ireland and how teaching staff in Ireland are currently using data to enhance their practice and to support student success. And I've also just given a URL a connection to one of our recent publications, which is specifically designed to support institutions looking to develop strategies. It runs through kind of key stages of strategy development and links to the relevant resources within Orla that will help to support that. And that link is there. And sorry, I'll just ask you if you wouldn't mind muting your microphones again. And also this is the second in a series of webinars. And I've just put up the URL for the recording of the last one which we hosted in May of last year. Again with some really, really fantastic speakers some really interesting things happening. And just as part of it, and kind of as we draw towards this this phase of Orla and Desi. One of the things that I've been doing is kind of going back to institutions that we've been working with, just to get a sense of the kind of impact that the project has had to date. And so I've had kind of semi structured interviews with 13 of the institutions that we've been involved with with universities institutes of technology and private colleges around the country. And that's 13 that we've spoken to, we see that Desi has contributed to 16 strategies and policies within those institutions. And there are initiatives that have been kind of supported by Desi underway in nine of them. And so we're really seeing kind of a decent amount of activity on the national level coming in there. One of the things that people have said that participating in Desi is that it's helped to drive a really student centered approach to data use and that I think is really fundamental to what we've been trying to support is a student centered data informed approach rather than a data centered approach. It's critical that we remember kind of why we're doing this at all times. We've had reports then from the conversations that within institutions we're seeing more and more conversations between people kind of across the institution between staff and students between senior management and kind of everybody else. And it's that the area of data is increasingly being seen less and less as the kind of domain of the techies, which I think is fantastic we can all benefit from an evidence based approach. We're both in a cooperation between institutions as well which is absolutely fantastic. We're very, very lucky to work in such a collaborative sector. There's reports then from each of the institutions, sorry, excuse me. Data culture is kind of growing on the ground we're seeing more and more teaching staff for example using the reports within the VLE. One of the other things that the last thing kind of it's been reported across the conversations is that having a national level focus from someone like the forum has actually helped to kind of support interest within institutions. So I'm really, really delighted to be able to kind of share with you the impact that our work has had. Finally, a last quick plug. As I mentioned, Desi comes in under the student success stream. So you see here we have a number of kind of new publications recently. On the left hand side, we have our report on student success, which we've been kind of working on most of this year, which looks at kind of, it looks at national policy it looks at institutions it looks at student feedback to get a sense of what success is, or what we mean by success goes through substantial literature review to look at the enablers of success and finally culminates in our national understanding of success. On the right hand side, then you have the four page insight for those of you who don't have the time to read the full data. And in the middle we have my colleague Alison's report, which looks at the data coming in from the teaching heroes awards. So it looks at a substantial volume of feedback from students about what they value in the teachers that support them. So three kind of hopefully useful reports are certainly Alison's is So a big thank you again to our speakers today. So we have Hazel Farrell from Waterford Institute of Technology. I'm not going to kind of spoil the surprise by telling you what everyone will be talking about. We've Peter Jan Bon who will be talking about the the off the project the onwards from learning analytics project. We have Mark Glenn and in in spirit Shadi Karazi from Dublin City University. So without further ado, I'll pass on to the stars of the show and I'll get out of the way. So Hazel, I shall pass over to you. Okay. Thank you so much, Lee. That's great. Thanks, Hazel. All right. And so, and I'm well done on all the amazing work you're doing. It's absolutely fantastic. Very kind of you guys. Thank you. Okay. And so just to give you a little bit of context. I work in. Oh, my slideshow. Sorry, that was me. Excuse me. Is this is that you, Lee? It is. I'm staying away now. Can you put it back in store for me or is there we go. Thank you very much. Okay. And so, yeah, just a little bit of context then. I suppose I teach on the, I lead the music degree program in WIT and I'm very interested in technology for creative disciplines. I'm particularly interested in these analytics because I feel they can really enhance student experience and also my own experience as an educator as well. So I basically began to use analytics across a range of my modules. Initially it was to address student engagement concerns. So basically I wanted to get a clear picture of the level of engagement with module materials. So I just wanted to be able to check on, you know, how often the students were logging in and engaging with materials, how long they were spending on the materials. And then I was also having a little look at their level of understanding of the different concepts that I was presenting them with. So, and I thought this was particularly necessary and I have one fully online module. And with that one, because I wasn't seeing the students at all, I felt it was really important to start there. So that's the one I started with. But I do apply it to my classroom modules too. So what I was doing then was I was using the data to identify at-risk students. And this was in the context of retention and attrition. So it provided me with an invaluable means of quickly identifying students who were not engaging or who are struggling with module content. So no matter how comfortable, you know, or safe a classroom environment you create, there will always be students who are not happy really to ask for clarity on a topic or to admit that they're not really with you or they're not getting what you're talking about. And, you know, I suppose I've learned through experience that, you know, when you ask your class, like, you know, do you understand any questions and everyone just agrees. And they're like, yeah, of course we do. Yeah, no question, no questions, but don't trust them. You know, can't be trusted them. So, you know, sometimes then you only discover that at a very late stage of the semester. So I thought I need to actually address this. So that was my second driver. And my final reason for using analytics was actually to learn how I was doing as an education myself in terms of engaging my students and actually consolidating their learning outcomes as well. I'm very, very passionate about participative learning. And I really, really strongly believe the students have to be actively involved in the learning process in order for it to be a meaningful learning experience. And then I was able to sort of use that type of ethos as a driver behind what I was doing as well. And, you know, I think you can operate in this sort of blissful, ignorant state where, you know, you're lecturing the students and you're feeling that, oh, I'm doing a great job. I know what I'm talking about. I mean, the students may not be with you at all. And I feel that that is a strong responsibility on you to actually address that. So that's where I was, that's where I was coming from. So what I did was then in my particular institute, we use Moodle. And within each of my modules, this is a very typical example of one of my topics within this is the Western Art Music Module. So what I would do is I would, first of all, have my presentations up there so you can see there's a YouTube version. There's a PowerPoint version of the PDF. I also use Adobe Spark quite a lot. So I'd have the presentations. Then I'd have my resources. And at the end, actually, the most important bit that's going off the screen there at the end, the most important bit for this purpose is the quiz. So at the end of every topic, I would have a quiz. And initially I use the quiz to for the students just so they could test themselves and, you know, to consolidate their own learning and also then for revision purposes before we were coming up to assessments. But later I changed that and I'll talk about that in a second. So this is a typical quiz for me. Short quizzes, 10 questions within Moodle, absolutely simple to set up. And I use a mix of true or false and multiple choice questions. And again, when I started doing this, I thought, oh my God, this is so obvious. Everybody will know these answers. They don't know the answers. They don't at all. So what you presume is extremely straightforward that anybody will know. They don't know. So this really brought it to life for me. So that's a typical layout of the start of one of my quizzes there. And these are really short. They take five minutes-ish in class. So the students just basically take out their mobile phones, they log into the Moodle page and we start the quiz together. And I'll just show you here on the next page. This would be my sort of the data that I'm collecting from it. So you can see very clearly here that, you know, there's an average of around, you know, five, five, you know, nobody hits 60 minutes, but it's ranging from two-ish up as far as, you know, five and a half minutes. That's how long it takes. So it's no great effort on their part. And you can easily integrate this into your class if you wish or else they can do it in their own time. It's up to yourselves. So then you can see just by looking at this, if I take a horizontal view, I'm identifying student issues. So I'm identifying individuals who have issues. And if I take a vertical view, I'm identifying an issue with what I have not actually effectively portrayed to the students. So I can say, for instance, there with question 10, I'll have a look at that topic. And I'll say, well, actually, there's quite a number of students who have not really grasped that concept. So then that's my problem. So I need to go back and have a little think about how I tried to get the material across how I didn't achieve it to my satisfaction. And then I'll just rethink my approach the next time. So just by looking at this one, then here, quick analysis to students, very, very clearly in difficulty, because there was like, you know, the majority of the questions were on for both of them. And then I need to work at enhancing an understanding of questions one and questions 10, because quite a few students had actually fallen down on those ones. So just by glancing at it, I can immediately, you know, take this data and turn it into something positive. And I also feel that by, you know, being able to do this quick analysis there, just at a glance, that it really works for larger groups as well. So when you don't have time to be looking, you know, into detail at every single thing, just to be able to glance vertically. Okay, this topic, this topic, I'm going to have to look at, okay, the student here is in difficulty, the student here is in difficulty. Very, very straightforward, very simple, you know, nothing, nothing crazily complicated like this at all. Okay, so then at the impact, I've had very, very positive feedback from the students, because I tend to sort of incorporate them into the assessments, into the feedback. I incorporate them into every aspect I possibly can of the learning in order to sort of reinforce that whole meaningful experience we were talking about. So they enjoy doing it. They actually enjoy testing themselves and just seeing how they're getting on. And because it's so short, no great deal of effort for them. In fact, they can just take out the mobile phones, login, do the quiz, submit, end of instant feedback. You know, they appreciate that. From, for me, I found that it gave me a heightened awareness of how effective my approach was to different concepts. And in this context, then I was able to adapt my teaching in the interest of ensuring clarity and also accessibility for my students. And I feel that I'm actually engaging the students from a more informed perspective. And this personally gives me a greater sense of fulfillment as an educator. I feel like I'm actually doing a good job, you know, as opposed to lecturing the students and just standing there being totally oblivious to whether they're actually getting it, whether they're involved or whether it's benefiting them at all. So it's not about me. It's about the students and I'm pretty happy with myself about this. And then the last point I suppose is the early detection. This actually helped me to identify very quickly at an early stage if students were in difficulty. Typically before this, it could have been week six. In some cases, it could have been week 12 before issues would have been highlighted due to, you know, however the assessments were laid out. I give quizzes very, very regularly, practically on a weekly basis with a trial group I'm working with, with my first years at the moment. And it's great because I was able to identify students that needed extra learning support at a very, very early stage in the semester and I'm pretty happy about that too. Okay, so in practice then, what is this meant? I have increased the number of quizzes I'm using. These are all short quizzes and I use them across the whole range of my modules. I set them weekly or bi-weekly just depending on how we're going with the materials. And I've also started to allocate continuous assessment marks for the quizzes after consultation with the students. I had a discussion with them because before it was just for revision purposes or, you know, whatever just for themselves. So I had a conversation and I said, how would you feel about this? So they actually felt like it would take pressure off them and they were happy to be able to complete assessments in this manner. So I now use them as continuous assessments throughout the semester. And then the other thing I have done is I have started to use quizzes across a broader range of modules. Originally I would have stuck with my survey modules such as music histories and things like that. But this year with my year one students, I started to apply it to practical subjects such as composition. So I'm testing basic music literacy in their quizzes here. And because it's a really, really vital thing to our particular program, you know, they have to have these music literacy skills because they basically are transferable across a wide variety of other modules and most students fall down on it actually. So I was able to identify students that had issues by week two and set up the learning support for them in practical modules as well as just the theoretical ones. So that's basically what I'm after doing. So to sum up everybody, quizzes rock. Thank you. Fantastic. Thank you very, very much Hazel. Okay. Sorry. Just turn this off. Sorry, I'll be through in one second. Talk among yourselves. So thank you very, very much Hazel. That's absolutely fantastic. And I think like one of the things that's really interesting with that, you know, there's this perception that to really get value out of data. You need a swarm of data scientists and you need a budget of 400 grand and a platform that does, you know, all the bells and whistles. And I think it's really, really, really terrific that you show that something I mean clearly a lot of work went into planning it and setting it up. But something that is manageable by one person. You have a source of information for feedback informed practice in real time. You have the early detection piece. And we see that having that on a module basis is really, really beneficial and using the data then to employ kind of assessment for learning, which I think Geraldine would be pleased that I've managed to hopefully interpret that correctly. So really, I think that's a fantastic. It's a fantastic achievement. Does anyone have any questions if you'd like to join in. So either vocally or if you want to use the chat box. We've a couple of minutes for questions for a little if anyone wants to. Thanks nabla. Hi Hazel. That was great. Thanks. Do you always have the student or sorry. Do you always have the students do the quizzes in class or do you have out of class ones too. I have used the latter but I worry about cheating. Thanks nabla. And yeah, do you know I've done both with the fully online module and they obviously did the night class because I never met the students and at that point I wasn't using it for assessment and it was really for their own benefit. So I felt that you know if they wanted to gain from it they needed to actually just be honest about the thing that you know to be whatever but in I used the ones now in class and it's just simply because it's like a group activity. Everyone they take out their phones and everyone logs on and then there's a little bit of competition over who submitted the phrase it creates a nice sort of dynamic. Do you know what I mean. And I think they enjoy doing it together and early, you know, as a group. So, yeah, I've actually done both. So, yeah, that's the answer to that one. Fantastic. Thanks. So, so then hi, Sinead. Oh, I am. Hi. Thanks, Sinead. Hi. Thanks. Have you been able to use this data to inform longer term curriculum change or modular program improvement rather than kind of in module change for your practice or if you have any plans for doing so. Yeah, we do actually we just went through school review. And we have like a revised program, you know, coming up in 2020. So what we've done with that then is we've introduced a lot more flexible learning options. So we have more blended options. We have more online assessments going on in there. So absolutely we've applied it to our program. And actually it's part of our Institute strategy now as well to get all of this moving. So I'm quite happy to be out there and, you know, pushing it forward. And we coordinate with students union as well. And they're quite happy about this too. So, you know, I think all in all it's very positive. And yeah, absolutely. We have applied it to our new program. So there you go. Fantastic. Thanks, Hazel. And thanks as a note there for Mark as well. There's a lot more. Sorry. Sorry. There's a lot more quiz statistics available within the quizzes hidden gem that people may not know about. The statistics give you information on how good or challenging your questions are as well. Under the admin window on the quiz. There's a result tab. Just click on statistics. Fantastic. Thank you very much, Mark. And thanks again, Hazel. It was absolutely terrific. Really, really appreciate you giving up your time. No problem. I'm happy to be. Thank you. So moving on and I'll pass over to Peter Yan. And I'll get rid of my opportunity to let us share some of our work on the off the projects onward from learning analytics. And also thanks to Hazel for her first talk. You'll notice that some of the advice we have from our research project so far links to the small interventions she's doing, which are often quite valuable. So our project name is onwards from learning analytics and the idea of the project is that we often have a lot of data available to us in an institution. And we have spent some institutions have spent a lot of money and time on making all the data communicate communicate with each other without having a focus on what will we do with the data and why do we need all the data to correspond with each other. So what is the focus of our work is with the data we have and sometimes with the limited data we have. How can we use it in the best way possible to affect students coaching to affect curriculum design, etc. That's why we call it onwards from learning analytics. We work with a model in three steps and the first one is prompts and this links back to data and how can we use data to signal that we should do something that there's a problem. That will also be the first part of my talk. The second one is linked to communication. How do we communicate this data to students in order that they are one informed or to take the step to see for example a counselor or somebody who can help them. And that's the third one an intervention during the intervention one should we start referring to the data should it be the starting point of our discussion should we consider it as something shared or something that we own as an institution or something that a student makes. What I'm what will I be talking about one of the inputs we're finalizing now from our first year is a literature review and snowballs through the literature. We started looking according to a typical way to do literature review with keywords and searches but we came up with very little in which learning analytics was linked to coaching guidance advising. So we went another approach we took another way. We did another approach which was snowballing through the literature where we started from existing reviews of learning analytics and we looked at what articles in those reviews would be interesting for us could be linked to coaching references sort of follow up of the data sort of use of the data. So we started with two reviews and this led us to 39 articles that we selected for abstract screening from those we process 23 articles that were still relevant and they discussed a total of 48 cases in which learning analytics was were used in one way or another and linked to coaching or to translation to well not just on the data but really using to support students from those 48 cases 36 were unique cases so some articles discussed cases that we also found in other articles. I think the main clue is work with what you've got but use it effectively and this has been something well data has been around forever I think in education if you look at the basic things like scores or scores on quizzes or scores on exams. It's also how you communicate how you start from these what you do with them to just offer them to students or not. So basic data are available and often institutions have a lot of data that are very specific for their student population but they don't always use it that well. The second general conclusion I want you to take from this small talk is that communicating concerns and guiding students to support that often already exists in your institution and is often the best you can do or the most efficient. We saw or we read about a big interventions where students were informed that they were not doing well and have to get their act together and other ones that set up extra exercises learning platforms etc. And they didn't see for example difference between both students. I'm not saying that that the second one is worse than the first or the first is better it could depend on the situation but communicating to your students that they might be at risk from what you've seen might be enough to change their behavior. And the second and the third one sorry if we talk about intervention interventions is that we see that sharing the data that's available to lecture support staff advisors all those kind of roles and also to the students can improve the time between student and mentor advisor talk students have the feeling they don't have to repeat their life story over and over and over again. And that you can really start a conversation with sort of basic knowledge what happened before and use that 20 minutes or that half hour that you have with a student as a next step rather than having 20 minutes of the 30 recapping what has happened through mouth of the student. So, some other things that I wanted to share step by step, according to our model on data. First one if you use data. I would advise that you use a combination of data, not just data of things that cannot be changed like where is a student from how old is a student is what is the gender indicated by the student etc. But also actionable and recent data, and I would like to refer back to the quizzes done by Hazel and seeing that students don't do the quiz. Could be a sign that they're at risk that they're not engaging with the material informing them that you haven't done the quiz what's happened. It might be that the reply is oh I was sick or I didn't know it or I don't see the value of the quiz and at least that's something actionable and recent students can get their act together and do the following quizzes. So there has to be something that they can change if I would tell Lee that men with beards from Ireland are less likely to succeed in higher education and he wouldn't. Well, he could shave off his beard but there's not anything else he could change. So it has to be actionable in the data. The second thing is if you have data and you want students to be self guidance or you want them to improve their own behavior you should try and make it visually and continuously accessible for all stakeholders. I remember a good example of it. It's a platform somewhere in the Netherlands and all they have is the students success rate so they know which courses they passed or failed. And what they've done they've put them on a very visually attractive platform and they also added the expected date of graduation. So the students who had failed a lot of classes this year would get a message like your expected date of graduation is January 2025. And it would be very clear message based on actionable data that they have to do a bit more their best or try to succeed more and that they could see people to help them with that. But it would be visually and continuously accessible they would clearly see that date and it would also be able to change if they put an effort in the next exams. These are some examples of very visual dashboards where a lot of color coding has happened or you could see how well you're doing as opposed to your peers. And then the last one data should lead to an intervention preferably a targeted intervention to a specific group. And I think the minimum is making students aware that what they are doing is not enough or what they're doing is good and making them aware of that that they know I should get my act together I should put in more effort and I can do it this and this way because of course used actionable data. Communication communication should again be actionable a student should be able to do something after the communication it's not just be sad that he's not doing well but also get some guidance on what he or she can do the communication to students should also be or come across as personalized. We have the power to make communication look like it's personalized, even though it's mass generated, and it has a different effect on the students. And the last one should support staff should have the final say, they should be able to exclude a student from an automated message that saying he's not doing so well because they might know. Something's happened at their home situation. So people taking up, for example, a care rule in their house when the parent is sick and would be excluded from these emails saying you're not doing a good job get your act together. The last one is intervention. You should start the conversation from the data. It should be the starting point. It's very objective. You should also present it as shared between the student and the counselor or the advisor or the lecture. It's not. This is what I know and I think because of this, you're not doing so well. It could also be that the student says while looking at these grades. I'm actually very happy because I was expecting to fail more classes because of this and this reason, rather than you saying okay you failed half of the courses you're not doing so well, you have to work harder. So a shared aspect from the beginning of the conversation. And the last one I forgot to put something in bold there but the word in bold should definitely be timing. Early intervention is often very easy to do. If you put in those quizzes and you see students are not reporting or they're not filling in the quizzes, you can from week two or three connect with students saying you are not putting in the quiz. Have you missed how important they are for this course. Also, if you want to talk about exam results, you should do that after the first exam period, not after the second. So considering when you place these interventions is crucial. I think that was my five minutes. I've gone over it a bit maybe. I don't know if there are questions. Thanks so much, Peter, and great overview of the project. Thank you, Sarah. I believe it's really important that dashboards come with human interaction who or what role is targeted as taking that task on. I agree these dashboards are valuable and our way forward sharing that information, but we sometimes in the communication, it could just be look we noticed this. Could you let us know if you're a right or not, or if you need a conversation so we have that step, allowing students to opt in, for example, for for another talk and Mark asked the students have the ability to turn off the dashboard and are they displayed by default to students. We don't have a dashboard ourselves. I think and nothing I'm trying to the dashboard is default to students and if I read anything and it would be unfair not to share the data with the students as it's their data and just have it accessed by lecture support staff advisors or any of those roles and coming back to Sarah's question who or what role is targeted as taking and taking that task on. I don't really understand the question or maybe it's just a reflection that can be as well. I wonder how do you measure the impact of interventions. Well, this is something we're doing now in our own project where we're talking to students to see how the communication came across. We're also looking at attendance levels during graded evaluation classes where we have sort of quizzes like Hazel had, but they count for the final point of the exam. And we started emailing students who weren't in those classes to say, look, you're not in those classes, how come, well, not how come these classes are important. We see that students who do not participate in these classes have lower success rates, etc, etc. Mark agrees with sharing the data with students, but if the dashboard greets them every time they log on, this may cause them harm. Yes, you're right. It's not something that should pop up when you lock in. It's something that you should be able to access any time. Who is the person in the institution who makes contact with the students as a result of the dashboard output. In the Nothingham Trend case, those are advisors, support staff in our university. We have each student is in a group and that group has a sort of a tutor, lecturer kind of responsible for that group who sees them on a weekly basis. And that would also be the person who can address that to students. And so they already have a sort of personal relationship with them. That's what is personalized, not the machine that says, do work harder. Fantastic. Thank you very, very much, Pidean. I think it's such a critical reminder that this is kind of something that we've very much emphasized that analytics in and of itself doesn't actually achieve impact. It's what you do with it that's absolutely critical. And I think that I love that phrase work with what you've got, but use it effectively. I'm absolutely going to plagiarize that horribly from you. I think again that the giving students actionable guidance making it easy for them to take effective steps afterwards. I think again is another really critical aspect. Thank you very, very much. The only thing that just I would kind of question you on we've done some very, very, very up to date research on the success of men with beards. And apparently we found that there is an outlier group that where they're outrageously handsome, apparently it doesn't apply in that instance. You can't argue with the science unfortunately. I'm going to mute my microphone before I play. Thank you very much, Pidean. So sorry if you just bear with me. And I'm going to, well, this is bent it in gender confidence. I'm going to pass over now to Mark. In just a moment, I'll just get rid of me first. And I'll see you afterwards. Okay. Let me see do I have control. Okay, can you hear me first of all I suppose. Okay, so thank you very much and to incredibly interesting talks with some great clear points coming out from both of them and hopefully it's generated some ideas for the speakers and for the audience and indeed I can try to follow it up. So, where we've got we're not going ahead here Lee for somebody other we go right. I want to chat to you about how we use analytics to with regards to students assessment and very conscious of Peter shan's comments about having actionable analytics. And this is what we want to pull data from model data that we could action very clearly or data that will address an existing problem. And I've undertaken this project with my colleague Shadi Karazi, but it's actually connected to a wide variety of assessment related projects that we've been involved in. So, I would like to acknowledge the support of all of the team actually involved past and present members of the team. And I know you have worked on projects connected to this one. So, and what we have done is we were listening to what the students were saying whether they are saying it throughout the student evaluations or feedbacks or the easy all sorts of different sources of feedback we pulled together to hear what are the students saying and what are the staff saying. If I concentrate on the students because obviously they're more important. If I concentrate on the students, the, they were complaining about being over assessed, or they were complaining about being overwhelmed and exhausted by the assignment and a lot of the complaints where all of the assignments in one week. And they weren't getting to spend the time on the assessments that they felt they needed to or would have liked in an ideal world. From the staff's perspective, they came back and they were saying, I don't know what's going on in module X. So, for example, the chemistry teacher had no idea what the biology teacher was doing or biology lecture when they were assessing and what they were assessing. On some cases, and I'd like to say more rather than the majority rather than minority. We did get individuals who took it upon themselves staff members who took it upon themselves to ring their colleagues and collate all of the exam dates and assessment dates and put them in a calendar as part of the program handbook. But that was a manual process for the most part. But what they were able to do was to get an overview of the assessments. And if we're looking at this diagram, we have the number of assessments on the vertical access and the week of the semester. So week one right through to week 13. And then they were able to plot graphs like this. So this was module A and we could very clearly see that in week 12 there were two assessments week 13. There were three assessments and another two in week 14 in this case. And then we had another module that looked like this one. Again, small assessments in this case, we knew they were being assessed but didn't quite know what the assessment was. And then we had another one like this, but these are all manual manually collected data. And the problem with it was that pretty much as soon as the lecture hung up the phone call or created that and created that manual that the course handbook for the students. The data was out of date already that maybe for whatever reason, genuine reasons, lectures may choose to change their assessments add in a second one or change the date, or take it away from being on on do for two things breaking it into draft submissions and final submissions and so on. So the data was not live and as any analytics person will tell you bad data is sometimes worse and I would say all time worse than no data at all. So what we wanted to do was to look and see if we could get moodle to actually automate that entire process. And the challenge that we had was we had to get the data into moodle in the first place. But what we done was we did a huge PR drive with our staff explaining from the advantages of putting in the dates of their assessments into their moodle course. And then we were able to generate this report. So just to explain what we're seeing on this report. You see some tags up the top the BA in journalism and the BA in languages that actually means that in the module English 101 it's taught across two programs journalism and languages. And what it's doing is it's highlighting here across the weeks if we look at the bar chart across the weeks and the 14th of the 10th there is one assessment 21st is one assessment and then again you can see the 25th of the 11th. There's three assessments but it gives you this calendar overview of the of the assessment. So this is what the students sees. This is what is automatically entered into the students phone. And because it's in the app or because it's in moodle it can be synced automatically with your Google calendar which comes as part of the DC account we're a Google college but obviously this would work for Microsoft colleges as well. So we're able to pull the reports on a program basis. So if we want to just have a report for all of the journalism or all of the languages course in this case where it's taught across two different modules you can if you wish. I hide the assessments due to just languages and just show the ones for the journalism students or vice versa. You can also hide the assessments and just show the quizzes or hide the quizzes and just show the turn it in submissions, whatever the case maybe. The whole concept about it now is that the all of the lectures will have visibility of the other assessments that are going on and more importantly all of the students will have visibility live dates as to what it's about. And because all of that data exists and the everybody's a winner and the only thing we require lectures to do is to turn around and create the assignment. Create the assignment by simply logging on to their course page at the assignment put in the title and put in the date. And if that's all they do this calendar is automatically generated. It's at that stage that pure exercise kicks in where my team will go around saying well seen as you have the assignment on. What else can you do well there's an advantage here if you correct online or you have submissions allowed online or whatever other functionality we want to highlight there. So we've taken a very simple step and then done a huge pure exercise and to try promote further benefits of it. And another plugin that we used we looked we wanted to have reports for lectures on the how the students are interacting with them. So here on this module is an assignment breakdown. For this module test assignment and there's two assessments in here and we can see there's four students on the course. For non submissions no late submissions and obviously they haven't been graded but it also gives you an indication as to how many of them scored below a particular point. And then following on from that one. And so you can forward too quick. You can also have a breakdown of the students right how many assignments has Kira submitted how many were was she late for or how many did she not submit how many late submissions and what grades they were and again is she at risk what sort of how many of those are low grade assignments. So this is all available within Moodle we use that the my feedback plugin for this particular feature built by UCL and we built our own plugin for the assignment calendar. We listen to staff we listen to students we said what is the problem that you're having and we use the analytics to actually help us make informed decisions. And that's how we moved forward from there. It's all about using the data to inform proper pedagogy not to steer the pedagogy but to inform it. And then the very last slide I have for you there is with one simple step we can generate a huge amount of data by simply asking the lectures to insert the deadlines. And it will automatically create a calendar. The calendar is viewable for all anybody that has the appropriate permissions to view it. You can then see the online submission for the students you can analyze the submissions and the deadlines and the late students are not so late students and you can monitor student progress across the program. This is all just from one day to one day to step that we ask our lectures to do. And because the program coordinator now no longer has to spend the start of the semester ringing up his colleagues or ringing up her colleagues to pull together this handbook. They are a big driver of getting this done of encouraging their colleagues to do it and the heads of school are also big drivers are doing this. So that's what we've got. And if you have any comments or questions, ask them now or indeed email Shadi or myself at any stage that suits you. Hello. Fantastic. Sorry, thank you very, very much Mark. Yeah, I think it's really, really interesting work. One of the things that I've seen from the work that we did on student success. You know, I mean, we looked at the enablers of success. And you know, we found things like engagement and assessment and feedback and data informed decisions and institutional cultures and all of this stuff is absolutely critical. So looking to students without meaning to kind of make 250,000 people sound like a single blob. The thing that kept coming up over and over again was these major kind of logistical and infrastructural hurdles that they had to face. So you have lectures putting all of this work into kind of really effective assessments will be good meaningful support of pedagogical assessments, but then just through either a lack of data or a lack of organization. It's the student that suffers. I think that's absolutely fantastic to be able to take such a data informed approach to it. And the other thing that I think is really interesting. You know, I mean, I did the ability of targeting individual students. I mean, we see is key and is critical on the power of early warning systems, but being able to use data in this way that you can make changes that improve the experience of a large number of students at once. Again, I think is is absolutely essential. So thank you very, very much. And so does anyone have any further questions there for Mark. I think you've covered everything Mark. I did that on the ball from sleep. I think Peter Jan has them all still thinking about my beard. Yeah. Well, presumably I'm sure you're you're happy for anyone to follow up with you kind of afterwards Mark presumably absolutely. That'd be my pleasure. Thanks. Thanks Angelica. Great food for thought Mark. We could very easily encourage course directors to create a program site in our VLE where all students could be added and where they can access an aggregate calendar as well as generic resources. Very easily done. And again, would be delighted to show you how to do that the program plugin that we've developed that tagging feature means you can circumvent that because every student is enrolled in the modules, the course pages on moodle and it collates them all and generates the calendar automatically for you. But if you only use the calendar for a program page. Yes, it will definitely work, but it wouldn't be the best way of using moodle or any VLE just as a document repository and a calendar way. Absolutely. So thank you very much. We just given that we have just a couple of minutes left there. And I might ask, and her key. If you wouldn't mind. Sorry. Do you want to unmute their key and just has a little bit of keys running a project in Finland, which now we might ask her key again you might come back to us and kind of properly for a future webinar. But just if you wouldn't mind what we have a couple of minutes just telling us a little bit about your project. Okay. Okay, thanks Lee. Thanks for everyone who has been presenting in this webinar. It has been very interesting. Yeah, I'm Erki Herkanen from the University of Turku in Finland. And we are part of a national development and research program called analytics AI. So artificial intelligence and learning analytics. And it is actually coordinated by the University of Oulu in the northern part of Finland. And we have been working now for almost a year. And we have one year to go. So it's a two year pro two year project funded by the Ministry of Education and Culture in Finland. It's a national one. We have seven universities as partners in the project and what we are doing is actually trying to work on three work packages and one that we have already. And that it's already finalized is that we have analyzed user needs and like data sources what are available and what we can use and what kind of data can we combine about student progress, for example. We have discussed about indicators so that what are the indicators that we need to follow in order to help students to move forward and to stay in their plans and all that. And we have taken a look at the different kind of tools and methods that, for example, we at least like more is presenting us or providing us to use. And we have in this project, the University of Tampere is making an application development for a student dashboard and we are going to pilot that dashboard or application in several courses also at the University of Turku, but also in the other partner universities. So we get some information about what works and what doesn't work and how to help students to progress better. But that will take some half year or something like that. So we can come back with that results later, perhaps. But what is really interesting is that we have tried to take a look at the legal and ethical aspects of learning analytics. So what is, for example, TDPR is what they can or what we can do in terms of TDPR, for example, and other legal issues and what is ethically correct. And we have decided that we need institutional level like policy on learning analytics. So where the responsibilities and roles of different shareholders, for example, teachers, counselors, program leaders, the management of the university, what they are, what are their roles and responsibilities in learning analytics. And that's something that we are at the University of Turku, we are now working on or starting to work on, on basic of all the universities experiences. So that's very quick. Janne might add is also working with me in the process. Fantastic. But thank you very, very much. He's sorry for putting you on the spot. And we'll look for it. You might come back to us kind of next year. We'll be running kind of further webinars. I'm really, really interested and I know we all would to hear more about your work. Fantastic. Thank you very much. So that's it. And we made it. I managed not entirely break the computer. So thank you very, very much. Enormous thanks to our presenters. I think we're really, really, really interesting stuff going on. And it's great that we can kind of get a sense not only of the stuff that's happening here in Ireland, but that's happening across the continent as well. It's fantastic to be able to broaden those arms of collaboration if that's not too poetic for this time of day. So thank you all very, very much. This will be, I'll make this available on our website and we shared the slides and recording will be there as well. So thank you all very, very much and bon appetit. Take care. Bye now.