 So, thank you all very, very much for making the time to join us here today. So what we're talking about, for those of you who kind of haven't been along already, this is the third in series of webinars about data use in institutions. So we looked at kind of innovative practices and we looked at kind of the challenges of kind of getting up and running with data. And this very much is the next in the sequence. So what we're talking about here is the challenges of embedding data use. So getting from the point where data is like a novelty or a wouldn't it be nice to the point where using data in this way, using state student data, becomes an ordinary iterative resource for supporting our students, which is probably kind of an inevitable way that we're all going. So just a couple of notes here. So student success is one of the four strategic priorities of the National Forum, as many of you probably know. And this kind of the data stream follows on from work that the forum has been doing over the last couple of years around learning analytics and around using data for enabling student success. So it fits in then under the student success priority. One of the most recent projects was DESI, the Data Enabled Student Success Initiative, through which we worked with institutions kind of across the country, developing strategies, developing resources to kind of support institutions as they kind of make the best use of their data. You'll see then that the next link there is ORLA, which is our online resource. You begin to see my kind of naming convention there, my acronym naming convention. So ORLA is a resource that we developed about two years ago, and I've been kind of adding to since, which should hopefully kind of help answer any questions or provide guidance and advice about various aspects of kind of getting up and running with data around what data types are available around GDPR, data quality, making effective interventions and so on. So you see we've got the link there to ORLA. And we also have, as I mentioned, the last two webinars were also recorded. So we have the two links to the last two webinars as well for those of you who've missed them. So this will make the presentation, will all be available on the forum website over the next day or two. So our speakers today, and significant thanks to all of them for taking time out of their hectic schedules at the moment. So we have Ed Foster from Nottingham Trent University, and I kind of go through the speakers and what they'll be talking about a little bit more as we go. We've Lee Richardson and Sarah Sharkey from DBS and Jeremy Britton from UCD. So it promises to be an adventure filled afternoon. Thank you all very, very much. So we'll start with Ed, who was absolutely instrumental in the kind of development and implementation of the NTU student dashboard, which really, going back to my earliest days of reading about learning analytics, what really seems to have been one of the kind of pioneering path-finding projects on learning analytics in the UK. So Ed is indubitably one of the, I think, the UK's leading lights on learning analytics and certainly one of their most exciting thinkers at the moment. So we're very, very lucky to have him here. And without further ado, I'll pass it over to Ed. Thanks, Lee. One second. Right. So can everyone see my screen? I think it's just going to assume that's a yes. Right. Thank you ever so much for the huge big up there, Lee. Just to kind of put it into proper context, at the point where Lee started speaking, my chair has broken and now I've dropped a few inches, but I think I'm okay for angle. So if I get lower over the course of this presentation, I'll apologize now. So thank you very much for the invite. Thank you very much for letting me talk. I'm going to talk for 10 minutes on our experience of embedding learning analytics and the NTU student dashboard at my institution, Nottingham Trent University. I need to, I'm going to do some, a couple of slides that I think people may have seen before, but I'm going to try and give as much time as possible at the end to what we're currently doing around COVID-19 and how we're attempting to use the data that we have from learning analytics to target our support better at the needs of our students. So without further ado, I'm just going to run through a few slides for you. No, I'm not, yes, I am. Okay, right. So the first slide I want to pick up with you is some work that we did as part of one of our Erasmus Plus project. So this was work that we did with KU Lervin and the University of Leiden finishing a couple of years ago. And I just wanted to stress this from the perspective of anyone looking to embed learning analytics, there are these kind of factors to take on board and take into consideration. And the first of these is the overarching ethical and external environment. So we're all governed by GDPR legislation. I've done some work recently with colleagues in other countries and wanting to take a little bit of work out in the UAE. And their assumptions and starting points around the use of data are very different. And I think there's a really profound set of issues here of what is acceptable. We're currently in an environment of very profound change in terms of governments talking about using monitoring apps to see how we're all doing in terms of tracing our, whether or not we've got COVID-19 and whether or not we can share that and we can use that data to give us relative freedom. So the first thing to stress is learning analytics is a big data project. It has very profound ethical considerations that need building into whatever work is done. I think it's often interesting that when we talk to colleagues about this, staff are often more concerned than students are. But of course we should be very concerned about student views. Building on this, once we get into the institution, the most important starting question is what's the mission, what's the purpose of having this? So there's a really statement of the bleed and obvious, but it's important. Why does any senior manager in an institution or any lecturer or tutor or whomever believe that having more data about an individual student, whether it's live, whether it's after aggregated reports will change anything about the way the institution works. And then once we have got past those strategic hurdles, there are profound issues around ongoing project management. And we found that we spent longer than we would want on project management because the notion of what learning analytics is, is evolving over time. And then often people will perceive learning analytics as a technology project. I think that's a mistake. At the core of what takes place around learning analytics is of course data, high quality, accurate, up-to-date data about students that you're working with or whatever it is that you're looking to support. And then there's the technology wrapping around that. Do your feeds work in a way that enables regular safe updates of data that you can then act on. But I think the thing that I want to really stress more than anything else is, this is not a technology project. This is a technology enhanced project or enabled project, but fundamentally it's about what happens within your institutions. And our real big issue right now is, it's not that we have the technology available for use, it's how do staff make use of it supporting our students with COVID-19. So I'm just gonna run through a couple of points quite quickly. So our vision for learning analytics is on the left hand side, it's a student success use. We're looking at this not from an efficiency, not from the perspective of changing our learning and teaching practices. We're primarily using learning analytics to support our students in coping with the transition into higher education and then through into further years. We're using the notion of engagement. So this taking the North American model of educationally purposeful activities. And that's the kind of driving the core underpinning of what it is that we do. We try and frame it in the positive. When we describe the word high students with high engagement, we're talking about students who are doing a lot. We're not talking about high risk. And we think that's quite important because one of the things that we do is we share all of our data with our students. And that of course throws up all sorts of ethical and challenge ethical challenges to us. But we feel that's an ethically appropriate thing to do. And it is a proxy. We're not measuring students actual engagement or measuring the things that we can touch upon. And we made the decision very early in this process that we would not include socioeconomic disadvantage as part of our algorithm. The work that we do purely focuses on student activity. Now, we do draw from that and we do reports looking at the differences by socioeconomic background. But we made a conscious decision that what we didn't want to do was to actually build in that socioeconomic disadvantage into the very calculations describing what it is that students do. It's really important to stress the data and the information that we provide is available to staff and students. Students of course can only see their own data with an aggregate average line comparing them to their students on their course. But we do think it's important that students have a sense of the data that's available about them. And I need to stress that we work with a company called Solution Path and we use their stream tool to do our learning analytics. There are three broad ways in which we work. Number one is student managed success. So we give information to students. We're giving them live up-to-date data so that they can see for themselves how they're engaging. And we know that around 40% of our students are using the dashboard on a regular basis and these are the students who tend to do better in terms of progression and attainment. I'd love to prove cause and effect but I think it's that they are already engaged students and it gives them a nice buzz. So what we then do is we move to the second part of our strategy which is about staff supported success. And we're primarily talking about the role of tutors here. So tutors offering support to students on an ongoing basis but drawing data from the dashboard to help in those conversations and from within the tool, we can make referrals through to professional services. And the final area that we're working on quite a lot at the moment are institution level initiatives. So for example, we've just launched a black leadership program using data from the dashboard to target students to coaching and more positive activities to help build their confidence and their self-belief. Just very quickly, I'm just gonna run through how the dashboard works. We draw data in from these seven data sources on the left hand side, our attendance monitoring, logins to our verly, logins to our learning rooms, use of e-resources, cards, et cetera. Some of these are virtual, some of these are physical, which is important for the next slide. We have some contextual information about the students that we add into this to help give tutors in particular some understanding and overnight the dashboard generates five engagement ratings, high, good, partial, low and very low and it also generates alerts. The moment the alerts are based on students doing nothing at all. So this is quite a high marker for M10 or 14 days. And then we expect staff to make use of it. So, and now many people are seeing this, I'm just gonna skip to the next slide. Oops, then you'll back up. So this is where we are at the moment. So there's a lot in this graph. So just bear with me a second and I'll run you through it. We are currently in the middle of a campaign contacting students around just checking that they're okay. We just wanna give them some additional support. The original proposal, and I put forward the original proposal was that actually let's not worry about targeting. Can we just ask our tutors to contact all of their students and say, essentially, are you okay? What can we do to offer support? But the pushback came back from our school saying, hang on a second, it is just not practical. There are too many students. It would take too long. We're not sure that we can cope with this. So what we've then done is we've put in place a targeted activity where what we are doing is we are communicating to our students using the dashboard. So what we are doing is we have asked all of our tutors to contact students who have low and very low average engagement in the dashboard. So we hit our first challenge straight away. And the first challenge is this. When do we contact our students? What's the criteria we've picked out to this low and very low average engagement? But the challenging question is when? So when is it that they have low and very low average engagement? And the reason for this is the awfulness of COVID-19 is it's hitters, the good thing about COVID-19 is it's hitters quite late in the academic year. Most of our students have got a reasonable and they've got a long way through. They've done many of the activities they need to do. But we have this real problem around the data that we're using for targeting. And the problem that we have is that the way that the dashboard works is it needs to do some smoothing of the data. So instead of just giving each day a completely different engagement score based upon what a student's doing, we need to have some smoothing because if we don't, on a Monday, a student might have high engagement. On a Tuesday, they might have very low engagement. And so what happens is from a tutor perspective of logging in, there's almost meaningful data because it's completely spiking. So here's our problem. When do we target students with low engagement? So if we look at this and I've using made up data here, although if you look at the real data on the real graph, it's quite similar. So this is just my plausible deniability. What we can see is that the black line with blue spots or blue diamonds is engagement in a normal year. So in the few weeks before Easter, we see that it's quite good and that it drops away somewhat in that last week, which you would expect students have handed in their course week, coursework in that last week and they're enjoying themselves and then we see it falling away over the Easter break. The red and orange line is this academic year. So I'm going to need to speed up this academic year. And what we can see is we have a week of normal teaching which is pretty much identical to the notional normal year. The hybrid week where it drops away and then the two weeks of online teaching in which engagement is lower than it was in the comparity year. And then over the Easter break, it's okay. So what do we target? What do we do? In the end, we made the decision that we would target students with low and very low engagement in the first two weeks of online teaching because that was a fixed point. That was a point where we knew students were shocked. We knew that they kind of, they would be struggling with some respects to cope. And that's the data that we've pulled together and we shared with our tutors and they're using it now as part of a call campaign. And the other thing to stress is initially our plan was that the personal tutors would do all the calling and what we're now having to do is to support those calls with staff in a central department. So I guess the point I'm trying to drive out is having data about our students, having a dashboard that shows student engagement data is useful and is valuable, but it's meaningless until you get to the point of having a kind of a mechanism in place, a support structure in place to act upon it. And even though we're happy with the quality of data that we've got, we've still had to make management and strategic targeting decisions based upon the context of data. Data is, the meaning of engagement data is almost always needs interpreting and the context needs pulling out from it. So that's the end of my very quick overview of both big picture and where we are now. I'm very happy to take questions, but I think I'll follow Lee's suggestion. Well, Lee, you tell me, do we start from the position? Should we lump all the questions together at the end? And I'm very happy to do that too. Well, I suppose if anyone has any, price a million at first of all, that was absolutely fantastic. If anyone has any questions, particularly for Ed now at this point, which obviously you're going to have because you haven't seen the other presentations yet, we can do questions about each of the individual presentations at this stage. And then at the end, if anyone has any overarching questions that apply to kind of all of them. Ed, listen, thank you so much for that. So it's really, really interesting. I think I really like the focus that, you know, data and technology, they're all, they're critical tools, but that ultimately, I suppose fundamentally from what you're saying, it seems, you know, it comes out to people. That it's about being able to support the students, it's about getting staff to engage with it, getting actual use out of it. And I think it's really interesting then as well that, you know, now that you have the platform embedded in this kind of circumstance that probably, you know, none of us would have necessarily predicted for now when the world has changed quite comprehensively. You have a tool there that is already fit for purpose in a kind of a vastly changed world. And I suppose just one other piece that I found really interesting, the fact that you're not using socioeconomic group, which personally, again, I would've always had kind of ethical questions on these perhaps about that. But on the contrary, you're actually using the data to kind of empower underrepresented student groups, which I think is really, really interesting. And to move over to questions. Shall I stop sharing my screen or do you want to, what do you want? So I do, your screen is a lot more attractive than mine. So maybe we'll click it up wrong. So, sorry, just looking through. So from Sinead, thanks, Ed. You've stressed the road of choosers. In NTU are personal choosers, academic staff or staff in learning support or similar centers? No, at our place, they are primarily, they're all academic choosers, so that they're attached to the course in some way. One of our schools has a slightly different model where the academic mentors are, they teach on a particular module, but they may be connected to students across the whole of the kind of school. But for the most part, they are embedded in, not like I think the sort of North American model. And obviously this isn't the only way you can do this. I'm very aware of colleagues in Coventry who use a different model where the main users of the dashboard of their learning analytics tool or their contacting students or their student support services center. And I can see the real benefit of that. And this, one of the questions that we're asking as a consequence of this particular piece of work is if we need to do some of the basics of calling and contacting students centrally to backup and support choosers, it's the words, the added value of a tutor. And for me, it's that having a meaningful conversation is where a tutor really stands out. It may be that actually spent them spending time trying to get in touch with students is that there's less value to that. And so there may be things that we can do there about taking time off tutors so that they can then do more of that kind of added value, if you like. I think one thing I would just add, I meant to say at the very start, a wiser colleague than me made the point that at the moment what it feels like is we have a sort of hierarchy of needs at the moment in terms of how our students are adapting to online learning and teaching. So almost the first set of principles is have you got Wi-Fi? Sorry, Virgin Media. Have you got Wi-Fi? Have you got a computer that you can actually access it? And also, if you have a computer at home, how many brothers and sisters are you sharing it with who also need it to do their homework? So I think the universities across the sector, I think I've responded very well to that in terms of just those fundamentals of technically can a student get access to learning? But that next step up of motivation, hearing the registration of the requirements that mine's the tuition of different to the ones next door, who am I paying attention to? Actually, there's no job out here anyway, and I've finished anyway. I think we're finding some real challenges there about students' motivation. And that's where I feel this next step up, the hierarchy of needs is that just reassuring students, making it clear that we understand how discombobulating it is and how we offer support to them at the same time. So I do think there's something there that's really we can gain from having learning a little bit of data around to help focus our attention. Fantastic. And yeah, there's a nice kind of juxtaposition of using binary data as a means of conveying human empathy. Yeah, yeah, absolutely. So Kevin, just wondering who owns the project? Is it student services, learning and teaching, or another division within the institution? And it's moved around a bit because it's moved around with me. So it has at various points been in different parts of the institution. We're currently based in our widening participation team. So for the most part, it's a department that does a quite a lot of work with outreach and pre-entry students, but we're also offering a range of different support initiatives. So I think that I straddle the learning and teaching world and student support world. My gut feeling is it needs to always, depending on how you're using learning analytics, I think it needs to stay outside of IT. And that's no disrespect to the IT department, but it think it needs to be focused on the end users. And in our instance, that's mostly tutors for the students. So I would always keep it away from it. The technical people need to be there to deliver and do things right and ask the right questions, but I always put it into the Academy of Search. Perfect, thank you very much. Sorry to everyone else who's asked questions. We might just, we're kind of a little bit tired for time. So we might just push on for the moment, but then we'll come back at the end if we have more time, if that's okay. So, bear with me. So yeah, our next speakers then are Lee and Sarah from Dublin Business School. I'd like to express my personal thanks to them. They've both been kind of, not only very good friends to the forum, but particularly the data and learning analytics projects. They've been involved in one way or another since the very beginning and throughout. And what they're going to talk about with how many steal their thunder is the early alert system that they've developed within DBS. And what's really interesting for me about this is the fact that it's been totally developed in-house. They have their pulling data from multiple sources, all of the APIs, everything has been developed within the institution. And I think at this point, it's probably fair to say as well that Lee and Sarah right now in Ireland have more experience of intervening directly with students based on kind of data driven flags probably than anyone else at this point. So I shall pass over to Lee and Sarah now. Thank you very, very much. So thank you. Just to add me know, Lee, I need to be able to share my screen. You, can you not? You should be able to- I can now. Oh, okay, sorry. Okay, so hopefully everyone can see my screen. It should be a nice blue PowerPoint slide. And now I'm going to be quiet as Sarah talks. And just checking, can everybody hear me? Okay, perfect. So I'll begin. So just good afternoon from a sunny Dublin and I hope everyone's keeping well. I'd like to start by thanking Leo Farrell for asking Lee and myself to be part of today's webinar. And it's actually given us an opportunity to reflect on our current practices and also to do a quick review of the academic year. For today's presentation, we're focusing on what we do to reach out to students who may have non-academic difficulties based on our data. And I'll also mention a few of the responses we've received over the past 12 months. So for those of you who haven't heard us present before, we created a CZU unit just over three years ago. CZU is a dedicated team responsible for student engagement and success. There's just four members. Lee is team lead, myself as engagement officer, and Deborah Zorzi, a dedicated first year librarian, and Roy Devlin from the Students Union. And his title is Vice President for Education. So our main aim is we find students with little or no engagement data, and we do this as early as what we like to call week zero, or borrowed from another national forum event, our first critical moment. And how do we do this? Well, we've lots and lots of data. And these include our custom-built early alert reports, two dashboards, one for staff and one for students, which were both launched in 2019, two engagement scores that are currently in testing, and four years of research of our own data. And this has been extremely beneficial in targeting this group of students for us. So as I've said, we've lots of data, and just on the screen coming up, the top left image shows our early alert report. And upon review, we actually discovered 64 different columns of data on each student. And we're continuing to add to this yearly. The bottom left image shows when we filter the report down to key metrics for us, which are Moodle login, average attendance, fees paid, and average marks from previous year. The top right snippet then just shows the student dashboard showing their library attendance and Moodle data. And then below this is the students attendance line graph for the year. So for today's webinar, we decided to focus on what we do to identify non-academic issues from the data. And to begin, we look at students with funny enough, no data. As discussed in other national forum webinars, early detection is key. So Dublin Business School is a private college where students can apply through CAO or directly to us through our admissions process. Therefore, they've made a conscious decision to come, study with us, and therefore pay fees. We commence all courses with a two-day range and induction program, which includes key IT information, library induction, program talks, and we compliment this then with a number of student events. So to start, we target students who miss this or may attend only a small part of the two days. We ask, why would a student miss the open a few days of college? Why miss crucial information from passwords and logins to timetable details, campus stores, and course content? Even when that student then comes for week one, lecture one, they've no student card, potentially haven't received any emails. How do they know their timetable? How can they log into Moodle? And as our research tells us already at risk, this is where we start. And we actually question the students. Where was the excitement, the energy, the enthusiasm that goes with starting college? And we're constantly looking for these answers. Next on the list then is the low engagement. So in the opening days and weeks of term, Lee and myself are constantly reviewing it. Low attendance, low Moodle interaction, little of any library engagement. And again, we ask why are students going to have their classes or less? Why are they not active on Moodle? Sometimes days between being online. And again, we reach out. We have an initiative called don't drop out, drop in. And we contact these students by variety methods. Removing the reason, of course, not suitable. Responses from our year one students last October, November, as to why they have low engagement includes no friends. They're lonely. Course wasn't their decision. Parents or guardians push them to do the program. We can see and we hear their self-esteem is gone. Motivation, nonexistent. But there's pressure to stay on the course and the students force themselves to come to college. Other reasons for low engagement include financial stress. And this is evident early in the course when attendance dips and student responds with their work and they need the money for fees, rent, and in some cases survival. So we actually have students paying for a course they're not attending. More examples, including personal and family related reasons with issues such as marriage breakups, pregnancy, domestic abuse and major illnesses have all been discussed and obviously college is not a priority. This year we made more contact than we ever had before and as a result the response has exploded. So just if you move on to slide five, leave me. So as we know one of the key metrics all universities would review is attendance. So the image presented there is a final year law student who has attendance of 54% over this year. For our undergraduate students this is the typical attendance graph. Just to note the academic calendar has broken into four blocks of six weeks with reading weeks and Christmas identified as the dips there at roughly week seven and weeks 15, 16. So hopefully you can see that what looks like a good strong start with attendance and decreasing coming up to reading week and improvement upon her return but then a downward trend again up to the holidays. The second half of the year is up and down with little or no attendance in the last few weeks but a spike for an in-class test. So then towards the end of last year Lee and myself began to review attendance data from our 18, 19 records and our new dashboards. As well as working on CISU, Lee is a lecturer on year one psychology teaching on a module called research techniques and analysis. So we began by reviewing and comparing the data from the current year two psychology students to their previous year's engagement scores. The students profile on the screen shows both her full 18, 19 and 1920 details with 1920 on top. We actually picked this student to start her review as we knew she was strong academically as well as having strong engagement data from year one. And as shown, her attendance was an incredible 79% where the average for her class was just 54% in 18, 19. She was also class rep and an active member of the SU. So November 19 with six weeks of data available, we were both kind of intrigued and only slightly concerned as her incredible slide in attendance with no classes actually attended in week four. Could it be as simple as she hadn't tagged into her classes or even lost her student card? So couldn't tag in. And a simple check and email led to a response of, I was waiting for someone to notice. Big mental health issues were discovered and her engagement remained inconsistent throughout the year as she finished with a total of 39% attendance where the group's average actually increased by one to 55% overall. So we continued to review different courses as the year went on and other stories emerged. We have found this type of attendance pattern is reflected of someone with mental health problems and the students actually forced themselves to come in. So unfortunately, we're only beginning to looking into this level of detail as our focus has been on academic issues over the past three years. But we are beginning to build up some passions and profiles that hopefully will identify students quicker and not just on a potluck basis. And I know Lee is just gonna finish our final slide. So, yeah, so what we've been talking about there then really was, I suppose pre-COVID-19 and then obviously we COVID-19 hit and we went into lockdown. We were fortunate I suppose in one way that we had about three weeks left of the semester. So we were in to wrap up. Now that didn't mean we stopped monitoring students. We did, we were still contacting students. But although teaching finished, we took the decision unlike many other HGIs is we were actually gonna continue with examinations and we were gonna move those examinations online. And we've been moving them online. We administered them through, well, we chose to administer them through Moodle. We're now in our second week and it's actually going really, really well. From the data we've got coming in on attendance rate, it's looking like we've got a higher attendance rate administering exams this way than the traditional way we wish would run up for the past few years. By moving the exams into Moodle, it's allowed us to get some really nice interesting data. One of the key things we've, one of the really good reports in Moodle which we've found is the live logs. So prior to the start of exams, we're going into the live logs and we're looking to see the activity on the Moodle page. And then we look again shortly afterwards to see who's access the paper. And then we've been 30 to 60 minutes after the exam, we look then to see who submits the paper. And what's key is anybody who hasn't submitted the paper, we follow up within 24 hours. And that's weekends as well. And we're getting a lot of interesting comments coming back from the students. And Ed has just referred to some of those 10, 15 minutes ago. So if I just summarize those briefly, Wi-Fi is a big one. Not everybody has good Wi-Fi connectivity and it's not just a rural urban design, rural urban divide. There's bad Wi-Fi within cities and towns as well. Also across different countries, it's the same France for some reason there seems to be some particular regions where Wi-Fi is very bad. PCs and laptop, again, Ed mentioned this. We all assume that our students have PCs or laptops at home. We're finding actually that quite a few don't. What's also coming across is that some students who do, have actually quite old devices which won't run. They might be okay for Word or for Excel, but they're not gonna run SPSS or in vivo. And also there might be a PC or laptop at home, but now there might be, there's mothers, fathers at home, there's siblings at home. So there's competition for the laptop and the student isn't always the one that wins. We're finding a lot of mental health issues being raised by students. That's not just the distress caused by the isolation and the loneliness. What we're also finding is, I wouldn't use the word many, some of our students are in therapy because of the lockdown, they can't continue with their therapy. So that's having problems as well. Financial and financial employment issues are coming up. And there were two sides to this. We've got eight co-op to students who have been laid off work. I think the term we're using now is fair load. That's causing distress. But then also the other side of that is, we have students who haven't to work more than ever, they're doing 12 hour days and they're covering for colleagues in work or sick because they're essential workers. Then what's also coming up, and this is a new one as well, where we've got, there was some families where the relationship going into COVID-19 might not have been great. Now they're basically being quarantined, they're forced into isolation. Those relationships are starting to break down. And we take all of that together and it is starting to have an impact now on the students performance in college. And this, our challenge I suppose now is how do we support those students? And that's what myself and Sarah are working on. So that's the end of that piece. So I will try and stop sharing. Hopefully hand back to Lee. Yes, Sarah. And then myself and Sarah can take any questions. Fantastic, thank you so much guys. That's really, really interesting. Again, I'm really struck and it was following on from Ed as well. You know, we think of data, you know, data is cold binary facts, small, solid statements. Because we associate this sort of work so closely with data and with technology, it's so easy to forget that behind each of those patterns is a person, a human being, with everything that they're going through. And I think we have this idea where there is this underlying idea in learning analytics that what you do is you find students who are disengaged essentially give them a kick in the ass and then they start being engaged which completely overlooks and underestimates both the academic, co-academic and non-academic challenges that students are facing. I think that I was waiting for someone to notice is an incredibly, an incredibly powerful statement and really a rallying cry to all of us. So thank you very, very much. Again, I'm just, I'm afraid I'm cognizant of the time. So we have one question that kind of came in through while you were talking. So we might just do that one for the moment and then if we've time at the end we may come back if that's okay. So Eileen asked, how do you monitor attendance? Is it a swipe card system? Yeah, if I jump in there. So attendance is not compulsory unless you're an international student when you've got to meet your visa requirements. However, we don't tell students that. We maintain or encourage attendance. So all of our students have a student ID card with an RFID chip buried within the card and then there's a scanner in each room. So we ask students to scan in. The software system we use to monitor attendance is something called CellCat and there's a piece of software within there then called CellCat Live. So the lecturer in the classroom can go into CellCat Live and see who's tagged in. So that works really well until students don't have their ID card or they forgot to tag in. And that's what Sarah was saying at the beginning. That's what we need to do really within the first week is why we're hitting the students to see we've noticed your attendance isn't great. Is everything okay? And in the first week invariably what comes back is yeah, I don't have my student ID card. Fantastic. Thank you very much. And it's just like a couple of comments coming through that on then to the next challenge. So you start with the data, you identify the issues, the challenges that the students are actually facing but then supporting students through that of course is a further challenge. We might thank you both very, very, very much. We might move swiftly on as swiftly as my ability will now to Jeremy from UCD. So Jeremy has considerable experiences. The former kind of lead of the student desk in UCD registry. Jeremy has kind of always had a very, very strong comprehensively student-centered focus and it's very cognizant of the kind of challenges and the kind of supports that students have needed. So he's going to talk us through UCD's USM which uses data to ensure kind of a consistent experience for students and efficiency in terms of ensuring their queries and issues are resolved. Now that I've done that Jeremy, I don't know if you need to present at all. I think I've covered everything there. So I shall stop sharing. And Jeremy if you want to jump in there. Yeah, thanks Lee. That's a wonderful introduction that you've set me up for and hope I can live up to it. I really, the project that I'm working on, I'll share my screen here and I'll get the presentation up. So the unified support model is, it looks to the services that are delivered to students both centrally and in their local college and school offices and really what we are trying to do with this is ensure the most consistent and highest quality level of service for students, whoever they may interact with during their time in UCD. So to give you a little bit of a background through the project, where this all started really ultimately was a entry in the student desk. So 2005 UCD registry introduced and developed its student desk. So between 2005 and 2016, that's where you've been developing central supports for students. So students would go there for any kind of support to do with their fees registration. And what they did from the beginning on that team and I only came into it later was that they were capturing statistics and information on the types of services that they were delivering and the students that were accessing those services from the very outset. The intention was always to capture the types of interactions that were coming in with a view to ensuring that they knew the business that they were delivering. They knew what students were looking for when they were looking for it with the intention then to use that information to improve online information to improve the services and the processes that students were looking for. Then in 2017, a review took place, the student and academic services review took place, which was really focused on ensuring that UCD was delivering excellent program supports to students and making the best use of the resources and to the best effect. An outcome of this was that they had looked across at the services that were delivered across the university and seen some real positives in what the student desk were doing centrally and suggested that that approach be rolled out or used in other areas as well, which is where in 2018 the Unified Support Model project came from. So I think what's probably particularly valuable here is that at an institutional level, I think we all would like to have some kind of blanket approach that would work across the board for all staff, whether that's administrative or faculty, whoever's supporting students in all offices that we could just roll out tomorrow and it would all work extremely well for everyone. But big challenge obviously is the differences between the disciplines, the differences between the supports that are delivered in the different offices and how you ensure that if you are trying to roll out a model that's specifically targeting students and delivering high quality services to them, you really look to be cognizant and respectful of that local variation while also driving that consistency. So that's where our building blocks approach came from, which I'll talk about in a second. And then the various principles that kind of drove us towards this idea of the three unifies, which I'll get to at the end. The building blocks, the principle here was develop a model that was robust enough to adapt to local variation whilst also driving a consistent and high quality student experience of services. And really what we've attempted to do with this is to ensure that regardless of which office we're talking about, whether it's an office with a team of people who are delivering services to students or an office with a single staff member, what are the things, the building blocks that they can use to ensure that the services that they're delivering are consistent and how they get the data out of that that they need to consistently improve that. And these building blocks were philosophy, people, process, system and physical. So the philosophy is really around the service idea and that we're really all there to deliver the best possible service to students. And that's a kind of a driving force behind the rest of the people or the people involved in the interaction who support students. The system, just to jump straight to it, the systems, it's always tempting to look directly to the system, particularly when we're talking about gathering data. One of the biggest challenges we have is ensuring that staff consistently use the systems that are given to them in order to collect robust data that can actually be used to capture what's happening across the board. So the two systems that we use in UCD are Unichare, which is a CRM system that captures the interactions that take place and the interactions between students and staff. And then connector forms or contact forms which replace generic email addresses that aren't placed in those offices. So that's where we're gathering the data that's being used to drive the service improvements and service delivery. And effectively what we end up out of that with is data that can be used to know what's happening across campus, but also can be used by those individual staff members to ensure that they're delivering the highest quality of service to students. And I think something that we all know as well from gathering data is that there's this temptation out there to believe that once we have the data, things will get better. We're all familiar with the data gnome whose project looks a bit like this, which is collect the data, go out there and find out what's happening, who students are interacting with, who they're going to for support, what kind of questions they're asking, how they're contacting them. We'll gather all of that and we'll have it and then things will get better as a result of that. So the two questions that we really needed to ask ourselves on this was, how do we ensure that we're not just collecting data for data's sake that the data that we're collecting is actually being used to do something? How do we ensure that it's used to make things better as well as opposed to just to highlight what's happening? And that's where this idea of the three unifies came in. So the three unifies for us are unified for the student. This is around the support that's delivered on a one-to-one basis between any staff member in any office across the university is delivering academic or administrative support to students that the student can expect the same level of consistent service from wherever they go. And we bolster that with the systems and with the data so that when a student goes to an office, the staff member who's helping them has a fuller picture of the interactions that they've had previously, the support that they've been given previously. The system that's used by the staff member does the legwork for both people involved in that interaction. So it gives them the information that they need to support the student and it gives them the opportunity to share that information with colleagues who might need to support the student as well should the student go somewhere else. That means that the student doesn't need to consistently or constantly restate their issues that they're having wherever they go and seek support. There's a full picture of the student's interaction there for whoever needs it. And this leads nicely into the team focus. A few people have pointed out that there's a new focus for a lot of their projects in the COVID world. I'll actually talk a little bit about that towards the end of the presentation but just to make sure that this team is being a responsible team, we'll social distance them. And what we've done with the team focus on this is that we support the teams, that the information and the data that's gathered supports the teams to do the business that they need to do which is delivering services to the students. And the data that they collect on their own individual services also drives the continuous development of those services that they can see where they need to place their focus, what the busiest things that they have going on at that particular moment in time is. And they can provide the context to actually improve the services supported by the data. A lot of the challenges that you have when you're gathering this kind of data is that you get a high level picture of what's happening but you don't necessarily know what to do with it because you don't know what changes need to be made in order to make things better, why students are contacting staff at particular times of the year. All of that context can be gathered from the staff who are closest to the students and supporting them through the services they deliver. And then finally, they're unified for the university piece which is all about the overall picture that we have as an institution of what supports we're delivering to students and where student issues are. So in this, what we've done with the unified, excuse me, sorry, what we've done with the unified support model is we've ensured that all staff have access to the same system, sure, but they're all using consistent language in how they're supporting students as well so that we can track students' interactions in their offices, both centrally and in the local offices in the same manner. We can see individual interactions that form overall trends and over time, the services that are delivered, whether they're responsive to the changing environment and COVID-19 is actually a really good example of that or not, but we can see if those are... We can drive convergence in those services. So what works for one office is very likely to work for another in terms of the supports and services that they're delivering. And then what we are also finding out of all of this is that by having this fuller picture of students' engagement with their various support offices, we can use that information not only to drive decision-making but even to inform discussions. It's not always clear what you should be talking about as an institution in terms of the supports that are delivered. Certainly that's what we found in UCD. So having access to this data about where students are going and who they're interacting with gives us the ability to decide what we should be looking at, even at a first step. It's not always clear where we should place our focus. This gives us an opportunity to see. These are big ticket items that students in every program area are going into their office about all the time so that we can use that then to decide what we should be focusing on as an institution, what we should be improving. But I think that that three-tiered approach where the person who's actually driving the data that's gonna be used by at an institutional level to make decisions is the person who's closest to the student is probably the most important output from this project and the most important lesson learned. It really relies on that individual interaction, that one-to-one support interaction, and whoever's giving it to be able to inform the context of the line of how things improve for the overall student population. So that's it. I probably rushed through that with an intention to keep things on time, but I might hand back to Lee then if there's any questions. That's fantastic. Thank you so much, Jeremy. It's really, really interesting. First of all, the data knowns, I think are such a key interpretation or a metaphor for what we're doing. I sincerely hope the term data knowns now becomes like a catchphrase within the hierarchy that, yeah, this idea that what we just do data, we'll do analytics, we'll buy a platform and then everything will get better. Without overlooking the idea of this, that the fact that the way you're providing, you're using data to use evidence to direct actions and supports, I think we've all seen a great history of somebody has an idea on a given day and gets it in the right year or the right person and everything changes in a way that isn't traditionally or hasn't necessarily been based on evidence. I think as well to get around the data knowns, that idea of the three unifies, having a coherent consistent plan, particularly one I really like how easy it is to represent it. Three unifies will stick around for a long time. I assume it's played kind of a key role in carrying it on and embedding it. It has and I think particularly if you look at the current situation, so since we've all started working at distance, so much of the support that happens in UCD relies on the interactions that take place between staff on a daily basis. What's happening in your office? What's happening at my office? Even within an office, they don't have that connection anymore in terms of the supports and it's not always clear when they do connect to what they should be talking about. So being able to pull this information out at that level in terms of what's happening to me, what's happening in my interactions, that forms the next picture for the team, that forms the institutional picture. And when we go and interrogate them as a project team and say, we're seeing a spike in this, we get really constructive feedback that allows us kind of frame how we should be looking at this at an institutional level. And that has helped us to overcome massive challenges as well in terms of the variation in services that might be delivered in different offices and viewing that variation as a strength rather than a challenge where there's a definite understanding that people deliver the best level of support to their students that they can and they don't always want to budge. But when the focus is on how it's gonna support you in delivering your service to a student, that tends to allow the conversation to flow a lot better. Yeah, I think like a further really interesting piece then coming out of all three presentations. And I really like the fact that you all three of you tied it into the COVID-19. Again, the point that we made to Ed at the start that part of the evidence that these approaches have been effectively embedded is that they've ended up being of enormous benefit in an environment that as I say, nobody foresaw and nobody would have expected. Which I think is something I hadn't thought of kind of coming into the webinar but that's really a consistent theme. So we might just go for a question or two again. I'm afraid I'm very sorry that our time is nearly up but we just go with Colin there, if that's okay. Hi Jeremy, with really large class groups how does attendance tracking work in UCD or do you use this data? So attendance tracking isn't used in UCD. There's some projects underway at the moment which we'd be tying in with to look at how that kind of data might be used. But it's not currently there unfortunately because I do think that particularly so when you look at the data that we do have we have information on the other access and we have information on obviously which supports are being accessed and how they're being accessed. If you were to be able to tie in attendance on that that would be massively beneficial. So it's definitely something we're looking at. I think back then, kind of maybe in, you know where the attendance data is possibly very valuable for kind of learning spaces and so on. I guess the engagement data fundamentally that you are using is more around the connector and the unishare. So you see the staff they're engaging with. Would that be kind of more of the measure if you like of engagement in this context? Yeah, and a buzzword at the moment in UCD is trusted person. The idea that a student's going to go and seek support from whoever they know rather than where they're supposed to go institutionally we would all like it if they went to where they were supposed to go but that's not always the way that it works and that's not the way it should work. And the person who they end up speaking to should be provided with the tools and resources that they need to be able to give the student the best possible level of support. So that's where these tools come in as well particularly where you can see a history of where the student's been and what they've been told previously. Fantastic, sorry, I just see there one from Lee. So Lee says that Lee previously shared the student guide to learning analytics which was written by DBS. If you're happy for me to share it again, I can, what I'll do is I'll ask if we can make it available. So enormous thanks again to our speakers, Ed, Lee, Sarah and Jeremy. I think this has been a really, really interesting and educational and engaging hour. And certainly it's given me a lot to think about particularly in the challenges of the current environment. So thank you all very, very much and to all of you who've joined us out in bedroom land or kitchen land or wherever you are. Thank you very, very much. And so this will be posted on the recording and presentations will be posted on the teaching and learning website over the coming days.