 Yes, hello. My name is Anu. I work at Kaplan Open Learning. I'm a learning technology manager there and I'm talking about learning analytics and how we use dates to improve student success. So we work in partnership with University of Essex to provide University of Essex online and we deliver 100% online undergraduate and postgraduate degree programs. And we strive to give the highest quality experience to our students and this is evidenced by our very high NSS scores which are 90 plus every year. And also we have a TEF Gold and I think we're the only alternative provider to have a Gold. So the key challenge to everyone who does online learning is of course how to keep students engaged and motivated. I'm using word challenge instead of problem because problem is quite negative. Challenge is a positive thing. So how we have, this is our solution in doing so. Our solution has been to create a bespoke learner analytics system which is called LAS. If you imagine Sean Connery saying that it sounds much better than what I can say. So LAS uses engagement recorded through Moodle, some Moodle data to automate interventions and to support students at scale. And the way different is different from other learner analytics packages is that we've designed this in collaboration with our student support team and they've been kind of the driving force behind this in terms of what they, because they are the end users and it's about what they need instead of what the company generally needs in terms of learning analytics. It's also customizable for each of the end users, i.e. the student advices. So the student is very much hard at what we do and through our VLE, which is Moodle, they get access to obviously all the learning resources they need to complete their programs. And the second tier of that is the student support advisors. And they support all the resources and all the platforms the student need access to to successfully complete their programs. Behind that the learning analytics then support the student advisors so they are able to deliver that support in a targeted way. So the key features in our programs are small class sizes, is maximum of 20 per group. We give plenty of synchronous learning opportunities and regular teacher and student advisor contact to all our students. So each of our students has a named student advisor as their first point of contact. Now last has enhanced this provision by allowing individualized support at key points in time. So how does it work then? Well, we have a bespoke student management system. So this pulls the live data from Moodle and LAS is part of, so it's an extension of the student management system. And it does two things. Firstly, it feeds the information into a customizable dashboard which the student advices then use to inform what is happening with their students and allows them to intervene. Secondly, it sends automated emails to students at predefined points in time based on individual student actions. So this is a demo dashboard copy without any student data on it. So you can see that there's sort of a color coded tracker against each student. So that gives you a number of interactions per day. You've got some boxes on the right. So those are inactive students. Those are showing the six most inactive students for that particular student advisor. Down there, the next section is about discussions, how many posts and replies the students have done, because discussions feature quite heavily in our assessment. And this is our email engine. So this allows us to build emails based on criteria or amend existing ones. So we can do this ourselves on site. So the benefits of this for us have been that it predicts learners at risk very early on. So we can do interventions or other student advices can do interventions. It helps us understand what the learner needs are based on their actions or inactions in their programs or in their modules and allows us to do individualized and tailored supports and also timely support for each particular student and give precise feedback at those critical moments, perhaps around assessment times or earlier in the module where the dropout rate is higher. It's also a low cost now that it's being built. It was fairly low cost to build start off with, but now that it's there. So look, if you saw the email generator or the email dashboard, it's a no code for us. So we can build more emails. So we can amend existing emails. We don't need the developer to be on site to do any of that. We can do it all ourselves. So the result has been the ability for us to build that functionality without coding, which is obviously a massive cost saving. It also has allowed us to quite quickly know that the core elements are set up to build in new features. So currently we've integrated our module student feedback into the system. So we can track against these tutor or perhaps a particular question, what students are feeling at any given time. And overall it has had a very positive effect on learner outcomes and especially retention. And that's all from me. Thank you very much. Okay, before we hear from Greg, does anyone have any questions for Anu? Oh, we'll go on here again. Thank you. What are the trigger points that send out the emails? Sorry if I mis-start. Well, there's several. They're usually based on the starting point of the modules. So at the first, well, the first day there would be a general welcome message. Then there might be, if it's an induction, there might be a bit more information about the program in general, et cetera. But generally for modules, it's the first day, welcome to the module. Here is what you need to do. Here is who you need to contact and all that. Then it's about assessments. It's about assessment reminders. And those are the trigger points. So if a student has obviously submitted their assignment, we wouldn't want to send them a reminder that you need to submit assignments. So it comes individualized at that point. We also do a lot of discussion forums. So we use the HSU forum for assessment. And students have to do a certain amount of posts per assessment. It's usually 10 within a sort of two-week period. So we're counting those posts. And if they haven't done a certain amount by a certain time period, then that triggers another email to say that, you know, you're falling behind on this assessment. You probably want to contribute a bit more. Thank you. Another question over here from Mark. It may sound like a silly question, but do you know that the students read the emails? Well, that's the thing, isn't it? You can't guarantee that they will be reading their emails. So it's not... I mean, most students are very happy to receive the emails and actually, you know, receive that customized support. And we have a feature in the platform as well that if there's a student who says they absolutely don't want any emails, we'll switch them off. Or we can switch off individual emails so they only get certain ones if they want. So we can customize it. If you could measure in some way whether they've opened the email, and don't want to pay attention to it, but if you can measure whether they've opened it, I think your effectiveness will dramatically increase. And by that, I mean, you'll be able to say excellent, the students improved or retention has improved. And these ones were quite bluntly lost causes anyway, because they weren't paying attention to our intervention because they didn't even open the email. Because when they don't open their email, that's reducing your apparent effectiveness. Yeah, that's... As somebody who sends out a weekly newsletter, this is... I see where it comes from, Mark, but it's actually really problematic because iOS devices block tracking pixels and stuff by default. So then you're making an assumption someone hasn't opened the email, but actually they have, but their devices... So it's a bit of a dark art sometimes as well. Any more questions for Anu? There's one over here, the gentleman here before we move on to Greg. Sorry if I missed this, but is it pulling in any other data other than the mood-like activities just out of interest? Is it gauging any other type of engagement? So at the moment it's just the Moodle data, because we do everything through Moodle. We're looking at integrating our eBooks, we use vital source, so we're looking at integrating that data from there. We also use Zoom for our synchronous and big blue button for our synchronous platform, so we're looking at integrating data from there as well, so we see how students interact with other sort of engaging activities. And a quick follow-on question, if you don't mind, is are you finding these triggering flags against the same students as like other attendance monitoring is doing? I think there are always students who won't necessarily engage, and it's not necessarily a bad thing. So this is where the individual student advisor comes along, so they'll know that student and they'll be able to see that, okay, just because it's read all the way along doesn't mean that they're not actually performing well in their modules. So that's where the individualized support comes along. Thank you. Thank you again.