 Good morning and good afternoon. So let's make a start. So for this session, this is the first session of the emerging technology and the behavior. So in this session, we have four speakers and our first speaker in the column, Loplin, and he will talk about the domestication relationship between the virtual learning environment and engagement and containment and outcome. So please welcome. Thank you very much. Thank you all for coming. Big proud. Excellent. I haven't done this for a few years. I'm quite nervous. I like to pace up and down. So you have to excuse me. My colleague, Ben is a statistician. Are there any statisticians here? Good. That's what I like to hear. So I can bluff away quite happily in that knowledge. I started getting interested in VLE analytics a few years ago at my previous institution. I noticed a sort of a trend that I didn't get a chance to investigate. So I've been investigated it this year. And it's all to do with the amount of time that students spend in the VLE. The big buzzword in HE at the moment is engagement. If we can just get the students to engage, everything will be all right. But will it? Is there such a thing as too much engagement? And the spoiler is, yes, there probably is. And the other thing is that I haven't looked at this for a couple of weeks. So I've got to see what's, I don't actually know what's on the slides. What does a student engagement look like? In our terms, it's how long they spend in the VLE. And my particular project was how long they spend in the content in the VLE, not in any other area. So not in whether we're on the homepage or in the discussion forum, or in any of those other ancillary things. It's just purely whether they're in the content areas. Up until a few years ago, it was very difficult to provide any correlations between face-to-face teaching and engagement with the VLE, generally because the stuff on the VLE was so poor that the students, whether they engage with it or not, was not a really accurate reflection of what was happening in the course. That seems to have changed in the last five years. And we have been able to prove, we have been able to prove correlations between engagement. There's generally now a correlation between engagement and the amount of time they spend in the VLE. But the other thing I've noticed is, and particularly since the pandemic, is far more of this, far more of lecturers coming in and saying, where are the students? They're not engaging. And these are all lecturers who are posting pictures of their empty, empty lecture rooms. And I was wondering if that's the same with the VLE. As I say, the focus is on engagement. If we can just get them to engage, everything will be okay. And it's sort of seen as the holy grail. If we can just get them to engage, that will be fine. So the definitions of student engagement are time and effort, students devote to education, education and purposeful activities. And that includes activities such as attending class, submitting assignments, interacting with lecturers and peers. And the authors identify different dimensions of engagement, including behavioral, emotional and cognitive. I'm looking at behavioral. I'm looking at how they behave with the VLE. And unlike some institutions, we haven't got a great deal of money or a great deal of time. So I'm looking for relatively simple measures of engagement. The measures remain complex. They often remain differentiated by discipline. So different disciplines have different levels of engagement with the VLE, different ways of doing that. The individual students, their individual psychology means that they engage with their learning in different ways. So you can't always, you cannot predict from individual student engagement what their outcomes are likely to be. And the other thing is that the correlations with the VLE are often weak. So there's generally a correlation. There's generally a positive correlation between the amount of time they spend in the VLE and their academic outcomes as measured by exam results. But the correlations are generally weak. So in an effort to refine the relationship, a lot of people with more time and money than we have start looking into really complex dimensions of how they're going to model it, including demographics of students and all that sort of thing. Other people have pointed out that there are correlations with students who have high activity in high grades and other students have high activity in low grades. But they're generally in the past, in the literature, they've generally been dismissed kind of as outliers really. So yeah, as I say, the student engagement with the VLE, it remains complex. But it can be a proxy. It's not a direct relationship. There isn't a direct relationship between the VLE and our academic outcomes. But it is a reasonable proxy. So the most basic unit of learning data in virtual learning environments is the interaction. And there's no consensus yet on which interactions are meaningful for demonstrating effective learning. So I want to talk about the data that I was looking at. I was looking at, because the correlations were weak, which I already knew, I was only looking at large cohorts. So I was looking at cohorts with more than 200 students or at least more than 200 students enrolled. And I was taking the amount of time they spent in the VLE, measured in seconds, and this is just in the VLE content in no other areas. And then their exam scores. And as we can see, in this particular one, there is a correlation, and it's 0.46, which is about, which is actually quite strong for a VLE correlation. They're normally, they're generally a bit weaker than that. And the correlation line here, just for anyone who doesn't know. So this is the best fit line for all the students. So if you take all the dots which represent individual students, that's the neatest line you can take through. And the steepness of the line indicates the strength of the correlation. And the spread indicates how how strong the correlation is. If it was a stronger correlation, the dots would be closer to the line. Can I just show you the speaker? Sorry. The speaker's sound? Oh, right. Okay. Thank you, pardon. Okay. All the modules I looked at had a correlation. Again, it varied, but it was it was a similarish sort of pattern. So these are then the averages. So this is the, don't worry too much about the fact that the average was only 40.42.8%. That's a different conversation that so many students failed. But that's the average. That's the mean. And that is the mean of the time spent in the VLE, which was 60 hours in this case. So this is a module. All these modules are 15 credit modules, which is equivalent to 150 hours of study. And in this case, they spent 60 of those hours looking at content in the VLE. So these students in this quadrant are the students who don't engage very much with the VLE and don't score very well. So that's not probably a massive surprise. These students spend more than average in the VLE, but also score higher than average, which again, probably not a massive surprise. These are the lucky ones who spend not very much time in the VLE, but still score pretty well. And these are the students I've always been concerned with. The students who spend more than average in the VLE, but score less than average in the VLE. And as I say in the literature previously, they've kind of been dismissed as outliers and not really discussed in any meaningful way. But actually, there's quite a number of them. And my view has always been that in terms of engagement, these are people who are willing to put in some time, but are not getting the rewards out of it. So that's the purpose of this really is to try and pick up on these people and give them some extra support. So in this particular case, this individual here has spent 217 hours in the VLE. Now, I know that they could have put their computer on and left it on whatever multiple times, but you'd have to do that hell of a lot to get up to 217 hours. Bearing in mind that the entire study time allocated for this module is only 150 hours. That meant they've done more than that, just doing the VLE without doing anything else. And clearly, as they scored 42%, their time could probably have been better spent. So I've sort of enlarged it to include that whole group, the whole group that are spending far more than average in the VLE, but but achieving less. The interesting thing is, as I say, this line, this regression line applies to the whole cohort. And people don't generally look beyond that. But because I was interested in this group here, I did. And when I did a regression just on that group, it becomes inverse, which means that the more time you spend in the VLE, the worse you're likely to do, which is obviously counterintuitive. And I'm definitely not suggesting that the VLE is causing them to do worse. Just that the people in this sort of area are spinning their wheels and no matter how long they spend in the VLE, they're not going to improve their scores. So that goes to an inverse correlation of 0.135, which is not very huge, but it is inverse. So that's when I brought in the statistician, because I thought, well, I've got a more complex model than a simple linear regression here. So where are you? Yeah, so that's it. Yeah, the more time the students spend in the VLE, the worse their exam scores. Now, it's important to say that this is very limited in that I'm only looking at the VLE, I'm only looking at exam scores. And we all know there's, but learning is far more complex than that. But I've reduced it down to a very simple measure. The interesting thing I found, and I've not had time to explore this further yet, is that if you take the Penocto views, just the views of the videos or the lectures, you don't get the same thing. If I look at that, the relationship stays positive. In fact, it even increases the more time they spend in that content. So there's something peculiar, which is the next part of my project, is to understand what it is that they're doing in the VLE or not doing. Because as I say, it doesn't stack up with Penocto views, the lecture recording views. So this is the summary of the modules I looked at. This is the cohort size. One had 170, the rest for all over 200. And what I'm looking for is patterns. What I'm looking for is something I can hook on to, to, to provide an intervention. And as I say, I covered, across six modules, I covered four different disciplines. And when I was looking at it before, I covered yet another discipline, same pattern in all of them. But interestingly, at the meantime in the VLE, this one here is 12 hours. And that one there is 73 hours. So there are obviously, across the disciplines, there are very different requirements, different levels of content. The learning design is obviously very different in those modules. So that's, that's kind of understandable because different disciplines, I have no idea what the learning design is in any of these modules. I can't even remember the disciplines now. But so there's no pattern there. There's nothing, there's nothing to really hook on to, just purely with the number of hours. There's also the R squared number is just the, the strengths, the correlation. And as I say, the one that we were looking at was 0.56, I think 456, which is actually quite strong for VLE correlation. They're not normally that strong. And then the weakest one was down to 0.19, which is very weak. And what it means, what that means is that those are the uninitiated is that, so for instance, in the 0.191, it means that their interaction with the VLE can account for about 19% of the result as a percentage. But again, there's no real pattern there. And there's no real pattern here. So I was looking at the point at which they inverted. So when the, when the correlation went from positive to a negative relationship, I was looking at what point that was. But again, very wide variation. So one, on one of them, it was just over 100%, just over 100% of the mean and the relationship inverted. For another one, it was nearly 400%, nearly four times the average before that relationship inverted. So again, there's nothing, there was nothing really for me to hook into there. But this was interesting. So double the mean, double the average time spent. So in this one, average was 12 hours. When you get up to 24 hours spent, then the relationship disappears. The positive relationship between time spent in the VLE and their exam results disappears, which means that no matter, once you get to that point, no matter how long they spend in the VLE, they're unlikely to improve their outcomes. And for me, that's the, that's the goal, because it's a really simple measure that, and it also doesn't rely on anything. Most of the models are picking up struggling students, rely on some kind of feedback, some kind of exam, or some kind of formative assessment, where you can measure their scores and see how they're doing. This doesn't need it, because all I'm looking at is how much time they're spending in the VLE. And I can tell after about three weeks, who's spending double the time of everybody else. And that gives me a clue then to, to, you know, start our intervention. So this is just generalizing that pattern I've been describing. So there's the correlation, positive correlation. That's double the mean, that's double the time, the average time that they spend in the VLE. This is the point at which it becomes no significant difference. And then at some point after that, it actually inverts. So when I, when I first discovered this, I had a, I had a conversation with ChatGTP. And it assured me that change point analysis was the, was the, what I was looking at. And change point analysis, and I liked it. I liked the name. I thought that's cool. We'll go with that. So change point analysis, using Bayesian methods is a statistical method that aims to identify the point at which a significant changing occurs in the parameters of the model. In this method, a Bayesian framework is used to estimate the location of the change point. At the time, I was thinking that would be useful. But as I've discovered that the point at which that changes is so variable, it's actually not going to be that useful. It's interesting. And we can use it to predict or Ben can use it to predict all sorts of interesting things, but only really with hindsight. It can't predict stuff. It can't, we can't say using this method at what point students are going to start struggling. We can only really use it and analyze it once we've completed our sort of courses. When I went to Ben, he said that I was completely wrong to use Bayesian change point analysis for reasons which I can't remember now. And he wants to use this locally estimated scatterpot smoothing, which is a form of polynomial regression. Well, actually, I think Bayesian change point is as well. But the, it's a different type of pattern. Basically, it breaks, it breaks instead of taking the linear regression as one long line, it breaks it down into into a much smaller series of straight lines. So you still get, it's still a linear regression, but it's a, it's a series of linear regressions. And what we end up with is a kind of a U shape really at the end of the day. I'm not sure if I've got, I thought I'll maybe come back to that because I think he's given me, he's given me an estimation of what it looks like. But basically, all the patterns of a horseshoe shape. But I kind of lost interest in the stats when I realized that it's not going to tell me anything useful in real time. It's nice for analyzing afterwards, but in real time, all I need to know is that double the average. That's when the students, we need to think about an intervention. So the next steps for this project are, we need to do more work to understand the relationships from the statistical perspective of Ben does. We'll also be looking at other things because we can plug in other things relatively easily, like some demographic data. Because I'm wondering about the students. I was just going to, yeah, so I'm wondering about the students who are up here, for instance, versus the ones who are down here. And I'm wondering if there's a demographic feature of those that could actually refine, target our intervention more closely. I'm going to do some focus groups with the students who fall into that category to try and understand what they're doing for 230 hours. What could they possibly be doing for 230 hours? And we're going to relatively quickly generate a predictive model, which as I say, we can pick up struggling students after three weeks instead of nine weeks, 10 weeks using this method. And I'm working with the skills centre in the library to actually set up a special programme that is because these students, I think, have really been under service, the ones who are actually prepared to put in the time and effort, but are really struggling. Because at the end of the year, they're just going to be dismissed as failures and the effort they've been putting in will go largely unrecognised. At its simplest level, it's just identifying students spending double the mean time in the VLE. We don't have to wait for exam scores. We can use this after two or three weeks. Yeah, we may have discovered a relatively simple correlation. The thing is that this is a relatively small number of students, but it's not an insignificant number of students. It's somewhere between 5% and 10% of the cohort that are falling into this category. So although it's a small number, it's not insignificant. I sort of rattled through that because the last time I did this, we had a really conversation, a really good conversation afterwards, so I'm hoping that there are some questions. Yeah, very interesting. The investigation, particularly from the data analytics perspective, people have some quantities of the relationship between the engagement with the attending all the time. So are there any questions or are there some any questions? No, no. So if we'd done this study five, six years ago, we would have had virtually that correlation, the basic correlation probably wouldn't have existed in most cases. I think it was heading this way. I think the pandemic has shot up a bit in terms of most academics are now aware of the VLE and how to use it a bit more effectively than they were before the pandemic. So it's not just a repository for a huge amount of academics now. It is used in a much more deliberate way. No, no. And to be honest, I know I don't. I do know that our university, like most others, has struggled with physical attendance. I have no idea about, but to be honest, that was the purpose of doing this was that I didn't want to know anything about anything else. I just wanted to know, given just some raw data, is there anything I can say about this group of students? Is there anything I can, you know, how can I identify them early? And that's why I found it interesting to do a number of disciplines. I didn't even look at the VLE pages for the modules I looked at. So I have literally no idea what the learning design was in them or how many content items there were. I know that there were some videos only because I went to check the Palopto to see if it had the same correlation. Well, exactly. It's a blunt instrument. It absolutely is. You can't tell if they go into a page and then just leave it open and go make a cup of coffee. Absolutely can't do that. You can't tell it. Obviously, they could be downloading it and then switching it off. Could absolutely be doing that. So no, it is absolutely a blunt instrument, but it does measure the amount of seconds they spend with that page open and with it as the primary window. As soon as they click out of that window, it stops the timing. But the thing is, as I said before, once you get up to more than double three or four times the average, the odd cup of coffee or whatever is not going to make, there's still something to be said there. Is there any way to tell what the bias of the screen is? Yeah, interesting. Really interesting. Yeah, that's another thing to plug in. Yes, we can find out. No, I didn't. So for the purposes of this, I didn't, but we absolutely can. And that's something else to consider when we're looking at plugging in demographics, you know, which route they came to university by and all that sort of stuff. Okay, so I just have a question. Because we have so many people in the after. You're talking at the start about splitting off the behavior of the vitamin game, proportional cognitive. I wonder what your signals about how you might reach and find the emotional cognitive back into this. Well, I think certainly some of the more complex models definitely involve the cognitive. They can start. But my institution, I just don't have the resource. I'd love to be able to do it. I'd love to be able to look at how they're engaging with specific items. Are there specific items that this group pick up on that the others don't? What are the really successful students in this sort of area? What are they doing? How are they engaging with it? Yeah, there are so many great questions. But with a really limited resource, I have to use a relatively blunt instrument. Okay, so let us thank you for the calling. Thank you very much.