 I'm John Whitmer with Blackboard, often called Dr. John in the halls of Blackboard and I do the large-scale analytics work and manage our data science team. Before coming to Blackboard, I've been there a little bit over three years and I've been an active part of the Moodle community, worked quite a bit with many of you and with Martin and it's nice to be back in this community, Tom as well, so happy to be here. I'm going to actually speak exactly to the point about people using the LMS as a repository today and how frequently that is the case. I won't make you say this out loud, but think in your mind, come up with a number, a number of how many, what percentage of courses actually don't use the higher-level functionality within the LMS. Just think about what do you think? What percentage of courses are actually using the LMS as a glorified file repository? I will have some numbers from large-scale research so you can think about it for a little bit. So my colleague John Fritz and I from University of Maryland Baltimore County actually wrote up some of the results that I'll share today in an EDUCAUSE review article, thinking about how we can use learning analytics for course design. So I've always worked in large-scale distributed educational technology in higher ed primarily and I've always thought, wow, we're designing the systems, we're putting so much time, so much money, so many resources into these systems and we know that they're recording the transactions and interactions with these systems. Boy, wouldn't it be nice if we could wrangle that data into something that was useful and meaningful so we had a sense of what people did and how much those efforts in a sense paid off, but also to give us insights into what we could do better. That's always, that's really what's driving me at the end of the day, knowing how much work and how much we believe in what we do, but also knowing there's so much we can learn from actual studying human behavior. Also, as I know, you can get a couple people to fill out a survey, right? Those survey responses, do they represent everybody? No, they don't, right? They represent the people who are the highest level and also people who are the big complainers and I'm advancing slides without meaning to, which is not a good thing. So in thinking about this, I have to credit this idea for this slide to Richard Burroughs, who's one of the sales executives in the UK. So thinking about learning analytics for course design, I think there's two ways to approach health, right? So thinking about this, right? You could either have weight for the heart attack to happen, take the approach on the left, that's one of those things you put on your heart after you have a heart attack. Is somebody doing that to you? This is going to keep on happening. So you can either wait until you have a problem with health, right? You're going to put somebody up on the table, right? Take the paddles out, or you can have a healthy lifestyle, right? Kind of two ways. And I think of learning analytics for course design, right? It's following the latter, right? So thinking about how we can analyze courses, think about the materials that are there, and really, and it is going to keep on happening. This is awesome. So you're going to get a preview of what's happening. It will be like you can see the future all the time. So the course design, I think, is really the latter. And I think most learning analytics, right? Most analytics is really the former, right? It's waiting until a student has a problem, identifying the students at risk, and doing it before it's too late, but they're already at risk. So what if we could prevent students from getting into risk in the first place by identifying course design, identifying elements and behaviors so that we can improve and kind of rise all those? That's what I think we can do with learning analytics for course, for thinking about instructional design and course design. So I think really those of you in the audience who are developing learning analytics or developing your systems, I think this is one of the greatest insights that we have that we can offer. So when I came to Blackboard three years ago, right, one of the things that we wanted to do, and this is going to keep on happening, it's going super annoying, you can watch me watch, watch John on the stage get annoyed, right? Over the next 10 minutes, I don't know if there's any way to stop that. It's not, but you better watch Tom's next and he's going to throw the laptop. So I wanted to look at what the relationship was between student use of the LMS and their grade, right? The fundamental assumption behind kind of many predictive analytics is that we can look at the student frequency of use of the LMS and use that as a proxy for effort and something that can help us predict what the student grade will be at the end, right? Kind of makes sense, time on task, right? Students don't study, they probably won't do well, right? So we looked at this and with the data we had at very large scale, 1.2 million students, 35,000 courses, 788 institutions, and this is a scatter plot of the results. And you can see there that there is a line, right? It looks like there's a significant effect. It actually, if you look at the scale, right? At the bottom, it's a very large duration. There was a significant relationship, but it was less than 1% effect. It was something like students spending an extra hour in a class would lead to a 0.05% improvement in their final grade. And I'm not missing the decimal points there, right? It's 1 hour, 0.05%. But that's because it was at such scale, right? So we looked at that and began teasing that out, thinking there must be, intuitively, there must be some better results than that. This must have a stronger relationship. And we looked at this at the course level. So we knew many courses we thought were not making effective use of the LMS. We're using it again as a glorified repository. So we looked at the same results and relativized them to each class. So maybe it's that there's some classes have a stronger relationship and others have a weaker relationship. And it turns out that is the case. We found that this is a distribution of the effect size of that relationship looking again at this large scale, 35,000 courses. We found that in 22% of those classes, there was a relationship, period, right? And then within that relationship, this curve here explains and shows the effect, the relationship between, again, student use and their grade at the end. And you can see it was widely distributed, right? It went all the way up to explaining the entirety of their grade, but it also, it hovered really around 23, 24%. So, right, leads us again to the question. And it's a great thing about research, right? If you always come up with more questions than you actually answer, at least that's my case. Maybe I'm thinking about it too much. But why, right? So what makes for a stronger weaker relationship? And one of the things that we looked at, right, and that I will present today is how courses are designed. And the way we looked at course design is that we looked at the amount of time students spend using different tools. So not what the instructor set up or intended for people to do, right? Because that's interesting. But what we wanna know is what's the student experience, right? What do students actually spend their time doing in classes? And we came up with, and this is something, in my research, I had found a number of ways people have talked about this and they've come up with these really nice kind of flow charts, but they actually haven't done large scale empirical research that I'd been able to find. If anybody else knows of it, please let me know. I'd love to see something else actually to help validate or compare or prove this wrong. I don't think they'll get more data than I do, but anyway, that's a different story. But to look at that, so, but it hadn't been looked at kind of empirically before. And so we did this, we came up with and looked at, again, another, this 35 of the sample of 35,000 courses and looked at the tools used and the relative amount of time spent in each of the tools, excuse me, and this is what we came up with. So again, that number, so recall the number you thought of, right? So 53%, so you can count up the totals here, 53% of the courses we found were largely used as glorified file repositories, almost not even using the announcement tools, right? Just almost all of the time, and I'll show you a breakdown in a minute, were largely repositories again and then the next they went down to what's called complimentary, but those were used again as file repositories plus quite a bit of communication from the teacher and instructor to the students, not anything kind of more than that. And that's about three quarters of the classes. From there, and this is where it gets interesting. So I think that one piece is interesting and then the next is that they bifurcated, they went in two patterns from there. They went into classes that were either social, which means they made extensive use of discussion forums. So a lot of interaction there or they made heavy use of assessments. One of the two and those overtook and the amount of student time spent in the course. And then finally, Nirvana may be approaching the higher level as we get what we call holistic. I'm from California, right? So that's what we call things out there. The holistic and getting to these intensive uses of the LMS, they really made a large use of a multiplicity of tools. We think and we could not discriminate, just to say here. We could not discriminate or tell, which were online, which were hybrid. We have, there's definitely a little bit of noise in the sample because some of them could have been something that was used for non-instructional purposes. We did some pre-filtering. So I think we largely have real live courses. So the interesting thing to me, so we looked at proportion of time spent. I'll just quickly show this density plot that shows the relationship of the time of each of those types. And we did not look at total amount of time spent. We just looked at proportion of time. And if you look at this, what this shows is the average amount of time. So we plotted, this is an independent variable. And what it shows is that when you have largely content items students spend their time in, guess what? They don't spend a lot of time in the LMS, right? This isn't like rocket science or a huge surprise, but it's a validation that that happened. Again, the next level up, the evaluative was the next, the complimentary, I'm sorry, was the next size, so a little bit more time. And then you see this kind of wider hump with a larger variation in use. That actually is too overlapping. That's the discussion forums plus the assessments. We have both of those and they overlap higher level of use. And then the holistic classes have the most time. So the most complex courses, largest diversity of use, have the broadest amount of use in the largest overall time. And so what I have from here is just a breakdown showing you each of these types. Elizabeth's gonna yank me off the stage pretty quick. So I'll go quickly here. But you can kind of see supplemental 58% of the time in course content. Average student time is 15 hours. 35 students average, right? We go up to complimentary, a little bit more. Tools, 25 hours was the average amount of time and much more interactions around the same size. Evaluative tests, you can see their assessment overtakes in the amount of time students spend. And there's more time spent there as well, the average amount of time as well as the total amount of time they have. And then this is one of the other interesting things. So here is social, the discussion forums. Notice the class size went down. So these classes that are making intensive use of discussion forums on large have smaller size. Facilitating discussion forums with large sizes is quite complicated and difficult. One of the things we've been focused on. And again, a little bit more time here. And then the California Nirvana, Holistic, much more time spent in a greater diversity of use. I mentioned one last thing is also what we saw throughout these is that when you went from different types, one of the things we saw is that the tools increased. They didn't take away or stop using the LMS for a file repository or for content. Instead they were adding functions on the top. So recommendations, if you're looking at growing LMS adoption, are to begin thinking about discussion forums or tests. Those are kind of the two main ways people are currently doing this. Whether that's recommended practice or not, I can't tell you, that's kind of your call. But I think that that is the way that that's happening in practice. So a couple implications that I think we're also getting beyond the LMS as a proxy for effort. I think that's really important to do. I think in the work that Moodle is doing, Project Inspire, thinking about community inquiry framework and other ways to be a bit more sophisticated in our thinking about this is really important, especially given that these relationships at scale are rather small. And there I am, and I made it with 15 seconds to spare. And I didn't throw the laptop and it stopped advancing. A couple of minutes for questions. Any questions for John about this fabulous research? The question was, did the data come from Blackboard? No, that was his presentation time. He made it inside. Like what are the questions you're looking at lightly? Well, on student open rate, on learning analytics notifications. So, and looking at the open rates and interests. So we have rules based notifications in the ultra experience of learn that go at largely grade and activity. And so we've been actually looking so you can run these flags and send them to students. But the question is, do they care? And then of course you wanna know if they care, does it matter? But we've been looking at the open rates on those. And they've been very high. And they've also been very patterned. So we're finding student responses and student interaction largely get the same over time. And one of the fun things that we've found is that especially students in the top category, not only do they stay in the top category, they get the same notification all the time and they keep on opening it. Ooh, look, I'm doing really well. Hey, look, I'm still doing really well. And it happens with students in the low categories as well that are not doing well. They do open those frequently, but less frequently. And but probably as a proxy for overall activity. So we disaggregated, we clustered students by the kind of notification they were getting as a proxy for student achievement. So that's one of the biggest things that we're looking at. One more question? Into a class. That depends, of course. What it depends on is LMS only data, right? If you have LMS only data. So we have, we do models, we've done models with both student information system data and LMS data and LMS only data. What we found doing modeling for X-ray on Moodle data is that it's usually about week three. I think it's like week two to week three if you have intense abuse, tends to be the sweet spot. There's a couple early indicators, like the lag time between first access of the LMS. The tends to be a pretty good indicator. I think you could do rules based and kind of hit those students immediately. But what we found in the research with X-ray, and I've done actually previous work with San Diego State University, we found it was about week, between week two and week three was actually the magic number. With LMS only, and that's to get accuracy rates though, of 80 to 90%, right? If you're okay with 70%, right? Better than a coin flip, right? I don't feel very good about that. That's a pretty high error rate. You can hit at week one. Also, if you have SIS data, right? SIS only data, you can hit, and that's what we do with Blackboard Critic, you can do as of week zero. But your accuracy is in that 70%, range. I prefer to look at behaviors, that's me.