 Until recently I was working for, as I just explained a couple of minutes ago, an adult at college and or university and a lot of what I have been working on over the last couple of years is trying to improve the learning experience by means of data. So collecting data, providing it to people in time for them to take action to improve a learning process. And I got very excited about that and decided to do it as a dissertation topic and started presenting at Moots and then Martin asked me if I'd like to do it full time and I said is this a trick question? So that's what I'm doing now. So I am pretty excited about that. And we would like you to get excited about a project that we are launching to start to build stronger learning analytics into Moodle Core and we have a different approach than probably what you've seen from other vendors and that's what I'm going to talk with you about today. Most of the learning analytics that I've seen, most of the utilities that are available, services that are available, come from the analytics side of learning analytics. They are very often based on business analytics or web analytics tools and they carry over certain assumptions and metaphors into the learning environment that don't necessarily apply there as well as they did in the original context that they came from. Learning analytics should be about learning first. That begs a question. What is learning? So I'd like to make people a little bit familiar with the work of Michael Shiro who has a way of breaking down curriculum theory or educational theory into some very common themes or strands. Now if you were to look through your own curriculum and think about this, you would not find that you necessarily align with only one of these four categories. Nearly everybody I've met shifts around between these. You can be more aligned with one on a different day, let alone between different programs at your institution. But it's very helpful to be consciously aware of what your assumptions are about learning. So in these four models, and you don't have to memorize these terms, the terms are not important. I'm going to use them in this presentation, but we may find something better. We very commonly think about higher education, for example, as the academic scholar model. The purpose is to develop scholarship, to develop scholars, to identify those who are most suited to continue on in scholarly research and to produce new knowledge. On the other hand, the social efficiency model is intended to prepare learners for something in the real world for a job usually, or to be able to be constructive, productive citizens. And this model wants to do this in the most efficient way possible, the most effective way possible. Many of our K-12 schools will at least claim to adhere to a more learner-centered model, where what they are interested in doing is trying to develop the individual strengths of learners of children to their full potential. And they may also have claims to social reconstruction, where the purpose of education and learning is to make the world a better place by enabling people to go out and improve things and take action. These are for general purposes for education or general intents of learning. And your institution may adhere to all of these, or only one, or none, but I'd like to know what you're doing if that's the case. Generally speaking, think about your own institution and the programs that you offer, whatever kind of institution you are. First, how many corporate learning environments, training environments have we got? Please raise your hands if you would say that you offer at least some of those, okay? K-12, okay, not so many, but higher ed, okay, that's most of who we have here, but Moodle is used in all of those environments and more. Community education, do we have anybody here who's doing like volunteer training or yeah, that's another environment. And now it's definitely social reconstruction, right? And to enable people to go out and change something about the world. So this changes what we consider our known good. If we're going to talk about analytics where we're trying to improve learning, that means we're trying to get more of the learning experience to become a known good, whatever that is. If you are in higher education, probably, you are expected to provide scholarly advancement, also known as retention, course completion, graduation, that sort of thing. And you want some percentage of your students to go on to the next level of academic pursuit, whatever that is. So if you are in K-12, you want your students to go to college or university. If you're in university, you want them to go into grad school. If you're in grad school, you want them to become full professors, tenured somewhere and produce knowledge. If you are in a social efficiency model, you are trying to help your learners get jobs, usually. You're trying to fit them with certain skills that you have decided or your patrons have decided are critical for them to have, and these are measurable skills, and they're going to go out and be productive. If you are in a learner-centered, a true learner-centered environment, you are trying to help your learners meet their own goals. And they are really the only ones who will know when they have done that. And if you are trying to do a social reconstruction, you are trying to help your learners have increased influence in their community and be full participants in the communities in which they live. How does this change what we're measuring in analytics? This is the important thing that I have not spoken with any company that's taking this into account. If you are following an academic scholar kind of model, you may need to show that your learners have spent a certain amount of time learning something because more time spent is deeper reflection, and everyone can afford, can benefit from spending more time with an important subject, and that could very well be true. If you are in a social efficiency model, less time is more efficient. Your purpose is to help someone gain skills in the most efficient way possible. You don't want them to spend a lot of time. You want to maximize the efficiency of your program and help your learners become capable and productive in the minimum amount of time possible. These are in direct conflict with each other, and yet a great many of our institutions of higher education are trying to pursue both of these simultaneously. Does anybody here think they may have this conflict or this sort of dichotomy within their own institution? Yeah, a couple of people are thinking about it anyway. If you are in a learner-centered environment, only the learner knows if they've spent the right amount of time on whatever it is that they're learning, and if you are in a social reconstruction environment, you don't really care how much time they spend learning it. You care whether they're able to then spend their time in their community of practice participating. So how are we going to develop analytics that aim towards these different intents, these different purposes? I think we can do it, and I think it's very important that we do it. I think it's very important that we start by saying, what is the kind of learning that your institution intends to provide, and how do we measure that kind of learning? What are the indicators that will help us see that that kind of learning is happening, and what kinds of notifications and supports can we provide that will help support that kind of learning? All right. So there are a lot of different ways that we can try to do this, and of course we're going to try to use machine learning principles, and we will generally be trying to make predictions based on previously given data. The IT giants make a lot of use of this, so this is your Google, Amazon, Facebook, et cetera, trying to tell what you're going to do next by what you've done before, and by what all the other people before you have done before, and making predictions based on that. And that will probably be helpful in cases where a large number of people follow a very similar pattern of activity through the same content. How many people here reuse exactly the same course, just reset it and have groups in it over and over again, year over year, with maybe some edits, okay? You may be able to make use of this. Do you have at least 100 students per semester, per term, going through your course? You may be able to develop enough volume of data to make good use of this. Is that all of your courses or only some of them, only some of them? What about your other courses? You want your learning analytics to help with those too, right? So again, these are some of the things about the machine learning model. We do want to make use of this, but we also know that we need to take a more abstract approach to be able to deliver on the promise of analytics for real learning, whatever your definition of real learning is. Data is not information. This is an example of the Moodle log. You cannot just throw this into a learning engine, into a machine learning engine and expect goodness to happen. It doesn't work that way. I suppose if it did, I wouldn't be able to get my dissertation because it would be too easy, right? So what we need to do is identify some indicators that we think are worth tracking and worth doing analysis to say what do these indicators predict in terms of those different learning goals that we looked at earlier. So this is the community of inquiry model, which is a fairly well studied model that can apply to a couple of these different learning theories. It's a good place to start. And it consists of cognitive presence, social presence, and teaching presence and ways in which they combine to create a good educational experience. This has been well validated in research, but only in an after the fact kind of manner. So it's been validated by doing surveys with learners and with instructors after course is complete and by doing discourse analysis of forum posts and such from the content of course after the course is over. Generally speaking, when we talk about learning analytics, we want to know what's happening as it happens or even before it happens, right? That's why we're doing this. So while this is good to validate a theory, not so good for learning analytics. What we need to do is find indicators that could match to these theoretical constructs that have been pretty well established and detect those as they happen in Moodle. So we think, based on the data that we have already, that teaching presence, which is to say the actions the teacher is taking, including the design of the course before the start of the term, the facilitation of the learners, and direct instruction, perhaps in the form of feedback to the learners, will influence, but is not the only influence, on the learner's sense of social presence, their feeling of connectedness with the other learners, their ability to work with other learners to form ideas. And ultimately to have a greater cognitive presence, which is achieving some depth of understanding of the subject, depending on the criteria that you're using in your learning model. Here's an idea that I started to put together where I looked to make a generalized flow chart of all the different activities of Moodle. This flow chart applies to every activity that Moodle provides. So we start at the top. You see the activity, the first view of the activity content. Maybe you can submit something or you can review some peer content. You have an attempt and then a submission. Then you receive a grade from an instructor or a peer or some feedback from an instructor or peer. You can view that feedback. So there could be a time gap between when you receive the feedback and when you view it. Or you could be viewing peer submissions, like in the forum. And then you could be providing a greater feedback to your peers, like in forum or workshop, if you have student ratings turned on in forum. And then maybe you could even be revising or submitting multiple entries and going around through the cycle again. The further down you get in this, the more depth you're likely to be achieving in a cognitive sense. So my hypothesis is that this is what we're trying to encourage. If the activities that you've used in Moodle don't support this level of depth, then your students can't reach that level of depth. At least not in a sense that Moodle can understand. So when we look at instructional design, we'll be looking at providing opportunities for this level of cognitive depth. And then when we look at learner actions, we'll be looking that they achieved the potential that the course offers. We also look at social breadth. So here's an example of a sociogram from work by Dietrichson in 2013. Very interesting stuff about how people connect with each other in a discussion and how that influences their achievement in the course. These are probably going to be factors worth looking at as well. So this is the cognitive, the social presence, social breadth that we think will probably be worth tracking. So we think that courses from a course point of view, from the instructional design point of view, courses that offer regular opportunities for volleys of extended discourse and regular opportunities for deep cognitive engagement are probably going to promote better learning. We think that instructors who provide regular, prompt, detailed feedback, both publicly and privately, are probably going to promote better learning. And we think that students who actively and regularly participate, they're not just lurking, and engage with a wide variety of content and co-participants, are probably going to have a better learning experience. That's our starting premise. But I don't want to assume that that's actually going to be correct. I based this premise on data that I've seen from my institution and a few other somewhat similar institutions. Very few people here probably work at the kind of institution that I came from. 70% online, so mostly fully online, mostly adult learners. Does anybody here have a learning population like that and a program like that? Okay, we're not unique, but we're not the majority. What we need is to check these indicators against your data. So what we propose to do, we admit one size does not fit all. We want to know what your curriculum priorities are. Which of those four categories, and I have a little questionnaire that you can answer ordinary questions that will tell you the extent to which you prioritize those four different kinds of learning. Are you offering fully online or hybrid or face-to-face courses? Because that will affect what kind of data is available in Moodle for us to look at. If it doesn't happen in Moodle, we probably can't offer you any predictions about it. But maybe we could develop more ways to track what's happening in the classroom in Moodle. Do you use fixed terms or rolling enrollment? How many people here have fixed terms in the content that they offer? So the course has a start date and an end date. Hands up please. Okay, and how many people have a rolling enrollment where students might start and stop anytime and they're sort of self-paced? Okay, I think, I really rather strongly suspect, that the analytics that these two groups need are going to be somewhat different. Does anybody disagree with that? All right, do you reset your course and use it over again? Or do you use this import of content to a new course or a course copy? This is going to matter. If you are resetting one course, you build up a history in that course. If you create a new shell and you import the content to that shell and make edits, how do we connect that new course to the history that was built up in the previous course? Do we do it by activity? Do we have some kind of activity history provenance that goes back through all the courses that it was copied forward from? Maybe, that's complicated, but it might be possible to do it. That's one of the things that we need to look at. Do you have, in a particular term, when you have 1,000 students and 10 different instructors, do they all go into one Moodle course with groups? Or do you have 10 different Moodle shells, one for each instructor? How many people would put them all into one group? All into one course with different groups, sorry. Okay, I know that at least some places do that. How many people would create different Moodle courses and put the students and instructors separately into those? Okay, not very many people put their hands up. How many people would do something else that I haven't thought of? A little bit of both, okay, how many people do both? Okay, how many people are tired because it's after lunch? Okay, well, turn to the person next to you, hold up your hands, do a high ten, and say, wake up, now. All right, let's try that again. If you have a very large number of students and a large number of instructors and some students go with some instructors, do you put them all in one giant course and have groups or something? Or do you make multiple separate Moodle courses, Moodle shells, for each group of instructor and students that go with that? How many people would do the first one? Okay, so one big course. How many people would make separate courses, one for each? Okay, that's a little bit better audience participation, thank you. This matters. And then within that course, how are we going to know what learner engagement looks like? Do you use completion tracking? Because if you do, we can probably make use of that. Do you have a grade book configured with weights? Because then we probably know how much weight to give to each of the activities that you have your students completing. Do you have competencies tied to things? Maybe the competency achievement is all you care about. You're not even doing grades. We want to be able to design a system that gives you that kind of flexibility. So, this is about open research. We do not want to make some secret black box product that you will not understand and we will just assure you that it's all good. We want to know what you've got going on at your institution. So, what we are preparing is Project Inspire. Initially a plugin, but to be eventually included in Moodle Core. At the plugin stage, we are asking all of our partners to help us by making this available to you. And what this plugin will do is give you some initial reports and notifications that are sort of a baseline of what we think will be helpful to most everybody. And then we will collect data and we will thoroughly anonymize it. And we will have a white paper for you that you can present to your administration to explain the extent to which we are going to protect user identities and institution identities. And we will fold all that data into a larger database that we can use to test these indicators and these outcomes. And then we will iterate and provide you with an updated plugin that will be better than the one that we first provided. And we'll keep this up until you're all happy. Everybody can participate. And what this also means is individual instructors who want to try something new will be able to compare course performance against their previous performance. And students who want to know which kind of course, particular formats of online courses or whatever they learn most effectively in or what their study habits effect has on their course performance will be able to check that for themselves. We want to open this up. No more hidden black boxes. And I'm a little bit over time, but I think I did pretty well considering how many slides I had. Can I take questions for just a couple of minutes and then move on to the next person? I will be available all afternoon. Please feel free to catch me. And I'm happy to discuss this with anybody who has questions or is interested. Please contact me at Elizabeth at Moodle.com. If you are interested in participating or if you need more information about what we're doing or if you want to tell me I'm crazy and Martin should fire me. Please don't send that email to Barton. Send it to me and we'll talk. Thank you very much for your time.