 Yes, I'm giving Elizabeth the panoramic view of this beautiful, lovely audience. Thank you so much for coming. Wave to Elizabeth, please. Fantastic. So, we're going to get started. Elizabeth Dalton will be walking me through many of you know Elizabeth from her many years of wonderful contributions to the Moodle community. And I'll let her take it from here. Also, Elizabeth. Good afternoon everyone. I hope you're all enjoying being in at the mood. I wish I could be there physically with you, but I'm really happy to be partnering with Bob to join you virtually as it were. How's the mood been so far. That was an overwhelming deafening roar of approval. That's great. Okay, so we've got about an hour and a half before they come and kick us out for your closing ceremonies. So let's make use of this time. And we have our first side shows side decks that we're going to go through the first two are presentations but the second to our interactive and hopefully very useful to you as we get into learning analytics. Can we get a sense from the room of how many people have never heard of learning analytics before this is a new thing you just hopped in because you wanted to find out what it was about. 5% have their hands. Okay, and how many people have feel like they have lots of experience with learning analytics and they're here to get some deep details or find out what's specific about learning analytics with Moodle. We'll go with and we'll go with 4.5%. Okay, so am I right in saying that most people know what learning analytics is, but want to know more about how they can use it in Moodle and make it really work for them. 99.9%. All right, excellent. So what we're going to do is I am going to go to the next slide set. Okay, you should see learning Moodle learning analytics overview. Yes. Excellent. I want to first point out that all of the slides that I'm using today are available online in our learning analytics working group page. How many people have been to that page so far ever About a third. Excellent. I'm really happy. So that is a group and a page that we have been trying to ramp up and there's going to be a lot more happening there now because as I'm about to show you with Moodle 3.7. We have really taken off with the learning analytics functionality and we expect this to be a very useful and dynamic working group. We have online workshops for you, which build on and go much beyond what we're able to do in an hour and a half today. And we also have a monthly conference call. How many people have ever participated in the conference call? Let me round up to 1%. Okay, so everybody is welcome. And if the time doesn't seem like a good time for you. Although it's 2pm your time, then you can vote for an alternative time and I'm happy to run it more than once as long as I know there will be at least three people there. So check that out at Moodle.org slash analytics and I hope to see you online and connect with you in the conferences in the future. So in the meantime, these are the kinds of questions that we tend to want to answer with and learning analytics that the big one are any of my students at risk of failing. One of the most common questions we get asked, but some of these other things are also important are our online courses as rigorous as our face to face courses to our teachers need additional professional development. How can we know if our new initiative is working to help students and learning analytics can help us answer some of these questions it can inform. So academic research and it can also inform action research that happens with individual teachers in individual courses. So a lot of times when people think of learning analytics they include reporting and that is descriptive analytics. It tells us what happens, but it takes a certain amount of experience to be able to interpret a report and come up with a prediction about what will happen next, or why it happened, or what we should do to improve. And so with moodle learning analytics, we are working on answering these other three questions, in particular, what will happen next, and then eventually we hope to be offering some diagnostic information and prescriptive information, although we will need more information and moodle to be able to make those kinds of analyses. So as most of you know, there are several descriptive analytics tools already in moodle, and there are a number of plugins that you can add to your moodle site. And some of these could potentially be incorporated into predictive analytics. These are really all almost all descriptive still at this point. One thing is before we try to make predictions or understand how well our courses are working or how our students are doing, we need to start by thinking about our definitions of education and what we're trying to do with our moodle sites. And we're going to get into this in detail in an exercise in part two. So give this some thought, start thinking about some general ideas about what's important to you. We'll unpack this in detail in a little bit. In the third part of this workshop, we're going to actually step through how to construct a new learning analytics model, and there are some great tools I'm going to show you that will help you do that using moodle starting in 3.7. The moodle learning analytics system has a set of components that make it very reusable. These are individual classes and components that can be put together in a number of different ways to make a learning analytics system that fits your site and your needs. And that's very important to us. All of this, all of the learning analytics tools that we have are open and visible to you and despite the fact that I seem to have chosen to make a model represented by a black box here. That's ironic and I didn't do it on purpose. We don't do black box models in learning analytics. We make everything visible so that you can see it, understand it, and be able to explain it to people that you're working with. And we feel that that's very important. We want to point out that learning analytics looks at processes during the process of learning. It looks at lots of trace data and makes predictions in the moment, and we don't have to wait until between courses to be able to make these kinds of predictions. So some other systems on the market will make predictions about the quality of your course or the at risk tendencies of your learners, but only between courses because they can only make use of the grades at the end of the course and other material, you know, other data from your institution. Moodle learning analytics makes predictions and offers insights during a course while it's happening. And this is a significant difference between the dedicated Moodle system and other systems that you might know about that use an external learning record store like Learning Locker. Using Moodle learning analytics, you can compare different models to one another or different interventions to one another by tweaking indicators and comparing the model statistics and I'll show you a little bit more detail about this as we go on. We're hoping that this will help us improve research quality because a lot of the learning analytics systems that are out there have actually never been tested or evaluated to see how much of a difference they're making to learners and teachers. And we have also made a real point of emphasizing data privacy and ownership of data in these systems as well. So we anonymize data when we collect data to make new models. And we're trying to make sure that when we make predictions, you can see the details of why those predictions were made and answer questions about that and get more information from participants so that they can be part of this process and not have it just be something that's done to them. We think that with all the data that's available in a Moodle system, we have a certain obligation to act. The data is available and if we don't use it to actually try to help our students, we're missing an obligation to care. At the same time, we know that when we make predictions, they can become self fulfilling prophecies. So we want to be very careful about how this happens. And we have some features that we are proposing for Moodle analytics that won't be in 3-7 but I hope will be in 3-8 that will help us follow up on the outcomes of these predictions so that we can see if we're making a difference and if it's a positive difference, we would really hate to find out that making a prediction that a student is at risk causes that student not to be successful. So that's something that we're really trying to avoid. Most importantly, we're not just trying to predict what's going to happen, but we're trying to change it. It's not enough to make a super accurate prediction about who's going to fail a course or which course is not going to be effective if that outcome still happens anyway. Our real purpose here is to be able to use this data in a constructive way to help improve the learning process. I think we can all agree with that. So that is the end of the quick overview. I just want to stop and see. Does anybody have any questions about any of the stuff that I just presented? That was kind of a real fast overview of the concepts. Are there any questions? Okay. What was the question? Thank you. Thank you, Elizabeth. It might be a very obvious question, but what is the Pygmalion effect? What is the Pygmalion effect? Oh, the Pygmalion effect. Yes. So the Pygmalion effect is named after the Greek myth where when you make somebody into what you see. So if the system tells you that a student has lots of potential and should do well, you actually encourage them more. And if the system tells you a student is at risk or likely not to do well, a teacher may inadvertently cause that to happen by lowering expectations for that student. And that also can happen to students themselves if they get a prediction that says we see that you're struggling with the course, it can demotivate them. And we don't want that to happen. Does that answer the question? Yes. Yep. Thank you. Any other questions? No, I think we're good to move on. So next up, and you should see enabling and administering models. We do. Excellent. Okay. So big analytics has been available as a feature in Moodle since Moodle 3.4. There are some new features becoming available in 3.7 that we think you're going to find really exciting. At least I hope so. I'm very excited about them. And what I'm going to show you right now is an overview of how to set up Moodle analytics on your site, enable models and train models on your data. So to start with, when you install Moodle or upgrade to 3.4 or later, analytics becomes available in your site, but it does not automatically start generating predictions. So to enable it, you need to go to analytics settings. There are a couple of things just to check and make sure that they are set the way you want them to be. Generally, the defaults are good, but double check them. So we support two different machine learning backends, PHP and Python. The PHP is using the PHP-ML library, which we have added to to kind of strengthen it a bit, and does not require any additional installation to your site. The Python system is based on TensorFlow and does require some additional installation to get that TensorFlow library on your site. It's a little bit more efficient and has some different features for predictions. So our PHP backend is currently supporting logistic regression models, whereas the Python model is supporting feedforward neural networks. And the API that supports these different backends is open. So if there are any developers in the room who are interested in extending this, we welcome that, and I have training materials for you online. Are there by any chance any developers in the room? It's a very modest group, but yes. Okay, great. So we also generally you'll leave analytics processes executing via command line only and you can set the time limit per model and that will help throttle performance load on your site. We also have some new features, some new data that we're collecting starting in 2.3.7. And the reason for this is that we strongly suspect, and I have evidence that supports this, that the way you're using your site will alter the way learning analytics performs on your site and what models can be used on your site. So we're starting to collect this information, and you can select more than one of these what your institutions primary mode of instruction is. If you have a course that is that offers blended or hybrid courses. If you do that on your site about what proportion of the work is conducted online in Moodle. If you have hardly any of the work being conducted in Moodle or tracking Moodle, it's very hard for us to make any predictions about that. And then what type of institution or what level of learning you are conducting on your site. So having set that basic information up, you probably want to have some models configured and enabled. Starting in Moodle 3.7, you can create a new model in Moodle without having to program the whole thing in PHP. So this is a big new feature. It was the number one feature that people were requesting of us. And when you go into the new models, the model administration page, one of your options right off is create model and you can also import a model that has been created elsewhere, and I'll show you some more details of that later. So when you're creating a model, you can choose from any of the targets that are installed on your site, an additional new targets can be installed using plugins. So each of these targets, these ones are particularly about students, but they don't have to be targets can be any kind of an outcome that we want to predict either because we want it to happen or because we don't want it to happen. So we don't want students at risk of dropping out. This could have been phrased the other way student likelihood to complete the course successfully. Either way, we could also have course is likely to promote engagement among students. That's something that we're looking at developing for the future. So the whole list, any targets that you have are available here and then hidden somewhat behind this pull down menu. You have the different indicators that you want to use in your model. You pick a time splitting method and you pick the processor that you want to use and save changes and this becomes a new model on your site that you can enable and train. And if you think it turns out well, you can export it and share it with other users. The model administration page has been redesigned for 3.7 so instead of a huge list of indicators in the middle that kind of obscures other information, we have just a number of indicators. You can still click on this to see all the data. So then for each model, you'll see whether or not it's enabled what time splitting method it's using. You can choose to view insights here and then there's an actions menu that gives you a number of different choices for each model. Moodle will ship in 3.7 with three models coming activities do no teaching, which is a model that just indicates that there are no teachers and no students in a course a week before it's due and probably no teaching is going to happen in that case. And our classic students at risk of dropping out based on the community of inquiry theory. So those three exist, but I've added a few other models based on indicators that I have installed on the site. So these are all models that you can create for yourself so if what you care about our, if you have the course completion conditions set, you can build a model based on that if you have competencies and that's what you're interested in tracking. You can build a model based on that so we have a number of different choices here. Once a model has been created, you can go in and edit the indicators and these are all the pieces of information that will be used to generate the prediction and some of them may not be applicable to your site so you can remove them. And you can also add indicators that were not included in the original model, such as any right action which is actually quite a powerful indicator of the student makes any right action they are more likely to complete the course. And you can choose the time splitting method and the default processor for that particular model. You can also enable or disable the model from here. You can't change the target. The target is one target per model and it almost is the identity of the model. But you can have the same target being used multiple times with different other criteria so that you can compare models that way. Once a model has been created, it can be evaluated. You don't have to enable it first. And for example here, the accuracy of the model is listed as 80.72. That's pretty good accuracy, but in this particular case, the evaluation results vary too much so the average accuracy is pretty high but the variance is also a little bit too high. So this is exceeding the recommended maximum standard deviation. So the issue here is that there's not enough data on this site to generate a good model. So you have a little bit of information here available and we're looking at including more information that will allow you to evaluate a model in more detail and determine how it's working and what you might do to make it more effective. You can also check to see if there's data on your site that your model is not able to use because it's making predictions about students but there are no students in the course or there's not enough course activity between the start and end date. So it checks for all of the different analyzable elements on your site. In this particular case, an analyzable element is a course and it can't make predictions or train because of these reasons. So it checks for invalid to train a model and invalid to get predictions. And this just tells you if some information is being left out of your model training or your model predictions because of something that you've forgotten like it doesn't have a start and end date set. That's one of the most common reasons. Once predictions have started to be made, you can access them from the model management page as a manager, or if you are a teacher, you will start to get notifications for any model that this is included in. If you're interested in trying this out and you don't want every teacher on your site to get these notifications, this is based on a capability. So you could remove that capability from the standard teacher and non editing teacher roles and then make a different role called for example analytics pilot and only assign that role to some teachers in some courses. That's a way to restrict this if you want to limit who is going to see these, but you get these different predictions and you can click on them and see a list of all the predictions for a particular course. And then you have some actions, including sending a message viewing the outline report, viewing the prediction details or acknowledged and not useful. Not useful is your way to indicate that you think that the model is making a mistake. At present we track this but we don't use it yet to improve the models that's something that we're hoping to add into an early next version of little acknowledged is a way of saying okay I don't need to see this in my notifications anymore. But I acknowledge that this was a valid prediction. View prediction details lets you see for a particular participant or a particular sample in a model. What the indicators are and what the calculated values are and then again you can send a message view the outline report or acknowledge or make mark the prediction as not useful. In moodle 3.7 we have a new model shipping called upcoming events and this is another frequently requested feature. So this just gives a notification to any participant on the site of upcoming events that are going to be due and you can choose the time period that you want. This is going to be over the next fortnight over the next week over the next three days. It's the same setting for the whole site at this point, we may be able to make that more granular in the future but we wanted to get this out there. And unlike the other sorts of events these events show up on the calendar page. So they become visible to everyone this is the first model that we have that actually presents results to students rather than to teachers or site managers. So this just alerts people that items are coming due within the courses that they are enrolled in just as a bit of a heads up. When a model has been created it can be exported and you can either export the training data of the model or just the configuration and if you export the configuration. You can include the weights of the trained model so I am often asked suppose I train the model on my archive site or a backup site and then I want that model to run on my production site this is how you would do that. Train the model on your backup site export it and include the weights of the trained model and then you can import that to your production site. And I'll show you you can test it to make sure that it's valid on your production data and go ahead and run it. We are also going to have a way to share these models and exports on little dot org so that people can create models train them and share the trained models. So the difference between exporting the training data and exporting the configuration with weights is that the training data is actually one entry per sample. Anonymized it doesn't show any data that can be tracked back to the original student but it's a longer file that includes the data that you could use to retrain a model or that you could report back to us so that we can combine data from lots of different sites and make more powerful models that way. The weights is a much smaller file that just provides the information needed to make predictions. So it's not the information to train a new model. It's just the weights that the model is using to make the predictions. And it is completely safe to share these two kinds of files but especially the ones with the weights. There is no personalized information in this file. So this is something that you can share with other institutions. You can share with researchers at your own institution. They're just it's there is no individual data so it's not possible to trace this back to any particular individuals only aggregated data. You can report the model as I said you can either evaluate a previously trained model on your current site or you can evaluate a model that has not been trained and train it on your current site. So you you can take a model that someone else has shared and try it out on your site with it or without the existing training data. This is also new in 3.7. This is an example of what the model data looks like, just in case you were curious. So the first four columns. This is because this is a model that runs quarterly. So the time splitting method is quarterly, and it just tells you which quarter it was running in and then for each of the indicators, it gives a weight and then finally the prediction as a one or zero this is a binary logistic regression model. So there is one line per sample, but there is no sample identifier. There is no individually specific information in this model. All that's included in here are the indicating indicator calculations and the final weight. So this is again a very safe data set to share, and we encourage you to share these with middle.org and we have a place for you to share them on that middle.org slash analytics site. The more of these data files we can collect, the more likely it is it will be able to ship a pre trained model that will work for more people. So if you can consider doing that, that would be great. And here is the information about sharing that so we have an online workshop that you can go to and it will show you how to set up a test server. You can, there's a sample course that you can create some simulated student activity with and I'm trying to get a sample course with that simulated activity already in there I just have not had a chance to do it yet. Enable the model allowed to train in your course. Repeat some simulated student actions and see your results and you can either earn the basic administrator badge, if you complete the upload test site model data, or to earn the model learning analytics site administrator badge upload production model data and earn my eternal gratitude. So that's a quick overview of the features that we have available in model learning analytics, including the new features coming up in 3.7. Any questions. Yes, hold on one second. Sure. Do the models just work within a specific moodle course or is it possible to get them to work across a set of courses. I'm thinking our students will take multiple module courses across their student lives and we might be interested if there are any indicators in a first year course that they might drop out in the second. Yes. So there are a couple of different ways of doing this but you'll need some some target like completion of a learning plan. So the most common way of doing this currently in moodle is to set up a learning plan for the students that includes completing all of these courses or the competencies attached to these course courses, and you need a new target for student completes or students at risk of not completing the learning plan and then attach all the courses to the learning plans and that would allow you to generate a prediction of completing the learning plan based on how the student does in all of the courses that are attached to their learning plan. We don't have a model like that currently set up and we haven't written that target yet but if there is enough interest will prioritize that for the next version of moodle targets can also be published in between versions of moodle so we don't have to wait. Thank you. Does that answer your question? Yes. Super. Other questions? We have a couple more. This might be my ignorance of learning analytics but you showed up there a model accuracy value. How is that calculated? The model accuracy value is calculated by running the model so what we do is cross cut validation so we hold out a certain amount of the data from your historical data on your site. We create a model based on the testing data and then we check that again against the hold out data to see if the predictions work on the hold out data. And we actually, I believe we do a cross fold validation where we do that a couple of times but I'm not sure if we have that implemented yet. But essentially the idea is we use some of the model to train some of the data to train the model and then we test that model on the hold out data. Okay, maybe I'm not sure maybe I don't understand what it means then. So if you have a model that's trying to predict whether a student is going to drop out or not. What 100% accuracy meant that the students they predicted dropped out dropped out? Like I'm puzzled on that. Yes. Okay. So then that doesn't necessarily, huh. So 100% accuracy. All right. But if you use the prediction and the students didn't drop out because you used it to change the future. Yeah, then you get a 0% accuracy value which would kind of make you think your model wasn't very good. I am so glad I am so glad you asked that question. So I have a proposal which I am really hoping to get into three eight but we'll see how it goes where we follow up on false positives and false negatives. Because when we have a model that makes a negative prediction like students at risk of dropping out, and then it turns out that they don't drop out. So we had a false positive. We want to follow up and find out if the reason for that false positive was because of our model and the insights and an intervention or something completely external. And similarly, if we didn't predict that a student was going to drop out, but they did, we want to follow up with them as well and find out. Was there something that we could have known about some way that we could have predicted that. So following up on false positives and false negatives and incorporating those into the model accuracy is something that we want to do. Does that clarify things at all. Yes, thanks. Great question. My turn. Hello, you mentioned PHP or Python are used for process the model. Is it possible to upload the processing of the model to another server since maybe not everyone wants to run this on the same server where you run your model sites. Yes. I don't have the details with me right now of how you set that up, but that is a question that has been asked on the model.org analytics site, and David money out who is the key developer and lead data scientist at Moodle has the information about how you can set that up. Okay, I suppose he can be contacted. If you even just check on Moodle.org slash analytics that that question has actually been answered. I just don't don't know the answer on the tip of my tongue. But that is something that we do get requests about and we understand good reasons why people need to be able to do that. Thank you. Thank you. I'm just following up on the first question there a few seconds ago. And if I understand it correctly, you're you're using maybe five years of historical data to make a prediction on the current crop of students. We will use all the data that is available on your site to make a prediction on your current crop of students. If you have five years that's what we'll use. So if when I make a prediction with my current crop of students, does that not mess up my predictive model because like next year when that becomes historical data, it's actually you've interfered with it. That's a really good point and that's why we need the feature that I was just talking about. So what we hope to include in future models is one, if you mark something, if you mark a prediction is not useful, then we want to flag that as being the kind of data that we don't want to include in future predictions. And two, when we have false positives and false negatives, we want to follow up and figure out how to incorporate those into the models so that the next year, our models are still effective. So if we mark, for example, there's a false positive. We predicted a student was going to drop out they didn't drop out. And then we mark that they didn't drop out because of the prediction, then that still counts as a positive. We count that as the model is still strong. We don't have that implemented yet. But we do have that written up in tracker issues on the moodle tracker. And if you would like to vote for that, we would love you to do that. Or if you have comments about how you want it to work, we would love to hear those. All of the tracker issues that are related to moodle learning analytics are listed at moodle.org slash analytics in our working group, and we would love to discuss priorities with you. Thank you. We have time for, we have one more question. Elizabeth, how are we doing on pacing? We're good. We're good. All right. And just a housekeeping notice, we asked the hotel to maybe turn up the air conditioning just a wee bit. Having a very warm room at three in the afternoon is probably not ideal. So we're working on that. All right. Hi there. Our moodle has rolling enrollments. I was very excited by 3.4 and thought, wow, this is going to answer it. But apparently it's not supported. Is it likely to be in the future? Is the issue here that 3.4 has fallen out of support because of its age? No, no, sorry. I was very excited about the analytics functionality when it was introduced in 3.4, but was disappointed to find that it didn't address sites that have rolling enrollments. Oh, rolling enrollments, yes. Okay. Are there any plans to develop that? Yes. So the two factors that we need for rolling enrollments, actually the main factor that we need for rolling enrollments is a different kind of time splitting method. And again, I have a detailed proposal about this in the tracker, but briefly, instead of saying the student is at risk of not completing the course by the end date, which is sort of implied, you need to have the student is at risk of not completing the course by the enrollment end date if you have fixed enrollments, or if you have completely open enrollments, the student is at risk of never completing the course based on how much time they've taken so far and how long it's taken other students in the past to complete the course. So there are two different relatively complicated problems that we need to make code for. They both involve time splitting methods and to a certain extent, they involve new targets as well. They need the student will complete the course by the end of the student enrollment as opposed to by the end of the course date. That requires a small variation on our current target, and we should change the name of the current target to make it more explicit. The student will not complete the course at all in a completely open ended course that one needs a little more sophistication in how we make the calculations, but I believe we have specifications that will allow us to do that. And there again, if that's something that's really important to the middle community, we need to hear from you. We need to help understanding what the biggest priorities are so that we make sure to appropriately assign resources. We have limited resources for each release. And we need to hear from folks who want to use Moodle Learning Analytics as to which features you need us to develop at a priority so that you can make use of it. Okay, great. Thanks. Did that answer your question? Yes, it did. Thank you. Super. Okay, are we all right to move on? Is the show of hands. Yes, move on. We're good. Yes. Excellent. All right. Okay, so. There's a meta moment for you. Let's, if you want to take a moment while I set up the next slide and stretch, standing up or sitting, whatever works for you, just good stretch. Get the kinks out. You're in the final stretch, right? It's the afternoon of the last day. We can do this. And while you're stretching, you may also want to think about coming joining our lovely group here in the front tables. We would like to have about four people at each table. So if you want to move, this would be a great opportunity and we'll get started in a moment. Okay, you should have a screen share of what is quality education. We do. Excellent. Whoops. In Martin's. Keynotes, he's been making a real focus on how we want to support the UN sustainable development goals and in particular, goal number four quality education and we feel that model learning analytics is going to be a key part of this because it allows us to measure and predict whether quality education is happening and help change that future as I've said, but it begets a question. What is quality education? And it turns out that this is a very difficult question to answer. And in fact, it doesn't have just one answer. So there are a number of equally valid answers for this question, all of which can be aided with learning analytics, but you need to know what you're trying to measure before you can measure it. So we are going to do an exercise that will help you understand in your own context what priorities are important to you in your educational environment, what will be important to you to include as indicators and to incorporate as a target. There are no wrong answers in this exercise. So you should have a giant piece of paper in the middle of your table and six different colors of sticky notes and each of you is going to use one of each color. You'll mark your notes with your initials or any other symbol that you can use to identify them because they're all going on to the same page. So at the end, when you when you look, you should be able to see how your notes are positioned. As we go through, I'm going to ask you six different questions with four answers for each and the answers are arranged by quadrants. You can place your note anywhere on the page to indicate your best estimate of the answer to that question. There are no wrong answers. If you feel split, you can tear your note in half and put half of it here and half of it there. Anything else that helps you represent your educational priorities. This is not an exercise for us. This is an exercise for you to just sort of help clarify your thinking by asking some explicit questions about what you mean when you say learning and teaching and education. You can talk to each other at your tables about what the things mean. If you have any questions about the ideas, please feel free to bring them up, but you don't all have to agree at the table. You can put your notes wherever you need to put them. Everybody understand? I think so. So just to recap, you'll have your piece of your sheet of paper. You want to divide it up into four quadrants, four quadrants. So two lines intersecting at a 90 degree angle. Or, you know, close to that. Close to that. So, and that could be the folded or drawn. That's fine. And then just like this. And then what we're going to do is Elizabeth is going to guide us through six questions. And what we want you to do is use each, for each question, use a sticky pad, use one of the colors to answer that question in one of those quadrants. So everyone, if you can just, you want to put an initial or some kind of identifier on those so we can just kind of see, so you can keep track of which one was yours. That would be great. It doesn't have to be your name. We want to be GDPR compliant with our sticky notes. So that's important. And that's it. So one of the six questions, different color sticky note for each question in a certain quadrant. Did that per person? Well, different, each person will have the same color for each question. Wait a second, Elizabeth, is it one color per question or is it one color per person? Yeah. Yeah, let me put up the first example. So the first example is about what an education institution is and we're going to use the pink sticky notes to answer this question. Does that help? Right. So one color per question and they just put a little initial on it so you know that which one was your sticky note. Right. So each person grab a pink sticky note. Now you each get one. If you feel like you need more than one because you can't make up your mind, you can tear it in half. I don't care how you represent your opinion. This is about your feelings about this question. Okay. Okay, everybody got their pink sticky notes in hand. I think we're good. Okay, so let's start by what we think an educational institution should be about. Should it be a where knowledge is transmitted to students. Be a stimulating environment organized around the developmental needs and interests of the students. C, where students are trained to function as constructive members of society. Or D, it should help students to perceive problems in society, envision a better society and act for social justice. So what's your sense of what your educational institution or the one that you're thinking about right now should be doing. And Elizabeth, that can be shaded in terms of where you put it on the grid. If you're absolutely you can put it. You can put it anywhere, you know, to show that you're like half a half B stick it on the line, tear it in half and put it in two corners. Make yourself a new box somewhere else because you don't like these answers. Thinking about the questioning and getting some kind of a reaction of your own down on the paper is what's important here. I'm just going to share with Elizabeth if you don't mind. Great. Are we ready to move on to the next question? Are we all set for question number two. Okay, interesting. All right. Yes. Okay. Educators, whether you call them teachers or professors or whatever. What should they be? How do we think of them? Get your yellow sticky. Yellow. That's green. Okay. And a knowledgeable expert practitioners transmitting that which is known to those who do not know it. Mentors to students helping them to learn by presenting them with experiences from which they can make meaning. C supervisors of student learning utilizing instructional strategies that will optimize student learning. D partners with students using the environments within which the student lives to help the student learn. Okay. So which of these resonates with you. So I want to say that almost nobody answers a every time all the time or you know any of these quadrants all the time. Most people are kind of scattered. They're doing at least two of these things at once. Sometimes they're you know on a sometimes they're on C or whatever. And that is okay. And how you feel about this today with the institution you have in mind right now may not be the same as how you feel about this next week. We're at a different institution and that is also okay. But for right now just think about one particular learning context that's important to you. If you if you need to think about two contexts tear tear you thing in half and do whatever you got to do. We doing. Still some some discussion. Anybody have any questions that they want to put to the floor or ask me. Any questions. Now we're we're enjoying our revelations on the grid right now. Yeah. So let's go to the next the next one. All right. Learning happens best when a the instructor clearly and accurately presents to the student the knowledge which the student is to acquire. The students are motivated to actively engaging experiences which allow them to create their own knowledge and understanding of the world in which they live. The student is presented with the appropriate learning resources and positive reinforcement. D a student confronts a real world problem and participates in the construction of a solution to that problem. Place your bets. It's almost like a card game. I see your quadrant B and I'll raise you a C. But remember the winner is the one who comes out with the best understanding of their own needs. That's right there are no wrong answers only certainly better. No no no. One rule you're not allowed to bully other people at your table to get them to agree with. All right let's say I think we can go to the next one. This is your teal colored sticky education should focus on the structured knowledge and ways of thinking that have come to be valued by the culture over time. Where culture could be a culture of scientists or a culture of historians or whatever. The personal meaning of each individual student and of the world that comes from each students direct experience in the world and the students personal response to such experience. See the specific skills and capabilities for action that allow an individual to live a constructive and productive life. D a set of social ideals a commitment to those ideals and an understanding of how to implement those ideals. Elizabeth what's the correct answer. The correct answer is all of the above but today I feel like. Perfect. So do we need clarifications on on on the description. Okay we need an option E. We need an option E. Does anybody want to tell me about option E. What would you like it to be. We'll know it when we see it. Okay. So right in the corner of your big white paper. Okay. So what would you like it to be. What would you like it to be. Stick your your sticky or part of your sticky there. If there's anything that you at least partially agree with put part of your sticky over there. Whatever works for you. Again the purpose of this exercise is to help you unpack your own ideas. You'll need to know this in order to design a learning analytics system or choose a learning analytics model. This is a purple blue kind of sticky students should focus on intellectual development highlighted by growing reasoning ability and capacity for memory that results in ever greater absorption of cultural knowledge. So what would you like it to be. Self exploration and unfolding according to their own innate natures felt needs organic impulses and internal timetables as they are during the learning process rather than as they might be after graduation. See preparation for post graduation employment increasing the capacity to be a constructive contributing member of society. See the practice and in and preparation for acting upon society to improve both themselves and the nature of society. By the way if these sound a little stilted or very formal. They come from a particular resource which I will get to in a few minutes. And it's be be leading in the race. No. Wait now it's see pulling ahead. Oh my God look at D coming up from the rear now it's D. D has tripped at the front line and A is taking over I have no idea I can't see what you guys are doing so okay I think we're are we are we okay to go to the next question is this question which what are we on this is the last question question are we are we do we have the stamina we've got in this far we've got one more yes okay this is your faded pink purple mauve kind of sticky the purpose of assessment testing evaluation whatever you want to call it is a an objective measurement of the amount of knowledge students have acquired allowing students to be ranked by intellectual achievement a continuing diagnosis of students needs and progress so the further progress can be promoted by appropriate adjustment of their learning environment primarily for the students benefit see an objective measurement to others of whether or not students can perform specific skills to certify students competence forms perform specific tasks or D a subjective comparison of students performance with their capabilities to indicate to both the students and others the extent to which they are living up to their capabilities do I hear a bingo do you want that clarification so we have a question Elizabeth is this a question of what we wanted to be or what it is or if you want to you can tear your sticky in half and write what it is on one and what I wish it was or what my manager says it should be and what I think it should be where manager could be you know the parents the Dean the employers in my district all right are we all have we have we completed our task looks looks amazing we got some very different conversations in these charts and different in different groupings so Elizabeth what what do you want to do next okay so here are some general terms that can be used to help you identify research or models or info more information about these four areas quaternay we usually call the academic this is traditional teaching usually cognizm is the the pedagogy involved and it is more about an objective reality and focused on the source of knowledge we're respecting and valuing expertise okay quadrant B is more of an individualist kind of thing it's learner centered constructivism open pedagogy and we still value the source of knowledge but it's a subjective question quadran C pragmatic sometimes we call this social efficiency workplace training competency-based learning self-paced learning and it's usually based on behaviorism you know what people can reproduce and it is also objective but instead of being focused on the source of knowledge it's focused on the use of knowledge it's not so much where it came from but how you use it and quadrant D the idealist is about social justice social constructionism transformative learning and this is also where we get active affective outcomes like how so how much somebody cares about what we're teaching not just whether they can recite back facts but that they care so that is again subjective and about the use of knowledge not the source of knowledge so there are a couple of questions that I would like to ask about this of of the room in general did where look look for yourself and say where are most of your stickies did they tend to cluster together or very depending on the question so how many people felt like their stickies kind of tended to land in the same places about a little less than half one one half of the room which is kind of interesting no oh really so they're physically together yes okay and very depending on the question did the other half of the room raise their hand or we have some holdouts there was one half of the room this the correct side raised their hands and then answer the the the other side where I don't walk don't you do that don't you do that okay did people at the tables agree about definitions and priorities sometimes yeah I see some head shaking no are you able to see the points of view of those of different priorities even if you don't feel that those are the right for your own context correct so we saw there's some feedback saying yes they understood the perspective and do the people who have different points of you are you in really different context so like our how many people here teach K through 12 do we call it that where you are a few okay and higher ed post post-secondary the corporate training one one training mostly post high education anybody who's in another category NGO community schools okay so most of you are in higher ed but you still have oh yeah continuous professional development for midwives post post qualification okay okay and and which quadrant did you tend to land on or were you kind of split sorry I've identified it as academic workplace but learner centered sure and I think that fits good because I don't have an opinion about where your training fits I just want you to know where you where you feel things fit so that's great so so even though a lot of the other people no no it's good I thought for some of the questions they where I stood depended very much on the context and I think for the last one the assessment one I for say my first degree was I combined humanities and I was very much in the B quadrant but if you talk about something that safety critical or say a driving test or say you are placed to teach as biomedical scientists some of the things they do I think I'm very much in the sea quadrants absolutely absolutely so one of the anecdotes that I use for this is you know if I have to go in for surgery I probably want somebody who's had an objective measurement of whether they can perform those surgery skills right but I still want somebody who cares I want somebody who's able to be empathic about my particular needs and who cares about my health individually and you know somebody who respects me as a person so and probably I want them to know the difference between a liver and a spleen so you know there there's places for all of this but certainly the the specific kind of education that we're talking about makes a big difference in where you want to focus your priorities and that's why we can't have one learning analytics model that fits everybody the student students at risk of dropping out is actually a C it assumes that everybody's going to go through a bunch of training that's standardized and that it's important that everybody can meet some basic skills that they are going to complete on time in an efficient way and it's got a healthy dose of a because the content that's involved tends to be traditional past on content but students at risk of dropping out doesn't even start to ask what students are going to do with the knowledge it doesn't say it doesn't ask about students internal life it doesn't say are they going to go on and become leaders in their field it doesn't say are they going to go to graduate school it just says are they going to meet the requirements in this particular course that have been externally set and there's nothing wrong with that if that's what your social contract with your students says that you're promising them so it's just think through what your institution is like what your students come to you for what you said that you're going to deliver to them that's how you choose the learning analytics model that will best suit your institution any other thoughts on this any anybody want to talk a little bit about what they uncovered did it did anybody was anybody surprised by their answers were yours were you surprised with your colleagues answers oh yeah that's a good one were you surprised by the table next to use answers we weren't surprised to find out that D was the idealist though did anybody have D is their overwhelming favorite one more one more couple more comments so let me find my pathway here thank you so much sorry we were rebelling and we weren't using the post-its we were just chatting about it but I thought it was quite interesting in that sort of where we placed ourselves in terms of how we thought about quality education was in a different quadrant to the institution we work at and that that's quite interesting when you have a vision of what you think education is and then you start thinking about it from your university perspective and you go actually that doesn't quite match up with what I think and I just thought it was yeah quite interesting to reflect on so thank you I hope I don't didn't cause anybody to decide to quit their job or anything you you can sometimes bring your personal hopes back to your institution and try to encourage change there or when if you start this conversation at your own institution sometimes an institution as a whole will say you know we've been doing one thing all along but actually what we wish we could do is something more like this so unpacking these ideas and being explicit about your own motives and goals can help you do what you intend to do better and I hope you I hope that you find that effect for yourselves was there another question or comment there was hold on one second here we go thank you I'm a learning technologist in the university and I was really struggling on a number of occasions to sort of I kept thinking internal contexts so there's a subject related context where I thought well when I work with these academics then this is the you know this this will be their primary this is where they would fall but other subjects are the disciplines of the schools kind of some of it I even sort of thought schools based well they would they would tend to be in a different in a different area which is which is really interesting and then also thinking about at a much higher level what is it what is what are the policies what are the policies telling us that the university is doing and how does that sometimes conflict with what the actual and also what is required you know at the sort of at the ground level with working with the students it's really really really interesting activity definitely worth much more reflection and worth taking back to the university probably as well thank you excellent feel free to these slides are available and feel free to use this extra exercise or activity at your own universities and I would like to make a pitch for Michael Shiro let me just bounce back to maybe I don't have that slide okay I'll post this up for you but Michael Shiro SCHIRO has a book called curriculum theory I'm gonna share this and this this is the book that I developed the questionnaire from and he has a version of this questionnaire I've changed some of the wording to make it a little bit more applicable to a wider audience but this is a really nice book for exploring these four different ideas in a very even-handed manner and helping you see where they fit it is a little bit US centric but I think it would still be valuable to people in other countries just the historical examples tend to come from the US but you'll recognize things about your own institutions when you look through there okay I have one last exercise which we are not going to have time to do completely but that is certainly okay and I want to just kind of walk people through how to do this and then if you want to you can do it online okay so we have a workshop online called an overview for designers and we have a whole set of workshops that are aimed at both designers and developers and you are welcome to go through these they are free and they help you clarify your thinking and write new models either write code or write use cases in a way that somebody who is a developer can convert them into code so this is a way for you to formalize you're thinking about what you want oh here we go here's Michael Sherrow curriculum theory conflicting visions and enduring concerns thousand oaks sage so we know that we want our analytics to align with the curriculum if you have multiple priorities at your institution you can have them all at once but you need one analytics model for each priority at least one so you'll have different targets depending on which priority and you'll have different indicators that go with those targets and if you have multiple values at your institution you can have multiple analytics models running that warn you about the different values so that is perfectly supportable in Moodle learning analytics overall the process goes like this you think of what outcome you want to detect or predict and how are you going to tell if it happened so if you are looking at final grades your final grades need to be a Moodle or you need to figure out how you're going to get them into Moodle so that you can can detect that if you are trying to detect student satisfaction or student acquisition of lifelong learning habits you're going to need to think about how you might detect that I have some suggestions about that at learning analytics or sorry Moodle.org slash analytics we're having some lively discussions about that I have some suggestions there are different things some are harder to detect than others what clues do you think might help predict that outcome what we should do if what we should do if that outcome is very likely or very unlikely who should be notified what kind of notification should be sent and what opportunities for action should be provided on notification and and those aligned with these pieces of of the diagram here be careful in setting these up develop an institutional code of practice avoid indicators based on sensitive demographic data you want as much as possible an unbiased model so you don't want your model differently predicting based on something like general or ethnicity when that shouldn't the problem is that when you set up a model and you train it on your prior data if there were biases in your prior historical data they will be replicated in the model so you have to be careful about that and we are developing some tools that will help you detect that we just don't have those up quite yet it's not a one-time thing get your community involved in model development and testing and ensure continuous review and improvement good hearts paradox is when the thing you are measuring becomes the objective so if you say I want my students to get good grades and then people start doing things to make make sure the students have good grades recorded but they're not actually learning more that's good hearts paradox that's when your measure becomes useless because people do things directly to change the measure but they don't what change what the measure is supposed to be measuring so here's the general structure of a use case we're not going to have time to go through this in in detail but in general and you won't necessarily do these steps in this order you're going to have a learning question at least one user story as a teacher I want to be able to know which of my students are at risk because I want to be able to intervene as an example your primary actors who has the question who needs to see the answer and who are the other stakeholders I will pause and stop and talk about that in a moment is this a question about a user a course quiz question all users all courses and you know details whatever whatever you're thinking about early on about what the targets could be what the indicators could be and some kind of a link to a discussion so I use case should be developed out of a community discussion it should be something that's validated by the people that it's going to affect and one of the things that is really important is who are the stakeholders and I'd like you to think about these different questions when you go through and think about developing or adopting a model who is requesting at the model who will be monitored by the model who will receive notifications from the model who is supposed to benefit from the model and who really will benefit from the model and is there anyone who might object to this model and how will you deal with that are there conversations that you can have with people that will help them be more comfortable are there things that you need to change about the model that will make it more acceptable Elizabeth we have two minutes absolutely so if you would like to participate in this part of the workshop please go online to moodle.org slash analytics and there is a link there to the online workshop introduction to learning analytics there is a review of the content that we've gone through in this workshop and then there is an activity right and discuss a use case based on a specific learning question and you also peer review two other use cases and you'll earn a badge the beginning designer badge and then there are more workshops online that you can go through and flesh out all of the details and eventually get an overall moodle analytics designer badge or there's also a developer badge so I really want to hear from you I really hope as many of you as possible will go to moodle.org slash analytics or return there if you've been but not been back lately and participate in the conversations here and help us continue to develop analytics in a way that will meet your needs and help us prioritize our work in the ways that will best serve you I want to thank everybody for your participation today you've been really engaged and I appreciate it I hope this has been helpful to you thank you Elizabeth