 Good afternoon, ladies and gentlemen, dear participants. I think we can start. It is my great pleasure to open today's session, today Eden's, today's Eden webinar. Eden series actually of Eden app webinars. I'm Igor Balaban. I am a professor at the faculty of organization and informatics, University of Zagreb. I'm also Eden app steering committee member. And today we have a webinar entitled, What do I do as an educator with learning analytics? As you can see from the first slide, today we have five very interesting speakers and I would like to thank them as well for keeping them time with us, for preparing their presentations. Actually the main moderating role today will be from my dear colleague, Professor Bart Rienties. Bart is a professor of learning analytics and program lead of the learning analytics and learning design research program. He is with the Open University UK Institute of Educational Technology. Maybe just before we start, what was the main motivation for today's webinar? As you all might have known so far, we have been extensively dealing with the learning management systems. And in some point in time, we realized that the systems were actually recording a large amount of data. They were actually recording all data that systems were recording by themselves or by default, but they're also be capable of recording the instructor based data. And then you realize that data can help us in a variety of terms and then we started to analyze that data. And today's story is actually how can we use that data that we now popularly call learning data in order to let's say, optimize learners' preferences, optimize learners' learning experience, how to use the learning data to help teacher, how to cluster students. What can we learn from the learning data that is already inside the system is just that we need to dig the data out and start to analyze data. So today we will talk exactly about that. And now I would like to give the floor over to Professor Bart Rienties, who will actually do the main introduction and who will introduce today's speakers. Bart, please. The floor is yours. Thank you very much, Igor. Thanks for the organizers for inviting us to this evening webinar. We are very excited to share our experiences. And we look forward to your questions, comments and insight. And I can see already from the chat that you've already started to chat, which is great. We've opened up also a question box. So if you have any questions during the presentations, just put them into the question box and we will either directly try to answer them or we'll try to answer those questions towards the end. So in this interactive webinar, we will share lived experiences, how four organizations from six countries are using learning analytics on a daily basis. Some of these organizations have implemented learning analytics at an institutional scale, like Nottingham Trends or the OP University, while others have created more specific and bespoke applications and practices. You will see today at least two organizations who provide rich learning analytics data directly to students, while in all four organizations, we provide rich and aggregate data directly to all educators. So for example, what you will see from the speakers who will join today. So for example, Ed Foster will, from Nottingham Trends University, he will demonstrate how students can easily access their own data and how this helps them to see how engaged they are with their studies. Stimuli at Masat University, Dirk Tempelau will in a minute show how he's using his own course and how he is developing basically their own students. Sorry, I'm losing my notes. How is giving this data directly to his students in terms of their learning dispositions, their learning engagement data and how he's helping students to not only learn statistics, but also to reflect on their own practice. Anna Gillespie from the Open University will share her experiences of how using learning analytics to over 3,000 teachers helps them to provide really useful advice to students using interactive dashboards. And she will highlight how predictive learning analytics systems could be used to help to support a teacher. And then last and not least, Robert Bodily from Zappi, he will illustrate how at Mountain High Academy, teachers are using a so-called action dashboard to basically understand how their students are learning and how to basically provide better data to teachers to potentially intervene with struggling students. So what we will do in this webinar, we will not do a lot of technical data analysis. What we will mainly do is to focus on the role of the educator and how you as teacher or as educator could potentially use learning analytics in your own classroom. What we've indicated that each presenter has a maximum of seven minutes to share his or her amazing work. And then there is a one minute opportunity to address any urgent questions. So if you add that up, that should be around 40 minutes of presenting and an opportunity for quick questions. But then afterwards, this should give us plenty of time to go much more into depth in terms of the lived experiences. So we're really looking forward to your questions and answers and if we speak too fast or too slow, do let us know. So let's get this show on the road. So at the OP University, we've been using learning analytics since 2013. And what Anna will show in a minute is that these tools are really amazing and powerful in terms of predicting students' behavior. What we also found was it's really important to understand how teachers design online and blended courses. And at the OP University for the last 15 years, we've used the so-called OP University Learning Design Initiative. In this approach, we're basically doing a kind of blueprint, a brain scan of how teachers design courses. So what you're currently doing is you're listening to me, which is kind of a simulated activity. Perhaps at the same time, you're Googling who the F is, Bart Rinty. So you're finding perhaps more information about the speakers. Some of you are already communicating in the chat and are asking interesting questions, hopefully. Hopefully also later, there will be some opportunities for you to play with actual data from these various learning analytics platforms. So you get a kind of productive experience of what it's like to work with certain learning analytics data. And finally, all the way to the right, you see that typically, of course, we as teachers eventually often assess our students to what extent our students have understood the main concepts that we're discussing. So these blueprint of seven main learning activities is a way of trying to identify what are the key activities that our teachers are using for their students. And what you see, for example, in the next slide is a kind of blueprint of one computer science course. And in this computer science course, you see a kind of brain scan of what a teacher thinks each student should spend in terms of their activities. So for example, in week one, a student is expected to spend 10 hours on reading materials and spend one and a half hours finding information, for example, what is Java or what is HTML, et cetera, et cetera. And in this particular design, the teacher in week five designed a really engaging program activity task where the students were expected to program a particular piece of code, which is a kind of experiential activity which lasted for 19.1 hours. That's the expectation of the teacher. And that was then also assessed for on average 10.9 hours. And this is a very smart guy. This teacher indicated, okay, let's give the students then three the week beforehand so they have plenty of time to work on this program. So you see the kind of expected workload that a teacher thinks typically would happen. So now let's look at what students were actually doing. So the next slide in a way shows exactly the same information, but in a slightly different way. So the bars basically show the workload expected by the teacher and the red line indicates the actual average engagement by students. And what you can immediately see is that in week four, students didn't go on holiday. They actually spent a lot of time working on this particular activity. They actually spent most of their time on this particular activity when they were supposed to be free. Another thing that you probably will see if you look with a queer eye to the data is that every time when there is a blue activity, that's a so-called assessment activity, there is a lot of engagement by students. And the reason why we think at the Open University that this is so important is with any kind of learning analytics data, any kind of data that you have, you will have peaks and troughs. But if you do not know what's behind those peaks and troughs, it's really difficult to make sense of this. So in a large number of studies, we have then been able to show that around 69% of what OU students are doing on a week-by-week basis is determined by how we as teachers design our courses, which is amazing if you think about it. So if you want to change your students, it's about changing you as an educator and thinking creatively about how you can optimally design your courses. So what we have done in some big data studies, we looked at over 100 different courses at the Open University. And what we found was that teachers who design a lot of individual learning activities with lots of amazing materials, lots of interesting videos, they tend to lead on average to a lower engagement by students over time. While teachers who design more so-called social constructivist learning activities, and I'm sure that Dirk Templer will show an example of this later on, they tend to lead to higher engagement over time. And what we see in addition is that those teachers who design more constructivist learning designs are more focused on the individual learning tend to lead to higher learning satisfaction. While at the same time, those courses that have a lot of interactive and a lot of social constructivist learning materials, they tend to get really low satisfaction. But at the same time, the number one predictor of whether or not our students are passing courses at the Open University is whether we let students work together in groups with a teacher. So this is really fascinating because it's great to look at all this learning analytics data, but what we're trying to basically say is that we as educators have a tremendously important role to set our students up for success. At the same time, by showing these dashboards back to teachers, and this is an example of the same computer science course, by showing this data back to teachers, as you can see, this teacher changed his learning design based on the information we were providing from this dashboard. So in week four, there was no longer a break and he made the decision to move the assignments in a slightly different way. And then we can start to track whether or not this is a good approach or not. So if you're interested, and I will post a link in a minute as well, you can use this all the approach for free online and I'll post a link in a minute and you can play around with it in your own time. So exactly with seven minutes, I'm really keen to hear if there are any questions or comments and I will try to do that within one minute and then we will move to Anna. So is there any question or comment that we need to address? Sorry, I'm just trying to find the... So there's one question in the checks. Yes, Andrunda, this is indeed data from Moodle. And Marcus asked, have you had a buck change in learning patterns during COVID? That's a big change in learning patterns. I mean, because of course, everything at the OP University is online, we basically continued as normal. So for us, there wasn't a massive change. But we have also been working with a lot of normal universities in between brackets. And I'm sure that they have probably had to change quite tremendously and learning design allows you to kind of map this. So this is my time up. So I'm going to give the floor to Anna. So Anna, go ahead. Thank you. Can you hear me okay? Yeah. Oh, that's brilliant. Well, thank you very much, Bart. And just, I won't spend a lot of time introducing myself because Bart's already done that. But just to say that I work as an associate lecturer. So my role is direct teaching with students on particular modules. And I use what we call the early alert indicators dashboard at the Open University. So today I'm going to talk to you a little bit about supporting teachers and supporting students who are working at distance. We'll go next one please, Bart. Thank you. So one of the things as a tutor working at distance is that sometimes it's really quite difficult to know whether or not your students are actually still engaged in the module. And sometimes we get to a point where we may not actually even know that the student isn't engaging because of course we're not in a classroom situation, we don't see students, we can't stop somebody in the corridor and say, how are you getting on in that same way? So very often we're in a situation where the first time that we get to know that a student's kind of dropped off a little bit is when the submission deadline arrives and the assignment doesn't. So of course that's staying quite worrying because that's quite a lot of time that's gone by and we've had no contact from a particular student, don't know what's happening. And the best way in the world, students don't always communicate with tutors. For a variety of reasons, it's not necessarily negative. Some students just prefer to do what they do and get on with it. And they may not really require an awful lot of tutor input. We may find that some students have got particular situations whereby communicating isn't what they want to do, and that's why they're doing online learning. So one of the things that we do have is the early alert indicators dashboard becomes a way for us as tutors to be able to offer support before it becomes too late for that particular student to catch up. So that's one of the things that we're really trying to encourage tutors to use as much as possible to make the best that we can out of the tools that we have. So as educators, we've got lots of tools in our toolbox. So this is just an additional tool that we can use. So can I have the next slide, please? Okay. So what the early alert indicators dashboard does is it provides us as tutors with manageable data. So it's visualization and that's updated every week. And so to form the data, there's two types of data. So it's the static data. So this is the demographic sort of information. So this is the sort of things that students tell us on enrollment. So we will know bits about age, gender, previous education, whether that's within the open users to or whether that's elsewhere. We know a little bit about their geographic information. So that's the sort of stuff that's kind of like static. And then we've got the fluid data, which is student activities on the, for example, the learning visual environment. So that's something that is really important to us particularly as tutors, because actually that's where we can actually change what's happening. We can't do an awful lot with the static data, but we can do with the fluid. So that's what is going to be the focus of the case study that I'm going to present to you now. So, okay. So what I'm going to do is the slides that I'm going to show you now, they're an anonymized example of a small section of the dashboard. So due to time, I've only got the time to sort of like show a small element of it. But the reason I've chosen this is because this is the bit that really where tutors can make a difference. So I'm going to present you with Dagmar. It's not her real name. And there's a long story behind how our names are generated on this dashboard for anonymity. So this is a real student different name. She's 29 and she's studying at the Open University for the first time. It's her first module. It's in technology. And she is working towards a BSC in engineering. So a new student. So if we can go to the next slide, just show you some demographics here. So basically, the brown trend line is Dagmar's activity on the VLE at the moment. So we can see that, and it's matched against the blue line, which is the average student activity for that particular module. So what we can see when we're looking at the trend lines is actually Dagmar's doing okay. She's had a couple of dips, but generally speaking, if we look at it, her trend is certainly on track. So that's really reassuring. Now if we look at the columns, what we can see there is the brown one is the results that Dagmar has had so far in her assignments matched against the blue one, which is the average student again. So she's above average and that's looking really, really positive. The green one is an online piece of work that people submit. It's a small task really, compared to the actual brown one. So the dark brown one shows that activity as well. And then again, when we move to week 10, what we can see is that Dagmar is still doing really, really well and her engagement is really, really positive. So if we can move to the next slide. Now if we look at her trend line there, if we go to week 14, what we can see is four weeks of non-activity. Now that's quite one. And actually only very small amount for week 18. So what we've got there is a little bit of a worrying trend. Now the thing about Dagmar is, as a tutor, I might not have been particularly worried about her unless I was able to see this data. But actually looking at week 14, 15, 16 and 17, it's a worrying trend. So the tutor who was tutoring Dagmar for this particular module, she actually contacted us to see whether or not there was any problem. So what we actually found out from this was that Dagmar had had a baby. She hadn't told anybody that she was pregnant because part of the reason, she's a woman on an engineering program and she was concerned. She never studied at university level before and she thought that she would be asked to leave the module. So she tried to keep it to herself that she'd had this baby. She also had some problems a little bit with postnatal depression as well. And she was at the point of thinking that maybe this was not for her. So if we can go to the next slide, just to finish this off because I'm slightly over time. Finally, by the tutor picking up this up and engaging with Dagmar, we can see that actually that was the point in which she actually peaked and completed the module. So we can see that she actually had a big list at the end and she finally did complete. But what I think is important about this particular case is that the fact that she dipped in the middle and flat lines, that quite easily have been missed without having access to this particular data. And thank you very much. Great, Anna, that's a really lovely story. I'm going to give like one minute for any urgent questions. So if you have any urgent questions in the, you know, posted in the question poll or posted in the chat. And if you don't have any questions, that's also perfectly fine. Storm in the back of your mind. So there's one question by Marcus. Can you remind me what CMA and TMA stands for? Yeah, sure. So they're very specific. There's very specific OU steps. CMA means computer moderated assignment. And TMA is true to Martin's assignment. So we do have some unusual statements and some will step in. So at the same time, there are lots of questions appearing in the question mark, which is great. So Anna can perhaps look at this while we're, and if I go to the next presenter, so the next presenter will be Ed. And so keep your questions coming and we will keep on trying to answer them. Thank you. Ed, it's all yours. Thanks, Bob. Okay, so I'm also going to talk quickly and I'm going to try and cover the ground on how we use learning analytics to support students during the start of the COVID-19 pandemic. So it's a very specific case study. Conceptually, the work that we do is very similar to the stuff that Anna's just described, but I'm going to take a slightly different tone on it. I was thinking of going who heart to change screens, but just can we have a next slide screen, please? But okay, so we are at NTU Nottingham Trent University where I work. We started our journey in learning analytics in 2013-14 and I just wanted, I need to cover some of the basics quickly to give the context to the case study. So our focus is on the students, our focus is on student success, completing the course, progressing to the next year. And we focus on student activity. So quite similar in many respects to the example that Anna's just given. We're looking as far as we can at what we call educationally purposeful activities, that North American concept of engagement with activity that counts. All the way through, we try and frame this in the positive. So when we talk about high, we are talking about students who are highly engaged, not students who are highly at risk. Obviously we recognize that what we're looking at, these data sources that we've got within our systems are only proxies. We know that we're not mind readers and we never will be. And I think I need to stress that we don't measure socioeconomic disadvantage. So all that we're looking at in our system is those activities that a student is involved with. We don't calculate in filters based on student background because actually what we find is on the whole the disadvantages that hit students are the ones that continue to affect them when they're life through university. And I need to stress it's a resource that's available for both students and staff. So students can see their own engagement data in comparison with their peers as can their staff. And although the work that we started is very much built around a partnership process where we put a lot of our intellectual property in we're using an external tool by a company called Solution Path. The thing I want to stress is at the point where we have data we have three ways that we expect it to be used. The first of these is students themselves. So students can see their own data and use it to regulate their own learning. What we found is that those students who are already quite good at learning are the ones who tend to use the resource the most. Secondly, what we've got is a staff support success so staff making an intervention and the last one is the largest scale institutional level stuff. So Bar, if you could just press the next screen please. So just to re-emphasize we've got students with high, good, partial, low and very low engagement. Next slide please. So I just want to take two things away from this. I apologize. It was in a wide diagram and then it got rescaled and rescaled again but the basic point is this. I would argue that there's two steps involved in how we use learning analytics to support students. The first is our data processes and the second is the work that we do around supporting our students. So the data process is the stuff that you may be more familiar with. The supporting students using the model that we're using for one of our Erasmus projects we start with a trigger. What's the thing that leads to an intervention? We look at the communication. So how do we get to those students? And finally, we look at the intervention itself and I'm going to focus on an activity that we did in the summer of last year. So next slide please, Bart. So this is the summer calling campaign. This is an activity where we run a coaching call campaign for our students in recognition of the difficulties that they were facing around the transition into the COVID-19 world. So next slide please. Thank you. Okay. So in the UK, higher education teaching was locked down on Monday the 23rd of March and we had a fortnight before then where it was a little bit or significantly disrupted. Our concern was that we knew that there would be some students who would be significantly affected by this through issues of digital poverty or probably also motivation and that slightly strange disembodied sense of the world being turned upside down. We know that within our institution we cope with some of the digital poverty issues by the fact that what we do is we provide computers and workspaces. But of course, if you're working from a whole family home that suddenly becomes inaccessible to you and is a really profound problem. We also know from our previous work that the students most at risk and most in need of support are also those least likely to seek help themselves. So what we did is we set up a call campaign where we took data from the dashboard. We took average data for the last, average engagement for the last two weeks of the spring term and we then made a commitment to contact all students in the institution who had low or very low average engagement for that period of the two weeks immediately prior to the end of the spring term. Next slide please. So what we did is we ran an intervention with 30 volunteers who made just over 5,700 phone calls and we spoke to 2,300 students. And within that we made a 780 referrals to professional services or onto personal tutors. What I would stress is again, I want to repeat a point. We don't use socioeconomic disadvantage as part of our system. And yet when we looked at the students that we called they were disproportionately from disadvantaged backgrounds. So those disadvantages that hit them prior to coming into higher education continued to disrupt and dislodge their learning processes. When we spoke to our students overwhelmingly they appreciated the call. For most students it was just a thanks very much I'm quite grateful for the call. For others actually it led to a more significant change in behavior. But I want to also stress that although our tutors were very grateful for the interventions, what we found is that there was a lot of confusion at that time and a lot more information was wanted from colleagues. So wherever we do it needs to fit into the institutional framework. And then next slide please and two quick quotes. So this one I'll trot out forever. It was a student piece of feedback that the students said despite everything happening in the world I wasn't forgotten about or abandoned by uni. And in many respects that's what we're aiming for is just that point of contact that point of reassurance. And the next one is also from a tutor. It was really helpful to know that there was a safety net supporting you to help students engage and who were not responding to you. It also made you feel like it wasn't all down to you and took off some of the pressure. So this is our team of volunteers stepping in and supporting the personal tutors who would normally offer that support out to students. And then last slide please Bart or last but one. So just a couple of quick things. Number one, it was a huge problem to fit this on top of existing systems. So we at our institution have a process whereby personal tutors are the person who does most of the contacting students where there's an issue. Dropping this call centre onto the top of that did create some issues and continues to create some issues but we think there's something here really important about scaling up the work that we do to offer support to students through the use of learning analytics. Secondly, identifying a student's nothing like the same as a successful intervention. A lot of these students were deeply wary and concerned around being contacted and didn't answer the phone calls took a number of attempts to get through to them. And I just want to reiterate something that we're using from our current piece of Erasmus Plus research. Three steps to intervention. What is it that sets off the intervention in the first place? How do you communicate what gets through and then how do you structure the intervention? And that's the last of the presentation and does anyone have any questions? And if you want me to slow down, it's a bit late. No, this was perfect as always. So we have one and a half minutes left, so well done. So there are some really good questions already in the chat as well as on the online forum. So keep them going. Also, vote them up and down. So Ed, do you just want to take any question you want or? Okay, so I'll just take the one that's... So how do we help students who are not so good at learning? I've gone for the short one. How do we help students who are not so good at learning to use insights and approve their study behavior? That's a real problem for us, I think. I think that where we survey our students around their engagement, I'm giving you a long answer to say, I'm not sure yet. And the longer answer is that what we've found is that where we survey students and we ask them around, what do you think, what's your perceived average engagement in a normal week? And then how do you respond to having your data in the dashboard? Of overwhelmingly those students who say, my engagement in a normal week is normally good or high, those are absolutely the students who say, and seeing my data in the dashboard motivates me. For the ones whose engagement is very low, they're much less likely to say that they're motivated by it and they're much more likely to say they find it stressful. So if I'm honest, I think there's probably a limit to what we can do. And I actually feel that for us, that strategy of having a personal tutor being involved or having a different intervention, such as the call center, is a more realistic way of targeting and getting to those particular students. It may be that we can find ways of using nudge approaches to communicate to those students and really inspire them. But my honest feeling is that at the moment the priority is these students may not be confident enough to adapt or change their learning. And therefore what we need to do is we need to offer staff interventions of some description. Okay, I'll stop there and I'll type into the answers short in a minute. And keep the questions coming. It's great to see so many questions appearing and we will come also back to some of the questions later on. So next, Derek, you're up. Okay, thanks a lot. If you ask me, I can summarize my speak in seven seconds rather than seven minutes. And I just don't forget learner data. The previous speaker as well as Igor emphasized catching learning data and trying to make models of that learning data. What I am doing is next combine that learning data with learner data. And that's different learner data than at indicated. What we are collecting is what we call dispositional data. Next, please. So that's the mainstream learning analytics. It's strongly focused on what student on learning activities. And from those learning activities, you can do see two different things. The one is what is called process data, how active students are and then the outcomes of these activities which is called the product data and typically product data is data which you for instance see in formative assessments. Next, please. But in my own teaching as, which is so the difference between my teaching and of my case study and the previous case study is that I am a teacher in mathematics and statistics. So I am working on a micro level not on an institutional level but in one course. And this is the typical course which is an introduction course in mathematics and statistics which has two characteristics. First of all, it's a large course more than 1,000 students. Second, the students in my course are very heterogeneous. So you would like to have them in the middle learning path is to really personalize the learning like for instance Timothy McKay is advocating. Please next. And the way we make it personalized is by using dispositions. Dispositions are introduced by David Perkins in the Harvard Project Zero where he defined dispositions as everything you need for good thinking. And quite often we focus on the first which is called ability or skills but David Perkins make quite clear that beyond ability you need two further elements and that's what is called the inclination, the motivation, but also the sensitivity to the ability to notice opportunities. That's the theory of David Perkins. In our learning analytics world the dispositions are well known by the work of Simon Bookingham Schumann Deacon Crick. Deacon Crick developed an instrument for effective life from learning which is called the alley and which is also a measure of dispositions of characteristics of students. Please next. So in the mistake example what we are using is a lot of learning data we have e-tutorial systems we see what students are doing in these e-tutorial systems and next to that we have a lot of formative assessment so we see also products of it but also we collect data which is about the dispositions of our students and such data is based upon instruments for health theories on effort beliefs, on attitudes, on learning approaches, learning emotions of different types, motivation, engagement, autonomous and controlled motivation. So these are all constructs developed in cooperative social educational theories which are based on self reports of students and so students answer those instruments build a personal database and we are using that personal database in the sense that also students are analyzing those data themselves are doing a statistical analysis of that but we are combining such data with the data we catch from the learning environments. Please. So a typical application of this is given over here. So what we do as most of the users of learning analytics do is try to develop prediction models and here we try to develop prediction models in a longitudinal way. So over time we collect more data and see how good that data is in predicting learning outcomes. In the top graph you will see how good we are in predicting learning outcomes based on only activity data. In the second one we add to that activity data also product data, formative assessment what you see is there's a huge jump in the predictive power of our models since that formative assessment data does quite well. However, not in the early start of the course and as you see in the third and the last graph there we add our disposition data and you see that the disposition data helps a lot if you are predicting early in the course what students are going to do in the final exam. Next one please. So what are typically good instruments, good dispositions to use? One of the most successful ones we are using is the instrument motivation engagement ring of Andrew Martin which has cognitive versus behaviors in it and it has focusing on adaptive versus maladaptive cognitions and behaviors. Such as learning focus, planning, persistence, study management, you can see it in the graph. Next please. One which is also very well known is Dweck's Mindset Theory where you distinguish students on their cell series. So either being an entity or an incremental thinker and then how they see effort, effort for learning as a negative thing or a positive thing. And what we typically do is, for instance, apply cluster analysis to distinguish different groups of students and see how these grouping is related to their exam results. Next one and I think it's the last one. So the big advantage of using dispositional learning analytics is that when you find, for instance, such typical clusters, it's much more easy to connect your findings with an intervention because if your university offers brain and knowledge type of intervention which is based on Dweck's theory, then you would use Dweck's instrument as during this position and offer students an intervention focusing on their mindsets or the other way, if you use the Martin's instrument, you can offer students, for instance, interventions in terms of study management or addressing learning anxiety. And so that's the big advantage beyond getting better predictions is that you also have better interventions, okay? That's it. Right, so we have 40 seconds left. So Alfredo asks, do learning styles play a role in your workload in your model? Yeah, not so much the learning styles as traditionally. So what we do is, as one of the instruments, we apply the different learning styles, but that's not the learning styles as most of the people see it. Okay, so a mindful also of time. So keep the questions coming for Derek as well in the chat and then we will, last but not least, go all the way to Utah where we've heard that you can do amazing skiing. So Robert, over to you. Yeah, thank you so much. Really happy to be here and happy to talk to you all. I'm gonna talk about an action dashboard today. So moving from an actionable dashboard to an action dashboard, go ahead and hit the next slide. So here's an assortment of what you might consider a traditional dashboard, lots of pretty charts and graphs, some line charts and pretty colors, all arranged in a nice visualization that is easy to look at. Bart, go ahead and do like maybe three or four slides. The main questions here that form kind of the foundation of my thoughts on an action dashboard are questions like what are you supposed to see in all of the data? Where are the calls to action? A call to action is like on a marketing website, you have a button and they just want you to click the button. So it's by this product or it's join my course or whatever the call to action is. Usually there aren't calls to actions on dashboards, especially maybe not in an educational dashboard for an educator. There isn't like a button that says click here to contact the student or reach out to the student. How are you supposed to know what data you wanna focus on? How do you know what the data means and what literacies are required to go from dashboard to action? You can imagine looking at a dashboard and needing to have some sort of graph literacy and information literacy, maybe even some instructional design literacies in order to know what to focus on and then how to act based on what you see. So go ahead and hit the next slide. So this is kind of a little summary of what I would call a traditional dashboard. You bring all your data into one place, you display data with charts, graphs, and you make data easily accessible and action dashboard is gonna be a little bit more focused than that. It's more like a data-driven to-do list. It's gonna tell you what to do, not just show you data, it's gonna make it easier to act. It's gonna call your attention to specific urgent items and it's gonna be extremely intuitive. So now I'm gonna give you an example of an action dashboard that we've been working on for the past year. Go ahead and go to the next slide. So this is our action dashboard. You can see along the left side, there are a bunch of different groups, like trying but struggling. These are students that are highly active in your course, but are still failing. So they have a 60 or below for their grade. You can see there are other groups inactive and all courses inactive in your course. To get all of this data, we integrate with the LMS, so Moodle in this case, and then we also integrate with the student information system and we pull all of that data together and then display it here in this action dashboard. And I mentioned it's more like a data-driven to-do list because you can see the done column over on the right side where you can just, you know, once you reach out to a student, you can click done. Yeah, go ahead and hit the next. You can see that we have a clear call to action here. So we want teachers to reach out to students. And so we have this nice kind of darker button where teachers can just click. They can mark how they're gonna reach out to the student. Maybe it's a video call. Maybe it's a phone call. Maybe it's just a text or an email. And then when they're finished, they can mark done and that kind of, you know, checks off that student for the day. Go ahead and go to the next. You can see we have little progress indicators here. So like I mentioned, kind of like a to-do list, teachers can see how many at-risk students they have in each of these different groups. Like I mentioned before, there's that little box on the right-hand side. So it's easy to check off when you've reached out to a particular student, you can check them off. Go to the next one, Bart. You can see a little bit of relevant context information here. So it says when the teacher has last contacted the student. So we're implementing this with an online high school here in Utah in the USA. And they're supposed to reach out to every student in all of their courses at least once every two weeks. And so we facilitate the tracking. So it's really easy for them to know who haven't I contacted recently and who do I need to reach out to. The next one, you can see at the top here, we have a few different things to make it easier for teachers. There's a course and group filter in the upper left. And then in the upper right, we have message templates. So each group, a teacher can create their own custom template message for this group. So for the trying but struggling group, your template message might be, hey, I see that you've really been trying in my course. You have high activity and you're still struggling can you tell me what's going on? But each teacher can customize that for every group. So this is what we're calling an action dashboard. You can see that it's quite a bit different than maybe a traditional dashboard. We don't have any pretty charts and graphs but we're really focused on helping teachers reach out to students in the easiest way possible and kind of removing all of the graph and data literacies required to interpret the data and then act based on what you see. Yeah, go ahead. Okay, so this is my last slide. And I wanted to just do a quick comparison of what we've been seeing so far with teachers. So you can see on the left-hand side, these are questions you might want to ask as an educator. Determine which of my students are trying but struggling, see which students have been inactive in my course over the past seven days, see which student grades have gone up or down significantly since last week, see when I last contacted all of my students to know whether I should reach out again or not. These are all questions educators might have. And in traditional dashboard tools or even like in the LMS or in the SIS, it's gonna take you five to 15 to 30 minutes depending on where the data lives and how you get access to it. Sometimes you have to do it student by student. And in this action dashboard view, you can answer all of these questions in 10 seconds or less because it's been designed in a way to facilitate answering those questions. So we've seen a lot of success with helping teachers use learning analytics data in their practice a lot more efficiently and effectively using this kind of action dashboard. Thanks, Roberts. There are three really amazing questions already in the question box, pick anyone you like. Yeah. Let's see, do these actions get tracked? So all teacher activity in the tool is tracked, meaning that when they log a communication, we send it back to the SIS and it's also logged in our platform. And so that's how those communications are tracked. Have you been considering working on student-centered dashboards? Yes, we would love to do a student version. We've been actually thinking about a student parent version of the action dashboard where you could imagine missing assignments or other kinds of things. We've only been working on it for the past year, so we'll get there eventually. Where can I get this action dashboard? We do integrate with Moodle. I mean, feel free to contact me and we can see if we can make something work if there's interest. Right. So I think we have 10 minutes left, Igor, right? For an open round, Robin, for any question that we haven't answered or any question that is in the back of your mind, is it possible for people to speak, Linda or Igor? I mean, it's up to you what we wanna do in the last 10 minutes. Yeah, I think that, sure, we should encourage people, yeah, to post questions. So please. If somebody wants to see the action dashboard again, so here you go. I mean, Bart, you were ultra-fast in replying to the questions I've never seen such speed in an organization. I'm not sure if we said the right thing. And there was a question from Angelos in the chat for Anna. The predictions of the OU analyzed, predictions on static demographic data, insights from the OU or from other studies that comes from YouTube. Yeah. Sorry, yeah, I'm just trying to find the camera. So the static information comes from students that enrolment. So when they enrol on the modules, that data is collected already. So we collect that information to understand a little bit more about them. The fluid information is based on things such as demographics. So it might be related to, for example, where a student lives. It might be related to how their experiences in relation to what we would expect, what we call the nearest student. So yeah, it's a variety. And I mean, it uses a lot of algorithms to be able to gather that fluid information as well. So lots of things to help us to try to find the predictions. One of the things that we use is looking at, specifically at demographics, at where people are in the country or where they are in the world in relation to how that might impact on their education. Yeah. OK, there's a question from Ziva Kami. How do we trigger interest in a country like India where diversity is huge? And I think this is a really good question. I've also seen the question from Paul Prinslow about in the global south, there are of course different contexts in which learning analytics may or may not work. So does anyone want to chip in? How, you know, your approach, your way of working, how could it be scaled to different cultural contexts? Is anyone brave enough to answer this question? I mean, I may as well speak. Sounds good to me. I guess it comes back to. I think a lot of the work that we do in learning analytics is ultimately it's a kind of management case study or organizational case study. So in so many respects, the question comes back to what's the model of support that's on offer for students and what's the resource base for that? And then I guess it comes back to, OK, does having data or risk data a week earlier, a month earlier, or have you wanted to define it? What difference does that actually make? And until that question's answered, I'm not sure that learning analytics, per se, actually adds very much. So obviously, and that's speaking from the perspective of ours and the European University's approach around kind of remedial support. If we're taking the work that Bart does around that kind of predicting or curriculum design, again, there's a different set of criteria or benefits that you can gain from that particular approach. So there are lots of good questions appearing in the chat. So Robert has one for you in the chat. Can you tell if teachers or learners ever feel that the diagnosis of the system on the actions are inappropriate or appropriate? Robert? Yeah, that's a great question. In our case, we actually allow teachers to customize the groups. And so teachers can create their own what they believe to be an at risk group. So a teacher that's taught a course for a really long time might know that if a student doesn't do very well on the first exam, then they're not going to do very well on the second or third exams in the course. And so they could create a specific group for students that did poorly on the first exam that also had low activity in the second week of the course or something like that. And by teachers creating their own groups, we're kind of trusting the educator to know what the at risk groups are. And then the teacher can then reach out accordingly based on that. Does that answer the question? Yeah, brilliantly. And also coming back to some of the questions that were raised before in terms of the technicalities for OEBAC, that's a program that is outside Moodle and basically scrapes data from Moodle. Another question that was raised in terms of how do you scale this? I've just posted in the link and I realized that I was posting it only to the panel members rather than to all attendees. And apologies for that is the paper that I just posted was a really relatively small administrative intervention that we did with 500 learners because we didn't have sufficient finance to do it with all 10,000 students. But very similar to what Ed was saying, just having the opportunity to call this particular group at risk can have a massive influence on those students not only in the week or the day itself, but we show that there was a kind of longitudinal impact. The question which keeps me awake at night, what happens to the other 9,500 students that we didn't call at that particular point in time? So it is a really difficult ethical and moral issue as well. So I'm mindful of time. We have two minutes left and I mean, we've agreed with the panel members we would stay longer online if needed. But I'm also mindful that you have busy lives. So is there any urgent question that we haven't answered that you want us to address before we give the floor back to Igor? See any other question arising in the checked question and answer box? No, I think it's easy to find all our contact details. So just Google and we're happy to talk more if needed. Thank you Igor, over to you. Good, perfect. Thank you Bart, thank you all speakers. Thank you, thank you participants for sharing this afternoon hours with us. I hope it was a very pleasant journey as for you or as for me as for you because I've listened to this in the heartbeat. Those are very interesting topics. So thank you again for spending time with us as Bart indicated the contacts of our distinguished speakers can be very easily find. So please do not hesitate to send them an email if you have any other questions that you still find missing an answer in the check box or that we did not manage to address during this session. So thank you very much again. Thanks for you. Thanks to Eden app for organizing this. Thanks for speakers for accepting the call. Thank you very much and have a pleasant day.