 Good afternoon everyone and welcome to this session on Learning Analytics at Scale. I am very pleased to introduce Professor Bart Rienties who is the Professor of Learning Analytics in the Institute of Education Technology in the Open University. So Bart I'm going to hand over to you now and please everyone place your comments or questions in the chat box. Thank you. Yeah thank you so much Emma and as Emma said you know really just post any questioned thoughts, ideas about what I'm going to present in a minute and I'm keen to hear your thoughts and I'm going to start off really weirdly because I'm going to show you the end first and I'm going to do a shameless plug as well so apologies for that. So in one month time we will launch a book on open world learning where 18 amazing PhD students in the last seven years have looked at how we can actually make sense of data and the complexities of technology and that book will become freely available on the 9th of February and you're more than welcome to join and during the this seminar we will talk about 136 learning contexts across the globe where we've looked at over 380,000 users. So this is really exciting so if you're interested shameless block apologies for that. So I'm going to tell you about the summary first so it's strange so this is the highlight. So what have we learned at the Open University of Implementing Learning Analytics and Learning Design at the OU? Change is slow as we know about technology but what we've learned is basically six big lessons. So the first lesson if you're thinking about implementing learning analytics or learning design and I know that many universities in the UK are thinking about this, do it with clear senior management support. If you have clear senior management support this will make such a huge difference for you to allow to play with the learning analytics approaches and learn from that over time. So the top tip number one is clear management support. Top tip number two is even if you have senior management support it's really essential to get the buy-in from teachers who are working on a day-to-day basis and I would urge you to basically try to select teachers who are willing to take some risk because once you are able to bring them in in terms of your learning analytics approaches they will be your future champions. Tip number three is when you're implementing all these innovations think about how you can link this with some evidence-based research because I'm sure that your institution is very different from mine but there will always be critical people in your organization who may think learning analytics doesn't work the way they think it should but by combining the initial implementations with some good research you can start to showcase what works, when does it work and under which conditions does it work and one of the things we've learned in the last 10 years is if one teacher has been able to provide some really convincing evidence that it works in his or her context that's much more powerful than me telling you that it works. The fourth thing I've learned in these large-scale implementations is that when you're focusing on fine-tuning all your learning analytics often you tend to forget all the successes you've made along the journey so one of the things I tend to forget is you have to celebrate all your small and big successes. The fifth one is that large-scale innovation like learning analytics takes a substantial amount of time and effort so don't expect big miracles in your first week of implementing learning analytics it will take years to get it right and last but not least it sounds strange but learning analytics is not about technology it's not about algorithms it's all about the people so one of the big lessons I've learned is to really think about how can we bring people along in these change processes so I've given you the highlight so let's talk a little bit more about what we then have done at the open university and yeah you're very quiet at the chat so feel free to disagree or agree with me and feel free to add any comments so the open university as many will know is a strange but interesting university we have a large diverse group of students coming to the open university and as a result we also really need to make sure that we support our students in the best way possible and because we have such a wide diversity we were probably one of the earliest institutions to implement learning analytics and apparently according to a web of science the open university is number one in the world in terms of research output on learning analytics and the reasons why I think we're so active on this is that we've realized that learning analytics can be a really powerful tool for us to help to support our students so what you what I will discuss in these 20 odd minutes is two examples of large skill implementations at the open university in the last 10 years and of course we do much more than this so I can only talk about two cases but I hope that these will be useful to you so one I will talk about is predictive learning analytics and the second element I will talk about is learning design so let's quickly talk about learning analytics so we've been doing learning analytics with the help of the knowledge media institute since 2013 what you see here is a dashboard that we give to our teachers and in this dashboard teachers can basically see how students are progressing how well they're doing assignments and it also allows them to see which of their students are potentially doing well and which students are at risk and how we do this and I won't go necessarily into the technical details but what we do is we basically map out all the kind of activities that a particular student is doing in a particular course and then based on what's so-called good students are doing we can then see a kind of okay oh good students seem to take these different paths to Rome and we compare and contrast this with students who perhaps might take slightly different paths and perhaps as a result not necessarily submit their next assignment and perhaps eventually drop out so by mapping out these journeys of students we're able to identify unique learning path of our students and that basically is then translated back into these dashboards because it's quite difficult to interpret this Beijing network so what have we learned from this so in 2013 we started literally with two teachers and we worked very intensively with them with an initial version of the dashboard and trying to understand how we could improve the dashboard what makes sense to them etc etc and we went to 10 teachers in 2014 as you can see here we have suddenly data available in 2015 we had 58 teachers and we gradually moved up in terms of uptake so in 2018-19 3000 plus associate lecturers had access to data and now nearly all our associate lecturers have access to all you analyze if they want to so you come on the one and see a kind of tremendous growth in terms of availability of this learning analytics resources but at the same time if you look at the percentage of teachers that make active use of all your analysis story is slightly different so in for example in 2015-16 89 percent of teachers regularly logged into all your analyze and in 2018-19 only a third of teachers looked actively to this so of course there's an interesting paradox we've grown tremendously over the last couple of years but at the same time the number of teachers who are actively looking at this seems to well decline if you like so what could be the potential reasons for this and you will see throughout the slides and I will make the slides also available afterwards we've done a lot of research on this to try to explore this and what we found when we were starting to dig into the data but also when we looked at the qualitative experiences by interviewing lots of people to try to understand why there is this interesting paradox is we found a range of factors that seem to influence the narrative so for example if you look on the top left graph you see that for example in one faculty of business and law 56 percent of teachers regularly use all your analyze while in other faculties these things seem to be much lower one of the reasons for this difference was that in certain faculties there was a real active promotion of all your analyze and of this predictive learning analytics systems the dean was really supportive there were champions redriving these things forwards so by having these champions in place that encourage other teachers to also make use of all your analyze another thing we found was that certain faculties were really keen to generate lots of evidence does it work does it not work how can we make sure that it fits with our contacts and again in business and law but also in science there were really proactive groups that really wanted to make use of these tools a third factor that we found that influence uptake was whether teachers were digitally literate in terms of did they feel comfortable working with these dashboards and in some faculties and in some groups of teachers they may or may not have felt really comfortable using all this complex data and that of course is a thing that we have to carefully address and then last but not least what we found was the way how teachers saw their role was substantially influencing whether or not they were using these predictive learning analytics approaches and in some some teachers basically said well it's not my role to continuously look at data I want to have that personal one-to-one relationship with my student I call him or her every day so why do I need to look at these tools while other teachers said well I didn't know that so and so was struggling and the predictive analytics showed us a certain pattern so I think what is really interesting from from this data is that it's really complex even if you have all these amazing systems in place how do we ensure that we can make sure that our teacher are accurately prepared for this and that we appropriately also support them in terms of financial means one of the things I haven't mentioned for example is that at this moment in time our associate lectures are not being paid to look at tools like this so I could imagine if I'm not being paid to look at learning analytics tools why should I so there is of course some some more work to be done on this so I'm just briefly going to pause and I forgot Emma to to ask you to post I don't know if you posted a link of all your analyzer if you're mindful of the time I'm not going to be able to demo this but there is a there's a link and you can play with all your analyze yourself if you're interested I'm just posting that okay fantastic fantastic right so are there any questions before we move to the to the next part of what we've learned at the old university there's a little bit of a delay between in the broadcast and so well if you want to leave a moment or two for questions or yeah I don't see anything lots of comments and lots of hello's but no questions at the moment I mean just keep them coming in if you have any questions I'm happy to to address them also towards the end there shouldn't be enough time for that yes so so the next part that I think would be quite nice to think about is to think about you know how do students then react to this because we've done a lot of work on predictive learning analytics giving all this data to teachers but what if we give this to students how would students react to this and what you see here is some of our experimental work and even poses to the right nice question about the types of data that we collect so we collect when people log in what kind of activities do they engage with are these activities really important in terms of learning outcomes you know we we collect a lot of demographics data but mostly what we find is that the kind of activities is extremely predictive so what you see for example also on this screen here is a kind of Amazon recommender so it recommends for example to go to block one part four geography is history of nobody about this course or consider to participate in the discussion in sense surgery so these are all based on machine learning approaches where we see okay for examples participating in sense surgery can increase the quality of your work and also can increase the likelihood that you that you pass your next assignment and we do this machine learning approaches for each individual course because they will never to be different so what we then what we then did is we gave 22 students in an experimental setting we gave them access to their own data and their own dashboard to see how they would would react if we would give that to them and what was really interesting that the students found that the study recommender was really useful for them for two for two reasons one is it allowed them to to remind themselves all the learning materials they may have missed but also quite interestingly they started to use the study recommender system to directly access content that they were clicking so rather than going linearly through the field they used the the study recommender to quickly move to the next element but what we also found was that there was a kind of variation in terms of whether it was perceived to be useful or not so some students who were more trustful in terms of the learning analytics dashboard were more likely to to support this notion while others were slightly more skeptical towards learning analytics and thus were more negative about comparisons and what we found was that students who were were academically performing better and you could call them able students but i'm careful to use them so the good or able students seem to be much more confident in terms of sharing data for example about how they were doing about their their peer comparisons while the perhaps the not so good students were much more critical towards this and i think this is a really important lesson for us because we have such a wide range of students coming to the op university so we really need to carefully think about how do we provide these dashboards in in an appropriate manner and i think this also links very nicely to what Tara is saying in terms of how you you know what does an engaged student look like and and could providing these kind of dashboards perhaps you know upset really engage students or actually encourage students who perhaps are not so engaged because it's easy for them to to see what they have to do and next so it's it's i think this requires a lot more thinking and and work so thank you very much for that comment so the whistle tops top tour of predictive learning analytics and we've written a lot of papers on this is over let's move then to learning design and learning design has been a big piece of work within the op university and lots of people are working on this and i'm really grateful for all the work that has been done since 2005 and what we have seen and learned and this is a recent matter review by Barbara Wasson and repolkiationer they basically argue that there are very many different learning design approaches but one of the few institutions in the globe that have actually linked learning design with what students are actually doing is the op university and i'm really grateful that you know they're so positive about our work and what we do within the op university is basically categorize the kind of activities that a teacher thinks the student needs to do so for example now you're listening to me in this seminar which is kind of attending and and reading and watching materials and perhaps you're also googling who the hell is this guy and why am i listening to him you're finding more information some of you are perhaps communicating in the chat and asking questions or perhaps some of you have now gone to all your analyze and started to play with the dashboard or perhaps you've experimented with some of the features in the dashboard and well i'm not going to assess you today but of course a lot of activities in in in in education are also involved around assessing so these seven broad categories are a way to categorize what a teacher thinks a student is going to do and obviously the next question of course is do students actually do this so this is a kind of blueprint and apologies for the for the complex figure of what the op university looks like so what you see is that there are lots of course these are 157 course at the op university lots of course at the op university have around 40 percent of so-called assimilative activities around 20 percent assessment activities and then everything else in the middle is more kind of engaging activities and at the same time you see a substantial variation in terms of how teachers design courses so the obvious question is does that influence how students are actually learning so what we have done in a range of studies is to basically categorize if you like the kind of learning designs that our teachers design and then we've looked at how students are actually engaging with the so-called learning designs so for example we have a more traditional distance learning course where it's a lot of focus is on individual learning which we call constructivist learning designs and the perhaps the opposite is the social constructivist learning design where students are expected to work a lot with peers and with the teacher to co-construct new knowledge and what we found was if you link the way how teachers design courses with their actual engagement in courses is that there was a positive relation between social constructivist learning designs and engagement and a negative relation with constructivist learning designs but quite interestingly in this links perhaps back also to Tara's point is we didn't necessarily find a link between lots of engagement and whether or not students were satisfied with the courses and whether or not these courses led to better or worse student retention over time but what we did eventually find and this was data amongst I think 110,000 students we found that students absolutely loved designs that were so-called constructivist learning designs so lots of individual constructions of learning materials and trying to make sense of that while they in particular didn't like to work together with students as there was a negative predictive value in terms of student satisfaction but the single best predictor of whether or not OU students passed their module was whether or not teachers included communication in their design so in follow-up work we tried to look at okay this was data this was aggregate data across an entire course and there could be lots of potential biases in this so in follow-up work we also did the same analysis but then analyzing it on a week by week basis and we again found a similar pattern amongst 37 courses so the way how teachers design courses seems to fundamentally influence how students are working through the courses and more importantly whether or not they are satisfied about the courses and whether or not they pass courses at the op university and in some of our latest work when looking at how teachers are making all those complex decisions about how much time should we spend in week one or week five or week eight on certain activities we found that this really substantially influenced how students are engaging over time on a week by week basis so for example here you see the the same visualization of that teacher that I've shown you here before and the red line is the average engagement by students and some amazing modeling by Juan Neugen showed that if you do this across several dozens of modules and that around two-thirds of what our students at the op university are doing on a week by week basis is determined by how we design our courses so I'm going to repeat that because it is a staggering finding in a way so it basically says that we have a substantial influence as teachers how our students are progressing through our courses so if in a way you replicate this findings and say okay half of my students are failing in a course you could say well 69% of the reason why half of your students are failing is due to the way a teacher may have designed a particular course because our research and practice shows that learning design is really important so mindful of time and I want to keep some opportunities for asking further questions so our latest work currently is working with a range of european institutions it is an example of teach four zero and the repeat project where we're translating all this work into kind of broader tools that are not only fit for purpose for distance learning universities but are also fit for purpose for if you like normal universities and colleges so if you're really interested in how you could perhaps think about implementing this in your own context I've posted some links here so I mentioned the summary at the beginning I'm just going to not repeat it again at the end so I'm really looking forward to hear your thoughts and any questions you might have so over to you Emma. Thank you so much that was fascinating and I've got loads of questions but I better to be democratic I better start with the questions from the audience so I think you have responded to Yvonne and Tara already on their questions but we could if we had another question from Eric Klisp seven on do you have data about the type of learning activities in all our most open university courses and if so how did you how did those courses undergo a design process then as a result of that? Yeah fantastic question so day one that's not actually day one but at the beginning of of a new course we basically work together with teachers to try to map out what's in their head in terms of what they think is a good way to design courses and then we provide lots of suggestions and creative ways to think about their learning design process and then the learning design team worked with these teachers while they're producing the various learning activities and learning approaches and so in a way we use some of that data to basically help to inform the learning design process but also teachers can of course adjust what they want over time and all our new courses basically work through this kind of process where the learning design team works with teachers to make sense of that and at the same time with our current modules that are running in presentation mode the learning design team provides lots of support in terms of further fine-tuning some of these learning design decisions that teachers make. Great we've got a question from Manish does a social constructivist design lead to student retention? Does it impact performance and he's added in a comment here or is it that those who have progressed or passed are the ones that engage with anything? Of course you have to take everything with a pinch of salt because of course these are all within the university within the open university context so it may be that when you replicate this in your own institutions you might find that for example learning designs that have lots of assessments work for your particular context. The reason why we think that social constructivist learning designs work well is it forces if you like students to work together and we know that within the open university studying at a distance might be at times a bit lonely so by having to having to meet and work with your students and with your peers it might be just that much you know it's like going to a fitness club if you know that your friend is waiting at the fitness club you perhaps might be more inclined to actually go there if it's safe to do so. So I think within an open university context this seems to be one of the key drivers and in a way it's very surprising that we find this because relatively few of these activities are really kind of social constructivist activities. Great thank you very much we have another question from Tara. Were you able to look at engagement in particular activities with particular types of students for example minority groups or students first generation at university or so on? Yes that is a really important question because the kind of one size fits all solution obviously doesn't work and we've done quite a bit of research on in particular being students and I'm just trying to because I'm mindful that I can't post myself I'm just trying to find the links that Emma can actually share that so I didn't incorporate this in the in the in the in the paper but we've also looked at for example how Black Asian and minority ethnic students are engaging and what we for example found in the study that has just posted in the chat and hopefully Emma can share this is that we found that for example if you correct for all the other factors that we've controlled for that Black students and for example seem to engage 7% more actively relative to white students but nonetheless their performance is unfortunately still less so of course this raises really big questions about okay if we have really active students why are they not performing equally well if they control for all these other factors social economic economic demographic etc etc so there's some really interesting but difficult questions and I think we need to unpack a little bit further and I'm I don't think we have any other questions but can I just ask did the pandemic have any impact on engagement with OU Analyse yeah what we have noticed is this is literally a peak in engagement you can lovely see this Lynn since March 2020 we have much more engagement of our students online and that's of course logical due to COVID and people working from home so we can see that more and more students are basically moving online and also a lot of students have decided to go and study at the OP University because she can do everything online so for us that's a positive effect and I'm going to sneak in one last question sorry did you have to engage with students before this work was undertaken to deal with concerns about using their data and it was something I was interested in do you have like an education process for that's a really good question I will try to post a link as well but I'm really proud that we are the first university in the world in 2014 to have an ethics policy on learning analytics and we have heavily engaged with students and we continuously engage with students to say okay we currently were using these variables some of these variables are you know not predictive do you think we should still include those etc etc etc there is a continuous process with students and there is informed consent at the beginning when students registered that they agree that we're using all this data thank you so much and I just want to thank you for a really fascinating talk it's really interesting to me I'm doing a bit of research on learning analytics myself at the moment so I just will post your link to the ethics policy in the comments thank you so much Professor Enthys for a fascinating talk and thank you everyone for attending and your really thoughtful questions thank you so much