 We have 49 participants. I'm giving it another minute and then we'll start. Seems like we're all here. Good afternoon once again. Welcome to workshop number three. Loved experiences of implementing learning analytics at scale. The workshop objectives would be explore how institutions like Open University in the UK have implemented learning analytics at scale. Our facilitator is Dr. Bart Rinties who is a professor of learning analytics and program lead of the learning analytics and learning design research program at the Institute of Educational Technology at the Open University in the UK. He leads a group of academics who provide university-wide learning analytics and learning design solutions and conduct evidence-based research on how students and professionals learn. As an educational psychologist, he conducts multi-disciplinary research on work-based and collaborative learning environments and focus on the role of social interaction in learning, which is published in leading academic journals and books. His primary research interests are focus on learning analytics, professional development, and the role of motivation in learning. Furthermore, Bart is interested in broader internationalization aspects of higher education and he has successfully led a range of institutional national and European projects and has received a range of awards for his educational innovation projects. He has published over 250 academic outputs and is the fourth most cited author and contributor in the learning analytics. In the period of 2011 to 2018, the fourth most published author on internationalization in the period 1900 to 2018 and the third most cited author on the higher education internationalization in Asia in the period of 2013 to 2018 and the fourth most published author on social network analysis in the social sciences in the period of 1999 to 2018 and the 14 most most published author on educational technology in the period of 2015 to 2018. We'd like to welcome you, Professor Bart Rinties, for this session. Yeah, thank you so much, Carmelita. I didn't expect that you would read all of that. Very impressive. Thank you so much for that and I appreciate that I'm the last session and thank you so much for your really great contributions thus far and I really enjoyed attending some of the sessions this morning. So I hope you can see my screens and if for some reason I talk too fast or too slow do let me know I will try to follow the chat as well. So if things go wrong or if you can't follow my points, yeah let me know. I've also posted in the chat the entire slide deck. There are lots of links in the slide deck. I won't be able to go through all the elements but feel free to use that. So first of all I'm going to do a little bit of a shameless plug because we're currently working with the range of canyon universities and I just gave a similar workshop two weeks ago where we had around 100 scholars and teachers and managers from canyon universities and they're currently following a program of open educational resources and that is designed by the open university. So if you're interested in that we can of course have to share some of the materials as well and in addition we've been working for around four years together with the University of South Africa in the ideas partnership project where we try to look at how can we use learning analytics to get a better understanding of how students at UNISA worked and it's also reflected in the project. It's really difficult I guess that's another interesting thing. So if you're interested in how UNISA is doing this and have a look there. And last but not least and this is again a shameless plug. We recently finished a really big editor book by 15 PhD students who looked at the notions of open world learning and without technology I would be able to deliver this workshop at a distance and in this book we basically brought together how 387,000 students in 130 different learning contexts work together and several of the chapters in that book are specifically focused on learning analytics. So again if you want to have some free resources these are freely publicly available also for creative common license. So I'm going to now summarize that the workshop talk and I will come back to this and hopefully at the end of this talk you would agree that what I've provided as a summary makes good sense. So what have we learned of implementing learning analytics and learning design at the Open University UK? Well we've learned that change is slow but it can be enhanced and I'm hoping that what we're providing you today with is some hooks to help you to perhaps implement it in your own institution as well and I've already seen this morning some really interesting examples. So one thing we've learned over the years is you can only implement learning analytics if you have clear senior management support. In 2013 we're very grateful that our Vice-Chancellor Blinstein really pushed learning analytics and we wouldn't be here at this point at our development if there wasn't clear senior management support. The second thing is what we've noticed is no matter how much support you get from the top you need that bottom-up support from teachers and researchers are willing to take risk and you will see during the workshop today that many of the big changes we've been able to make were thanks to early adopters and teachers and researchers who were willing to put their neck out to start to implement learning analytics. The third thing we've learned is by implementing learning analytics and combining that with evidence-based research we can actually gradually change the perspectives and narratives around learning analytics. A lot of our colleagues are or were very skeptical towards learning analytics but by continuously doing a bit of research with those teachers to see what works, what doesn't work, for whom does it work, for whom doesn't it work. We were able to gradually change perspectives. Then one thing you often forget and I forget this all the time is you have to celebrate your small and medium or large successes because oftentimes when I'm giving keynotes somewhere else and people are like wow the OP University is so far ahead and I always forget about how far ahead we are on certain elements and at the same time how much we can learn from other institutions so celebrate your successes. And last but not least, I'm sorry this is not last but least this number five, these large skill innovations take substantial time and effort and I think what for example Yishan showed this morning and also Rogers is it is really difficult to implement this even on a small scale let alone on a large institutional scale like the OP University. And last but not least I appreciate we talk a lot about learning analytics as a technical construct but it's all about the people and how you bring people together and I didn't really know who would join us today so I made some Paulie V slides so what you can do is if you go to the next slide I posted a link so you don't have to leave your name behind if you don't want to I won't use your data but if you click on the link Paulie V.com forward slash my name 552 and I didn't know which university would join so I just put master's university in as the handle but it's basically does your should your university implement learning analytics in the next three years and it will be really interesting to see whether you think in your own institution whether or not you think you should implement learning analytics and I'm going to give you one or two minutes time to think about this and then later on give you the opportunity of course react and yes I appreciate this is again a new system different from Yisham but yeah I'm keen to hear so thus far it seems that most of you are present think it should be implemented surely there must be someone around here who disagrees with the implementation all right so basically most of you think that we should start to think about learning and implementing learning analytics in the next three years the next question is then about okay could you in one sentence explain why you think your institution should implement learning analytics I'm going to give you again a minute or two and just type whatever you want to type there's no right and wrong and I will just read out whatever you're saying so it could lead to thank you for that it could lead to improve decision-making which is an interesting thought so it's to increase our understanding of how students learn it's evidence-based it's to improve the service that we're offering to our students you're giving some great thoughts here it gives us a window into how students engage online again it's about improving decision-making it's for course design improvement which I will talk about a little bit more it's important to be able to analyze and assess students journey to improve strategic decision-making in institution and to enhance decision to improve student outcome to support our students there are really some really interesting things coming in all of them seem to be very positive which is which is fine does anyone want to share and you can just turn on your mic or you raise your hands anyone want to share perhaps something that you're worried about because these are all very positive elements anyone brave enough to say I think this is all great but I doubt this will actually work that's fine I'm sorry if it's not but so I see some really interesting points and of course what I will do is at the end I will share this with Elizabeth so you can actually have a record of this so Mokundi sorry please tell us what you think well hi but thank you I think for me I'm an instructional design so I think you know in understanding because part of the most important work we do as instructional designers is needs analysis and you know this is this sort of forms part of a basis for that to get a feel of who you're dealing with what level they're at and so on so that it better informs how you design you know online artifacts and learning material for the students so for me I think it's a it's a it's a it's a crucial resource those are my thoughts thank you so much for sharing them yeah and I've seen them just scrolling through through your excellent comments I think it's yeah it's really important I mean somebody mentioned also the resource are limited that we need to optimally deploy them supported by data I think that's a really important point as well Ash do you want to come in yeah I just wanted to add that you said that these things take time and I 100 degree but also the challenge in that is that leadership expects it to take time and they I'm finding there's this perception of it'll happen but to me I'm like no we need a five-year implementation plan in order to get to where we're wanting to go it's not going to just happen yeah I couldn't agree it's not that simple yeah yeah so when I was referring back to the previous slide I'm not sure if I can go back to the previous slide yes I can this clear management support is not just yes we need learning analytics it indeed we had a five-year action plan I think at that time it was a four-year action plan this is what we're going to prioritize in year one this is what we're going to prioritize in year two and then we hopefully going to learn over time because it is it takes time as you're saying but yeah you do need a clear center direction of travel so that also you can take your colleagues with you on that journey anyone else who wants to come in okay so and I mean if you have any questions or comments feel free to use the chat or just talk me through so what I will do now is explain you a little bit about our our experience and then hopefully we can then translate that back to your your local context so the open university UK is the largest university in Europe and the reason why I think learning analytics has taken off so well within the open university beyond what I've already mentioned before is that we have a very diverse student population you can see that a large number of our students don't have what we call formal A levels so my computer wants to restart which of course we don't want let's do that for a day all right sorry about that and we have a lot of students who are basically already in work a lot of our students come from this advanced background and I guess it's quite similar to what we found at at UNISA so if you have students from lots of different backgrounds lots of different student needs how do you then make sure that you can actually support them and one way to support them is to use data to see okay which students may or may not need a little bit more support and I guess that's one of the reasons why according to Web of Science we're apparently number one in terms of publishing because we have a large institution but b we also have lots of problems if you like and so that's why I think it's really useful to use learning analytics so what I will talk to you is I will mainly talk about two things because for example when I made a screenshot a couple of weeks ago we had 85 publications from learning analytics I can't talk through all 85 publications so I will focus on two big things that we have done which I think could be useful so one is I'm going to show you the lessons we've learned from predictive learning analytics since 2013 and then I will share you something which we've been working on since 2005 which is called learning design and I will show you some of the dashboards there as well so what I will start with is predictive learning analytics and some of you may have already seen this but I will briefly demo it in a second so what we're basically doing in this flagships assistant code oh you analyze we're trying to predict whether or not a student is going to submit the next assignment the reason why we're interested in submitting the next assignment is that we know from our own research and from others that if your student is going to submit an assignment he or she is probably going to continue while if your student is not submitting assignment there might be something wrong and we basically developed these dashboards for teachers which I will show you in a minute and it basically gives a very quick overview of which students are potentially doing well and which students are perhaps a little bit at risk so how does that then work so what we do is we have a range of predictive learning analytics and machine learning techniques and what you see here is a kind of weekly type of activities that we would classify according to what our students does and for example a student here might two weeks before the course start look at OU content and look for example at forums do nothing then look at OU contact attend some resources etc and then eventually would submit the assignment and pass and we could compare this with a student for example doesn't do this so a student who doesn't do this for example doesn't look at the forum doesn't do nothing the week before the course starts doesn't do anything week one and week two and then suddenly looks at the forum and some structured contacts week three and if you compare this with the students who did successfully pass you can see from this schema that in a way these students do slightly different things in a slightly different way of course these are just two students and so the the choice is that these students may get substantially different but if you do this across dozens and hundreds of students you get a basic network which looks something like this which is very scary but what it basically does it provides a visualization of what is the best pathways through a particular course that students can take and of course no person in its right mind can interpret this in the right way so what we have is basically a dashboard for teachers which I'm now going to show you and hopefully the the the demo will will work I have anonymized the data but this is a real course it's a real course in year two and it has 664 teacher sorry students currently in the course and there are 416 who are active and what you see in this visualization is how the students are progressing in terms of their average engagement and how well they're doing on their respective assignments that are participating in the orange is basically the current cohort and the blue is the previous cohort so we're always comparing this with previous students and we're currently in week 30 and well we're about to get to the final assignment of this course so as a teacher this is really exciting because you can then start to see how well how well the students are doing it can for example make lots of inferences already like hey this cohort seems to be slightly less active than the previous cohort I mean I don't teach this course I've no idea this is first time that I look at this and at the same time you can then see the students that are in your course so these are real students but we have anonymized them by an automatic machine learning program so if I take the first 20 or 10 students and if you would be the teacher of this particular course you can just type in the in the chat which of those students would you like to investigate and then I will click that for you and the first one who who who shouts I will I will look into this let's go to Paris Paris okay thank you so if we go to Paris and then a teacher can then see exactly what Paris is doing and how Paris compares to the rest of the group so what you can immediately see the blue and this looks a little bit strange but the blue seems to be really flat but that's because Paris is extremely active Paris is very engaged that's extremely well in comparison to the peer cohort is super active in week 17 and 18 and is always doing well and what the teacher can then see is a so-called risk profile of the student and here the numbers are from zero to 100 and you can see we think that there's a 91 to 100 percent chance that this student will pass because of course the student is super active and here you can see the kind of activities that we know are really supportive or against the student passing and this is useful information if you're a tutor at the open university and then we provide a week by week prediction of how the student is doing so basically this is doing this is a student that will probably be extremely successful because this particular student is super active and is engaging with the right materials so thank you for that suggestion that also allowed me to explain a little bit about how the system works so let's go back to the students and this takes a lot of time I can see everything and normally our tutor see between 20 to 30 students so it takes slightly less time so which other student would he be interested to look in look at sorry in this particular case let's do Macy as Macy is here we go Macy is flagred and Macy has been flagred for quite some time so what you see here is is a really interesting pattern so maybe is anyone brave enough to interpret whatever is happening here just seeing this particular visualization maybe I can try yeah go go so it seems Macy started because was engaging was active at the beginning yeah and then after some time he dropped and picked up again then close to say the first quarter of the session or the semester Macy just dropped out or something and didn't even engage anymore in the elements or whatever platform you're using so and that shows it's flat and that's from the second and the third quarter Macy maybe yeah the last two third of the class he just didn't you know participate any longer yeah yeah thank you so much for sharing it sorry I couldn't see in the screen who that was what was your name it's Badrum thank you so much thank you so you're absolutely right and what we can see even more if we scroll a little bit down as we can see that apparently she did not click on certain key activities that we know are the kind of gateway to success like for example the summary activity in week six or seven at the same time she has basically done a lot of credits already so she obtained a lot of credits so based on that we initially thought that this particular student would submit but the student didn't and then later on we basically see lots of activities that we think are really important but we're then are basically our algorithm learns about the student and eventually we think okay this student is not going to submit so beyond just the engagement data we also know kind of key pathways to the success that we think is important and you can quite clearly see that also our prediction started to change quite quickly from week seven to week eight so this could potentially be a student I don't know about this particular student but it could be a student who had a look at the course and then decided okay this course is not for me anymore and then perhaps dropped out or there's something else going on so hopefully whoever was this tutor for Macy would be in contact with Macy and of course we've also anonymized the tutor's names in the data set but in this particular case let's have a look at what what the big name of this particular tutor was that the data would like to load he or she would have to be in contact so in this case a tutor Jenny Frey would then be hopefully in contact with her so maybe just do one more which one would you which we interested in that is kind of on the borderline Isaac let me see where Isaac is here we go Isaac Ward oh that's an interesting case so there's also a quick indication here that okay there's a little bit of where we're not entirely sure what's going on there so let's have a look at Isaac and by the way you in the slides that I've shared you can actually create your own account in the system and you can play with with the data as well so is anyone interested to explain what's what you see in the data here and what what you think is happening with Isaac I mean I don't know who Isaac is but feel free to turn on the mic and say what's going on um I think Isaac is is um engaging um with the platform activities on the platform but when it comes to academic performance um is not um doing well which regards to submissions so I think when it comes to engagements yeah is is okay but I submissions are not that good so maybe I've been issues with the learning the actual concept or the content of that course and it's not a good subject to the rights um yeah yeah yeah you see that there's like it's a little bit like an on off switch it seems you know and certain weeks the student is not present and then in other weeks in particular during assignments the student is is relatively active but as you say the assessment scores are lower than um the average I can see from the data that for example um he or she does have a previous postgraduate qualification but is also on um this is um UK speak for a poor poor region um and this student has a particular disability um so there could be a multiple of reasons why the student is on and off on the virtual learning environment and also um I mean on the one hand this predictive learning analytics seems to suggest the student is doing well but at the same time it's not doing as well as what we're predicting in the sense like we're predicting a certain high score and then the student only gets 50 so it's just barely scraping scraping by so the interesting thing for me would be as a tutor is to to really understand what's behind the data and I'm hoping while I go back to who actually the tutor is in this data set the tutor would know Isaac quite uh quite well because that tutor has been working with Isaac and the other 20 students um and then hopefully the student would be able to be well supported uh by in this case uh Jorge Davis whoever that person is um so to me what is really interesting is when there's a suddenly a turn in the in the narrative so for example I would be really interested hey there is this for example this Christina which seemed to be doing really well and then suddenly there is a red flag appearing behind that name and the system is really useful because it immediately identifies when a student is perhaps potentially at at risk and yeah that's just really interesting so what have we learned from this before before we go to what have we learned is are there any questions before I go back to the kind of presentation mode all right so I've done no question feel free to interrupt me at any point in time so what we then did is try to see okay how does we have this amazing system and this system is continuously being fine tuned based on feedback from teachers and in 2013 we worked with two teachers in 2014 we worked with 10 teachers or 10 modules in 2015 we had 58 teachers working with and you can basically see gradually over time that the uptake of OU Analyze has has been gradually growing quite rapidly over time and currently our 7 000 teachers all have access to OU Analyze but at the same time if you look at the percentage of teachers who regularly use OU Analyze it's been going down from around 90 percent in 2015 to around a third in 2019 and the latest data shows a kind of stabilization of the trend so around a third of teachers actively use the system but two thirds do not of course we were really interested to unpack that a little bit further and try to understand what were the reasons that some teachers really use it actively and others don't so what you see for example from this graph and this visualization that it's heavily influenced by the discipline so in for example business and law teachers are forced in between brackets are encouraged to really use OU Analyze and it's part of their job profile so most teachers do violence for example in social science it's not a requirement it's a voluntary activity so in this study with Theo here and also we basically then also interviewed a range of teachers to try to understand why some were very active and others were perhaps less active in OU Analyze and we identified basically five factors one was as I mentioned before whether or not faculties were encouraging teachers to use OU Analyze the second factor we identified and this might be useful is in certain faculties or in certain departments they allocated teachers as champions so these were teachers who were one of the early adopters and then by basically using them as champions they try to make and influence their other colleagues to use OU Analyze and at the same time of course if teachers struggle they knew I hate my colleague and down the hole knows how to work with OU Analyze a third factor that we found was that in certain schools departments and faculties they were really keen to generate evidence does it work does it not work or spend a lot of time disseminating these findings and we we basically see that this has an influence of whether or not teachers become enthusiastic about using OU Analyze a fourth factor was the so-called digital literacy and as we've seen from the previous exercise OU Analyze is straightforward but not straightforward some people feel very comfortable using OU Analyze others not you really have to know the system to make sense but then some teachers just don't feel competent to do this and last but not least there was this conception about teaching online currently our so-called associate lecturers who basically support our students they're not being paid to to look at OU Analyze and rightfully so some teachers said well I'm not going to look at this because this is not part of my role and others thought well it's not my role to look at data I'm here to teach I don't know 17th century poetry or 21st century physics so why should I look at the data so there's all this these notions of what good teaching is basically we're influencing whether or not teachers were I'm keen to use these athletic systems or not so the currently where we're looking at is to see can we provide these dashboards to students we've made a conscious choice not to give these students these data to students primarily because we have such a diversity of students and you may remember from one of the slides that one out of eight students have a declared disability and we have lots of students from impoverished backgrounds and this might actually be quite detrimental if you are like for example Isaac that is doing well on and off or the I forgot the name of the second person that we looked at the lady Maya I think you know it would be quite disturbing for some of you see that you're red so what we're currently exploring with is can we provide some kind of student recommender system to students that you see for example here so it gives students the kind of pathways that we think would be appropriate for that particular student and some initial results that we just published with 22 undergraduate students which we basically gave those dashboards we basically found that the majority of students found the study recommender system to be useful because they allowed them to remind themselves which learning materials they had missed and which materials they still had to do and quite surprisingly it provided them as a means to directly access in the content so rather than clicking through the VLE continuously they could just click on physical block one part six wireless communication and mobile computing and they thought that was very very useful at the same time that the relative usefulness seemed to be influenced by by a certain amount of factors so one thing was some really believed these dashboards and others were a little bit more skeptical and the people who were more skeptical were particularly worried about how they were being compared to peers and whether or not they were academically self-confident and we found that so-called good students students with high grades were much more comfortable using these systems than students who had not so good grades so one potential concern could be is that by providing these dashboards students who are already doing well get even more benefits while students are struggling might find it difficult so we're still finding out the best way forward so going back to the Paul EV question and this gives me an opportunity to drink something if you wouldn't mind do you think that your institution what you've seen thus far do you think that your institution would benefit from implementing such learning analytics system you can just say true I think this is a good thing or false what I've seen thus far um no that would not work in my own institution all right so the the majority thinks it's true and but there are also some people who say well it might not work in my own institution so is anyone brave enough to say why you voted for true or why you voted for false there is no right and wrong here um maybe I can go um yeah sure my institution just during the pandemic that was when we observed the uptake of the learning management system because now it was a condition we didn't have a choice but to start because it's a contact institution so most learning activities are done face to face but the pandemic actually required that we start using the learning management system to its you know full capacity and that means that um we were able to collect more data you know during this three years period and but I am not going to be able to say that this would work 100 percent going forward because there's still that uncertainty about where we're going are we going back to you know face to face but from my own from our own office we are trying to encourage blended learning you know where the students also engage in on the online um um using the learning management system virtually and then also engaging the classroom so it depends on what the leadership has decided or would decide in the future about these resources that we've already put together during these three years that would define if we will be able to actually use this um this type of learning analytics to um to influence you know um decisions or to improve students activities um or to improve students learning learning um um parts yeah thanks yeah thank you so much for that al-Dubaki and I guess many institutions are in a similar uh boat if you like and I work a lot with normal universities in between brackets or not like the op university or not distance learning and they're also struggling with this balance you know are we going back to blended or hybrid or some form and how do we basically make sure that we can keep on following our students and what C2 mentioned in the chat um yeah we really need to think about our digital literacy and we need to think about how we can best support our managers and lecturers and our students because we for example student at all students are comfortable with the data but we don't necessarily know whether that is true is there anyone else who wants to come in um I can maybe add to that also um yeah so I think it's very beneficial also from a lecturer's point of view because I think a lot of lecturers as much as well in institutions specifically we want to get everything more online and more into blended learning but they feel like no we don't want to do that we can't see engagement uh we don't know what our students are doing so that's the barrier and that's what's stopping them from wanting to move to online but I think when you bring in learning analytics you can almost show them that there's even more you can see from having some of your content online and it's not that you can no longer see how your students are engaging or know what they're doing um you can even get more from it and actually help students more specifically in where they might need help where you wouldn't really see in a physical classroom so I think that it's very helpful for the lecturers to kind of get them more on board with um wanting to teach online and in blended learning yeah thanks yeah you're absolutely right Sarah and quite interestingly um I will dig out the paper a little bit later but we did an experiment for two years with a Czech university um who was it was before COVID the only data they had were student assessment data they had no virtual learning environment data and the only thing we did there was to basically run the algorithms and then see which students might need a little bit more help purely based on the assessment data and it was done with the course of engineering and we were able to show that just having the opportunity to collect the data and then having follow-up conversation with students who will potentially in these amber or red categories that already had a substantial impact on the way how teachers engage with students but also students like oh we're suddenly you know where somebody listens to me or is taking care of me and we found that um very nice letter from the rector that said that the retention improved from 40 percent to almost 90 percent um since we used oh yeah analyze but he wrote in a kind of funny way could you please send help because our lecture rooms in year two are not big enough to accommodate the new group of students so it's a little bit of an anecdote but I think what is nice of this story is it doesn't need to be a fully fledged you know all bells and whistles approach just having some predictive analytics in your system but with some very basic data could already be really insightful um so thank you for sharing that um so the next part that I would like to talk about is learning design because I think learning design is perhaps the oftentimes the kind of missing piece in using really powerful learning analytics and the reason why I say this will hopefully become clear in a in a minute and then a recent review by how learning design is being used in Europe and of course is this Europe is not in southern Africa Barbara Watson and Paul Kirsner basically stated that the old university was one of the few people that were able to link uh conceptually the link between learning design so what teachers do in terms of their construction of course designs with what students are actually doing based on what teachers are designing and the reason why we think this is so useful is that if we can better map how teachers are designing courses we can then start to see whether or not our students are following that or not and if they're not following that how can we then make sure that these things are better aligned so we have under creative commons license we have a so called open university learning design initiative where we map teachers into seven categories so currently you're listening to me and you're watching and this is what is called so-called assimilative information it could well be that you're also finding more about this particular framework so you perhaps you're at the same time googling what is the all the framework or perhaps what we were just discussing in a minute with Sarah or Abdul Varki or others is to have communication to talk between each other and to talk in groups that's another learning activity you could actually do some productive activity I could give you some learning analytics data and you can play around with that or you can experience what it's like to work in a dashboard but we did some interactive activities with the poly V element or I might do at the end of this this worship attest and see if you remember what we've been talking about and don't worry I won't but by mapping these seven kind of activities what we are basically doing is trying to get a kind of brain scan of what teachers are expecting students to do and then by having the brain scan of what teachers think the students should be doing and linking that with what students are actually doing you can see if there is a potential mismatch in terms of engagement or not and what you see here in a quite scary graph this is the open university or at least this was the open university in 2016 so what you see here is that there's a lot of so-called watching and listening activities around 40 percent of our courses spend a lot of time giving students useful information about learning a particular discipline with one course course 53 giving nearly 70 and 90 percent of assimilated activities and then there's certain courses that provide a lot of assessment and in particular course 57 and 157 and then you have all these courses in the middle so for example for me as a practical learner I would love to be in course 94 because there is 60 percent of the courses about doing certain activities so maybe as a next exercise for the poll UV course it would be really nice if you could indicate you have a you have one one choice one dot if you could indicate where you think if you're teaching in your institution or where you think you know you're teaching most of your teachers are where do you think your teachers are spending most of their time in terms of teaching which of these seven activities do you think is representing your institution and it's anonymous so don't worry about it you can just post it at point wherever you are so somebody just posted um oh it's mostly assimilated um yeah okay that's why it's mostly assimilated yo all right somebody said finding information great okay there's a somebody else that's productive so while while we're throwing darts at this at this virtual board is there anyone keen to talk about you know um what does a typical course at your university look like yeah sort of we got to get a little bit of flavor of of the course design in your in your course and there's again there's no right and wrong we're all amongst friends here okay i will i will take a step because i'll talk about my experience yeah thank you for being in a classroom um so my experience have been in the the very first one which is the assimilation side of things and the and finding information with a little bit of productive but it's not too much with the because there were practice activities that you needed to do to submit your assignment and that's it and this i'm talking about a postgraduate qualification that um yeah it was supposed to be a hint on qualification but i found it to be in the very first block where there were a lot of material posted and loaded but um not so much in terms of engagement that yeah yeah and i find it personally really difficult because i'm also much more in the middle space but doing a workshop at a distance is very difficult um to make it engaging but i personally would love i mean if we would do this phase to phase i would hope that there's lots of interaction like for example in roger's seminar but i can see that for most of the courses most of you ticked um uh heavy assimilated which there's nothing wrong with that that would fit a lot of students at the op university as well but um yeah there also were some really interesting patterns emerging there so knowing then basically what your course might look like what we're then basically doing sorry is trying to map how does the way how teachers design course at the op university how does that influence what our students are doing and we did this analysis in 2016 and replicated this in 2017 um we basically identified four different types of teachers of course this is a this is a crude summary but we found certain teachers who mostly designed lots of individual learning activities which we call constructivist learning there were some teachers that had a strong assessment activities then there were some teachers like elizabeth mentioned who were supposed to lots of productive hands on activities and at last but not least there were courses that were so-called social constructivist in nature so lots of working together um activities and what we found um was that there was a positive predictive value in terms of how teachers design courses and what students were actually doing so if you design lots of individual learning activities um students tended to work slightly left over time while if we put students into groups it seemed that students were spending more time over time in the virtual learning environment and what we found was that this did not significantly predict whether or not students were happy about the course or whether they were um able to pass the course but what we found was a really interesting tension our students loved um being taught in a way that was so-called a constructivist so giving them lots of materials lots of individual tasks to do and they absolutely hated working together it was one of the biggest predictive factors was whether or not they had to work together with others and this could be a reflection of the op university context we're working in but at the same time the biggest predictor whether or not students continued and actually completed the course was whether or not students were forced by the teacher if you like to work together and we initially analyzed this data on an aggregate level and then in follow-up work we did this also in the week by week basis and we again found very similar effects where indeed how teachers design communication activities and in particular also assessments substantially influence student retention and quite interestingly and this is a visualization of one particular course what we found was that how teachers design courses really fundamentally influences what students are doing so I'm going to give you an example of a computer science course there's no right and wrong but what you see here is the workload per week and the workload per week in in our course is around 10 hours that you can see here and in week one of the course the students were spending 10 hours on reading and watching about a particular course then finding information one and a half hours working in groups for one hour students and productive activities for a bit etc etc and the teacher had a activity that was basically programming a first computer programming task which was marked and they had to read about the assignments they had to experimentally code and they were assessed and they were assessed and he in this particular case then decided okay that would amount to like say 35 and a half hours in week five so this teacher is a really smart guy so he gave students free the week before work on these assignments so any predictions what the the the student actually the brain scan of the teachers this what is what the teacher thinks is a good design and I'm not saying it's good or bad or what did you think the students did and any guesses they had a free week so let's have a look so the beauty is of course with all this analytics you can actually then see okay what is happening so here you see the same course the same mapped out activities you see when there are peaks and troughs and you can see the virtual learning engagement per week indicated in a red line so what what does what does it mean what can you see in this lovely lovely visualization again there's no right and wrong there it's just nice data yeah so it correlates very strong with activities and as indicated that even when a teacher indicated while in week four you're not supposed to do anything there was a substantial amount of engagement by a student and actually in follow up analysis most of the students worked ahead of time to make sure that once the assignments were in place that you know they've actually got everything ready to go any any idea what's happening between week 11 and week 14 what could that be yes students prepping for the assignments what are they prepping for three weeks yeah at stern break it's the UK follows the the northern calendar so it's Christmas and you can also see Easter in here and what is really interesting from from our students perspective is that we have quite a lot of engagement during the festive breaks because that's when students basically catching up on work and of course this is just an example of one course but if you do this over hundreds of courses what we found and this is quite startling is that two-thirds of how students are behaving on a week by week basis is determined by us by teachers and so I'm going to repeat that because it's oftentimes when I talk to my teachers at the open university they're always complaining about the students not working hard enough and not getting it but what our research continuously shows is that how we as teachers design courses fundamentally influences whether or not students have a successful learning experience so this is in a way really exciting and it and it helps us to basically you know understand again no matter what all these data show you if you wouldn't see if you would just imagine to see the red line of the VLE engagement without the collared data about what teachers are expecting you would see lots of peaks and troughs in your data but you wouldn't make out as you did very successfully that it seems that the peaks and troughs are tremendously related to when assessments are taking place or a critical reader might say okay there's always a peak under our certain assessment but why is there no peak in week 20 so perhaps in week 20 the assignments are too easy or perhaps the assignments are so well designed but perhaps the assignments in week 4 or 5 and week 14 15 or week 22 23 are too heavy and that's why you see this massive peak so having seen this do you think that an institution like yours do you think it would be useful to also think about implementing these kind of learning design on top or next to learning analytics so that's where everything is everyone believes it's true or no one is is able to link to the to the survey does anyone want to speak up why you think it would be useful to implement this this learning design hi hi boss hi how are you doing i'm very well how are you good man good man my name is milika i know thamsa from uk zillian i'm an instructional designer and so for me anything that has to do with data is highly beneficial i mean data allows you to predict any issues allows you to see certain trends now the only thing for me would be the extra work that you'd be given to someone who doesn't love data you know just simply um i don't know you know just like a normal lecture of another subject you know how do we you know just i don't engage them for example either how little data we present maybe at first and then allow a person to maybe dig deeper and further into more data but there is just that little bit of a hurdle i think for someone who just doesn't subscribe to the same bible yeah thank you so much for that milika and i couldn't agree more i mean of course i love data and we probably all love data because otherwise we wouldn't be here but how do we convince people who are scared by this and don't see this as part of their day job so thank you for sharing that and that's a big big big big hurdle i think stanford you raise your hand uh thank you i i'm not so sure if this fits in but my thinking is that it's true and it's false the reason being that i think if um the learning is not well designed it may not necessarily bring out the desired outcome and it's true if the instrument is well designed and then it will bring the desired uh the desired learning outcome uh that's my contribution thank you yeah thank you so much and to me i mean for me learning analytics and learning design is a marriage made in heaven because no matter how good your learning analytics profile is if you if you don't know what's what's the drivers behind it the teachers are it's difficult to notice but at the same time we continuously use the learning analytics data to talk to teachers to see whether or not they could potentially reconsider their design so for example this particular teacher readjusted their workload allocation based on the data that we shared to him and then said yeah maybe it's not such a good idea to design it in this particular week so he then altered it and then the next year we then had a look at okay does the alteration of his learning design did that then lead to a positive influence on students retention so you can basically always use your data to further improve your design so i think abdu lbaki had a raised answer um yeah so i think what my comment is very close to um yeah just completed comments about continuous improvement and i think that should be the ultimate goal and if you have that in mind if you are not able to measure you know your metrics that would help you to improve then definitely would not go anywhere because you're not able to manage the current situation or what is going into what you you're building so i think it's very essential to know those variables and those constructs that kind of impact on your your development or the improvement of your course so it's necessary to at the design level to even include the analytics element before even implementing you know the course on your learning management system yeah thank you so much for sharing it couldn't agree more sarah yeah i just wanted to say because i just agree with everything you've been saying um especially with learning design and learning analytics so what i've actually been looking at recently is how um so obviously if you have a course that's not very engaging so it's a lot of reading um i call it consumption what i'm yeah like that yeah like that you can't really tell much about engagement you can't really get much analytics from it other than consumption analytics how much time they're spending on it so then what we what i've been looking at is how if you design a course based on different pedagogies which are labeled so consumption is the lowest then it goes up to curation which is sort of like assignments and researching conversation which is talking working in groups anyways it goes on but based on what the activity is and how it's been designed if a student interacts with that activity there's sort of factors that you can see just purely based on the fact that they interacted with that activity they did better in the course overall so all they were less likely to drop out and then each of those pedagogies became different predictors so it was so interesting to see that just purely based on how the course was designed the students were just going through the steps if they did certain activities and ones that were more engaging they did well but if how most courses are these days are very consumption based a lot of just watching videos is only yeah you can't really get as much information from that on the one side but on the other side you also the students could be doing even better if you design more activity into your course and more engagement so it's like on both sides it's so important with the way lecturers do design their courses or how it's put online yeah I couldn't agree more I mean we as teachers and we as instruction designers have such a fundamental impact on our learners experience and I think there's a there's a new uncharted territory with us being able to link what we're designing with how our students are reacting and therefore it's great because we can then start to unpack what works for some and may not work for others so thank you for sharing it I'm mindful of time but let's take one more to go sorry I told you sorry I don't know that's fine it's fine bro my mind is just a comment really and I was doubtful in even contributing because I'm actually so I'm responding to what I'm learning from this presentation I'm coming a little bit from a technical background and I think concepts like learning designs are things that are you know less familiar to me so so I mean I was just running some imagination here to say you know often when you get dumped with data source you know if you are if you are following an expect design you can learn about all these measures you know that people are using elsewhere that can really help you to kind of give the kind of insights that you need but often when you go into the data a lot of those measures are not you know possible because some of the information is not there and I'm I just couldn't help but wonder when you talk about the marriage between learning analytics and learning design you know to what extent is it possible for people to say you know we're going to design our cause you know and in a way that we are intentional also on how we want to measure success you know from it you know so so so that all these things that are key to measure success you you don't find out whether they are possible or not you know once the system is there and running but you actually become very intentional about designing your cause in such a way that this measures are derivable thanks yeah this is a fantastic link also to this slide because of course the downside of any learning analytics system is that of course you focus on things which are easily measurable so what you see on this particular latest development will work normal the range of European universities is to help teachers to automatically allow them to map all these learning designs and then this particular course is about teaching entrepreneurial competences so if you're very skeptical you could say well how would you know that spending 62 and a half hours on acquisition or as I think Sarah mentioned consumption which I think is a great term and a hundred and a thousand one hundred and forty minutes on production how do we know that this actually leads to entrepreneurial competencies if for example at the end of this course I don't know what what the course actually is about is just you know a multiple choice test towards the end shouldn't be for example in this case develop a learning analytics metrics that measures whether or not students are really becoming entrepreneurial and whether they've developed this competency so I guess what I'm trying to say in an illoticulate way is we should somebody was was mentioning this on the at a very interesting conference we should measure what we value rather than value what we measure and and it is difficult and obviously we understand there's nuances but of course once you put them in a dashboard people will start to optimize it based on whatever is depicted in these visualizations so we have to be really careful what we're interested in but what I wanted to show was this particular visual debate which I think also was previously raised about how could you use learning design in in an online environment and in this particular case it didn't bring an example with me but when teachers are working through mapping that course design they could indicate how much of their time is online how much of that is face-to-face how much is synchronous or asynchronous etc etc is a teacher present a teacher not present so there's a lot of work in progress the tool is freely available if you click on these right in the link there again in the slide share and another thing that I just wanted to show because I'm mindful of time is how do we know that the learning design is fit for purpose for our students so most of the work of that's far shown you is across a large cohort of students but in latest worded by Samar Mureshvi she looked at students from across the globe and how they responded in MOOCs and massive online open courses how students responded to particular learning activities and she found that students from let's say Africa reacted very differently to students let's say from Latin America or Asia in terms of how they engage with courses so we found basically in our research that when we're designing courses we often design based on the kind of UK perspective and that in a way advantages UK students while other students from different regions like Latin America or perhaps I don't know Asia might prefer different types of learning activities and again the great thing about learning analytics is that we can then start to see okay which types of designs were best for which types of students and potentially in the near future we could then start to offer slightly different versions to slightly different students so some further reflections because I want to leave a couple of minutes for questions big questions to ask is who owns the data is it you is it the institution is the student and what about the ethics of doing all this in all the studies what about the professional development of our teachers as mentioned before and a critique which I think was really good and by Paul Kirsner he reviewed our work and he was basically saying well the work at the op university is great but perhaps the op university is optimizing the record player so we'll continuously fine-tuning and fine-tuning our course designs based on our learning analytics but maybe rather than optimizing within a box perhaps we should throw away the box and start completely new and rather than playing old records on our old record label so I hope that by showing you these two big examples I hope and hopefully have shown to you that you need those clear management support you need to work in a bottom up way with your teachers hopefully by gathering evidence you can convince some critical friends and colleagues that what you're doing is right and I hope that you realize that these innovations take time but you need planning and it's all about people so I'm really looking forward to your questions and if not then that was also great to talk to you so thanks so thanks and looking forward to hear if there are any comments or questions Padru you have a question yes thank you so I have a question and just a comment also my question is to guest to the there was a slide where you are showing the many use of your learning analytics was it a learning analysis yeah and then I was just thinking it would be nice to see I know most of the variables they were kind of social influence you know it suggests to them being the lecturers being asked to use the tool you know because or maybe the management asked them to use it but I think it would be interesting to see other factors like say gender the age the experience you know and maybe you could look at this you talk to model that's the unified technology adoption adoption model which regards to those other okay yeah which regards to those other factors that would help us to understand more when it comes to the adoption of this kind of soon what exactly is being influenced in their behavior and intention to use or the actual use of the system then one other comment is about culture so I think culture is a very big thing when it comes to change or especially when yeah actually change or change management and I feel even teachers themselves because from the office that I sit in we do staff capacity development and what I've also noticed is that even staff themselves lecturers themselves they fail to engage when they are in in sessions like this and they expect their own students to engage so I think we both staff both teachers and students actually need to to have that culture change about how they they engage in in virtual spaces and maybe on in their course also so I think it's both ways and that's quite interesting this is a protest or I haven't done any research about it but it is what I have observed so thanks yeah thank you so much for your question and also thank you very much for being so active in this of course I couldn't show everything and so one thing that our predictive learning analytics models does indeed is to look at demographic variables and together with the student union we had our first as first ethics policy in the world 2014 where we work together which variables are we allowed to use and which ones were not allowed to use and we revisit that every year together with the students union and if you're interested in that google open university ethics policy learning analytics and then you can freely use that and what was quite interesting is that from purely a predictive learning analytics point of view all these demographics are interesting until week one once the first week of the course starts the actual engagement of students in the course is much more predictive than than what what what teachers might think so in our only analyzed system for example teachers are able to sorry I'm showing you the wrong screen teachers are able to sort based on all kinds of questions of analytics like for example their socioeconomic income or what kind of accessibility needs they might have I'm waiting for the system to load but many of these factors that I'm listing here passing probability gender the region they're from their occupation age etc etc many of these factors basically disappear once we basically have good and data about students actual engagement so I guess my my gut feeling is let's remove all those but our teachers find it really useful and you're absolutely right in terms of culture and I guess what our research continuously shows is what may work in one faculty even in the same institution will not work in another so how can we make sure that we can actually learn from each other let alone learn from across different cultures and all right so I hope that answered your question and rolling you want to come next one yes okay thank you prof I think prof thank you very much firstly let me just say that this was a very very informative presentation certainly learned a lot from this and I'm really amazed by the system I'm sorry that I will always be you know almost biased to the technical capabilities but I'm very very impressed prof so I haven't been in this forum for a very long time but you know a couple of years ago I did visit Milton Keynes from a previous situation and one of that variation learning model that you shared you know was was presented to us and you know I've never been the same since then and a couple of years down the line you know I have I have a better understanding of these things than I did before and my question is that when we start looking at you know using this kind of sophistications to build models that can help us that can help us predict how do you govern to ensure that these these issues of biasness you know are handled as as adequately as they can and I'm asking this question because often you find that with a big deck like the one that you have shown you find that mostly the design is is expert led and in some cases it can be data led but probabilistic issues on the data sometimes leads to almost you know an entire expert led and then my issue is that of course you then have an individual who will who will you know make a lot of decisions and how do you govern that and how how what kind of structures do you put in place to ensure that the people who can review what those algorithms are doing that they can work with something that is not too far away from them to be able to contribute and weigh in on on what the algorithms should be doing I think the the the last point the um prof is that um and I think you've really communicated well that this is something that will need time but my question I'm thinking then I'm imagining if you really wanted to start you know you know um what is the real effort I mean can you do this with a team to get a couple of guys together within the institution um how much of the outside help um you know um did you get in putting a system like this together um you know I I think you know I don't know how how realistic one can be in terms of if if you're really going to take on a project like this what are the things that you have to um take into account and I think you've already answered my my very last point it was with regards to the adoption and you made a comment there about somebody says they teach English and they are not here for the data um and and I've actually had the responses similar to that and it bogged me how um because I'm on the support side how a lecture could not see the potential of of how this data can help them and you know and I asked this question a little bit earlier they they almost always defer or refer this to to college administrators and so on but I don't know if you can maybe maybe um at high level uh maybe give ideas or thoughts on in your experience what has been the single biggest contributor in getting the lectures to adopt you know the people that we need them to to kind of champion these things thanks yeah thank you for that um I guess I mean whether you're starting in big or small um the the thing I've learned most is that um you need a couple of people who are eager to willing to take risk and if you if you have a couple of teachers perhaps in and your computing department or perhaps in in the department that are comfortable with data um if you link them with people in your technical administration and perhaps some educational researchers build a really small agile team they can quite quickly make some big results we're not expecting you to immediately you know I mean all the systems that we've built are built by the Oak University and they're publicly available but how do you make the first step and that's that's that's difficult and I tend to prefer to work in small interdisciplinary teams um and by bringing those small teams together and pilot with one or two courses you can see what works and doesn't work and then gradually build it up over time um but this works again in the UK context and we spent three or four years together with UNICEF um and I appreciate it's very different over there um so um and I I'm not sure if somebody from UNICEF is here today but it's difficult and there's if it was easy it would have been done already years ago so um take it one step at a time yeah Karmenita you want to take over thank you we still have five minutes for any last comments or questions if not then we can hand over and I'll just say thank you very much to um Prof Rinkis for this wonderful and very interesting and very educational workshop it is was really eye-opening what can be achieved with learning analytics so if that is it thank you very much and I'll hand over to the essay area um Excal if not to Elizabeth thank you thank you Kamelita um unfortunately you're handing over back to me um my job is very easy I want to also release you um well in in advance like in in five minutes I want to get to that um first of all thank you thank you to everyone who registered and participated in this workshop without you guys we wouldn't have had a successful um workshop at some point we had about 84 participants in in the room and that is a biggest biggest milestone in terms of this workshop in terms of the learner analytic institute in SAA um and I hope even in the future the number will grow as we we learn more about learning analytics and learner analytics and student analytics and student success as we want all of us to improve um how our students um succeed at the end of the qualification as well so I would love to also thank all our facilitators for today they were phenomenal I I think um a month ago when I I was really looking for organizers or facilitators I panicked a little bit because I didn't know most of them but after today I think um they will be receiving lots and lots of emails from all the South African colleagues who are here to learn more about this space which is a very interesting space um I would like to thank um profs to get who started um with the note in terms of uh there is a student behind the data and I think from all the three workshops that we had today we've learned that they they are students they are human beings behind the data that we work with um and I want to also um thank um Dr. Tsai, Yishan Tsai, Roger Scalisa, and Barberis for all the workshop and for sharing your your work with us and imparting your knowledge with with us in South Africa so don't be surprised when your inboxes become influx because of the wonderful work that you did I also want to thank um the SAE team as a co-opted member of SAE in this forum the institute um on behalf of SAE we would love to um thank you WC for hosting us in day one um and we will see you tomorrow I also want to acknowledge Karen she should be sending you um the feedback from links that you need to complete for day one and day two please we value very much your your input so that we can improve from here on what as well um and I would also like to thank my team from UWC everyone who facilitated the session today even tomorrow please come back ready to continue this journey don't forget tomorrow we're starting um the session at half past eight is just registration but the session will start normally at nine o'clock um and I also want to appreciate our UWC choir for allowing us to use their content um to entertain you throughout the session because you were going to get bought throughout the whole day if we didn't have them entertaining us and sharing their talent with us so I would like to send my appreciation to them as well I will see you tomorrow have a lovely did I make it five minutes yay I made it um if there are any comments query question you are more than welcome to to comment thank you thank you thank you thank you if not bye enjoy I'm giving you the rest of the afternoon free to enjoy bye bye thank you bye bye colleagues