 Good morning colleagues. Am I audible? Thank you very much. I'd like to welcome you all to this first workshop for in security this year. I'm Let me introduce myself. My name is the end room at Ota Macaulah. I work for UWC I'm in the unit for quality assurance and the information management within the department of institutional planning. Our first session this morning is on essentials of learning analytics as we are going back to basic it has been seen to be befitting that we look at the essentials and we have a very capable presenter who's going to be taking us through the essentials of learning analytics and she's none other than Dr. Yishan Sayi all the way from Monash University she's a lecturer in the Faculty of Information Technology she's also a member of the Center for Learning Analytics which they call COLEM. She's also a member of the Digital Education Research Group at Monash University and also an associate scholar of the Center of Research in Digital Education and also the Center for Research in Education Inclusion and Diversity at the University of Edinburgh. You can see that she is really a formidable person to address us in this and the aims of this session really is to introduce us to the key concepts of the learning analytics giving us examples of the tools, the current trends and challenges and also looking at certain policy considerations. I hope we are all going to learn from what is going to be shared with us this morning by Dr. Sayi and we also encourage you colleagues that if you have comments and questions please place them on the chat box so that if we have time at the end of the presentation we can be able to let Dr. Sayi respond to those without wasting any further time I would introduce our request Dr. Sayi to take the stage over to you Dr. Sayi. Thank you very much Thank you Andrew thank you very much Hi everybody it is my great pleasure to be here today and so Andrew has given quite detailed introduction about what I do, where I am now so I've been working in the field of learning analytics for a number of years and my own research focuses on investigating social technical issues related to the use of data and technologies in education specifically learning analytics and I'm also interested in how we can facilitate more effective feedback processes based on the use of student data so I am an educational researcher by training myself so I'm not a computer scientist this might tell you a little bit about learning analytics as a field which is very interdisciplinary so I got my master's and PhD degrees from the University of Cambridge and then I moved to the University of Edinburgh did my postdoctoral research and then moved here so I've moved a lot around quite a lot and and you can tell from that that I have perhaps a mix of accents I'm originally from time one I don't know if there's any South African element in the way I speak at all but my husband is from South Africa he is brought up in Pretoria so I'm very excited about this opportunity to be able to share with with everyone here whether you are researchers or faculty members or administrators or students working with student data yeah welcome to this workshop and thank you for the opportunity so as Andrew mentioned that in this workshop what I'm hoping to achieve is to bring about all the essentials about learning analytics so this this is about an introduction to learning analytics and I'm hoping that in the next 90 minutes now probably less than 90 now but I think we end to finish in so in my time it will be 8 p.m so your time will be 12 p.m I guess yeah so hopefully in the next hour and bit more we will be able to I will be able to help you get a better understanding of what learning analytics is what it can do and I will start with some key concepts about learning analytics and then I'll give you some examples showing how learning analytics can be used to support student success engagement followed by a discussion about the key challenges that we are seeing in the field about learning analytics and then I will also talk a little bit about the policy aspects and moving on to the final part which is about looking forward what's the development of the field at the moment what will we need to be aware of so I there will be a few polls that I will invite you to participate and I'm also hoping that after the the presentation that I'm going to give to you we will have some time left before some group discussion yeah um so I hope everybody can see my my screen and um right yeah I can see I can see the chat now great thanks Vart um okay so let me just uh move on to the next this slide so what is learning analytics um so I want to invite you to participate in this poll you can go to this website directly by typing this in or if you have a digital device you could also scan this QR code um let me see if I can oh I can't actually sorry this this was a screenshot so I can't actually paste it in the chat oh thanks thanks it's great okay um I think I actually need to launch it so sorry okay it should be available now if you log into this webpage you see the link that Elizabeth just inserted in the chat or scan the QR code you should be able to see these two questions and participate in it let me know if you have any issues accessing the poll you're able to see it let me test it myself um it's still saying the poll is coming thanks we now can see it okay good okay I see that the responses have started to slow down so I'll just give one final minute and I will close this poll okay let's view the result right here's the result so what is your experience with learning analytics okay um so we do have half of the participants knowing a little bit about it and um some have used it even and a small number of people have no idea about what it is okay that is fine um so I hope in the following in this session we will be able to increase at least um this part b yeah and hopefully this people who answer a will be reduced to zero and hopefully we'll be able to encourage more people who will in the future uh then select c and as for this what clouds we're seeing that's data and statistics these are really coming out as the the big idea that came about about when when it comes to learning analytics the first thing that came to people's mind is about data and statistics and then we are seeing other things as well student data mining classroom behavior in our school knowledge engagement and um that activity is risky right okay yeah that's an interesting one um so we will actually touch upon pretty much everything that people have put here and uh we'll return to talk about this is learning analytics um many about data and statistics what else do we need to know about it okay thank you very much for participating in this poll um so let me move on so the most commonly adopted um definition of learning analytics is that it is the measurement collection analysis and reporting of data about learners and the context for the purpose of um understanding and optimizing learning and the environments in which it occurs so um I think before the pandemic started um we are really seeing a shift towards um digital learning and we are seeing a lot of um um learning happening in the online space for example the use of online learning management systems to facilitate learning and also the the growing popularity of um of of MOOC massive online open courses so we're seeing a lot of data being generated through student interactions with these online spaces and a lot of the data um has not been used uh adequately and a lot of data that has been just collected and collected and collected because um the system allows the collection the institutions maybe have routines in collecting certain data um but but before learning analytics field came about um this was already a common practice but um people started to realize that actually we can do more with the data and we should really make a better use of the data to understand what students are doing what it can tell us about the learners and about the learning patterns and help us understand learning in general so learning analytics the idea is about measuring what's happening in the learning space collecting the data analyzing it and then generating reports that can be used to inform better decisions so for example if the the report is delivered to teachers and administrators um such reports may be able to inform some decisions related to instructional design or student support um and if if it's delivered to students uh we're hoping that this kind of report feedback about what students are doing would be able to give raise their awareness of their own engagement with learning and for their own progress and even the likelihood of success so as to perhaps trigger some changes in their learning behavior and ideally also leading to some development in the cognitive level so that is a very simple view about what learning analytics is and a learning analytics cycle includes four basic elements learners data metrics and interventions so we have learners that interact with learning materials and platforms and the others in the learning environment and that can all contribute to certain data for example log data students visiting the learning management systems they view videos they click certain materials or they participate in online forums interact with whether it's with educators or with peers all these kinds of interactions leave some digital traces so we then can collect the data which can be further processed into metrics that can then be used to major learning to understand certain learning patterns and result in interventions that hopefully will then have certain positive impact on learners so that goes on as a iterative cycle in a learning analytics cycle and so essentially learning analytics is about informing decision making whether it's about learning informing decisions related to learning or related to teaching or at the management level we can also use learning analytics to inform decisions for example related to resource allocation in a learning environment so so learning analytics is about how we can move from data to action with the results the analytics results what can we do about learners and what learners can do to help themselves move towards desired goals so it's not just about telling us about what learners are doing it's about presenting that picture that can help us to move towards certain action desirable actions that can lead to positive change and so there is a short video that I would like to recommend to you that you can easily find on youtube by just typing learning analytics in a nutshell it's a three minute video introducing some key concepts about learning analytics and in fact if you don't have time to stay throughout this workshop I would just like to recommend this video to you to watch if you don't have time to stay yeah to stay with us to the end okay so Moodle this is just an example that these are screenshots that I took from my own course that is built on Moodle a learning management system that we use in Monash and I believe that many of you here today you may have different learning management systems in your own institutions and nowadays learning management systems would usually come with some basic stats so for example here these examples are showing just very basic views that a particular student has has had of during one month's period of time how many views they have had they have had in this learning space of various materials and how many posts they have generated in overall activity so we can see that there is no no post generated by the student which is not surprising to me because we actually had a separate online forum for students from Moodle so the data is not pulled in but what is important here is that as an instructor you have the knowledge of your own learning design so you are you need to interpret that with what you know you you you don't just take what is showing to you and equally important is for institutions that are adopting learning analytics it's important not to use learning analytics to pass on judgment this way because because every learning context is unique so we need to be able to we need to consider the differences between different learning environments different learning design and we need to also be aware that educators have the best knowledge to tell us what's happening in the course and here I can also see that this is just a very simple bar chart of this particular student's engagement with the online learning management system on different day and I can see that there is a peak here which is again not surprising to me because this is actually around the deadline of our first assignment so I was expecting to see this but this can be how even though it's very basic very simple but it can give us some ideas especially when you are when you have a big class of students so in my class I have about 140 students and so Monash is a big university and this class size is considered small we have quite a lot of classes with 400 500 students so it can become very difficult to provide a proper support to students now when you start to notice that they are students who seem to be disengaged for a certain period of time which is perhaps unreasonably long then and it doesn't coincide with any holiday time then that can be alarming or if you have students who requires special support of your or if you are seeing certain students based on their academic performance they seem to struggle a little bit or based on their attendance they seem to be a little bit disengaged then this kind of basic stats can still provide you some general idea about what students are doing so the purpose of me showing you this example is to let you know that learning analytics is not something that is like far to reach in fact many of you are probably already experiencing learning analytics now of course there are more examples that I would like to show to you later on and this is just another simple example of students engagement with a forum that I mentioned earlier that is separate from Moodle it shows you the number of people participating in the forum the total number of views and how many threads have been generated and answers and comments so it just gives you a general idea about the traffic in the forum the engagement level so we can say that learning analytics is a feedback process and involves two types of agents so in the traditional feedback scenario we have learners as agent B and we have instructors as agent A now if peer feedback is used then we will also have peers as agent A and of course in some scenarios you have expert as agent A for example in say experiential learning placement kind of scenarios you may have some experts involved as agent A providing feedback in learning analytics we have algorithms participating as agent A as well so algorithms would we make use of algorithms to help us make sense of of the outputs from students and to even help us provide feedback based on that so how a feedback process looks like is that it starts with some distinct goals task and measurement of a particular course that we may have you may communicate to your students about specific learning objectives and specific tasks and how you are going to measure their performance so you may have rubrics decided for for your course and learners upon receiving this information they will trigger an internal self-regulated process so they use their existing knowledge beliefs and attitudes the attitudes to learning to to sort of interpret what is presenting to them and based on this they may set up certain learning goals and then some strategies and tactics to achieve those goals and these will then lead to certain process of their learning for example they may start working on an assignment or preparing for for an exam in others and so this whole process would then continue to lead to result in this whole cycle of self-regulation students learners themselves they will continue to regulate this whole process examining what sort of reflects on their own learning process now externally this process can lead to some evidence that that we can then assess so the evidence it may be an assignment an artifact that the students produce or it could just be the certain learning behavior that is observable so for example we have we have talked about log data that we can collect in online space so log data mainly shows us students behavioral engagement with those platforms how many times they viewed videos or how many posts they have generated in the form so there is various evidence that we can collect as outputs from students learning for example if we use multimodal learning analytics we can also collect information about the student's location in the classroom if it involves some moving around if the activity involves that we can also collect information about their posts to see if students are under particular stress when certain activities happen and I'll give you some examples later so this kind of evidence can then be measured and compared analyzed and even prediction can be made based on this kind of data which can then generate certain information and be delivered to either students or agent A B or agent A so if it's used by students then it continues to to facilitate this internal regulation process and if it's used by agent A it can be used to further inform the next tasks or adjust the goals of the course or the way to measure students so this is an overall feedback process that learning analytics can facilitate so instead of seeing learning analytics as a tool or an artifact which you see it as a process a process that involves data collection involves measurement involves reporting of student data okay so we can see that there are two types of feedback loops here if learning analytics if it's a teacher facing learning analytics then we rely on teachers to make sense of the feedback delivered through learning analytics so teachers would interact with the learning analytics algorithms directly and they need to have relevant matrices to interpret the data that is presented to them and then they interact with learners they make then use that to provide for the feedback to learners so in this circle learners interact directly with the teacher and of course they whatever is produced by them will continue to feedback through learning analytics now if it's a learner facing learning analytics then the cycle we were seeing is over here that we are seeing learning analytics and learners that are interacting directly so learners are in charge of interpreting learning analytics they are the ones who have to make sense of what they are seeing in say learning analytics dashboards so in either groups we are seeing that learners and teachers play different roles and the kind of literacies they need may vary the expectations of that okay so let me show you some examples of learning analytics tools in terms of learning analytics tools that facilitate feedback for teachers the first example I want to show to you is called loop so it gives you again just some general idea about the the traffic of student engagement based on the log data so what kind of materials they are accessing over a period of time and you can also view this on a weekly basis to see at which point the students seem to interact with the learning management system more frequently than on other days for example causing to lecture time students may interact with it more you can also see whether students are having catch up behavior which will be indicated in the red color so students today only access certain materials after the given week or do they do that in the same week or even before so this can give you some idea about about student access to the materials which is something instructors are very interested in and user flow is another example this is developed by a research fellow in in Monash University so it allows teachers to see where students come from so we have two campuses well actually more than two but here it shows students in Clayton campus is emulation campus so we can see the students grade and we can see their major where they are coming from relation campus or Clayton campus and what the second major is and another example here shows us that it's also possible for us to collect certain data in order to understand how students move from one activity to another so here like one point one one point one this is about for example within the same module in the same week you may have multiple activities so how do students move from one activity to another and into the next week's activity into the comeback so the red color shows that students go back to the previous activity whereas the gray one shows forward movement so this also gives educators some ideas about the the excess patterns to learning materials and potentially helping them to to see some relationships between different learning activities and in how students maybe find they may make more use of certain learning materials to support them to work on the other learning materials so you can see that connections and here the same tool it offers the view of the students access using a heat map to show which which activity especially attracts most traffic for example 2.2 attracts a lot of activity activities access and so so it's 4.1 and the tool also has an annotation function allows students to annotate the the the materials that they are accessing whether they find it particularly important or they find it confusing or or they would like to to to get some help or they even actually insert some comment themselves so we can see for example this this material 2.2 seems to be particularly confusing to students so as an educator this can then give me some idea about okay maybe my students are struggling with the concepts that I'm covering in this this particular step a lot so perhaps I would start to think about providing additional support or just to just approach students directly speak to them during my lecture to find out what is bothering them about this this particular learning material the concepts covered okay and here's another example zoom sense which is which is to be used for breakout room sessions in zoom and so as many of you are familiar probably putting familiar with zoom now but using zoom to facilitate breakout sessions can be quite effective especially when we are not able to meet face to face now the one downside about that is it's quite difficult for you to actually know what's happening in those breakout sessions it's not like in a face to face environment we can actually see what's happening just by scanning around the classroom but when breakout sessions are happening you don't know you you may be able to jump between different breakout sessions but you have limited time you can least maybe five minutes in each room and and once you enter one room you don't know what's happening in other rooms so zoom sense is to address this issue it collects data about how students interacting with each other during the breakout sessions and then give educators this dashboard showing where the students are interacting in each of the groups and how students are interacting so for example q4 we can see that group one here shows all inactive and you can see that indeed there is no connection between any of these students whereas group four all active we can see clear connections between all these students so each node represents one student it also allows us to track student progress in google document because we google document is quite a popular tool that we have used a lot in our teaching especially to facilitate group discussion so space for students to work together so it helps us to track whether that group one nothing is happening or in gray or whether they have completed section one in blue or it's still going on and we can also see if groups are not discussing much and how much time i have spent visiting each group so this gives educators some ideas about what's happening and this example core signals it's a very early example of learning analytics it's the goal is to address retention issues at Purdue University and they make use of predictive algorithm in with with this tool relying on student performance data and effort so effort would be more about behavioral data based on log data they apply academic history and student characteristics and based on all of these the the tool would be able to predict the student's likelihood of success or failure and then it will be indicated using the traffic light system to to teachers and students themselves they also have a traffic light system but it's it's not about it doesn't the direct predictive result is delivered to educators um yeah okay so that is another example core signals and this example silver toss this is actually an example from the University of Edinburgh where I worked before I moved to to Monash so silver toss learning is an external service provider and we're interested in knowing what we can find out from our 65 um our master's courses we want and so they helped us to to to to analyze our data and to make predictions on students likelihood of continuing in in the program so overall we can see that it's the continuation prediction is 72 percent it's not too bad that could be better and they also generally also show us some powerful predictors for example the average number of days enrolled before starting the the the program and the modules registered for the next term the term season when students are enrolled credits attempted and others all of these can be useful predictors to help us predict students likelihood of of continuing this program so you may also say that these predictors can also potentially be useful um providing some sort of um diagnosis um information in terms of the the best uh points to to intervene or or what we can do what we can look for look into to understand potentially the the action we can take to uh support students and and uh and uh prevent uh the students from from failing or um or struggling with the learning okay so those are just some examples about uh learning analytics tools that can facilitate feedback for teachers and obviously there are many more but I'm just showing you some of them and so let's have a look at some tools that can be used to facilitate feedback for students okay um so this is one example that I briefly mentioned earlier about uh multimodal learning analytics that you can uh collect um multimodal data that happens in uh the in the learning context in a physical context so this this um this is this is exemplary space in a nursing environment teaching environment where students they um they have a few simulation activities so they have to they have to interact with uh fake patients so the based on their movement around the classroom their interactions with each other with the patient and the action they take uh we we we were able to provide this dashboard showing them what they did what they have performed what they have missed and so this kind of information can be um can can be can be useful evidence for not only students to reflect on on their own um learning progress what they did but it can also be useful information for educators to draw on when they discuss with students about what they did and how they can improve further similarly this example last see it is used to facilitate student counseling so this tool was developed in KU Luven in Belgium and they um they so they they this is based on their first the data space is from their first year student questionnaire of learning and study skills that they have been collecting for years and when the tool was designed the idea was to visualize the the result and to to be able to provide some clear evidence for for for study counselors to be able to provide better support and advice to students so this is just one particular aspect of of this survey which focuses on time management and it shows students which group they're forcing so like this one says it's the student is uh forcing this average group and the majority of students fall in this average group in terms of their time management and it shows that in previous years if students who um whose whose overall um study efficiency is higher than 80 percent then they are shown in green dots and the middle group is showing yellow dots and the bottom group is showing red dots and based on the the the previous year's data we can see that students who um who are in this green dot group um they um a lot of them are able to achieve very good performers um whereas the yellow group middle group um varies a little bit but we are seeing more of them um um attending weaker results like not so good and similarly for the orange groups so then students even though this does not really predict what will happen to them nevertheless give them some information to reflect on oh this is what happened in the previous year so if I um I mean this group then maybe it's likely that I will only attend average grades so if I want to improve then maybe I need to work on my own uh study my own learning skills so that's the intention of this um learning analytics tool and um feedback this this uh another example called repo um it's um it's it's to um help students create learning resources so uh sort of to cross source learning resources students themselves they can share their study notes they can create quizzes and you can share this with peers and then peers can vote on that and then um they can all sort of make use of it they can answer the quizzes participate in the um in various activities that their peers generated and then their system will um will be able to generate this result showing them about their proficiency in various topics and based on that it would also make further recommendations for them to improve their proficiency in various subjects okay and this tool synergy it is to facilitate peer feedback um so in this example they um there they are five areas that students are asked to evaluate and we are seeing that um we have two um two uh assessors here and they have some big disagreements on these two aspects so the tool then can facilitate these two students to have a conversation uh sort of maybe to resolve any disagreement they may have um so yeah so synergy is a tool that can facilitate this process and it of course also collects certain um data that can then um also generate some analytics results and communicate that to teachers so teachers will be able to see um whether students have completed this peer assessment uh evaluation task and whether they are collaborating on this or not so in fact many of these tools they they may have both students and teachers facing um dashboards to to or tools um aspects to facilitate different activities um a car writer is uh writing analytics it can um analyze uh writing piece that students upload to the system um it does not tell them about the grammatical errors it's it's more focusing on the rhetorical moves um based on specific types of writing so for example we can see here there are various symbols um representing what um a car writer can pick up from the writing students uploads for example initial thoughts and feedback about significant um experience so we can see that for example here starting over the semester I've learned to overcome personal barriers put up by myself in a new work environment blah blah blah so this um introduces initial thoughts and and challenge of of a new surprise surprising or unfamiliar ideas um and they also give students warning about sentences maybe too long maybe you you will want to consider breaking it down so it gives students some feedback based on that analysis about how what they have done well um what they can see from from the writing and students themselves can then read about this and based on the understanding of what their teacher wants them to do with a task then they can know okay have I achieved what I'm supposed to do um or is there any other things that um a car writer hasn't picked up and um can can I see that myself in my writing or have I missed it so you can see that there is a nice note here from a car writer which says that computers uh don't understand writing like humans so a car writer may highlight rhetorically good sentences that actually make no sense or leave unhighlighted and or leave an unhighlighted sentence that you feel is actually really good and it's fine to disagree with the feedback but um it's also your job to check your facts so this is very important that to to um recognize that algorithms are not perfect they can be helpful but you are the one the users whether students or teachers interacting with their own analytics you should um be the one to make the best judgment so instead of just taking everything in you should be critical about what you are seeing bringing in your own knowledge of what happened in the learning or teaching process okay untask is another example or feedback for students that can be facilitated using learning analytics but this one is actually a teacher-facing tool so based on the data especially uh that we can collect through learning management systems and also potentially other students other systems like student information systems that you can incorporate it into the tool um it pulls different sources of data into that spreadsheet and educators can also manually upload more data into that spreadsheet that table and then based on the uh data um untask uses if this then that a very simple rule to allow one piece of feedback um to be generated and delivered to a large group of students so it can be personalized this way instead of writing 100 200 different pieces of feedback to each student you can actually um just just write one email for like this um so you can set up various rules if student didn't watch the first video or second or third then um they get different kind of feedback so i put all the rules in i write the feedback um that i would like to send according to these rules if student didn't watch video one then they get this feedback if they didn't watch video two they get this one um if not uh if they didn't watch video three then they get this one and then adjust the safe and send and then based on student's actual um interaction with the videos they get different feedback so overall uh you can see that um learning analytics tools they they they can be used to generate analytics that are that fall in four areas descriptive diagnostic predictive or prescriptive and most of them are descriptive as you may have noticed that they tell us many about what happened to students what they did or what happened in the past um but some of them they do apply more advanced analytics um techniques so for example uh the um that early example um Purdue signal core signals that traffic light it makes uses of predictive analytics and um the the other example I showed to you that was based in the University of Edinburgh us using Civitas learning service they show you um some powerful predictors that can potentially be used to um diagnose what what happened why uh why we are seeing certain um phenomena of what can be the courses and and potentially then it gives us some ideas about how to intervene where to intervene and when um and prescriptive analytics um focuses more on providing recommendations personalized recommendations to uh users so for example repo um that one that facilitates students to exchange them and create learning materials that can give them some suggestions about some relevant materials they can study further to enhance their skills so a little bit like um if you're familiar with Netflix it may give you some recommendations based on your watch history and even give you scores about the match of their recommendation in your profile yeah so those are some basics about learning analytics so I want to encourage you to participate in this poll again the last question um which uh ask what type of learning analytics is most useful to you um so I have started this you should be able to see the question if you can go back to the same site that um that I shared with you okay right let's see the result okay so um we're seeing a lot of interest in predictive analytics and we're seeing more responses coming in um we have sort of almost equal interest um in descriptive analytics and diagnostic analytics um this so around the area of prescriptive analytics um maybe we can open the ground up later for some discussion about potentially uh what why people are showing more interesting one type of analytics than others is there any particular concern that people may have okay so thank you for participating in this poll and I'm going to move on to the next part of my talk I'm aware that I've actually used the most of my time in introducing those key concepts about learning analytics but um I I hope that that was useful providing you some ideas about what learning analytics is and what you can potentially achieve and that learning analytics is not something like very far from you it perhaps is it's actually something you have already been using without knowing um but it is important for us to also think about some challenges that are associated with learning analytics so even though learning analytics showing promising results and a lot of exciting tools are becoming available we're seeing that large-scale adoption is still pretty low um but my colleague Bart Rentes will will share with you their experience in successfully adopting learning analytics in very large scale so that's really really exciting lots of precious lessons to learn from but um we are not seeing this in many institutions okay so it's still pretty low large-scale adoption is still pretty low and we are seeing that it remains an issue in the field that a lot of activities around learning analytics are placing more focus on analytics and based on learning and as we if you can recall the the world clouds that was generated at the beginning of this session we're seeing that when it comes to learning analytics most people think of data think of um statistics but uh so that that is indeed most where the most attention is drawn into um but we need to come back to to learning we need to remember that learning analytics is about learning is supposed to support learning and benefit learners so let's look at some of the challenges in this area so in this paper the authors argue that learning analytics is an interdisciplinary field it's not enough for us to simply focus on data science even though data science techniques are very important without this without data science we it's it's in fact almost impossible for us to facilitate those key dimensions of learning analytics including collecting data measuring analyzing and reporting data about learners so it is important but it's it's it's not enough we also need to pay attention to the theories for example learning theories so without theories it's difficult for us to actually assign any any meaning to the data to the patterns we are seeing it's difficult for us to interpret what what is happening what students are doing so you need to have certain theories for for you to to ask the right questions to to to generate the right hypothesis to test and to to to be able to interpret the associations you are observing between digital traces and and learning constructs for example intrinsic intrinsic motivations units of regulated learning etc without theories it can be very difficult for us to know if certain learning processes are activated or not and and whether what we are seeing is actually meaningful or not or whether it's important or not and how learning outcomes are associated with different learning conditions and of course design is another aspect that we need to pay attention to so design not just about tool design it's also about learning design what's the learning context what's the study design so we need to pay attention to all these areas so don't forget that learning analytics is about learning and in this article pigeon picks and mouse clicks the authors arguing that it's more important to know how students interact with information than how much they interact with information and in this article embracing embracing imperfection learning analytics the authors are challenging this obsession with computational accuracy in the learning analytics field and they argue that there's a need to focus on learning design identify the right thing to measure define the impact by examining the extent to which the energy space feedback has led to learning design sorry learning game so it's it's not that seeking computational accuracy is not important it is important but we need to we need to make sure that we are asking the right question we need to we need to have the right priority that first of all we what we are using learning analytics for is not just about data it's about enhancing learning so we need to get that right first okay so so following that I want to highlight some of the aspects that are really important for us to consider if we want to ensure effective learning analytics so learning context is important when we design a tool that is that that is meant to be applied in a specific course context then we should expect that it may not necessarily really work that well in other learning contexts and similarly in different educational sectors it's a very different context k-12 and higher education are very different in different countries have very different educational systems so in our own study that we carried out in monash we're also finding this that we when we are trying to understand our the the how the educational quality evaluation process was that what people needed we're finding that there was a lot of disagreement the indicators of educational quality that the university has has sort of assigned and so people feel that the data in the central university provided to them did not actually answer the right questions that they wanted to to to answer they instead they feel that the those questions were just being imposed on them and the data that is given to them does not really give a facial picture of what is actually happening in their own environment yeah mukong yeah yeah I see your your comment in the chat I'm happy to share the slides so they are all the papers I mentioned are referenced at the end of each slide and and we are noticing this connection strong connection between the challenge about indicators of educational quality and distrust so we're seeing that a lot of our colleagues they have disagreement in in how in student engagement is defined and we are noticing that of course a lot of data that has been used to major student engagement tend to focus on behavior but then in different learning contexts there are different learning activities happening so so so our colleagues are feeling that they are being judged unfairly and that leads to distrust in the way learning analytics is being used in the university so that leads to the next topic which is about people in the whole learning analytics cycle it's not just about data it's about the people the about people that will be impact people who are going to use learning analytics so this this is related to also ethics and privacy issues that have attracted a lot of attention related to use of data and concerns in this area has really surfaced the need to focus on people so this article privacy and analytics is a dedicated issue it gives a nice checklist to ensure that institutions who that are adopting learning analytics have can have have thought through thoroughly through out the whole process about the implications of data collection the potential impact on individuals to make sure that the use of student data is in a responsible manner and another paper that my colleagues and I published called more than figures on your laptop in this particular paper we discussed some prominent issues related to trust what what may make people distrust learning analytics and we identified three areas of issues the first one is that these numbers are never objective they are always subjective because there are so many people involved in that whole process of decision making from choosing what data to collect what algorithms to use to the downstream decision making related to what action to take to intervene and so that's the first area the second area of issues is related to the fear of power diminution this is very important that we need to be aware that in an educational system there are so many stakeholders in this system we have educators we have students we have institutional leaders and there is an interest interesting power relationship in play here teachers do not want learning analytics to be used to judge their performance students do not want learning analytics to be used to judge their their own performance or engagement unfairly either so nobody want decision to make to be made about them unfairly so that's that's about this power diminution of course it can we can go further into discussing this power relationship between service providers and high education institutions and those primary stakeholders students and teachers and you can even talk about this relationship between algorithms and humans and another area of issues that can can cause trusted issues is about the approaches to design and implementation of learning analytics and I think we have talked a lot about that okay so apart from that we also need to be aware that the social cultural systems are important as well the overall educational system is complex and there are a lot of tension in play and we really need leadership to be able to address those tensions and we also need to be aware that in different regions different with different cultures and systems it's very different for example in Latin American context we are seeing that the issue with a reliable information system and policies to regulate the use of data are currently the biggest challenge that can impede the adoption of learning analytics so all of these are leading to the emergence of human centered learning analytics and so I think I will skip this slide because I'm aware that the time is running out I wanted to introduce this framework to you the SHIRA framework so this framework was developed based on a very large scale consultation with 89 institutions across 26 European countries and based on the consultations we developed a framework which considered six dimensions for mapping political context to identifying key stakeholders, identifying desired behavioural changes so focusing on what changes you want to see rather than just what you want to do and then based on that developing engagement strategy analysing internal capacity to effect change and last but not the least establishing monitoring and learning framework to promote continuous improvement so this framework is to be used to facilitate the development of institutional strategy and policy to make sure that learning analytics can be used effectively and responsibly so for each dimension we have a comprehensive list of action points and some challenges people need to prepare themselves for and some important questions to answer when developing policies for learning analytics um so I was going to I was planning to give some time for a group discussion but maybe considering that I'm actually running out of time I might just skip this part but I might open up the room to everyone just to ask questions but I do encourage you to visit this this you can use this tool to then you can drag the statements and you can even alter the statements and that really is just like a toolkit that helps you to think about all the important dimensions and aspects when you are considering adopting learning analytics and in a more sustainable manner and potentially to be able to scale it up to the whole institution then you need to have a systematic approach you need to think about the strategy you'll be adopting and and potentially also having a local policy to govern this so this is an example of of some learning analytics principles and purposes that we have developed based using this framework when we when I was in the University of Edinburgh and we also have a detailed set of policy for this and so you can see that it's important that from the university perspective we needed to recognize that there are certain limitations with learning analytics and this one especially I've included here just to show you that why why having a local policy can be can be very useful because again there's no one-size-fits or whether we're talking about a tool or a policy so at the University of Edinburgh after our consultation with various schools it came up very clearly that our teaching staff were really against the idea of the possibility of learning analytics being used to to track their teaching activities and potentially used as a performance judgment tool so this was very important that we included in the principles and purposes to reassure our teaching staff that really what we are trying to do with learning analytics is to enhance learning to support learners now this may vary in different activities because of different culture and different you know political context okay so just briefly looking forward I wanted to highlight that in the field now we are seeing deeper reflections in in topics related to equity diversity and inclusion so for example this paper subversive learning analytics is a very nice paper it also challenge the assumptions that have been baked into the existing practice the social order that some of the bias issues that we need to be aware of and we need to refocus on values human values educational values and another development that is worth attention is about AI powered learning analytics with the growing integration of AI into our lives we need to start to think about the politics that this will bring how politics will shape the way data algorithms and machine intelligence are being or could be used in education and what what it will look like what AI powered learning analytics will like will disrupt pedagogies or actually empower more diverse pedagogies and finally about literacy this is a particular area that I am very interested in how we can equip users of learning analytics with required skills to maximize the value that can be created so feedback literacy may encompass these areas about being able to appreciate learning analytics based feedback and being able to make sense of it turn relating those computational representation into their own selves or others about what people were actually doing and so as to take action and to manage the effects and negotiate some power relationships in this whole feedback loop and then there will be some literacies that are also related to digital literacy data and AI literacy that we should also pay attention to so finally I will leave these questions with you I think these are very important questions we should be asking when it comes to supporting student engagement and success with learning analytics so I will end my session here and I'm sorry that I didn't manage to leave time for group discussion but I'm happy to open the floor up to everybody to maybe ask questions or share your own experience with learning analytics in your institutions or even some of these questions or any concerns you may have thank you thank you very much Dr. Tsai colleagues we've got about seven minutes for questions and comments I saw one comment on the chat I don't know whether Angelou you would want to speak on that point you make there I think it's important and then we'll take a few heads thank you thank you very much and thank you for a very interesting talk and you've opened my eyes to quite a number of issues that have to be considered and what we found when we were trying to implement learning analytics at our institution we're a distance institution but that also has practical classes and in some forms of blended learning was that we we couldn't develop a model that fit for fit all courses we we actually had involved instructors to decide which metrics matter and and I think that's just one of the things that you know was the biggest lesson for us coming out of the process yeah thanks for sharing yeah we we've seen this quite a common issue and perhaps realization as well and and so one thing we have learned from our own local work is that it's important to offer some flexibility when it comes to tool design that it allows instructors to to some extent to be able to tailor the use for their own use so for example on task is a very good example it allows educators to upload additional data that maybe not that is that doesn't exist in the existing data streams but nevertheless it's something important to their own learning design and we know it's it's impossible to to to make all all the learning designs the same that's that's not right either so you do need to be able to to offer this kind of flexibility so that teachers feel that they still remain their autonomy they are not just being given a tool that tells them okay you need to teach this way or you are not teaching well or students are not learning well because our data is not showing that you need to actually give them this flexibility to make their decisions and to actually be able to make better use of learning analytics so that's very important to also about teachers sense of autonomy as well and perhaps apart from that we don't need to restrict to just one type of learning analysis we should be aware that actually they are many to select from um so yeah so that should be opened up yes thank you um I've got two hands I know the first and then thank you thank you um Dr. McCollough my first question is around the issue of interventions I must say I appreciate the power of of collecting and analyzing data I can do but without the interventions that that that is not useful at all now in our situation for an example you have one section doing the collection and analysis and presentation of data and then you have other sections responsible for the interventions now I just wanted to hear your thoughts on how how should universities perhaps I don't know how to put this but the design the structural design of of of the whole system around student support should be should should be arranged in order to be effective because sometimes you have this data but somebody else somewhere else is not acting on it so what's your thought on on on structural designs of of of student support systems within universities thank you thank you thank you for the question that's a very important question so I wasn't able to spend that much time discussing the strategic approaches and policy development around the Neonautics but one of the things that we found very useful when it comes to like when the institution starts to approach the Neonautics is to to establish a committee of of representatives from a wide range of stakeholders so you you should have faculty representatives you should also have some representatives for example from student services you know student support services academic services or and even from data protections offices so you need to have this wide range of people and including student representatives to make sure that you are hearing the voices from different stakeholders and so they may have various interests they may also have various challenges and so how do we bring people together to to to come to come up with with certain consensus and and to reach consensus and come up with a strategy that can really address what we need to have that systematic approach I think that's the way to go and I think what we were saying there was about this silo issues that we all have we all see in institutions not just about data in silo but people in silos thank you for that if you allow ladies and gentlemen we can add the five minutes and we break at five past 12 and then we'll have a 10 minute break I just want to accommodate this last hand which has been raised here and sorry you can go ahead so thank you very much program director and thanks a lot for the presenter Dr. Chai for such a very interesting session and I think I had I had a lot of questions but for the sake of time I'll maybe reduce it to to to to one I think there are a lot of important lessons to draw from your presentations especially when it comes to the issues of involvement a couple of years in this very forum there was a presentation done on analytics and ethics and I think when I when I look at your presentation today there's a lot of these important things that you've you've you've shared with us which actually helps us to think and understand that it is not just about the numbers so my question Dr. Chai is that we I come from the view that learning analytics should be about improving student success and if I look at the second part of your presentation where you started the question of what are the type of analytics and you will see there's a huge uptake on predictive analytics many of colleagues that I've engaged with they they always talk about the lack of uptake on predictive analytics especially from faculty or lectures and the nature of of learning analytics with the view of improving student success you cannot avoid using predictive analytics how have you found a balance to create an interest from lectures or or faculty in in order for them to kind of you know champion the kind of insights all things are considered and is it your findings that there is an interest from from from from lectures or for quality or do you have a situation as I've observed in our environment where lectures tend to end up using those insights more for research rather than for the the you know the purpose of driving student success thank you thank you that that's a brilliant question that's also a complicated question so in three minutes I'll try my best to to to answer this question so I think part of the reason why predictive analytics is not adopted as widely as say the descriptive analytics is because there are a lot of issues not just about the accuracy that's the first one the predictive accuracy that something people will question but then there is also an issue about whether this is really going to be useful from a learning point of view we know that from management point of view this is great because we we can then sort of prepare we can have some foresight about what what may happen and then we we try to take some action to prevent it to address it now from a learning perspective some teachers that I have heard myself with my own ears is that the question whether this will actually take away students opportunity to learn from mistakes if you predict say that oh you're you're going to fail so change your behavior now so on the one hand it seems great that we are actually helping students we don't want them to fail we we don't want it to be too late for them we want them to make changes before it's too late but on the other hand we can also deny that students do learn from mistakes they do learn from failures so there is this conflict here without whereas if you ask students then they you're going to also hear different results some students will appreciate this they will think that okay yeah this helped me especially the more personalized the support is the better for me every student no fail that I've heard the old one the support teaching support will be more personalized more geared towards them themselves individual needs but then you will you will also hear from students there will be a polarized view some worry something that this will be good for them some worry that this would make them more anxious especially if they continue to you know get negative feedback say you're going to fail you're going to fail you're going to fail you're failing every course they're still going to be very anxious about this so um so I think the best approach um is and that's something definitely that was taken when I was in University of Edinburgh was that we make certain resources available we start with policies um so that people can feel safe about how about what the university is going to do um and whoever will will will use learning analytics we can feel safe based on this policy framework that it's not going to be used it's just for personal interests and there there are certain guidance and there will be resources that students are sorry institutions make available for for educators to use um so especially against formal contentious ones that we that that need to be found out it varies in each context each institutions and I would say that having that that consultation is very important to learn about the the the appetite of your staff how much they you know they they risk appetite as well to know what kind of analytics may be more accessible to them and so if you're thinking about having multiple kinds of learning analytics tools if you are thinking about having some of them at a more like large scale and some of them maybe just make them available and and then teachers can choose a suitable one for their own use I think having that flexibility is always appreciated by teaching staff okay thank you thank you very much I'm sure we we all have a a quite a lot of questions to ask but unfortunately time is not on our side let's thank you Dr Atzai for the presentation that we've made for us quite enlightening and as we share the slides there's a lot of resources that we will be offering to uh to identify our our our knowledge in this area so yeah um so I just wanted to say that I'll share the slides and I also wanted to mention that this tool I mentioned earlier is free openly accessible um so I think this could be a very useful resource for um everyone here who is thinking about uh promoting an institutional um scale adoption of learning analytics and if you are starting to think about strategy and policy I would encourage you to access this this tool yeah thank you okay thank you very much uh let's have a short break uh until uh quarter past uh sort of for taking your time um let's meet at quarter past for another session thank you UWC is situated right at the tip of Africa in one of the world's most beautiful cities Cape Town known as the mother city Cape Town is the oldest city in South Africa we've also got the dopest mountain on the planet table mountain officially a new wonder of the world right here in the vicinity of UWC we've got one of the most interesting histories of all the universities in South Africa too true story UWC was originally established during apartheid as a college for colored people only the first students enrolled in 1960 were offered limited training not based on their aptitude no potential but simply on the color of their skin as a heavy weight in the struggle against apartheid UWC was at the forefront of South Africa's historic liberation and it so is if there's real change spilling out on the streets of Cape Town you can bet UWs will be there too bringing its unique brand of hope and the depth of knowledge that translates into real positive action maybe that's why we attract so many artists activists poets world changes and thought leaders right from the start UWC fought the status quo giving students the highest level of education possible that's why only a decade after these doors were first opened the institution was granted full university status and finally able to grant nationally and internationally recognized degrees and diplomas