 Thank you. Roger Scalisa is a doctoral research fellow in learning analytics at the Department of Education, University of Oslo in Norway. His doctoral research focuses on using learning analytics and in particular learning analytics dashboards as a tool to support teachers in making data informed learning design decisions in blended learning environments. Now he is also interested in mobile learning, virtual reality and multi-modal learning analytics. His research utilizes network approaches, e.g. social and epistemic network analysis and automated discourse analysis to make sense of students' data generated from online learning environments and how it relates to teachers' intended pedagogical objectives. Now he currently serves on the Executive Committee of the Society of Learning Analytics Research that's SOLA in abbreviation, elected as a student member and co-leads SOLA Special Interest Working Group. Now at the end of the session we are hoping to have a couple of minutes for questions and answers. Please do introduce yourselves at that time and maybe even during the session if you do have any questions please add them to the chat and then we can follow up on those questions during our Q&A session. The recording and the presentation will all be shared at the end of today's session. And so may I please welcome Roger Calisa. We really look forward to wonderful learnings as we've experienced all through the day today. Thank you. All right. Thank you so much, Bradley, for that interesting presentation. I just wanted to confirm before I start talking, which slides are you seeing? I have two screens. I have to make sure that you have the right view. Do you have the full screen or you have one with the other one with the thumbnotes? Are you able to see the full screen? We see the full screen, Roger. Thank you. Okay. That's great. Thank you so much. All right. So thank you so much for the introduction, Bradley. And just like Bradley said, I'll be discussing Levering Learning Analytics, Dashboard, Support Teaching and Learning. And first of all, I thank the organizers for the opportunity to talk about this topic and I look forward to sharing my experience and perspectives and I also look forward to having a fruitful discussion with all the participants regarding the topic of Learning Analytics dashboards. Just in a brief, I'll start with a brief presentation of myself besides what Bradley talks about. So I'm currently completing a PhD in Learning Analytics at the University of Oslo. And I submitted my thesis which was entitled, Designing Learning Analytics Tools for Teachers with Teachers. And I completed my Master of Education at the University of Adelaide in Australia and then a Master of Development at the University of Adelaide in Norway and then my bachelor's in Adult and Commit Education at Mercury University. And as Bradley said, I'm currently the student member representative on the Society of Learning Analytics Executive. So just as you can see from my background, I realize that, sorry, sometimes I look at the chat, so I'm trying. So my background is more in the education area, some more from the pedagogical perspective and just a question. I've been researching Learning Analytics from the Learning Sciences perspective and not so much from the technical perspective. But in my discussion, what I will see involves decisions regarding technical aspects, but I'll take you through how I've been managing and working with this along my doctoral project. And my presentation is going to be much focused on my work in the doctoral project. But in particular, trying to understand what Learning Analytics dashboards are. And as you saw, like in the previous presentation, I won't be diving much into examples because Dr. Yishan did a lot of that work. But my work will be more of, okay, if we are interested in designing Learning Analytics dashboards, what's the process and how do we go about this? Because what are some of the things we need to consider? What kind of steps do we take as stakeholders at different levels? I know in this meeting, we have individuals, we have experts from different perspectives all doing, participating in different roles in different academic institutions, either as teachers or advisors or as researchers. So I think we'll be touching on those different elements. And I think some of the questions that came from the previous session will be possibly discussed in this session. How do we design dashboards that are possibly aligned with the needs of the stakeholders and this could be teachers or students? My examples will be mainly from the teacher perspective because that's where my focus of the PhD was. But we're also going to see that the experiences and the lessons from whatever I'm going to be talking about are so applicable and transferable to whatever other context we can think about. So maybe one thing I didn't talk about is that I'm originally from Uganda. So I just moved to Norway for my studies. So we'll talk about dashboards, what they are, we already know, and then we have an activity we plan to have two activities if time allows and I think we should do that so that we have more interactions as well. And then briefly talk about challenges in learning analytics dashboards, implementation and then try to demonstrate an example of a learning analytics dashboard and then you have time to have some discussions and also questions with all the participants. So the objective is to at least have a general overview of what learning analytics dashboards and how they are developed from a participatory perspective. One of the participants highlighted the issue of we need to involve teachers, we need to involve students and it's an example I'm going to demonstrate here how I try to work from a participatory approach to develop learning analytics dashboards for teachers in authentic practice. So just like Ishan talked about, in general dashboards or learning analytics dashboards are visual displays that summarize and visualize information for teachers based on students, based on students' learning patterns and interactions. And this is a very, you know, definition and it also relates to what Ishan has been talking about. I mean, it's just a dashboard just like we have dashboards for cars, just like we have a dashboard for different metrics or even on our phones, giving you a summary of what's actually happening. But the difference between the dashboard, for example, a car dashboard, which shows your speed emitters and the fuel consumption and all that kind of stuff, it's a summary. You don't know what's happening in the car, but the dashboard is helping us to summarize this information and displaying it in a way that is actually easy and simple for us to understand. So it combines, dashboards usually combine automated analysis techniques and interactive visualizations to help us to interpret and understand the phenomenon. And in this case, for learning analytics dashboards, it's about learning. It's about displaying all this kind of information about different learners. On the right, this is an example of a figure from an Open Universe Analyst dashboard, which shows different metrics about what students are doing and which student has submitted the Siamese. And this is an example that will be, I think it will be highlighted more in more details in the next presentation where Professor Bart will be discussing about implementation of such dashboards at scale from an Open University perspective. So there are a couple of resources. I won't really dig much into this because we have been having a lot of literature about this, but there are a couple of, there's a lot of literature. Like if you want to dive into dashboards on like reviews, I picked out in particular reviews on learning analytics dashboards. So there is one by Jivet and others licensed to evaluate or prepare analytics dashboards for education practice. There is one on student-based learning analytics dashboards. There is one that is talking about pitfalls of learning analytics dashboards. What are some of the issues we need to be aware of? And then there is one on student-based learning analytics dashboards. So there is a lot of literature out there and of course I will share this presentation and you can follow the different literature in more detail later as we move on. So examples of metrics that we can actually, that we can capture from learning analytics dashboards just like we saw the examples from before. It could be sessions versus total users like our users using this particular kind of resource completion rate. This could be very important especially for institutions that are worried about their retention rates in institutions. You could be tracking how many students are likely to fail, how many students are likely to complete the program and if they will not, I mean we have been talking about predictive analytics trying to understand the likelihood that someone will be dropping out and how do you actually intervene. User statistics, who is using what kind of resources, device type, what devices are being used session times for results, what questions are not easy for students. I mean there are a couple of, there's a range of things drop off or user navigation. There are so many things we can navigate, we can capture, we can look at in these kind of dashboards. But the most important thing is that there are so many users but in the end it will be actually up to the user or the intended users or the stakeholders who is the target audience for whatever dashboard that we are going to actually develop that will determine what kind of metrics and what kind of examples or data we are actually going to collect. So at this juncture actually to make a little bit, because I want us to move in a way that we try to reflect on our own practice because like in the end of the workshop, if you are a teacher, if you are a researcher, if you are an institutional manager, if you are whatever role you are playing in your institution, if you want to develop a dashboard because like in this session I'm assuming that we get through all this process, I want us to go through an activity where we try to reflect on the, how do we plan and design dashboards and in this case one of the activity I want us to go through is to whatever role we take, we go through, we have. I mean we are going to realize that we are maybe in the different groups who have, we might have different roles but if you can identify the roles, if there is something in particular is common among you, maybe you are advisors or researchers or teachers, I want us to discuss like what kind of pedagogical issue or challenge because we are talking about learning analytics. So we are trying to resolve, we are trying to use learning analytics to help us with the particular pedagogical challenges we are facing and we are going to discuss, so based on that challenge and taking a learning analytics perspective, what kind of data is available about students that could support us to deal with such a problem. For example, you could say me in my managerial role, maybe we have a problem with retention or like we have dropout rates, like the dropout rate is very high. So what kind of, if we are to use learning analytics as a tool to help us deal with such a problem, what kind of information is available that we can actually utilize and then based on that information, what's the best way to present this information? Either to the teachers, if at all students are dropping out, what kind of individuals, what kind of stakeholders do you want to actually target? Do you want to produce, give this information back to the teachers? Do you want to give it to the students? Do you want to give it to the advisors? And this highlights like the level of intervention of course, like how do we intervene? Because learning analytics collected without intervention is not useful. So we have to act, we have to respond, we have to take action based on the analytics we collect. So I want us to have this simple kind of a quick discussion about these kind of aspects. How do we engage into, how do we identify a problem and what, how do we decide on what kind of data we can actually use and how do we, what kind of way, what ways do we actually use to design or to present this kind of information? And I hope this kind of discussion will help us to go through the next process that we're going to, we're going to have. Yes, I'm going to share the, the, the, the, the link shortly. Of course, I'll be, I'll try to, I hope, I hope Bradley I'm able to create rooms from my side, maybe no. But if not, I mean you can, I think I'm unable to do that. Maybe you can help me create seven rooms. This can be like, you can just do it automatically because I mean we can split and then it will be back in 15 minutes. So we have like that Google Docs has like group one, group two, group three. So I think if we, I think the, the, the Zoom, the Zoom will create automated rooms up to, so you can create seven rooms and then wherever you are, you are sent, please work with that and then we are back in 15 minutes. 15 minutes. All right. No. Yes. Thank you. That's 45 past 12. So I've, I've opened all rooms and please you should be able to join. May you go ahead and join those rooms and we should be back here at 12.45. If, if ever, if anyone is having a problem, please do let me know. Um, the rooms must be open for, for everybody now. Um, I think we still have a few people here. Yeah. Yeah. Possibly. Yeah. Maybe some people might be a little bit busy or something about. So that should be fine because, but I think the, the pop-up is available for everybody. So if they come, they might join at some point. Excellent. Excellent. Thank you. All right. Yeah. But maybe you can stay in this room just in case someone is popping up. I'll just go to, I'll join one of the rooms too. All right. All right. I'll do that. Thank you. Thank you. Sorry guys. I'm one of those who are still here. I just need to attend to something else quickly first. All right. No, Ted. Thank you for that. Um, Bradley, I'm actually just here for, for SAA social media. So I won't be joining the breakout rooms. No, Ted. All right. Thank you very much. Um, Rogers, I'll be closing the breakout rooms now. Yes. That's fine. I think that's okay. Thank you. Thank you. So everyone has been given 60 seconds to leave. Yep. Great. Okay. So welcome. I think most of us are now back from the breakout rooms. Uh, let me see. Yeah. So I think without wasting much time, I'll, what we'll do is to, uh, just have a very quick, uh, reflection from each group of what you talked about. Like, I mean, you just give a highlight, whether it is anything in particular you highlighted or you talked about in particular, or in, in relation to learning analytics or a challenge in your area and what kind of data and how do you, how do you deal with that? It could be something in general that you have talked about in, in line with the dashboards or using data to actually deal with some pedagogical challenges. So I will start with a group one. Uh, Abdul, Abdul back. Do you have a question? Thank you. Okay. Yeah. Sorry. I wanted to ask you, Mr. Ali. Hi. Um, if you wanted us to capture our responses on the, on the, um, Google doc because I couldn't edit it, but I just made notes on my side. Is that fine? Yes. So I, I thought I wanted people to, uh, uh, to present through the Google docs because I didn't, not so many people recorded there, but I think it would be, it would be great to keep these notes because we can share, share them at the end. But, but for now we can just give a very quick update, but I would, I would really encourage you to put your responses there so that we share this as a resource at the end of the discussion. Yeah. So anyone from group one to just give a quick highlight? Um, I can come in. I think my colleagues were struggling with connectivity very briefly. We didn't have a lot of time. We had identified, uh, from a managerial driven point of view, student support and student success, especially taking into account that, um, we are operating in an increasingly digital environment where there's, um, reduced contact time between, um, students and, and, and lectures and, um, ensuring that the students adequately supported and ensuring that the student success is, is monitored and improved. Those, that was the kind of the, the key element that we highlighted for the first section. And then thinking about the first section and having to answer to the data that is available, about students that could support dealing with student support and student success. Our view was that, um, this data would have to come from the virtual learning environment. So from systems like a learning management systems, uh, procterance systems for, for online examinations, um, and other learning supporting systems, uh, um, libraries, etc. Um, that, that would be the sort of the data points that we thought would, would be quite key in, in terms of, uh, um, supporting the, the, the, uh, the first issue that we highlighted. And of course, in thinking about section B for the, for the kind of information that we would hoping to be getting out, um, understanding things like, um, uh, student participation, for example, is a student engaging in learning activities? Uh, what is the rate of their participation? Why does they are grading, especially looking at it from the smallest, um, or the earliest possible, uh, intervention point, uh, such as, um, grading of learning activities, um, on the LMS, um, or even looking at how much time are they, are they spending on the LMS, etc. And, and in terms of the last question, we have not actually covered that particular question, but, um, naturally, um, you know, we, we, we would think of, you know, a, a platform, a web platform that can, um, you know, capture this information, uh, vitually, um, on a, on a, on a, on a, on an interactive dashboard or page, an internet report or any other data visualization tool, and, uh, you know, make this tool sort of accessible and, and publishable to all that would, you know, need to access that information. Um, colleagues from group one, I, I, I think I've captured our, um, our discussion points, but anyone can contribute if we missed anything. Thanks. That's very, very helpful. I mean, that sort of information. I think what you highlight about LMS is, I mean, you mentioned that's the main source of data usually for dashboards. And I think you have, this was very interesting. And I think in the next discussion, we'll see maybe if we are to, if we want to capture participation and this kind of information, how do we, what kind of dashboard and how, in what kind of format can it actually be to present this kind of information back to either to the teacher or to the student. So this was, this was really interesting. Thank you so much. Uh, group two, anyone? I think Kastun, yeah. Yes. Thanks for confirming Bradley. Um, we were just talking about, uh, dashboards are very helpful for, for, for overviews, but unless people are actually trained in reading and interpreting those dashboards or even have an appreciation for, uh, the usefulness of data, it's not going to be effective. Um, it's not going to be utilized. So I mean, in the analogy that Roger offered of driving a car, and as a person understands, it's a bit of a speedometer. Um, they're, they're not going to be able to engage with anything that's in front of them. So that was something that our group highlighted as something that's we're needing to, um, in our institutions, uh, address. Great. And I think, uh, thanks Kastun for that. I think that also drives us to some of the discussions we're going to be having. I mean, it doesn't really help spending all of the time talking about, we know what dashboards are, but the thing is if we have them in place, are they interpretable? Are they going to be used? And how do we move towards just not designing them towards actually making them useful and usable in practice? And I think we are going to have some discussions. I will share some experiences for my own empirical research. And I think Prof Batu also had some of the insights, uh, implementing learning analytics on scale at the, at the Open University. And what kind of things? How do we ensure that we design dashboards that are actually usable and making meaning to the users? And that will take us back to issues of theory and connecting theory to analytics. Is it, is it, is it just providing beautiful visualizations or what meaning does it actually provide to the user? So that's something we're going to highlight. So thank you so much. Uh, anything from group three? Any insights? What you talked about? Thank you very much, uh, Rogers. Uh, we, the specific problem that we looked at was a high dropout rate for first day undergraduate students. And we had set in terms of the data which is available, we have both, uh, quantitative data, which is in most cases, it is historic. So in the form of Hermes, uh, academic performance of the students, but also we also have qualitative data, uh, you know, from, from surveys in terms of, uh, you know, student background, uh, looking at issues of mental health, uh, funding, student funding, you know, social, uh, psychosocial issues. Uh, but here what we've identified is the challenge of what's most of this information tends to be historic and we want to have, you know, real-time data that is going to be able to, to enable us, you know, to intervene immediately. So I think the issue that we highlighted was the integration of, uh, you know, both qualitative and quantitative, but also the historic and, you know, real-time data so that then we're able to do so. So in terms of the real-time data, we identified the learning management system as, uh, you know, the appropriate platform looking at the attendance of students, you know, level of engagement. And once we are able to do that and then we should then be able to move into the last phase of intervening and the dashboards are, you know, one form or tool that can then be able to, to assist us in terms of looking at the individual students, you know, against their peers in terms of how they are doing, but also the issue of the dashboard being simplified, I think that is very important because if it gets too complicated and there's a lot of information and then we tend to lose the audience, I think that's what to be said. My colleagues can also add things. That's very, very, it's very well summarized and you highlight very important aspects of the data is available and I think this is data that we usually have. We have quantitative data, historic academic performance, I mean demographic data, for example, a student and then you can have data about from surveys about funding. And then you also highlight something very important about real time data. So this is something very important in analytics. We usually talk about like analytics like on the fly, right? You want to make, you want to make some kind of, some kind of interventions on the fly. You don't want to wait until a student has dropped out to intervene, right? So it's about how do we balance between the information that we already have in our systems but also real-time data and how do we provide real-time feedback. And I think we are going to share some of the insights. I'll share some insights about my, how I worked with the real-time dashboard and how we can actually provide real-time statistics. Some insights from group four. Anyone? Hi, so I will just speak on behalf of group four. So just a summary of all that we have put together, looking at the challenges we have highlighted that getting the students to engage with the online environment is the first challenge because if they do not engage with the online environment, which could be the learning management system, then you wouldn't have data to analyze. So that's the first challenge. And one way we talk about improving on that is ensuring that it's about the design of your course in the learning management system, where you kind of put restrictions to specific things or activities to ensure that they must engage with an activity before maybe they can get their results or maybe before they can move on to the next phase of their learning process. So that's one way of ensuring that students engage with the learning management system. Then you would have enough data to analyze and make decisions. We talked about technical ability and skills in as much as this might be done by the tech guys, that's from the IT department, but it's, if you do not understand how this has been developed, how it's working, you might not even want to believe in the data because you need to understand that, okay, this is what has been captured in the database and this is how it has been analyzed in order for you to then believe in the data to use it to make informed decisions. Then we talked about strategic intelligence, which is understanding the variables that affect the students of which one of our colleagues indicated that most analytics is focused on first-year students and then the other students are kind of left out and these then affect them going forward. Aside from, you know, you have used analytics to improve their learning process from first-year, then going forward, are you going to be using the same variables, same analytics for continuing students or the different variables you've been looking at for first-year students and then also acting on these results of after analyzing, you know, creating those interventions, how do you then use the information and insights to get in from the analysis, how do you use it to improve on your, so some people might have the data, but you're not able to act on the data. Translating the data also is another issue, so you could have different types of data, maybe discussion data in forums, attendants, how do you then translate these to make more, you know, insights to understand exactly how, what is impacting on what or looking at, you know, the correlation why this has happened, you know, so that's another thing and it also speaks to technical ability. Then I think the tools that we spoke about, learning management tools, some said also won't talk about the graduation survey, course evaluation, so these are also part of the instruments that would help you to collect specific data about students' engagements in their learning process and I feel like we've been doing here also polls, you know, like we are ensuring that students capture different polls and there are different engagement tools online now, like the Metameter and Co, where you engage students while teaching them and they are able to send feedback about their learning process at that time, so that's all that I have. Thanks. Thank you so much. That was still like a very good reflection that you highlight about issues like issues of data interoperability, which is something very, very important in analytics and how do we integrate different kind of data sources and what kind of, how do we measure that we present this data in a form that is actually useful to their, to especially either teachers or students and the issue of technical ability and skills, I think this is going, this is something I'm going to keep talking about and like over and over and I think the best way to do this, we're going to, I mean, I'll highlight briefly about the human centered approach of learning analytics. I think it will start away from, right away from working with the stakeholders, the people who want to use the tools and defining these things together, what makes sense and what doesn't maybe help in whatever different scenarios and aspects. I think it's something very, very key for us to consider during the design process, but we'll talk about some of these aspects even more as we, as we proceed with the discussion. So we have group five, anyone to highlight just briefly so that we're not caught up by time? I'm happy to talk for group five. So we, some of the issues that we identified, one of them was looking at what are the real accurate predictors for student success, for each unique and specific context. So yeah, that was the main aspect and then also looking at how effective are our student at risk programs? So what data are we needing in order to interpret that and understand that? And then how can we use that in order to make improvements to the student at risk programs? And then also looking at predictors for early warning, when is early enough? So when should we identify accurate students and when should we make decisions in order to advise them to take alternative careers or to take, yeah, to engage with additional support and that raises issues around sort of ethical issues as well as logistical and pedagogical issues that we need to consider. Great. And I think you bring in a new theme about ethics and privacy and I think this is something we are always talking about. Yes, we want to ensure academic success and we want to have intervention, but how best do we do this from well considering privacy and ethical aspects. And this also goes to the design. I'll highlight this like when I was doing my experience, like usually we design tools, but maybe we don't think much about the privacy and the ethics perspectives and what do we do? Is this something about the policies? Is this something about the institution itself? How do we design dashboards that are considering privacy aspects both for the teachers but also the students? So I think that's something very helpful. And then you talk about the logistical and pedagogical challenges and I think these are all very important aspects to consider and be aware of. Group six, any insights? In our group, we talked about the lack of collaboration between support and academics. So in support, for example, the support staff may have determined new approaches to some of the problems that are being faced. So there might be a technology support available, but there's a lack of engagement with it. So there might be some new method available that's not taken up and that extends also to analytics. So even if we develop the learning analytics and the dashboards and so on, how much of that will be applied in practice and used to really gain insights that make differences? So the engagement between support and academics or also the that there's this perception that that academics are overworked and there are also some academics that are not so tech savvy and basically don't want to engage with the new methods. And then another problem that we discussed was the role of plagiarism during these times of remote teaching. So even though we might be exiting remote teaching now, some of the lessons learned could be valuable going forward, but the plagiarism issues have especially hard to deal with in our country because of the difficulties in data access and the disruptions that might be going on that give the proctoring software the wrong that make it hard for proctoring software to be applied. All right. That's very, that's very, you talked about collaboration between support and I think this also takes us back to the, to the, to the stakeholder involvement. And I think we need to really, I mean, we may talk about so many things and I think a general aspect we are all agreeing and I think Prof. Bartol highlighted even this more and how what his experience has been. Yes, we can develop the dashboards and we can present all about dashboards, but if we want to have this in practice and implemented, how do we actually make sure that there is a synergy between the users and the developers, between the leaders and the people on the ground? Because usually we talk about learning analytics and we know students who are like the one of the key subjects of learning analytics, they provide the information, but the teachers also make use of this information to help the student or to make intervention with the students. How best do we involve these two groups and make sure that it's not just a tool that is brought to them and say this is the tool is here, just adopt it because taking that approach might not actually be helpful, but how do we make sure that there is a kind of a synergy and we are developing a tool that's going to be owned by all the relevant stakeholders. So that's an important aspect and I see it's something that's cutting across all the groups, which is very encouraging. Group 7, any reflections from your side? Anyone from Group 7? Bradley, how many groups did you have? Many of you don't have Group 7? We had seven groups. Anyone from Group 7 share the experiences? We did not have the chance to have the collaboration with my colleagues. I did not have access to the Google sheet to present my thoughts on this issue, but what came to my mind from the manager perspective is student access. So student may not have the bus fare to come to Canvas or they may not have the technological infrastructure to access the material that is uploaded on LMS and also even if they do have access to the materials and the tools, the quality to evaluate the quality of time spent on the LMS is a challenge to me. For example, we can see the students have downloaded the material. We can see the students spent so much time on the LMS, but did they really engage with the content? It's also one of the challenges that I think also probably needs some interrogation and then also I don't know how to put this in this board tech language. So in the physical conduct classes, you can actually be able to interrogate board language of a student. You can see this is a yes. This is a no. This is confidence and this is not necessarily, for example, silent doesn't mean there's no learning, but when it is in the tech, in the online conduct sessions, this might be actually problematic. I think we also have plenty of data. Most institutions, we may have plenty of data. It can come from baseline assessment outcomes, it can come from demographics, number of students applying for scholarships. We can also have edging data analysis from the finance department. I think we can also have a lot of insights from there, but the challenge is how do we utilize that data? Thank you. Thanks so much, Sanford, for that reflection. I mean, it still brings out the issue of like raising a lot of learning management systems and there's a lot of data. Of course, we can have some basic analytics from these systems telling us downloads and who is accessing to what. And the thing is, how relevant is this? And as you said, the key issue is what about the content? What about the discourse? How do these students actually engage with the material and that is being provided to them? That's very key. That's a key analytic that could be very useful, especially for the teachers, also to assess whether the curriculum or the design is actually well aligned. You have been talking about issues of learning design, and I think Prof. Butt will be talking much about learning design and analytics and how these two actually connect and how best do we actually capture this kind of information. I want to thank all the groups. I think this was very helpful. It has one thing I have learned from our discussion is that there is a lot going on and there is a lot of knowledge about learning analytics in the group. And the thing is about the things we are worrying about, especially about what kind of data. How do we make this useful? How do we make the tools actually accessible? Or how do we even develop tools that can be taken up by teachers? And these are issues that are shared even within the institutions that are mature in learning analytics adoption. That means we are almost at the same stage of how best do we actually adapt to learning analytics? How best do we move the analytics to adoption in the classroom environments? And how do we design tools that are going to be taken up by teachers? In one of my PhD articles, I looked at how do you overcome challenges to the adoption of learning analytics at the practitioner level? And this somehow aligns with the issues we have been talking about. Yes, we have a lot of learning analytics, but how do we ensure that it is actually adopted in practice? In the literature, just as we have been talking about, we have challenges like integrating technical and pedagogical expertise. I mean, how do we design tools that are pedagogically relevant, but also technologically robust? And to do the things you find that if you have programmers at the university, you also need to bring the pedagogical expertise together. Because if you only work with the technical people, then they're going to develop tools that are technologically robust but pedagogically less relevant. And this brings, this takes us to the issue of actually working with the different stakeholders to ensure that we develop things that are tools that are actually relevant in practice. And then the connection between learning analytics and theory. This was also highlighted by Ishan as one of the challenges in the field. How do we make sure that the analytics we're developing is actually connected to theory? How do we make sure that the analytics that are developed, that are produced by the tools, by the analytics tools or dashboards is actually highlighting what the things we need to see. Just like in our discussions, I saw many people talking about engagement, participation. How do you design a tool that actually captures those elements? How do we make sure that we have, we develop someone talked about measures like, I think one of the groups talked about, a group five talked about what are the indicators of real academic markers of success. But who defines this? I mean, we can connect theory. We can look at self-regulated learning, can look at constructive learning, can look at all these different forms of learning theories. What do they talk about successful learning? What does, what do they talk about successful student engagement or participation? And if we start from, start from there, possibly it's possible for us to design a tool that is actually pedagogically relevant. And if it's relevant, then it's easier for the teachers to adopt it in practice. Even if it's students, maybe it will be easier for the students to take up this kind of tool, because it aligns with their practice, it aligns with theory, it aligns with their needs. And even if it's not just theory, but could also be working with the stakeholders, taking a human centered approach as I'm going to highlight to make sure that what we develop comes from the needs of the users, needs of the users like teachers and the students. So most dashboards existing with, they don't align with the teacher's practice and that's because they are devolved by tech people and they don't, don't consider too much of the, the needs of the teachers, which means in that case, they may not be taken up in practice. We talked about ethical and privacy issues. Sometimes it's about, okay, you design this tool, but how do you present this information in a way that is ethical and ethically acceptable. So there are so many aspects here, but some of these things can be resolved by engaging in discussions of course with the, with the institutional managers, but also the students and what kind of data can be used by teachers. For example, if, if the intention is to support students learning, do you need consent in that case? And I think the, the, the framework by Ishan, the Shayla network, I think they also highlight about things like privacy. When, if you want to start analytics projects from the ground, how do you actually devolve some things? Someone talked about workload, one of the groups, like sometimes the, we don't want to, to present tools that are going to, to, to add workload to the teachers. They are already overwhelmed by so many tasks, but how do we develop tools that are actually going to be taken up without, without considering or without feeling that okay, we are so much overwhelmed and we are taking up a lot of more tools that are actually adding a lot on my, on our challenges and workload. So based on these, on these, on these challenges, also another thing is, is that most of the dashboards today that we have are located outside the learning management systems. For example, looking at your own institutions, of course, everybody has a perspective, but if a dashboard is out there, but it's located outside the LMAs or the learning management system, if you want to capture this data, data about students, it's not very easy to, to actually get this information for the users. So that means there is need for tools that can actually be possibly plugged into the same system like a learning management system, maybe a model or canvas, whatever you're using. That makes it easier for the teacher, for example, to access this data or even the students, because most of the systems, they provide some information, but it's not that robust enough. And that's, that's why we have to develop plugins or that, that provide information that is, that is customized to the needs of those particular users. And then we have, of course, I talked about the best practice examples that involve stakeholders in the domain of analytics are limited. And then, and that's why there is a, a growing interest in human-centered learning analytics. We develop tools that are centered on the human. And when we talk about that, that means that the human must be involved. And these are not new things, but I think this is something in the, in the learning analytics community that is actually developing and coming up to make sure that we develop tools based on the needs of the, of the, of the stakeholders. And in that case, there is a possibility that the uptake and the issues we've been talking about, if we develop dashboards, will they be used by people? Will they be taken up? Maybe that would be solved because it's, it's the, the, the dashboard itself, the tool itself is coming from their own needs. And then you work with the, with the technical people to actually develop the algorithms and the, the, the, the kind of features that the users feel like they're actually interested in and it's actually touching on their pedagogical challenges and problems. So I had, I had this activity, but I think based on the time we have, I think I'll push on a little bit for now, so that we're not caught by time. But if at all we get time, we may come back. So I had, I had an activity where we have different dashboards and I wanted us to look at each and every dashboard to see what are the pros and cons, but the, the, the, the, the, the, the idea behind this activity was to look at just like we are saying different dashboards that presented differently. But when you look at them, the visualization, the way the information that is provided, is it easier for you to interpret? Do you find it relevant? But even if we don't get time, this is something you can, this is a slide that you can revisit later and see if you are developing a dashboard, what is, what makes a good dashboard versus a bad dashboard? Are you able to interpret the visualizations that are actually presented on the dashboard? So that's kind of, that's an exercise that I expected us to engage in, but I think because of the time I'll just push on a little bit and if we have time we can revisit this. So I'm going to take you through an example from my PhD, like based on the, on the literature we have where like the result of designing dashboards not based on a human, a human-centered approach. So in this, in this example, I tried to borrow examples from human computer interaction and design based research and human-centered learning analytics to develop a dashboard that, where I worked with the teachers. So the dashboard is meant for the teachers and what I did in my, in my doctoral project was to start with the users. So I, I followed an approach, it was more of like a design-based research approach. So in this case, I, just like you see this cycle, I started from problem identification, just like the exercise I was, you engage in. So what's the problem? That's the first thing. And then after identifying the problem, we moved to prototyping. Like it's, that is very important. We don't develop systems without testing them. So we, we provided prototypes, then we moved to high fidelity prototyping. Then we had pilot studies and then classroom use. So those were like the stages we went through. But at all stages, I was working with the, with the teachers and then the technical people and of course administrators to know what kind of things do they actually want to see? What kind of information do they value as teachers and what kind of challenge do they have? Pedagogical that can be informed or answered by the kind of analytics that is available. So in stage one, for example, in problem identification, I'll be a little bit first. So that's the paper is already published where I engage the teachers at two Norwegian universities through qualitative interviews to ask them what kind of things do you want to see? What kind of challenges do you have? And just like some of you, you mentioned most of the teachers we're talking about issues of like a capturing participation and engagement during online discussions. So in this case, I zeroed to how do we support teachers to actually capture student engagement in online discussions? And in this case, the needs that were the analytics needs that we identified here included like social analytics, discourse analytics and feedback analytics. So from this stage of problem identification, now we move with this problem and the needs identified by the teachers to start developing possible possible solutions. And one of the things we did was I shared paper prototypes or some of the visualizations based on online discussions and I presented data based on student interactions based on social networks, who is talking to who and who is not interacting with who. And then we had visualizations based on what kind of concepts were actually used by students in discussion forums. And then we presented this information through like on a paper form, just telling teachers could this be useful? Could this be helpful for you? And then based on this feedback, I'll take you a little bit faster, but all this information is available in the papers which I'm going to share. The teachers gave feedback about one of the things that they said they need simplified visualizations. One of the visualizations you see here was like an automated discourse analytics visualization and teachers felt oh this is too complicated and I want to see the concepts but it should be as simple as possible. And it should be automated because I provided it in a paper form. So if it's presented in an automated way and embedded within the Canvas LMS, I think this is possible. I can use this in practice. So then we moved from that stage working with the technical people at the lab here at the university to develop an automated dashboard which is this is the interface. So it's called the CADDA. It's a Canvas Discussion Analytics. So this dashboard provides metrics about participation, students' participation, the discourse that is what kind of content are they engaged in and what kind of things are they talking about. Network analytics and then recently also added sentiment analysis. So it can provide the sentiment that is attached to the kind of posts that students are actually providing in the online forum. So this is the interface again. This is the interface I'm talking about the tool that I developed with the together with the teachers. But when you get into the tool, if I get time I will show you how it does things automatically, the actual interface. So when you go to this course, it gives you the content. How are students interacting with the content? And what kind of concepts are they using in the discussion? How are they interacting between each other? And then the network gives you the social network analytics. And then the sentiment gives you whether the content is positive, negative or neutral. And based on whether this is something important for you as a teacher, of course you decide what kind of aspects you actually pick up. Just like I mentioned, I based on the principles from the learning sciences to develop this. So it was not just about what the teachers say they needed, but was also embedded within the theory. For example, learning as participation and master of subject specific discourse and practices mediated by artifacts. These are aspects from the social culture perspective saying that learning is participation. So if we can capture participation, if we can show how students are interacting between each other, then we can be able to make inferences about that students are actually studying, students are engaging within content. And then from the human computer interaction, we're making sure that the dashboard is developed with stakeholders, but also doesn't, it only reduces cognitive load, not increasing it and also helping with decision making. So when we went into the actual practice, we realized that the teachers were able to, because we were wondering, we wanted to see our teachers actually finding this dashboard helping them to reduce cognitive load, or is it just increasing it? So that was something that was done during the pilot studies. So I piloted this tool in two iterations with seven courses at the universe of Oslo with 10 teachers in different iterations. And teachers used this tool in practice showing what was happening in the in the real time. And we were talking about like on the fly. So they were getting the visualizations of they had discussions running, and then they could get the analytics in real time, and they could actually make some adaptations of the course based on what they see from the discussion firms in real time. Just like one of the group members highlighted about how do we do this in real time without, yes, we have historic data, but what about getting to know what's happening in the real time so that the teacher or the student is actually able to make changes into the course design without waiting until the end of the semester until the end of the course. So this tool was used in real time and teachers used to use it in the real class. They give a discussion, they get the analytics, they share that with us. Some shared it with the students in class showing some of the scripts, but others also saw the misuse of concepts, students only interacting as based on the learning design instructions. Like some teachers were saying, can you discuss between one and two other students? But sometimes when the analytics from the social networks came out, they were showing that some students were just posting, their posts were directed to the discussion prompts, not between other students. And in that case, a teacher could easily see that the analytics are not actually aligned or what students are doing is not aligned to the course objectives. And we're going to get more examples about learning design analytics in our next workshop, I guess. But this was an example of how this tool could be used in practice to provide this real time kind of analytics. So there were a couple findings. This is a paper that has been published recently. It's my last PhD article. And I present the details of the tool and the process we went through and what teachers thought about this process. And in brief, I can say if you co-design a tool with the teachers and the theoretical constructs, you can actually, it increases the chances of making it relevant for use. And then presenting simple but informative visualizations. So you don't aim for complexity, because when it's too complex, teachers don't want to look at it. And then configurability, teachers should be able to customize and choose what they want. Like in this tool, teachers can choose what discussion they want. They can edit the kind of words they want to actually pick up and use. So it's possible to customize a few aspects and then the teachers can actually play around the tool to get what they want. And then focus on basic aspects of relevance to the teachers. As we say, this comes from working with them throughout the development process. Because if you don't engage them from the start, then it's very hard to know what kind of basic things do the teachers actually need so that the tool actually emphasizes that. And then learning science constructs should motivate design decisions. So based on the learning theory, what kind of learning constructs are you actually capturing? Is it persuasion? Is it engagement? Is it about use of concepts and how best do you actually develop the analytics that are going to help you to capture these kind of things? And in that case, just like I highlight the connection between the analytics and the design, it's supposed to be together because you can't do without the other. You can't design analytics without considering the design, without considering the intentions of the course. And then one implementation lesson I think from my lesson was institutional support is key to classroom and institutional adoption definitely. So you can't do this as a teacher alone. You may be interested but you need the institution with you. You start small, work with a few teachers at the institution and then get some of the people who are really interested and keen to use such tools and then move towards institutional adoption and then communicate the value explicitly. Like if you can explain and tell people what this tool is really capable of doing, then it's very easy to design and have tools that can actually be implemented because people will know what's the value of the tool. And then peer training, if you have any training to give to the teachers, which is key, we have talked about the skills and everything, it's important that you train teachers and also as peers not as individuals because teachers usually work as a group and they can always help each other in the process. So this is a kind of demonstration. I think I can do this when we are having lunch, breakfast, people will be interested. I have 15 minutes remaining and I think I would give you an opportunity to actually possible have discussion instead of me if there is something you are keen about, I think it will be relevant to hear from you so that we can discuss further about the things but I will try to pull out the actual tool. I wanted to get the version where you can play around it but for ethical reasons, I think I thought to get the version that could be shared with you but what I can tell you is that this tool is going to be made available to all institutions. That means if you have an LMS and you are interested in adapting it to your own institutional needs or teacher needs, it will be available I think within a month. So the lab is working on completing the open access details so it will be available. Feel free to follow up and I will share the presentation and if you have any questions that's my contact details, you get it from the slide. So thank you so much. For now, I will stop sharing and welcome some questions from the audience before and then if we have time I will show you the tool as it works in real time. All right. Thank you very much for this informative session. So if there are any questions, please go ahead and ask. I have one question that I would love to start off with. So at the center of learning analytics is the learner, the improvement of their learning experience and learning outcomes. Now with the tools that you're going to be showing as an example, the tool that you're going to be showing, how do you ensure certain governance structures are in place so that these things are used solely for the intended purpose and not necessarily to build prejudice or bias against certain learners or even against certain teachers and lecturers? How do you build that and ensure that all these tools, all these dashboards are always or at least in most cases used for the sole purpose that they are intended for? Yeah, I think that's a very good question Bradley. And I think the bottom line for learning analytics and like everybody who is engaged in this process is the transparent, right? And this, like for you as a researcher, transparent and also the institution, like we have the fiduciary role as an institution to help learners learn, yes. And if we are designing tools to actually help us promote that, that is fine. So the whole thing is about the transparency and the trust, right? We should be transparent and we should be trustworthy about if we are designing tools, they should not be used to actually undermine or to disadvantage students. And I don't think, of course, people are using algorithms and data in so many different ways and in that case, if not used in a good way, they can actually under disadvantage certain groups. But I can say, for example, with the tool we have been developing, this is a tool that was from the start, your intention is it's about pedagogical relevance, it's about pedagogical support, it's about supporting students, students learning through providing teachers with timely learning, timely learning analytics about what students are doing in online environments and using this information to customize the learning design and provide individual feedback. So I can say from a learning analytics perspective, it's about you as a researcher, it's about you as an institution and talking about governance, the institution can develop clear policies about using learning analytics, what can you use analytics for and what and not. And if this is very clear, it's something, it can be something from an external perspective, but it can also be from an individual perspective as a researcher, as a teacher, when you have the analytics, it's an ethical issue and it comes from you as a person, don't use the analytics to disadvantage a student. But if that ethical aspect is not available, I mean, it's not within an individual, I can say having governance structures and having structures that are very clear about what and what does and don'ts about the use of these analytics at an institutional level could be the key. Make sure that you have the structures, make sure that you have the policies on how best to use the analytics so that the tools that are used specifically designed and used to do or to perform the task and the purpose they actually meant for. Thank you. Anybody with a question you may unmute? Alternatively, you may just add it to the chat. Bradley and Roger, seeing that there are not a lot of questions and we have about 10 minutes left. Roger, if you can do the demo, I think it might bring some more questions and people might ask some more questions, but you are more than welcome to go ahead and do the demo. Thank you. So I don't know whether you, can you see the, can you see my screen now? Yes, yes, we can see your screen. Yeah, so just to, just like what I said, I failed to get the demo course, so please don't take any shots for this course because this is a real course, but as the teacher said, if that's fine, I can use it for demo purposes. I'm sorry. I'm hearing. Okay, yes, now I can. Okay, I actually had an end of the first time for, I don't know, just a very quick thing. Yeah, so I wanted to take, which regards to the global availability of the tool, is it going to be open source? And then this is kind of talking about, you're talking about re-embedding it onto our LMS. So I was just thinking of when it comes to improvement and development, how are we going to approach that? Because now it's not, it's not sitting in a specific cloud resource where it's only, you know, picking up the API, using API to pick up the data from your LMS. But now, if it's embedded in your module, I know most times module administrators don't want to embed a new, a different plugin because of the problem of maintenance. So I don't know if you thought that we could think about all these kind of restrictions or this, yeah, during the design phase so that we can easily integrate it onto our own LMS. Thanks. Yes, thank you. Just one thing. I say yes, it's going to be open source. And then the thing is about the adaptation. And I think that that really, like the institutions are different. Like when we're working with the University of Oslo, so when we explain, they usually develop LTIs or what they call like the plugin. So this is something that is not a problem to, at least like we have so many LTIs. So if it's developed, it can be added as an LTI for Canvas. And if a teacher is interested, it can be added as a module by the technical people. So at least here it's not, it has not been a problem. But the issue of maintenance that could be from an institutional perspective, which is whether the institution is actually willing to add or like have these kind of tools embedded within their platforms. And when it comes to maintaining or adapting, I think that also, as a researcher, you may not be able to like work on it on this alone. But if you are, I mean, depending on your role, still you need to work with the technical people, even if it's open source, because maybe you want to find a way to make changes or integrate the tool into your, or make some kind of function changes. But I can say if you really need some support from a technical perspective on where the tool stands at the moment, just get in touch with me because I think the technical team at the University are more than happy to give you whatever support that you may need to actually have the dashboard integrated or like whatever. I know they can give you, they can offer some minutes to have the tool embedded within your LMS. But I can say working with your institutional technical people will be really key in having everything in place. And I can't give a very general answer on what happens on an institution because this varies from institution to institution. At least here it's not a problem to maintain the tools and adding them to the modules. You only add the module that you are interested in so it can't be there permanently. You can always remove or add according to different course requirements. Thank you. Thank you. So I was seven minutes over on Stanford. I'm not sure if that's a new hand. But for the sake of time, I think it would be wise to bring it to a close so that we could have our lunch and not necessarily delay further the later program. Now, thank you very much for everybody who's joined. Rogers, thank you very much for this informative presentation. And to everyone, this presentation will be shared. The contact details are available at the end of the presentation. So we should be able to get hold of Rogers for some of the technical questions that we have. Now, we will play some music. Please do enjoy your lunch. And we should be back here again at 14.30. So that's just going to be about 37 minutes, as opposed to the 45 minutes. Apologies for that. Apologies for taking your eight minutes. Thank you very much, everybody, for joining.