 ADINA's work with learning technologies helps to develop skilled data literate students who can change our world for the better. Teachers and students can develop and share coding skills with notable or Jupyter Notebook servers. Our DigiMap services deliver high quality mapping data for all stages of education. Future developments include a text and data mining service, working with satellite data and machine learning, and smart campus technology. See conference, I think you've made a good choice. We have three short papers for you in this session. There'll be 15 minutes presentation and five minutes questions for each paper. We are being streamed in here, which I think was already mentioned at the start of the day. And remember that we do have the VVox app for any discussions and presentations that are happening in this room. So if at any point you want to make a comment or post a question, please do so and we'll try and pick those up when we get to the questions for each of the sessions. So without any further delay, I'm pleased to welcome Yishan See. I think hopefully I've just about got that right. And you're going to be talking about working towards a systematic adoption of learning analytics on behalf of yourself and quite a number of colleagues as well. So, thank you. Hi everyone, I'm from the School of Informatics at the University of Edinburgh. This talk is about stakeholder expectations and concerns regarding the use of learning analytics in higher education. And it's based on the output of a large scale European project called SHILA, which stands for Supporting Higher Education to Integrate Learning Analytics. So as the name suggests, the goal of this project is to help higher education institutions to adopt the learning analytics effectively, systematically, and responsibly. So I'm presenting this work on behalf of this whole team and I would just like to acknowledge the great input to this work. So this project is made up of six analytics data about collecting and reporting data about learners who constantly acknowledge as they learn. So we could make use of this data. We could integrate it with various sources of data, such as student characteristics, and then generate some insights which could help us make better data informed decisions. So it could be useful for students to adjust their learning strategies and for teachers as well to adjust their learning design, their course design, and for managers as well to better allocate the resources. And if you're interested in learning more about learning analytics first in three minutes, I would like to share with you the advertisement short video I made. It's called Learning Analytics in a nutshell. You'll be able to find it on YouTube. Okay, so to be able to understand what different groups of stakeholders think about learning analytics, there are various groups of stakeholders using the survey, focus groups, interviews, and group counsellors, teachers, and motivations to use learning analytics. We found that the top motivations here, the top five ones are to improve student learning performance, to improve teaching excellence, to improve student satisfaction, improve student retention, and to explore what learning analytics can do for the institution, for the teachers, and for students. So we could see that the top four items here are very familiar key performance indicators for institutions. And we could also see that from the fifth item here that at the time of the survey, which was towards the end of 2016, learning analytics was still fairly a new idea to many institutions. So they were still just exploring this idea, trying to figure out in what way they could benefit from learning analytics. For teachers, we have observed the three areas of interest. At the student level, we found that teachers were particularly interested in using learning analytics to help students develop their self-regulated learning skills and also to give them better access to their own learning progress so that students would be able to make better learning decisions. They weren't unnecessarily over anxious or over optimistic about their progress. And our teachers' table, teachers were quite interested in using learning analytics to identify students' weakness by providing support to them. They would also like to know how students are engaging with the learning contents that they have prepared, and so as to identify the needs to adjust the course design and materials that they have prepared for students. And at the program level, especially for teachers who were responsible for managing programs, they were particularly interested in overview about how well the program is doing and what way they can improve the overall quality of the international provision. We observed four areas of interest, personalised support, a peer feedback, a greater navigation of academic resources, and also opportunities for self-regulated learning. So here are just two quotes as examples to illustrate what they mean by personalised support. So the first student was talking about why didn't excess that universities are all striving for, but yet students themselves don't feel that universities have really done well in terms of providing the same kind of access at the course level, so making sure that every student is on the same page, nobody is being left behind. And the second quote provides a very good example of what they meant by somebody being left behind because the teacher didn't know that the student hasn't learned to use micro-scope and yet they just started working on micro-scope. And this quote regarding feedback came from a student who was in the first year at the university and he was saying that, oh, the big change between pressure education and pressure education is that the contact with teachers have become much less, so they would really appreciate it if learning analytics could allow them to get more feedback about where they are and where they should be improving on. In terms of resource access, we particularly hear from students who have learning difficulties and learning disabilities that just waiting through the information about various supports available at the university itself is a very challenging task. So if learning analytics could provide them more targeted recommendations about these support available, then they would really appreciate that. From our survey with students which got brought out to six institutions and reached about 3,000 students, we also found that among the items related to their expectations of learning analytics services, the top three items are all related to self-regulated learning. So in this survey, we asked the students to rank their ideal expectations and the predicted expectations for various items. The ideal expectation is regarding what they expect to see ideally, what they would like to see, whereas predicted expectations are what they expect to see in reality. And we could see that for both scales, the top three items are about receiving complete profile of their learning, making their own decisions based on analytics results and knowing how their progress compares to a safe learning goal. But we also noticed that among all the samples that we have received, we have got, the Open University in the Netherlands did not really give very high score to this item about receiving complete profile about their learning. So this is an interesting one. We do not know why, but what we do know is that the average age of this student population is much older because the students at Open University tend to be professionals. They are already working, so they are more mature students. So it could be that, to them, they tend to have a better awareness of where they are regarding their own progress, and perhaps because of that, they don't think they need these constant updates. But what we do know, especially what is important about this finding, is that the implementation of learning analytics can never be one size fit all, and that students in different contexts do have different needs. What about concerns about learning analytics? Among senior managers, we have observed the four areas of concerns, including returns on investment, whether the investment is worthwhile or not, and resources, whether institutions have enough resources to drive the use of learning analytics, including the financial resource, technological infrastructure, and enough people who could work on learning analytics, and also whether the university has this data culture, whether people are willing to use data to make decisions based on data, and finally whether there are enough skills to drive learning analytics. So here is just a quote illustrating this manager's concern and uncertainty about whether learning analytics can really do them any good or bring any changes at all. As for teachers, concerns are around three areas, students, teachers, and learning analytics, and I will just highlight teachers and learning analytics, as I will also be talking about student-related concerns later. So the top concerns that teachers have about themselves are number one workload and number two, potential judgment on their teaching performance that learning analytics has been used as a material tool. As for learning analytics-related concerns, teachers have reasonable skepticism about whether, to what extent, learning analytics can capture the differences among individual learners, and also the fact that learning is quite difficult to observe and difficult to define in different disciplines, and the way we collect, we analyze data could all affect our interpretation of learning. So to what extent learning analytics can present us a faithful picture of learning is a question that teachers asked. So here is just another quote that teacher was talking about not wanting learning analytics to make students performing way that satisfy algorithms. Okay, as for students, their concerns are also there are a few areas of concerns, including the shared concerns about to what extent learning analytics can give them a precise picture of learning, but really the primary concerns among students are related to privacy and ethics, and I would like to talk particularly about access and anonymity because purpose and security are more about the expectations of what the institutions should do. So the three themes came up around from our conversations with students when they talk about access and anonymity. The first one is the fear of surveillance. I think we have all heard earlier from the keynote this negative feeling about being watched, and also there's a fear of being labeled, which leads to stereo types and then unfair treatment and marking. And finally, there is a very strong distrust in third parties, especially when talking about university sharing their data with external parties, and this is due to the fact that students do not know what would happen once data travels out, and also there's a fear of becoming the target of commercial emails, spam emails. So what we have seen is that there are very different priorities and concerns among these three key stakeholder groups. So what we have been able to do is that we have developed a framework that we call Sheila framework, which is based on the data that we have collected from direct engagement with 89 institutions across 26 European countries. We have developed a very comprehensive set of key action points related to learning analytics adoption and some primary challenges that institutions are facing today and also different stakeholders are facing today, and also a list of questions that we encourage policy makers to answer when they are developing strategy or policy for learning analytics. And we try to take them step by step to consider key dimensions to really use a holistic approach to learning analytics. So I'm going to just give you some examples here from this long list of statements that we have created in the framework. So for example, in terms of mapping political context, we ask decision makers to consider what are the reasons for adopting learning analytics. In terms of the dimension of identify key stakeholders, we ask will there be mechanisms to address inequality? And moving on to identifying decide changes, we ask how will the purpose of learning analytics be communicated? In terms of strategy, how will the results of learning analytics be interpreted within the context? Moving on to internal capacity, what training will be provided to scale up data literacy? And in terms of monitoring, what are the limitations of learning analytics? What can learning analytics can do and what can they not do? So this is just to show you how they all connected to each other and that this is an iterative cycle. So just quickly show you that we have developed a web tool based on this framework. And this is an interface. It's also openly accessible. You can move the, drop the statements and put in your own statements and also label them by the relevance to different stakeholders and thereby create your own policy framework. So that's the key output of this project. I'm happy to take questions. Thank you. A couple of minutes for questions. And if you could just introduce yourself and say who you are and then your question, that would be great. Hi, my name is Richard the Black Air Clarkson from the University of Leeds. We just sort of get started in learning analytics systems. This is really interesting. I had a question about the idea of conflict between different stakeholders because it seems like perhaps it's not as big a problem as some people might fear, but there's going to be conflicts of interests. So in particular things like student privacy with regards to having a system that delivers sort of useful data. Do you have a general approach to resolving that kind of thing? So I'm thinking about examples where a student says, okay, I understand you need to track my attendance, my grades and so on, but I don't want you to have anything to do with my physical location or gender, for example. Or is it something that would have to be dealt with on a case-by-case basis? So in general, we are trying to encourage a dialogic approach to adopting learning analytics, which is to bring different stakeholders together to talk about how they would like to use learning analytics and really to find a balance. And I hear your question, which is, I don't think I have an absolute answer to that because that really depends on context. And I think at the end of the day, it's down to this balance that we can reach within our own context. So in what way, so for example, perhaps could we offer opt-in and opt-out options for certain uses of data or collections of data, and particularly when it comes down to interventions, and this is also what JISC has been encouraging people to do as well, that we may not necessarily need consent to collect certain types of data when it is for legitimate purposes and of public interest. However, when it comes to interventions, it is quite crucial that we do have students' consent. So I think that would be a principle there. Thank you very much. We're just on time. So what I'll ask everyone to do is give Yishan another round of applause, please. And there were another couple of questions that was kind of similar up on the wall. So if you'd like, feel free to take a chance to answer those on Twitter during the conference. Thank you. Can I invite the next speakers up, please? So now we have Sarah Knight and Mark Langer Crane and Youth Bridesdale from JISC, talking about a data-driven approach to student engagement. So over to you. Thank you. That's great. Thank you, Keith. And thank you very much for the opportunity presenting in what a spectacular location. This is very much a team effort, and I also need to add Helen Beatham, Tabitha Newman and Claire Kellan to the acknowledgments here in this presentation. And we're really delighted to share with you today the outcomes from our 2019 Digital Experience Insights Student Survey. And I hope some of the messages coming out from our data will be able to build on what we heard this morning around some of the themes that we'll introduce, but also bear in the back of your mind that you go through some of the sessions throughout the next three days. Because one of the most important things from our work is that we do need to consider our students. We make so many assumptions of what they like, what their skills are, without necessarily always having that evidence-based approach around what students are actually doing with technology, what their views are around what we might be offering them. So first it just this work builds on many years of underpinning research, and I'm delighted today to share with you some of those results. I think importantly as well, with the drivers towards more decline of resources that we have, both for colleges and universities, we have to make sure that everything we do does count. We know there's a lot of rhetoric in the government policies around the importance of technology, but we know that there is still that mismatch between what we're able to offer, how we're able to support our staff, and those sort of widening expectations. But to sobering thoughts in relation to the investment that colleges and universities are making in their infrastructure, if you look at the percentage of spend in budgets relating to ICT and infrastructure, that we really do need to be making sure that we have actually got that evidence for return on investment. With your digital strategies, how do we know that we are investing in the right areas that are actually meeting our student requirements? So the work we've been doing around our digital experience insights work over the past four years now has started to bring together a sector-wide body of evidence. We've got some longitudinal data now from over a hundred thousand students from over a hundred institutions across the UK in relation to the impact that technology is having. The surveys that we have crafted together and tested extensively with users are now robust to provide us with the data that I'm sort of sharing with you today. And importantly, I think we often forget that we actually should also be asking our staff. We need to be able to validate the views that are coming through from our students about their uses of technology with what our teaching staff are thinking, what our professional services staff are using. And we really need to have that holistic picture of how technology is being used across our organisation. So with our work that we've been doing around those three areas, we now have got a mechanism for institutions to start gathering that data essentially and very much looking at digital in the lives of their staff and students, looking at digital within the institution and how that's being used by users, looking at digital at course level in terms of the experience of students actually having on their courses and lastly looking at attitudes and particularly important there I think as well for teaching staff. So what do our reports say for this year's data? Well we were very delighted to work with 50 universities and colleges across the UK and we gathered over 29,000 student responses. So again a very, very large sample size in which we were able to analyse the data. Really delighted to say because these are literally hot off the press that these are available from our distance. If anyone would like a copy, a physical copy, do come and visit us on the GIST stand later today. But the report is now available to download online as well. So in the short time that I have, this is a very quick tour of some of the key headline statements that have come through from this year's data. The depth of the report and the analysis please do go on and have a look at in the full report because there's a lot of statistical work that Mark and Tabatha and Helen with the qualitative side have all been involved in reviewing. So if we look at our first theme which is around how students, how students using technology in their own lives and really important for us to recognise that students are using their mobile phones, their smartphones as an essential tool in their everyday lives and there's more opportunities perhaps for us to make in terms of ensuring that all our university or college systems are equally available and being able to use on those devices. There is some discrepancy there in terms of ownership around devices that we're seeing between HE and FE and you know that comes back to some of the digital divide issues that we still need to be aware of. So what are students using, what are the most popular activities they're using their devices for? Well for FE it is around making notes and recordings and as we're all doing today, photographs of slides, annotating our notes, using our phones and technology to actually record records of what we're hearing. For HE it's very much about accessing lecture notes and recorded lectures and I'll say a little bit more about that in a moment. We are seeing assistive technology becoming more and more mainstream and very much recognising that not all students will necessarily be assessed for their disabilities they may have or may be using the built-in features of the standard devices that they have available now to help make learning more inclusive and I think we always need to remember that we should be making learning inclusive for all particularly of course now with the legislation that we're all working towards. I always enjoyed the word clouds and thanks to Helen for her work on these and it was great to see this one in particular when we ask students around an example of a digital tool or app you find really useful for your learning and you know they they say it all don't they in terms of those similarities there between FE and HE but just showing again if you look at some of the detail there the variety of different sort of apps that students are using but clear dominance of some of those ones that are highlighted there in the large text. So in terms of key messages for a digitalise of learners we really do need to make sure that we are ensuring that there is equitable access to technology for all. We need to be ensuring that we are supporting learners with their own devices and only 53 percent of FE learners say they actually get support for that and we do need to be making sure that we are not assuming that students are aware of the accessibility features built into the devices that they may be using. Staff confidence and I'll come back to that shortly is an absolutely important and important enabler as students are still the most likely source of support with telling to their lecturers and their tutors for support and we need to ensure that staff are digitally confident as well in terms of supporting that. So if we're looking at digital in the institution we've got some sort of key stats there to highlight in particular we're seeing a very positive stat there in relation to organisations digital provision. We did some correlations there with institutions that received positive and good NSS ratings and there were some correlations there with the findings that we were finding on that particular statistic so there was a correlation there. We're seeing that there are still some discrepancies around Wi-Fi for HE and FE. We're still not at 100% reliable Wi-Fi will we ever get to that point because expectations are always growing and moving and you know as I said earlier we're still very much seeing that that importance of ensuring that we can support students with their own devices as they come onto campus. The comments that we heard and the discussion that we had this morning in Sue's keynote I think picked up here that we are seeing students and having more of an awareness around data protection and I think we'll work around data privacy and GDPR and we have seen a shift an increasingly positive shift in that statistic but we're still only at 61% for FE and 54 of HE students are agreeing that their organisation are protecting their data so that's certainly something that we need to work on. So it is about getting the basics right it's ensuring that Wi-Fi is seamless especially in areas perhaps that are thinking about student accommodation or areas that are perhaps off-site it's also about ensuring access to timely and accessible lecture recordings and for HE that was absolutely one of the key areas of concern they wanted good quality and they wanted timeliness and also to be able to access academic good high quality academic content and for more content to be digitised. So if we look at digital at course level there's some key stats there in relation to students and how they are accessing information relating to their courses still a concern I think that only 29% of FE 24 of HE students never work online collaboratively and if we're thinking about preparing them for the workplace that's quite an alarming stat. Also I think though we had the positive trend of increasing around data privacy there is still a new third of all students agreed they were told how their data is actually being used I think that ties into some of the earlier comments there around data that we were hearing in the previous presentation as well. Importantly digital at course level and I think here we've got some really key stats that very much mirror what we had last year in relation to the importance of preparing students for the workplace. There is still a huge discrepancy between what students should be able to do and the expectations that are there when they move into the workplace. So we need to encourage more collaborations emulate business practices to ensure that students are better prepared. We need to be looking at how we can support the embedding of digital skills into the curriculum and that I think is still one of the core messages that we really need to work on together and we need to be better at articulating why digital skills are important for the workplace and there's some lovely anecdotes around email that we could relate to that. So looking at the last theme around student attitudes to digital here we have got some positive stats in relation to students feeling more independent they're able to fit learning into their lives when digital is used. Four in ten students would like more technology to be used on their courses and when we added in a new question this year around asking students what their options would be the most useful to learners. HG students chose more practice questions available online, course related videos, references and reading. So some interesting thoughts there in terms of how we can better support our learners with their requirements. So there are some more sort of key messages there in relation to where we can make learning more engaging the use of digital and to really ensure that our students can actually manage their time, take notes encouraging learner autonomy but importantly involving students in these decisions and only a third of learners said they had the opportunity to be involved in decisions about digital and this is one of the core drivers for having this survey is to open up that dialogue and to actually enable students to have the opportunity for being involved in discussions around digital with you. We've got some great resources we've been working with NUS in developing a roadmap for supporting students to improve their digital experience and we've also got some briefing papers for senior leaders around the importance of this. So there are some great guidance there in terms of being able to assess where your institution is in relation to this. So this was a little bit of a promotion for some of our other sessions which we're going to more detail. We've got a workshop after lunch and we have got a session tomorrow with Helen and Sabatha who'll be presenting some of the deeper analysis work that has gone on around the data sets that we are collecting but importantly now I'd like to hear from yourselves around using our voting system. Do these results align with your experiences of your students? So I'm hoping that there are some similarities there but more importantly I'm hoping that you've got ideas on how you can gather your student views if you're not using insights and which results surprised you most. So we're going to do question one first and give you the opportunity of selecting yes or no and then we will close the poll and display those. Do you want to close that one? I think we've still got people voting. Okay, great. I think that's quite a positive response there and then moving into our second question which results surprised you the most. So was there anything there from that very quick overview that you were quite surprised about in relation to those responses that we got back? A word cloud generating around that. Digital skills seem to be coming up there and I think you know I think one of things that we are seeing from this work is that actually the importance of preparing students for the workplace is both a driver for our institutions to be investing more and supporting staff with their digital skills but also importantly you know a driver for senior management to ensure that we are continuing to invest in this area. Great, I think we'll close that there so we're running out of time but hopefully we can capture these and tweet those as they continue to come in. Super. Thank you and just to say as I said we have got the report but we've also done sector summaries as well for FE and for HE around those key stats as well so please do visit us on this just stand and come and get your copy. So I think we've got if we can flip back just to the last slide as the closing. Oh we've got questions, great. Thank you. We do have three minutes or so for questions. Let's flip back to the ones that are coming on screen as well. Yeah absolutely there's a couple of observations on the screen. I think we have our first question and audience over there. So if you could introduce yourself and say where you're joining us from that would be great. Hello it's Alan Williams from the University of Derby online so obviously we're in a particular sector. I think it's more a comment about he's saying smartphones, mobile devices, not desktop TCs, not tablets, not even very much is the handheld devices of the future I reckon. I think that's true although you know I think for FE we're still seeing when we ask the question around what one thing could institutions do to better support you. Both for FE and HE it was about having more access to laptops in classrooms and being able to actually use those devices bearing in mind that you know there are some more specialist activities that you do actually still need the desktop in front of you so balance there. We have a question at the back as well. Would you take that one? I just say we have also in I have got some analysis to do on our online data set so we did have about one and a half thousand online students that we're doing some further work on yeah. Oh hello it's Christine Spratt from the Hong Kong Polytechnic University. That was interesting thanks very much. I've got a couple of comments rather than a question. The first is that it doesn't matter whether it's what the students sitting in front of if what they're actually looking at doesn't have any meaning for them and isn't embedded in good pedagogy so I think the dilemma that we have is based in the curriculum so it's all especially when I was looking at that I saw students saying what are they doing when they're accessing their digital devices and they want good academic quality academic content they want more interaction online probably in face-to-face as well and they want to have you know a sound experience so I think the curriculum's missing there and you might have other you know the rest of your data might unravel some of that but unless we bring back the curriculum and what what academics mean by the curriculum it doesn't really matter what digital is really. I think your overarching comment there about is about the curriculum and it is about you know I think all the research we have today it is about good and effective sound curriculum and assessment design and if you have got that in place with technology seamlessly integrated in there that will give your students those opportunities to develop their digital skills you know that that's what we're aiming for that's what we're aspiring to. There is more detail in the report around the digital at course level that we'll start to sort of unpick that in a bit more detail particularly some of the qualitative analysis as well but no absolutely agree it is about that thank you. I think we've got about 30 seconds or so are there any questions up there that you'd like to give a 30 second response to? I think that's interesting about trends over time and you know I think there's something that we are looking at obviously it's difficult because we we have different cohorts each year but you know that's something I think that we can have a little bit more detail on in some of the more detailed and in-depth analysis that Mark has been doing along with Tabatha so I would attempt you to come along to their session tomorrow at 12.15 because I think that does sort of unpack some of that a bit more. I think we'll have to stop there but thank you very much indeed for a great session and thanks to our colleagues once again. Great so thank you so for a final talk in this first parallel session and let's hope your slides reappear in a second there we go my people back to the title slide. That's no good I've got the wrong one around I'll just go. I'll let you quickly fix that just while we say we're now going to welcome Martin Lynch talking about how green was my tally validating the proxies and predictions of a learning analytics service so over to you thank you. Thanks very much when I proposed the presentation months and months ago I bit off more than I can choose so I've had to overnight chuck about three quarters of my slides out and I realize I've now got 15 minutes to get through this so if you want a director's cut then see me afterwards so I'm going to basically unpack the background really quickly I'm going to miss out a whole chunk and we'll get straight to the results okay so we're going to basically cover the background to what we're doing at the University of South Wales with the GIST learning analytics implementation our early experiences with predictive analytics and the scourge of the black spot a basic overview that's the bit I'm going to skip of how predictive analytics works and we'll get straight into the results of the validation exercise we've just carried out and I say just it's it's the slides came in last night and what the results mean for us on this project so a bit of background and context University of South Wales one of the largest universities in the region of South Wales we've got four campuses our stats there basically the takeaway from this is that engagement and retention are big deals for us if you look at the polar three stats that most of our students are from a low participation background so we're very interested in engagement we're very interested in supporting our students we have a very low tariff but we have not bad retention rates so we're trying to do some good work in that field we joined the GIST effective learning analytics program back in March 2016 show hands who else is in the GIST program very few ones or two is okay okay so it's taken us two years of preparation work to get to what we called the full service which went into effect September last year so we've only been operating a full year our primary focus for the work was really adding value to our as was then fledgling personal academic coaching program that had a couple of key objectives the objective there was to present and provide with personal academic coaches with meaningful dashboards of data engagement data largely library attendance grade information these would inform conversations with students I've got time to unpack an impact assessment we're doing the takeaway from that is it has made a difference it has made a difference to our students in our PAC program a secondary expectation was that the data would be used to drive interventions that our personal academic coaches would use engagement data and the traffic lights that were implicit in that service to make interventions this did not happen I can tell you more about why that did not happen but it did not happen for a number of key reasons nothing to do with the data but mostly to do with human systems a sort of also ran program in here was just to explore the potential of predictive analytics which which was for all of us in the program quite an early offer the schema of what an analytic service looks like looks like something like this at the bottom is all the data that we're collecting about our students this is the course level information module level information staff on a module etc so going from left to right then we've got all the past stuff about the students the courses and modules they're enrolled on and then we're moving into activity so clicks visits really book borrowing we use the Alma system attendance monitoring there's a valetier story about attendance monitoring which I won't get into coming next year we do have panopto sessions viewed and quite excitingly we've got electronic resources accessed through the library this all gets sent up this is the the the critical thing is that data has to conform to a data specification the udd that's the master plan if it's not in the udd it's not getting in this is all sent up into the learning data hub and this is where the magic happens this is where the learning analytics processor does its stuff the outputs the top layer the most obvious for our staff or is the staff dashboard this is the descriptive data screen that the staff are accessing and viewing all that good stuff for the students the students themselves are getting an app and they're using that app to consume a smaller subsection of the data but they're also using that app to check into events but attendance monitoring we've been coming little foxes and we've now got our own way of drawing this data back in through a reporting service and adding stuff that we can't get into the udd that's data that we know is important but it doesn't actually form part of the udd we're going to be using that for an exciting next stage next year and then if this was all working nicely you'd have alerts and interventions coming out of it that's the program okay now we're in this for another two years our license lasts until 2021 so this is one year out of three so watch this space okay the scourge of the blank spot again if you were part of the analytics program back in the early days you'd have had to choose a data science partner we chose tribal and they're mature product student insight any student insight users here good because i couldn't talk about what had happened for us the problem with student insight is a very mature product but the problem that we found was when we were looking at the screens it was presenting and the screens it was presenting was showing a percentage likelihood of risk for the student it was basing that on quite a lot of personal characteristic data parents educational background their their highest tariff on entry this type of thing the black spot was coined because our personal academic coach was saying to us if a student comes to us with some of these characteristics and they have a bad just prediction meeting i can't do anything about that i can be aware of it but it it sets up a bad precedent it gives them a black spot as it were about so it goes against the ethos of the coaching model that was trying to be promoted there so staff were really reluctant to engage with this as it turned out it was also it was also very much about a data deficit model it wasn't looking at the cohort or or of a course but it was looking at an individual however last summer the supplier at that point said okay if you want to continue from this point you're going to have to pay a license fee we hadn't even validated the results we weren't even sure about the results but we were going to be expected to pay six figures so we walked away well in fact we were chucked we were we were essentially dropped as part of the partnership so that left us working with jisc now jisc by this time had selected a consortium led by consultants unicorn marist unicorn and marist college now the difference between the model that was within this one was that it focused almost entirely on the descriptive data on behavior and activities the personal characteristics that were in there but they're very minor very background no black spot and the model only starts to generate predictions when the activity data appears so we were very happy with that so our mature data was quite mature so we were able to get our predictive analytics model working quite quickly so we enabled that last september so for a full year i'm going to skip this whole section how it works you don't need to know about that okay rock curve for any of the data scientists the results so what do we do so we were capturing the predictions that were coming out of 61 courses about 3000 students we chose those 61 courses because of their low medium and high retention rates we're very interested to see predictor was any good at predicting risk academic risk what it was predicting or set out to predict or claim to predict was the likelihood of a student not progressing on the course deemed as an academic risk okay so we'd already developed a method for anonymizing the data so we could give it to an undergraduate so we gave it to an msc student we created a research proposal msc student took it and ran with it the challenge was take this data take the actual results from these 61 courses and compare the predicted result with the actual result not as a whole but very interestingly through the year so at what point was this predictor any good and for what reasons and for what courses in the literature there's very few people of anybody doing this kind of thing about a timeliness but when does this happen so um we want to know at a secondary level how the model performs over the course of the second year yeah okay right so we're attempting what are we trying to do which we strategic we want to intervene early and we want to intervene effectively so we want an early indication of risk now the jist model differs slightly from what we might be looking at in that the jist model is looking at the bottom 15 percent the 15 percent of student average grades now um if anyone is familiar with the types of progression codes that you can get for a student and we've got about 65 um it's a moot point as to what categorizes success and failure so is is resitting uh an assessment of risk a lot of academics would say it probably is does the student see that as a risk etc i've got five minutes i'm gonna crack on i'm gonna forget this precision versus recall although it's an important concept right the results this is what you're all here for so only on the left you've got the students in our sample who we knew were actually at risk and on the right you've got the ones who were not at risk in red predictor predicted uh those at risk now you might think 75 percent that's not bad so if you were 75 percent uh chance of being predicted uh and then that just means that they were picked up in a predictive score once in the entire year so as a as a year set back that's not so bad unfortunately through the year it doesn't look so good it's probably easy to see it on um this is a monthly view and this is in a weekly view this is the whole data set as a as you can see kind of where you might want to think that there was a gap unfortunately the predictor service stopped working for about seven weeks which is why there's this gap of in the middle right so at the top you've got this it's reversing the the graph so the top layer is the at risk the known at risk the bottom is the not at risk now what we really want is the top layer to be all red and the bottom layer to be all blue all red would mean we actually predicted students who were actually at risk so as you can see it's not particularly effective and particularly when you might want it to be effective right at the start of the year because then you can do an intervention by the time it starts to improve it's the end of the year not particularly helpful however that's not the whole story for some courses it's very good now here's the bsc computer games development course and now this this is a very good graph mostly right at the top mostly blue at the bottom and if you see the one of the first bar charts that's about a 55 percent it's even better on this one which is the bsc computer science this is our best performing example so in this one 75 percent of the students who actually failed were identified in that november data grab and that's really exciting so if we intervened using that data we would have hit all the students who would have actually failed that's a really good example on the other side unfortunately we've got some very very poor examples theater and drama it considered everybody to be at risk aeronautics engineering nobody was at risk until the very end when some people were at risk but they sat their exams we knew they were at risk foundation degree in community it was even worse so what's going on here well i tell you what we think is going on here is the the computing studies courses are the only ones they've got blanket coverage of us of attendance monitoring so we've got a real problem in our institution of data gaps and our attendance monitoring is a big problem for us and something we need to do a lot more work on for that course and the way they teach it attendance is critical and they may put a big store in it so surprise surprise the takeaway from this is where the data is good the predictor works where you've got no data it doesn't really work very well so what now there's some graphs which can prove this you can see that the ones in red are best performance this is a particularly interesting one and clearly the the in a good performing course it outperforms the average miles so etc etc so the rock curve anyone's familiar with rock curves a receiver operating curve that shows it it takes the false positive and the true negatives put them together what you don't want is this graph what this graph is showing is almost a straight line that's that's just better than guesswork and that's in November so in November if you just got a monkey to chuck darts you'd probably get as as good as predicted was doing however in June it looks like that now that's a good receiver operating curve okay so preliminary conclusions from this one minute perfect we do happily conclude that jisk is using robust methods and its data is predicted calculations there's nothing technically wrong with what they're doing and it's been done well overall the accuracy for a sample course is approximately 60 percent precision which is a concept of how when it was when it was predicting a student how accurate was that prediction it was only 25 percent accurate and didn't get up until 35 percent until the end of the year recall which is how many of the students who actually were at risk did it capture was 25 didn't get much beyond 50 percent um so for the majority prediction they're not useful in two to two late in the year however where the data is richer in certain courses it performed much much better but given the overall performance of this we're absolutely right in holding it back and putting it in a black box do not look at it and test it so we did the right thing our next step now that we've got a method for examining individual course performance now we've got to continue to investigate what is the data that's contributing to the highest performing courses and why and what can we do about it i'm going to work with suppliers to understand the this research confirm the findings particularly our two different definitions of risk so as our definition of risk different to the 15 percent it turns out that it's they wouldn't have made much difference so i think we think we the conclusions are broadly correct we continue to collect data for our sample courses and for those courses where it is actually performing quite well we're going to be carrying out a pilot where we're actually going to be using this data in anger with our course teams to say take action on this data because we can now be comfortable that in these courses it and for these reasons it works phew directors cut later thank you but martin very much indeed very well timed we do have about three minutes or so for questions if there are any in the audience yes please yeah second row from the front there thank you very much matt often from the adam smith business school i just wonder if you could talk a little bit more about the black spot is this a kind of machine learning problem where we pick up a lot of stereotypes due to bad data in the first place i i think it's the model that that we were certainly seeing within the tribal model so the tribal model was looking at outcomes and was looking at factors that lead to certain outcomes and it was putting a lot of emphasis on background characteristics so the the kind of things that students couldn't change where they're from uh their their qualifications on entry the way that they got here um so doesn't tell the same the right story what we liked about the the unicom and marist model was that it was that stuff was okay at the start but it was massively skewed by behavior we don't care where you've come from but it's what you're doing here camp the grades you're getting the attendance that type of thing so that's why we were more ethically drawn to to that model thank you martin we have a number of questions on vbox i don't know if you want to select any one of those in particular you'd like to respond to um the first one right away yes i think there's a huge data literacy question about staff looking at understanding a dashboard it's something that we're working through with our um so we've only been doing this for a year for with our personal academic coaches and it's something that we're constantly working uh to improve with our staff training so i think data literacy about graphs and how to read them from meaning is is very important uh is what i can't go time to read that one sorry um no the prediction algorithm is not created by core stuff that's created by jisc the supplier and students weren't seeing any of this data and they weren't aware of it and with no action was being taken on that on that data at all so it was just sitting in the black boxes and experiment lots of questions coming in now yeah i think we've literally got one minute so i think the question the longer question was um just uh i think i'm pointing towards um uh courses with lots of coursework checking points opportunities for writing drafts um do they give more possibilities to identify students at risk it's it's it's okay so the the it's all about the data so the courses that are performing better seem to be where uh we've got uh coursework which is being graded early and put into the system quite often our courses might not put that grade information in the data system at all they'll be using attendance data they'll be using the vle so we've got a problem in our institution about lack of engagement with the vle so is everyone else but our vle rates are very low but in courses where there is good use of it you can actually use it as a metric for engagement and it does seem to be meaningful and there is a correlation with with success so i think um that's a good question but it's a lot about that activity turning into actionable data that we can put into the to the system yeah great thank you very much um uh colleagues apologies because there's still um a number of questions coming in here so it's provoked a lot of thought and which is fantastic um however it's now time to provoke a lot of lunch so um if we could just thank martin and other speakers again and we will break now for lunch which we'll find being served back downstairs thank you enjoy the rest of the day the future developments include a text and data mining service working with satellite data and machine learning and smart campus technology edina's work with learning technologies helps to develop skilled data literate students who can change our world for the better teachers and students can develop and share coding skills with notable our Jupiter notebook service our digi map services deliver high quality mapping data for all stages of education future developments include a text and data mining service working with satellite data and machine learning and smart campus technology edina's work with learning technologies helps to develop skilled data literate students who can change our world for the better teachers and students can develop and share coding skills with notable our Jupiter notebook service our digi map services deliver high quality mapping data for all stages of education future developments include a text and data mining service working with satellite data and machine learning and smart campus technology