 I'm Peter Stone and this is Geffin Edwards. I'm a really administrator and Geffin is a retention in data officer. We're here today to share with you our learning analytics journey. As already mentioned, we work for the University College of Estate Management and our core purpose is to be the lead and provider of supported online learning for the built environment. UCM was founded in 1919, so we already have a wealth of experience for providing high quality learning opportunities for the industry. Me and Geffin haven't worked there for that long, but we have a few people that have been there quite a while. We offer a range of programmes from level 3 apprenticeships up to level 7 MSCs and MBAs to enable students to succeed at all stages of their career. At any one time, we have over 3,500 students from approximately 100 countries worldwide. Average age, typically 32. Moodle has been used as our VLE since February 2009. All of our modules, which are delivered as Moodle courses, adopt an agreed model that is applied consistently across all programmes. Over recent years, an institutional initiative has utilised a strategic approach and data-driven approach to focus on student retention and success. We've called this No Student Left Behind. As all of our students are online learners, we have adopted predicted analytics for our Moodle environment as this allows us to utilise the wealth of data available to us. Learning analytics was implemented on our Moodle site in September 2016. In preparation for this, we worked with a third party supplier, specifically their data scientist, to produce statistical models that best fit our data. To achieve this, historical VLE logs were used alongside the student grade data for past modules. We went back about three years with this because that's when we had to apply that consistency. We continued to work with a third party supplier to refine the models, ensure that the analytics provided supports the UCM tuition model, our tutors look after groups of 45 on our modules and his best place to support the appropriate interventions to help our students succeed. We have already used some of the data provided at an institutional level to inform decision-making, which we will cover in more detail later. The learning analytics provides approximately 35 reports. Before I show you some of these, I'd just like to say that we are very much at the early stages of our journey and expect our use of learning analytics to evolve over time. So our first one is the learning analytics dashboard. You can probably guess what product it is. This is available to staff on our site in the top section of each module, of course, that it has been configured for. There are four report areas, risk status, activity, grade book and discussions. I'll be showing you some examples of reports from three of these areas and we will detail the information that is provided and highlight where we have used this information to make strategic decisions or where we plan to investigate in more detail. I won't cover the grade book reports as we don't currently have graded items on all of our modules. This is an enhancement that is planned for September 2017 semester, but if you cannot share that with my boss it would be most appreciated at this stage. The dashboard also displays a list of recommended actions. It is a activity by course date for a module of a small number of students, about 30. As you may expect for a small sample set, it is fairly sporadic activity. However, the overall trend is for an initial burst of activity at the start of the course, which drops off before there is a peak towards the end. Any guesses for what that peak might proceed? Yep, that's spot on final exam. There's no praises, I'm afraid. And that's consistent across all of our modules. It's more pronounced on the next slide, which shows a module which has approximately 750 students. There's three peaks here, really. Initial activity, assignment one and assignment two towards the end. And we see this across all of our modules. Another trend we generally see is a drop-off in activity after that assignment peak. This correlates with students' transfer of efforts to an assessment on another module. Currently our students study the modules concurrently in a semester. And we've used this information, alongside other pieces of information, to redefine our semester structure. And from September 2019, we will be delivering modules sequentially rather than concurrently within a semester. Here's a table that covers activity metrics and contains what our students logged in during the previous week, their last activity, total number of discussion posts, time-spending course and the regularity of their visits. These can all be ordered and the activity report for each student can be accessed. A key piece of information that we've determined from this report is that a relatively high proportion of our students, typically 50%, never post to a discussion forum. This has initiated a review of our module structure with the aim of determining what we need to do to create higher levels of engagement. We've also used the information here to investigate whether a high amount of time spent in course correlates with grades. Initial findings on a module with 750 students suggest that there is quite a fairly strong correlation between the two. Although, as you may expect, there are some anomalies with this and we are working through these to determine the reasons. To complete the activity section, here is the activity by time of day. Highest activity takes place where the blue line is closest to the circumference. About 65% of our students are based in the UK. This is higher on this particular example. It's probably about 80%. So it's no surprise that the activity is higher during 8am to 8pm, roughly. Although the data presented here is blurred by the end of a module, it's a useful tool early in the running of the module to determine the best time to hold asynchronous activities. I'd like to now move on to the discussion reports. The participation metrics table displays a total number of discussion posts made, the number in the last week of the amount of original thoughts in the post, and how much critical thoughts is displayed. As I previously mentioned, the percentage of students that make a forum personal model can be as low as 50%. With the aim of increasing forum engagement, we are currently running a pilot project where badges are being awarded to students whose posts display original contribution and or critical thoughts. This chart displays the interaction of students and tutors on a module and how their forum posts stimulate critical thoughts in others. The figure of the line, the greater the stimulation of critical thought. The tutor is towards the bottom right corner. But you could argue that, as this module is at academic level 7, you may expect more critical thought stimulation from the students in the module. Our challenge is not only increase participation, but drive the student-to-student interaction so that the forum becomes a thought-provoking environment for our students. The final reported area that you're going to show is risk status. Realistically, if we only had access to one report, this would be the one that we'd be interested in. However, it essentially uses all of the other information and pulls that together into an overall risk score for the students. At an institutional level, we're in the process of determining what the trigger points for interventions should be and who owns them. Geffin will cover these challenges in more detail. Although this report does detail the level of risk, I'd just like to show the following chart, which shows how that evolves over time. Green, low risk, red, high risk. So you can easily see a student trending towards high risk and intervening as appropriate. There's not the time to show any more reports today as we wanted to convey the impact that the learning analytics has had to date within the institution. Now we're now hand over to Geffin who will cover this. Right, thank you Peter. So having sewn you some of the reports available to us, I'm now going to focus on how we've used the data, the challenges that we've faced and the next steps. So unfortunately for you, this means a shift from colourful visuals to a series of bullet points. The first of which are the impact and benefits to date. Even at this very early stage in our learning analytics, the data has already informed our learning and teaching strategy. As explained by Peter, our student study modules concurrently with most students studying two modules per semester. We're now working towards a new learning model where students study one module at a time and one of the key drivers for this was the data shown by learning analytics and how it showed the impact of assessment dates in one module on student activity levels on another. Something that is of concern to us given that our modules are structured by a series of weekly learning activities. In other words, we now have the knowledge to make informed data driven decisions as we understand more about our students and more about the way that they learn. The fact that assessment drives student activity isn't exactly unexpected, but we now have the data to support appropriate changes and interventions. The assessment strategy is now a key part. The data has also been used to inform module development and our assessment strategy. We've looked closely at the timing of assessments and are also considering the option of awarding students for engagement on the VLE. Both of these to try to encourage a more consistent level of engagement throughout the module. The assessment strategy is now a key part of no student left behind. Using the analytics reports, we've been able to demonstrate a positive correlation between student engagement and success. We also know that engagement in first assessment is a key indicator of student retention on a module. We've therefore introduced initiatives aimed directly at increasing student engagement and have also introduced module KPIs of 90% engagement with first assessment and 80% engagement with final assessment. Regarding the most important initiative here is increasing levels of tutor VLE engagement. We've introduced several tutor KPIs to drive this, one of which is to respond to all student VLE queries within two working days. To support this initiative, we developed a report in our data warehouse that allows us to monitor tutor activity on the VLE and to measure whether the tutor is meeting those KPIs. In all of this, learning analytics data is very much driving no student left behind. It's provided a real focus to help driver our attention and success strategy. The key to this is the level of excitement that is created within the institution. It's really helped to engage staff as we suddenly have this fast amount of new data that has the potential to really help us understand how our students study and learn and to help shape our learning and teaching appropriately. The key word though is potential. We are still learning and identifying how to make the most of this new data. We are still very much at the beginning of our journey and we have several challenges going forward. The first of these is the need for new institutional policies. For example, we are currently putting in place a learning analytics policy. The second is the potential for data overload. There is so much data verbal to us now, rather learning analytics, that it is difficult to know where to start and where exactly to focus. I think there is a danger of getting lost in the data, where the data starts to dictate the focus rather than being used to necessarily inform the key questions. We are also facing a further challenge of joining up multiple sources of student data, whether that is directly from our SITS student database, our new data warehouse, which pulls together VLE data and data from SITS, and now, of course, learning analytics. We also need to start thinking or using the data to inform appropriate student interventions. So far, we haven't actually used the data in this way, even though this was and still is the driver for our use of learning analytics. There are several questions about using the data. There are several questions to explore and answer before we can actually move forward with that. The first relates to who. Who should be responsible for monitoring and potentially contacting individual students? Should it be the module leader or module tutor or maybe support staff? The second relates to what. What interventions or actions are appropriate? The third relates to how, for example, should a student be contacted by phone or email or via a message on the VLE or a combination of these? Alternatively, there is a more generic message on one of the module forums more appropriate at certain points or in certain circumstances. The final question is when. What points during students' time on the course or during a semester are critical when it comes to student engagement? When should monitoring perhaps be intensified and when is the most appropriate time to contact those students identified as a potential risk? All of these questions need continuous review as we undertake and evaluate the interventions in place. It is therefore imperative that we effectively record the interventions so that each one can be evaluated. This is extremely important when it comes to informing future interventions. We come to our final slide, our next steps. We've learnt a lot over the last six months or so. Now we're going to put into practice the interventions that we deem appropriate. We need to identify the who, what, how and when and then to put this into practice. We're eager to explore ways in which we can link in student demographic progression and performance data into the risk metrics. For example, we're interested to know more about how our Hong Kong-based students learn and how this compares with those based in the UK and to adapt our interventions accordingly. We're also interested in introducing a student dashboard so that students can monitor their own progress. I do think, however, that we need to fully understand and explore the data before moving forward with this. We would also be very interested in seeing tutor activity engagement via the learning analytics. I was referred to earlier, central to our attention strategies is that tutor VLE engagement is a key driver for student engagement and adding in tutor activity in the learning analytics is that you measure the extent to which this is true. Finally, we're keen to explore opportunities for publications and this is another reason why it is extremely important that we effectively measure the extent to which and record and evaluate the interventions in place. Finally, just to sum up, having introduced learning analytics in September 2016, we are still at a very early stage in our journey with many opportunities but also challenges ahead so that, so much so that Peter is moving into a new role to allow more time to focus on this project. Thank you for listening to our journey and we will be happy to receive any questions. Thank you, Peter. Do we have any questions? No? Nobody got any questions? That light blinds me. I can't see how people in the room. Can I have one in the middle there? There's always one. Thanks. I enjoyed your presentation. One of the challenges you had was evaluating the success. How do you actually evaluate the success? Good question. Head to the challenge. I think the first step we need to undertake is to identify what we do first. I'm also very concerned that the learning analytics are now available to all modules, all tutors on all modules. What we don't want is tutors just going ahead, interpreting the data in their own way and deciding on their own forms of interventions. There's a danger there in that there isn't a coordinated approach and also we miss the opportunity to record what they're doing and the potential outcomes of that. I guess we do carefully monitor module retention rates. The end of a semester, well throughout the semester, we monitor the number of students that we refer to as still active. That means they're still registered to study. That doesn't mean they're necessarily engaged, but they're still registered to study on the module. We have historical data to show what the retention rates on modules are. One step is to identify whether the retention rates have changed as a result of the interventions in place. Also, we hope to be able to use the actual learning analytics reports to see as Peter showed, I'll go back to the slide. Hopefully, we can identify students that's potentially read early on. We can identify what the interventions are put in place and then track to see what impact that had. If we record it well, we'll be able to look at that across a module and across all modules. That's the plan. We'll see how it goes. Thank you. Any more questions to our Kevin? No. Okay. Thank you very much.