 Our next speaker, we have brought all the way from Melbourne in Australia. Professor Dragan Gasevich, he's a professor of learning analytics now in Monash University. He's widely considered the world's most foremost authority in learning analytics, with which his name is synonymous. He's the founding program co-chair of international learning analytics and knowledge, the LAK conference. And he's the founding editor of the Journal of Learning Analytics. So he comes with huge experience and I'm really looking forward to hearing what he has to say. Alright, thanks very much for the interaction and kind invitation, my pleasure to be here and also very pleased to see that so many people in higher education are so interested in learning analytics, so my trip was already worth it, so thanks very much for having me today. So I'm going to talk about learning analytics today and my key message is really kind of completely contained in the title as well that I have for today's talk. That learning analytics in higher education is about embracing complexity, rather thinking about learning analytics as a simple silver bullet or a quick fix that will solve our problems. I think that's really very essential. So before I actually start developing my whole argument and start sharing a fair bit of experience in terms of learning analytics, in terms of adoption in higher education, I'm going to just kind of quickly talk about some of the recent trends in learning analytics. In early days of learning analytics and why many institutions in higher education were so very much attracted to learning analytics is because they saw also clear value proposition in terms of both monetary but also student support. In many cases institutions were primarily attracted in the first place with some of these I call them deficit models, meaning identification of students at risk and trying to improve retention of students. Lots of important work was done in mid-2000s, 2007, 2011 at Purdue University. They developed and piloted a big system called the core signals and they showed considerable benefits in in terms of student success, in terms of student retention. However, over the time we actually start developing other types of learning analytics and start focusing more about optimizing learning, about predicting and describing different types of learning outcomes and learning processes. We also started talking much more about 21st century skills and something that makes our learners successful both workers and citizens in the 21st century and also thinking more about dispositions that our learners need to have. We are also able now to understand with the use of learning analytics much better what type of learning strategies our learners are actually taking when they are studying in different blended or fully online environments. And we are also able and that's I think I would consider one of the biggest successes of learning analytics. We are able to provide personalized feedback at scale where each and one of our learners are receiving a personalized message while our teaching staff are really not getting much increase their workload and students are getting much happier with that. Moreover, they are getting much better grades in many different contexts that we are getting to see. And of course we are also seeing much more progress in terms of frameworks for ethical use of learning analytics with privacy protection. All these things are great and many institutions are quite excited but this is the slide I've been using for the last five or six years and hasn't changed much. Most of these issues are still in early stages and they would tell you well we are still in early days of learning analytics adoption. Well we did this small pilot and we are still thinking about how to implement an rural learning analytics on a bigger scale. So what are those critical adoption challenges? We studied those in different contexts initially in the period 2014 and 15 with a colleague from Australia Shane Dawson and a big team of other Australian colleagues. I had a pleasure to work on their project which is funded by the Australian Government's Office for Learning and Teaching which benchmarked the entire sector in terms of higher education and analytics adoption. We just wrapped up another European project in September this year which was between 2016 and 2018. It was called SHILA. It was part of Erasmus Plus in which we both tried to benchmark the current state of learning analytics adoption and implementation in Europe and also develop a framework that will help higher education institutions to develop their policies and strategies for adoption. So we spent a lot of time discussing that framework but before that we're going to talk about some of these critical challenges that many institutions are experiencing in terms of adoption of learning analytics. First one is really the shortage of leadership for successful adoption of learning analytics. Many higher education senior managers are quite eager to have implementation of learning analytics yet they don't in many cases have sufficient backgrounds or clear understanding what are those clear benefits that learning analytics can bring and data science can also introduce in terms of understanding student success or other priorities for many institutions. More importantly where we are seeing even a bigger challenge in terms of leadership is academic leadership. Many institutions may have different professional teams that are quite equipped to deal with large amounts of data however in many cases they are not really clear in terms of what is that academic vision and what are the academic needs of their teaching staff and students. And then brings me to this next point. Two critical groups are barely at least in the literature ever consulted in terms of the implementation of learning analytics. That situation slightly changed since the systematic literature review we did in 2016 but yet in many cases institutions would be just going ahead buying a product without consulting two case stakeholder groups and their input was not sufficiently so in terms of the development of institutional strategies and policies. And of course critical point is related to lots of policy related dilemmas for many institutions. What about informed consent? What about GDPR? How we are tackling data protection of our students? What about security of our data? Institutions are not quite clear and this is fairly uncharted territory for many institutions and they are of course risk at worst because many things may also go wrong and they don't want to really kind of ruin their experience of their students but rather they want to be on the safe side. So considering these challenges then what can we do? What can we do better? And I'm going to spend most of my talk today on directions and the things that we need to consider in terms of development of learning analytics strategies and policies in higher education. In many cases what we actually saw in learning analytics is that institutions would start actually looking into fairly technical issues related to learning analytics. Those technical issues would be like what kind of data do we have? Or do we have data about students' logins to our learning management systems such as Moodle, Blackboard or Canvas or whatever other elements you may be using? Under these cases also they will be saying what other socioeconomic, demographic or academic data we may have about our learners? Can these data be joined? And if these two data sets can be joined can we also feed that back to some of these key stakeholders? Create a student dashboard or create a teaching dashboard or create a senior management dashboard? Those would be the types of conversations in most of institutions. Lots of things that are related to data crunching and data management as well. But in many cases we are seeing complete absence of consideration of different stakeholders who may have completely different concerns, privacy protection, personalization, adaptivity, data governance, interoperability and sharing of data with different relevant partners. How we incorporate these different stakeholders? There was the precise of the approach we took inside of the SHILA project in which we tried to develop a framework that can help institutions integrate the voices of the relevant stakeholders as they are going forward in the implementation of learning analytics strategy and policy. So inside of that project what we actually start from we actually start from an existing approach which was precisely developed to help translate evidence and scientific evidence into a policy. The approach is called rapid outcome mapping approach and then again this rapid is another abbreviation we can go indefinitely with different abbreviations with this particular approach. But a key point in this particular case is that we need to initially start from the understanding of the political context in which certain policy or strategy needs to be developed. If you are in Ireland you are clearly governed by different higher education rules in comparison to any other European country or in comparison to Australia. Therefore you may have a completely different political context. Also each institution has a different mission serves a different slightly different segment of the population of the students in spite of being in a country which is not perhaps huge in terms of student numbers yet some institutions are quite urban others are more rural others are institutions are also serving some more privileged students others are actually serving lower socioeconomic students. It is very important to recognize that political context. Another critical thing is also to identify early enough who are the relevant stakeholders and think about the relevant stakeholders. The third point is to define the purpose of learning analytics or what kind of behavioral change we want to see with learning analytics is that retention is that for example improved learning design or the overall experience of our students or as priority you have nationally here is the student success. Then what kind of engagement strategy we think if we want to enact our initial objectives once we have mapped out the political context in the agency stakeholders and define the purposes. Then we also need to pay attention and bear in mind what is our institutional capacity to have the capacity to tackle some of these critical issues and the final thing is also we need to have a framework based on which we can see what is the success and monitor success of our initiatives and more importantly also to build organizational learning memory. So based on the experience that we are getting from learning analytics and the evidence produced we can get the our institutions to learn from some of that implementation but we didn't really just start and use this particular framework in our work rather we actually developed a methodology which engaged a wide range of relevant stakeholders we engage students more than about 3,500 students across six different European countries also a smaller sample from Ireland as well. Students were first of all involved into focus groups to understand their qualitative impressions and expectations from learning analytics but also to understand their challenges and moreover we also developed a psychometric instrument which measures students expectations from learning analytics both in terms of ethics and privacy and particular services that may be expected. Surveys fairly succinct only 12 items and is also measuring ideal and predicted or realistic expectations of our students so we can see what students really want and what they think that they would get from the institution if they are actually going to get some of these services. Similar approach was followed for academic staff we then also interviewed more than 70 different senior managers across Europe that are in leading roles and they were responsible for sponsoring different learning analytics projects plus also we worked with European University Association and administered the survey which helped us to understand the readiness of higher education institutions. We also followed and discussed with experts by conducting an exercise which is called group concept mapping but asking what are those high priorities for learning analytics policy and strategy in any higher education institution. Finally we engaged many stakeholders through numerous workshops organizational or national level different types of events and also consulted with many different national agencies such as for example Quality Assurance Agency in the UK, JISC in the UK, SERF in the Netherlands and several relevant organizations Spain and Estonia. What we did as an outcome of this is created the Sheila framework and also four institutional policies that follow this process as we were actually going forward. So what I'm going to talk about today is I'm going to talk about this final product. It's a nice shape image that we developed as a result of three years work but this nice shape six angles image is actually representing that learning analytics policy is really considering six critical dimensions. The same six steps that we discussed for the Roma framework except that we are not suggesting that these dimensions are implemented in any particular order. Clearly identification and mapping out the political context is the clear first step but in many cases that also requires involvement of the relevant stakeholders as institutions are going forward and similarly also thinking about what are those critical purposes of learning analytics as they are going forward. More importantly what is at the core of the Sheila framework is the three elements. As you can see those are the action points that we documented. We also documented the relevant policy questions that institutions may have and finally we also documented a number of challenges that other institutions experienced as we were engaging with them. Final result and you can download it immediately it's just the Sheila project.tu slash Sheila dash framework is available and you can actually consult it but it's also offering a wide range of different experiences action points challenges that other institutions didn't or did consider and it's also something that will help other higher education institutions as they are going forward in the implementation of learning analytics policy. I'm going to now dive into these dimensions and think and basically try to engage us into the conversation about thinking what we mean by some of these critical points of learning analytics implementation. In terms of mapping of the political context we typically really see major drivers for learning analytics. Some of them are internal for example we see many institutions they are trying to promote learning and teaching excellence. In some other cases these are kind of between internal and external because government policies are for example pressuring institutions to pay attention to student retention. However there are other relevant external pressures that institutions are also experiencing as their drivers for learning analytics. In the UK that's teaching excellence framework as you can imagine or when I was previously with the University of Edinburgh they were also trying to understand why the university was receiving particular scores from students through national student survey. So they were trying to optimize their overall student experience in spite of the institution not having any problems with student retention. But some institutions are also feeling lots of pressure that they need to just jump on board with learning analytics because they are actually in fear of being kind of left alone. But in many cases they are really kind of experiencing many of these resellers I wish to say of learning analytics products. They are actually trying to sell them artificial intelligence without really much understanding what are the issues in higher education. What are tangible problems. They are just kind of trying to sell them uninformed type of myths that you know deep learning or neural networks can do the magic. And then you know we don't know what questions you may have but then you will get plenty of answers and then you go figure what questions you may have. And this is really a reality like a couple of years ago when I was still in University of Edinburgh I was always talking about these things as fiction stories but then we are receiving a call and our senior vice principal is asking me to talk to that company. They are selling us this product for student retention. I'm responding well we are not interested. We don't really have any problems with student retention. I ask them okay so what else can you offer with your product. Neural networks. That was it. And it was pretty much I had the conversation which was this cartoon. So the rest of the story is pretty clear. So be quite aware of these type of organic questions. So engage first with the types of questions and what are those critical points that you want to have in terms of learning analytics before engaging into any particular technological solutions. And then look for the technological solutions that are very critical for your understanding and addressing the priorities for the institution. In terms of the key stakeholders this is really some of the critical issues. Senior management is essential if you want to have a success. In learning analytics projects in higher education. Without sponsorship of senior managers it is really hard to bring relevant professional groups on board. IT, business intelligence, student services, other relevant organizational units that are important there. However in some cases even just having one senior manager sometimes is not really quite relevant. I was actually reviewing one of the leading Australian universities and their learning analytics initiative after three or four years. And all I basically saw was there are plenty of war stories. They basically had a learning analytics team which was reporting to the learning and teaching portfolio deputy vice chancellor. But then they had the IT team who were reporting to chief operating officer and then business intelligence team who were reporting directly to provost. Business intelligence team was owning the data. Learning analytics team was trying to get that data, wouldn't let them to access these data because they actually had completely different cultures in terms of understanding how you're addressing data. The learning analytics team had this kind of more data science approach. Give us data, data will tell us the magic. Business intelligence team had much different approach more in forward with the methods. What are the questions? And they could never agree upon on that. And the IT team was almost kind of found in between the two. So this is really very important to understand who are those relevant senior managers who can actually help us bring relevant professional groups on board and what kind of governance is needed for learning analytics. Students' perspective. We cannot emphasize enough how students find very important that their privacy is protected. That's the top consistent across all the different countries. And also they want to make sure that their data are ethically used. Some students are for example concerned that if we are trying to profile our students, if they end up in a wrong profile, that may also bias the grades that they are eventually receiving on different types of assessments for a good reason. And we need to have these get-kill guidelines. But then on the other hand, it's also important to engage with students to understand what are those legitimate interests as we are requested to have to deal with data according to GDPR. And what are those legitimate interests? The Sheila survey instrument is also providing us access to understand some of these legitimate interests. For example, we can see that there are cross-cultural differences across different countries. For example, why the UK students and the Dutch students completely agree about the importance of higher education institutions preserving their privacy and having ethical use of data. English students really just want to use also to see their data being used for particular services while Dutch students are much more careful about that. They don't want to do that. And so even if the institution may think that they have a legitimate interest to analyze data, if students disagree, then informed consent in such situations need to be directly obtained for particular purposes of learning analytics rather than to use a blanket approach for learning analytics. In some other contexts that as well might be completely fine as long as the relevant stakeholders agreed on that. Similar perspective is also shared by the teaching staff. There are in many cases even more concern about students' privacy and ethics than students themselves. That's well documented only in our research but also some of the research that was done even by the National Student Union in the UK. But then academic staff are by far the most concerned about their workload. When we send them the survey and we had also these open and questions almost every item received workload, workload, workload, workload. And the impression was that learning analytics will actually expect from all of them to become data scientists rather than analytics made streamline lots of these data collection analysis measurement and reporting processes and then also help them to make these better decisions. This also brings us to another important point that we need to convey these messages about what learning analytics can do for our teaching staff and also what are their workload implications and also what are these success stories that they can actually benefit if they engage with learning analytics. Experts, very clear. First thing that you need to address is ethics and privacy and the purpose of learning analytics and then you are good to go with other issues with respect to learning analytics. Those are the key I would say enablers for learning analytics. Of course according to the experts these are the issues that are easy to address because they are already existing frameworks that are in place and we collected a fair bit of actually experience in that. In 2016 Hendrick Drachler and Drachler also got the best paper award at the law conference by proposing the delicate checklist for institutions to help them to develop ethical and privacy protective use of learning analytics. So in terms of purposes of learning analytics that's something really that is very specific for every institution. But then once you have identified these purposes for learning analytics and those could be in some cases improvement of quality of learning experience dealing with scale in some institutions or improvement of personalized feedback in some other institutions or for example efficiency of some other institutions. In many cases we really need to think what are those implications for our primary stakeholders. So for example if you are running a student attention program in your institution identification of students at risk is only one small point that you will have to do. You will have to develop then a program that will engage these students and offer them additional counseling or academic opportunities that can actually help them to overcome some of the issues that are potentially identified as the risk factors. In other cases institutions are really just when you suggest analytics they are saying oh yeah we are developing a dashboard. That's almost like by default which is fine in some cases and I did also fair bit of share in developing dashboards as well. In many cases really very helpful for teaching staff and also senior decision makers inside of institutions. But what we've seen so far in terms of student dashboards they are mostly harmful if they are not done well. In particular what I'm referring to is comparison of a student with respect to the class average. There are many different studies that are suggesting that but when I just mentioned one a randomized controlled trial at Stanford University published earlier this year proper randomized controlled trial. They developed a dashboard which collected historic data decades of historic data about students grade distributions self-reported measures of the time they spent working in particular units that they were working inside of their university while they were pursuing their degrees and a few other things. Students were then randomized into two different groups one which was strongly encouraged to use the dashboard and the other one that was just informed about the presence of the dashboard. The group who was encouraged had a negative effect on their GPA through the engagement of this dashboard. So this is something we need to be very careful. Dashboards are not the only way how we can make effective implementations of learning analytics and that's really very important to consider. In terms of strategy and the engagement strategy this is really one of the key things that we need to consider in terms of learning analytics strategy and policy. It has to be well aligned with other relevant institutional strategies and policies. So typically most of institutions have their learning and teaching strategies. Many students are also having digital learning strategy or they are developing but they also have a number of other relevant strategies and policies in terms of IT data protection policy or data security policies that are quite relevant and also policies related to data management. Some of these things need to be very carefully considered and when you are creating these governance teams or steering groups that are responsible for learning analytics you need to bring them on board. I already mentioned the problem of the three different reporting lines in the Australian University but I also mentioned another similar issue that we experienced in Canada when I was at one of Canadian universities still I spent Canada for 10 years. So vice president academic had an idea to promote education innovation projects and we had one of these projects which was related to learning analytics which was heavily supported by the vice president's academic office. But then everything really worked well until the point we wanted to deploy particular technology onto the operational LMS of the institution. At that point the IT team told us that's against our policy and so they were pretty much telling like the chief academic officer who cares this is our policy and you can do much here. And then it took us more than six or seven months to get to the point that we were actually able to do even a small pilot against or together with the IT team. So this is something which is a real world and needs to be considered very carefully as we are implementing learning analytics. The other thing is also how is responsible and how some of these things are triggered. Some of these things are potentially triggered automatically but then the academic staff member needs to take the load. For example we are developing a technology which allows for personalized feedback provisioning for students. And this personalized feedback provisioning really requires input of the academic staff. But the approach works in such a way that academic staff member really needs to spend amount of time that is equivalent as if they are supporting three or four maybe five students depends on the complexity of the tasks students are using to provide personalized feedback at scale for hundreds in some cases thousands of students. However you need to account for the development of some of these rules as they are developing personalized feedback as well as bring the relevant expertise on board who can assist the staff members who perhaps don't have sufficient technical expertise to understand the types of data as well as to create some of these templates as you are implementing in particular cases. On the other hand the payback is quite significant and also as you are going forward you really don't have to repeat this exercise as long as students are following a similar learning and course design. This is another then critical thing to consider. Analytics really needs to and the policy needs to go to the core of understanding of the capacity, institutional capacity. The case that I mentioned from Australia that they were struggling with data was also that the institution really didn't have data lake meaning that they didn't really use APIs to allow different relevant professional groups and easy access to data but rather there was one data guardian who was protecting their institutional data enterprise data warehouse and this is one critical challenge. However in other cases as well it is important to consider what is the disposal strategy of the institution. Do you really have data and in many cases really collecting huge amounts of data is quite it could be expensive depends what kind of policy you actually have for that. Data storage if you are using cloud solutions are fairly inexpensive but then it's really important to understand what is that capacity and then have you also resolved issues related to data integration? Do you have sufficient unique IDs that are easily connecting relevant data paths that you may have across your campus and also get involved for example at the University of Edinburgh we had to involve chief security data officer who was responsible for also signing off some of the relevant issues related to learning analytics. Human I would say capacity is really really important. I can't emphasize it enough. Academic leadership is so scarce at this stage. We need to build much more of that human capacity. Many institutions are also witnessed a case in which they had this really strong professional group who was responsible for learning analytics but they felt completely disempowered to go to talk to academic staff and their perception was that academic staff could not care less about any of these changes to happen. Simply because in many cases they didn't have enough academic credibility to talk to the relevant groups who could actually accept some of the really good benefits of the technologies that they were producing and of course thinking about the infrastructural capacity. Final thing in my view is probably the biggest thing in terms of the analysis of the institutional capacity culture. Do we really trust data? Do we have the culture of even making decisions based on data? 2012 good colleagues Shane Dawson and Leah McFadgen they published about a Canadian university in which they were talking about a case study where institution was making decisions about their next learning management system that is they were reviewing the existing LMS and they were thinking what to do next and they were basically documenting how institution instead of make decision based on or by being informed with data institution made decision in spite what data were suggesting so this is really one of these critical things that we need to consider what kind of change management as well we have and how can we also have all our relevant stakeholders to be actually involved into that institutional change as we are actually thinking about the culture of the use of data and higher education. Final dimension of the Sheila framework is really talking about establishing monitoring frameworks. I'm afraid there are not many frameworks there and this is really not a big surprise I guess for many of you as well because the field of learning analytics is pretty mature and we are seeing anyways if you are not seeing many systemic adoptions of learning analytics across the world and how can you expect also to see many frameworks as well that are actually telling us what is important to think about but this is also something that we need to think about and some good examples are suggesting that we really need to think about relevant indicators what are those quantitative indicators in some cases that's easy that's pretty much like the number of students you retain versus when you didn't have the learning analytics implementation but also qualitative indicators as well is it the satisfaction of the students with their overall learning experience is it also qualitative really kind of improvements of the access to relevant services that students may have and enhancing that whole journey that students may have inside of the institution this is really very important to consider as we are talking about students in some cases as well relevant is to consider satisfaction but this is another critical thing to bear in mind with learning analytics in many cases it's really hard to isolate that tiny effect that learning analytics may be making why because all of your institutions are making so many different projects and developing different projects for example for student success without learning analytics digital learning strategy lms different platforms with your lecture capture all sorts of things and an analytics so the question is do you really want to anyways think about this kind of big whole thing or you want to try to think about one isolated effect in some cases you really don't care so much about the one single isolated effect unless you really are completely into the efficiency and optimizing all the use of resources in that case then there are methods that are out there that are helping us to think about the causal effects that can be derived from the implementation of learning analytics few more things and I'm gonna stop talking just quick epilogue of the use of the framework in terms of its practical implementation at one of higher education institutions in in Europe while I was still at the University of Edinburgh we developed learning analytics policy and now it's the whole strategy building developed so we did we decided on the advice of one of the vice principles digital learning to have policy developed in two steps first step was to develop something that can be used also as a vehicle to consult everybody on campus to kind of make a fuss but on the other hand also prepare a really simple two page document that is outlined the key principles and key purposes for implementation of learning analytics among these principles what we actually recognize the first one is that lots of data is out there available but in many cases they are just eventually used to provide models of reality and we know that whenever we are modeling models are reducing that reality does the data and the models are always incomplete and to make any major decision about our students we really have to have a human in the loop you can't simply set an email to some students we don't think you are going to make it that's as simple as that right another critical thing is and that's quite obvious transparency needs to be present there so by having these principles and purposes you are already becoming quite transparent and you are writing these principles and purposes in very plain English language rather than to use very kind of bureaucratic technical terminology that is very common for many policies that's another critical thing that of course you need to have good governance the University of Edinburgh decide to have that governance by establishing the governance group for learning analytics who would be vetting all the relevant projects that are going to emerge inside of University for learning analytics both in terms of the types of data they are considering in terms of the ethics and also how they are aligned with the purposes that are strategically identified for the institution and if they were not aligned and there was a dialogue why you are actually doing something that is not necessarily the priority of the University another critical thing that was identified in the principles is that institution shall avoid deficit models meaning we should not be just investing into the students who are struggling but equally so we need to invest into the students who are at the top of our ranking scales they need to be additionally challenged they deserve also equal opportunities themselves in many cases we also recognize that algorithms can perpetuate bias as mentioned also by the minister we had this quick conversation this morning and there was a colleague at TU Einhoven in the Netherlands who run a study by trying to really see what would happen if a predictive model for enrollment of students into an IT program would be also used as a decision maker for involving students first of all the most predictive variable emerging that process was gender of the students we unfortunately don't have many women in computer science information technologies however that gender for women would be a risk factor according to the predictive algorithm and what happened is basically if that algorithm was making these decisions over the course of several years you would have fewer women than more and that's really something we need to acknowledge that algorithms can be fairly dangerous and that's part of this principle for learning analytics at the University of Edinburgh final thing was a concession made to academic staff and the unions that learning analytics shall not be used for academic staff performance assessment as simple as that of course there was no unanimous consensus around the campus some were very unhappy they thought that was not really sufficiently strong to protect academic staff others were thinking they are actually not able to do their job with this principle but it's a really critical thing to consider in terms of purposes of learning analytics these are really something that is very specific for every institution you think the University of Edinburgh is thinking about the quality of their designs and the ways how experience of students can be improved also equity issues analytics is seen to measure different effects of particular initiatives that are trying to reduce or completely eliminate bias University of Edinburgh is particularly focused on the scalability of their online students fully online and they have a strategic priority and they were also quite concerned with the personalized feedback of students not that they are actually concerned with their national student survey ranking as well but it's completely directly caused by it also they were interested with students overall experience skills and efficiency but those are just some of the details that are relevant for a particular institution and as you are actually going to implement learning analytics in your institution you need to really think about what are your priorities that is to say purposes of learning analytics some of our war stories about the implementation learning analytics at the University of Edinburgh are documented by my partner in crime Anne-Marie Scott at the University of Edinburgh and that blog is also available so you are welcome to explore it and see what happened all right before I wrap up a few really quick final remarks learning analytics is not a quick fix but we really need to acknowledge and embrace the complexity of what we have in higher education we also need to recognize the critical role of leadership for the adoption of learning analytics both senior management and also academic leadership without those two we cannot really have effective adoption of learning analytics but in my view the biggest challenge is and we all need to work on that very hard is the development of data-informed decision-making culture that's not something we are really good at and so on that note I'm going to stop my talk and just basically one final plug for many of the institutions who are interested in higher education adoption we develop a MOOC which can help basically share plenty of these experience my talk is a really just a small summary of the things but the MOOC will be launched in late November so you are welcome to check out it'll be available in edX and everything will be also available under creative commons cc by licensed thanks very much for your attention wonderful it was well worth bringing you over from Australia absolutely