 Rhefn, Rebecca ydw i'n mynd i nhw'n gwybod, clwn i'n gweithio hynny, ond rherwydd Rebecca mae'n gofyn ond maen nhw'n gweithio'n gweithio anoletu'r ysgol, felly mae'n cymdeithasio'n dod i gwybod o ychydig yn llawer o gweithio anoletu'r ysgol. Rebecca ydw i'n gweithio'n gweithio i'r ysgol yn ymgyrch mewn i'r ysgol yn ysgol, a wnaeth a'r gweithio'n gweithio i'r ysgol yn ysgol. Ie ddim gwybod yn y beth ar y bêl ymlaen i'w gwirionedd yn gweithdoedd. Felly byddwn i'n gofynol yn cywir i'ch bod ni wedi'n cyfnodol o'r hyfforddiol. Rhaid i'n gofyn. Ok. Rhaid i'n gofyn. Rhaid i'n gofyn – dywedodd gan ymddangos ei ddweud yn gweithio'r cyfnod. Rwy'n gweithio'r cyfnodd, rwy'n gweithio'r cyfnod, rwy'n gweithio'r cyfnodd i'r cyfnodd i'ch fod yn chyflol ymlaen neu roeddiol. I sympathise with you that you've come to another talk. Also, the problem is, of course, that with such a great talk it should influence your practice, so you should be thinking about it in all the next talks you go to, which of course makes things doubly difficult for me. So a challenge there. So thank you for inviting me to talk to all about scaling up learning analytics. There may be those of you in the room who are thinking, hang on, scaling up learning analytics, we have not learning analytics. Or if we have got learning analytics, it's not me that's going to scale them up, it's somebody way up there. The aim of this talk is it should help you if you are the person way up there, it should also help you if you're the person who hasn't got learning analytics and isn't going to be scaling up. It helps you to ask the right questions and to engage with it in ways which are helpful and engage in ways which I would hope would encourage equality, as Laura was hoping we would do. So very briefly, why am I talking to you about learning analytics? I come from the Open University, we've been collecting data about our students for about 45 years now and using that to support learning and teaching. And we've been very active in the learning analytics community, we've got a huge learning analytics programme. We've got an enormous amount of data about what our students do in formal settings, the undergraduate postgraduate and what they do on iTunes, you, what they do on YouTube, what they do on Open and what they do on FutureLearn. I'm also involved in the learning analytics community exchange, it's a big European project which is bringing together people who are working about learning analytics, people who are trying to find out about learning analytics. I feel I should have one of those t-shirts on saying, find out about lace now, ask me how, do ask me about it if you'd like to, we encourage people to get involved. So to start by defining terms, people talk about learning analytics in different ways. This is Society for Learning Analytics Research definition. I think what's important about it is it only begins with the data and the analysis of data. What it goes on to do is to say, we're going to use this to improve learning and teaching and where it takes place. If you're just crunching data, if you're just visualising data, you're not really doing learning analytics, you're not doing learning analytics until you're making a difference, until you're making things better, in my opinion. So let's think about how people are doing that. And if you're new to learning analytics, this may be a new list to you. If you're already doing learning analytics, think about what's happening in your institution and what could happen in your institution. Because I think what we see at the top is the things that you would expect. If you give educators some data, if they've got any interest at all in that sort of thing, they'll have a little play around with it. If you give them data, they will monitor their students' progress. Of course we monitor students' progress, we're interested in students' progress. But then they're also doing more complex things. They're using it to understand what's going on, they're using it to reflect on what's going on and crucially, they're using it to improve practice and to improve learning and teaching. Now if you look over at what the learners are doing, the learners are concerned with slightly different things. The learners don't really care what happened in your classroom last year or what happened in your university over the last five years. What they care is what happens to them. How can they succeed? How can they use learning analytics to improve their performance, to improve their enjoyment, to improve their success? So they're doing different things with it. They're monitoring what's happening now, they're taking action based on what's happening now. So I'm going to give you a few examples of scaling up learning analytics. I wouldn't really count this as learning analytics because learning analytics is about making use of the data we leave behind in the process of learning and teaching. So it's when we click on the VLE, it's when we add comments to the forums. The sort of things we're doing as we go along. The SAP scheme, which has been running for many, many years in England and Wales, has data collection points. So it's not quite learning analytics, but what we can see from it is some of the things which might be achieved with learning analytics and some of the things we don't want to achieve with learning analytics. So on the left are some good things. It's aligned with clear aims and I go through this talk. A vision is quite an important thing. So the government said we'd like to drive up literacy, we'd like to drive up numeracy, and in the beginning they even wanted to improve our science learning. They've sort of given up on that now. Huge and sustained effort over many, many years. I'm sure you've all been involved in it in some ways, even if you don't work in schools through your children, through your friends' children, whatever, and there's agreed proxies for learning, which are the SAP's tests. We can't see the learning taking place in people's heads. We need some way of judging that it's taken place and the government takes those SAP's tests. If it sees clear standardized visualizations, each little rectangle there represents one student in a year. You see a glance, the blue ones are on track from the government perspective, the other ones perhaps not. And it drives our behaviour at every level. Not just the teachers, not just the learners, but the publishers of educational material, the people who produce the training for schools, the people who produce the admin papers, the personal tutors who your children go to just before SAP's tests if they're not doing too well, the parents who are stressed out. Everybody is involved. Now, possibly not in a good way with SAP's, but you can see that this could be used for good. Thinking about the bad side, we know it brings about stressed unhappy learners. We also see stressed unhappy teachers. We also see stressed unhappy parents. A lot of things can go wrong there. They're analytics which are useful for the government. They're not so useful for the learners because the learners, once they've done SAP's, usually go on to another school or another key stage. They don't learn from those, they just move on. The teachers can't necessarily make use of them. And they focus on the vision. So the vision was about literacy and numeracy. It sets aside humanities, it sets aside music, it sets aside PE, all those other rich things that we do in schools, sets them aside as less important. And it sets aside, it focuses on individuality, individual learning. So things that we know students should be learning about, collaboration, teamwork, it moves our attention away from those. So there are good things and bad things that are worth thinking about this huge scale example. Now to come more to learning analytics, the key example that's always cited in learning analytics is Purdue University in the Northern USA. They've got a big program called Course Signals. It's based on about more than 10 years of data now. They've rolled it out with a lot of courses and they predict that grades have gone up, retention has gone up. It's a very simple system from the point of view of students and educators. Each student, as you go along, gets a traffic light signal. Green means it looks as if you're on track. Amber looks as if something's going wrong. Red, it looks as if you're off track. Now crucially, you don't just get the colour because if you just give students colour saying things are going wrong, they may just give up. And if you give students a green saying things are going well, they may just think they might start coasting. So it gives you information about what you can do with this colour. It says these are the people you could talk to. These are the resources you could look at. These are the actions you could take. And that's what makes it a powerful form of analytics to give to you. Now another example I've got, I don't know so much about this. It's one I picked up from YouTube recently. On YouTube you can see Georgia State University testifying to the US Senate about their learning analytics program which again is a huge learning analytics program rolled out across the university. They say it increases grades, they say it increases retention. And crucially, that last point there, elimination of achievement gaps based on race, ethnicity and economics on class. Now if that is true, that is incredible. I haven't seen the evidence but it looks like one to watch. Coming closer to home, what are we doing at the Open University? Well in a nutshell, that's what we're doing at the Open University. We have a strategic analytics program which is led by our Provisor Chancellor, Linda Tynan. It's rolled out across the entire university. It's all encapsulated in one slide which is quite difficult to take on. What I draw your attention to in this context is there are three colours and roughly those colours align with educators, researchers and technologists. And they all have to be working together in order to roll out learning analytics at scale. It's no good just working with one group, you need to work across all those groups. So you've probably seen a lot of slides like this coming from management. They're always very good at giving you a slide but it encapsulates everything you're going to do over the next few years and then you go away and try and put it in practice and it doesn't quite work. So how has it worked out at the OU? Well one of the things we've got is a whole series of tools which produce data and visualisations of data. They're available to all our staff. You can do all sorts of clever things with them. But crucially it's not just the tools. You may, if you're thinking of introducing learning analytics, be offered tools. And they say, look, you can look at all these wonderful things. But what we've found is we also need staff who go between the educators and the statisticians and interpret the stats for the educators and then go back to statisticians and say, but actually the educators need this data, not that data. We need people in the middle. We've got a series of data rangloes who are in the middle. We've also, it's no good knowing what's going wrong if you haven't got the resources to intervene. So we've also got student support teams who are working with the students to really make a difference based on this. We've also crunched the data. Here's another example of crunching the data. We've aligned it with our learning design. We've looked at what are the things in our learning design that really make an effect on the student's satisfaction and on whether the students stick with us. And those three top hexagons you see there as workload. You've got to get workload right. You've got to get just navigation and finding your way around right. And you've got to integrate it all well. And if you build those into learning design from the start, then you can make a difference right from the start. So I've talked about a lot of different things that are going on there. You know, there's the educators, the technologists, the data rangloes, all sorts of things. Really in any innovation in educational technology, you've got to take it in the whole complex system, which is an educational institution. Now, I'm guessing you can find yourself in just one sector of that. You might define yourself as an educator. You might define yourself as a technical person. You may even be in two of those. In order to roll out an innovation and make it stick, you've got to engage with all those different areas. And that's as important for learning analytics as it is with any other innovation. And again, it looks good as a slide. How do we make this happen in practice? Well, people have thought about this as a framework, the Roma framework, which stands for Rapid Outcomes Modelling Approach. You might as well call it the Roma framework. I don't think it makes much sense as a name. It's got six steps or really seven steps because in the centre is the crucial one. What is your vision? What are your objectives? What are you trying to achieve? And this is something that you get with scaling up learning analytics, but just with introducing them. Don't just introduce this tool and then think about what you can do with it. Think about what you want to achieve and then can you find the right tool to do the job? And then you can begin working around this circle. It hasn't got an end, it keeps going, so you're doing all these things. So let's think about that in a bit more detail and let's start by thinking about the centre one, your objectives, your vision. What sort of objectives or vision could you have? If you've got learning analytics already, think about what vision you've already got. Have you got a vision? What are you trying to achieve with them? Let's think about some options. Of course, Laura's talk earlier will have given us some other ideas about what we might be trying to achieve. From an institutional perspective, I suppose what I call academic analytics, there's lots of visions that we could have. We might be trying to make our students happier and more successful and if you look at those student perspectives there, they're things that we ask our students about very regularly. We always ask our students if they're satisfied with these things. From an educator's point of view, we're interested in our students coming in with the right skills to be able to succeed. Are they being able to succeed, not just in what the goals we set for them but in the goals they set for themselves, is it all fitting together? From a university's perspective, from those vice-chancelors and pro-vice-chancelors, your headteachers, the head of your college, they're interested in, does it make my institution look good, does it keep the books balanced and does it all fit together? So they're all things that you might have as a vision. You might want to achieve all these things. Or you might be slightly more learning focused. So you might look at things which are going on on the ground, which we can alter in practice. So you might be offered a tool which tells you how many people are posting comments and how many comments they've posted. But what you're really interested in is when I've asked them to collaborate, when I've asked them to engage in conversational learning, did they talk then? If the students are returning to materials, they might be doing that because they don't understand. But they might be doing it because you've asked them to reflect. So if you've asked them to reflect, are they reflecting? If you've asked them to browse and share resources with their group, are they sharing URLs then? So it's about relating your intentions with what your students are actually doing and seeing if your behaviour aligns with what you were hoping for. Do you need to change your practice? Do you need to change their practice? What does success look like? So once you have your vision, and it might be any of those visions, it might be completely different vision, how would you then work through the framework? So I'm going to tell you quickly through how the Open University did it and then I might take to you as an even more blinding speed through how the University of Technology, Sydney, did it if we've got time. So the Open University, as I said, we've got a huge program rolled out by Belinda Tynan. You can see the vision at the top is a very PVC sort of vision. It's driven by university objectives. And if you have the vision, you then need to think about what's the vision going to look like in practice? What change will I see if that vision has been enabled? What are different types of change will I see? I'm going to see changes in how people talk and communicate. I'm going to see changes in how we do things and what happens as a result. Now in order to do that, you've got to think about what's going on in the institution in the first place. Over on the right there, you can see another visualisation of the slide I showed you earlier. It's done in slightly different colours. Over on the left, you can see some of the things which are going on in the background which you need to engage with. You need to engage with the dashboard. You need to engage with the management of the university. You need to engage with the educators. You need to engage with student support. You need to take action. And then when you've done that, well, who are the really key people? Who do you need to be talking to? What teams do you need to be setting in action? You can see we've got, we're a big institution. We've got a lot of teams working there. Different people working on different things. What I draw attention to is the ethics framework. Ethical issues, privacy issues, data protection issues. Really important to think about those from the start. Not just think about, hey, we've got all this data, but think about if the students knew what we were doing with this data, would they be happy? If the teachers knew what we were doing with this data, would they be happy? Now, the European University's got a detailed ethical framework. You can Google that and you can look at that. The horizon report this year links to that. But it's really worth thinking about that from the beginning and not trying to brush it aside. And we've got key stakeholders. So administrators, students and educators. We go back to the vision. We think about what behaviour changes we would expect to see if that vision were applied and then how we can check that. So the one about students will achieve their study goals. Well, we don't know if students achieve their study goals unless we know what students' study goals are. So we may have to go and ask them what their study goals are. We might not have done that in the first place. So align those things with each other. We're a huge institution. We're doing this in lots of different ways. Three broad sets of data we're using. We're using the data looking back at what happened in the past to say, how could we do that better? We're looking at the data about what's happening now to see how can we change what is happening at the moment and predictive data about what might happen in the future. What might go wrong in the future? How can we put it right by action at this stage? So, really, this is the deployment section. Your institution might just be focused on one of those. They might choose not to go too broad. We've got a plan. Can we affect change? Can we do it? Have we got the capability to do it? It's easy to be overambitious. We needed to do recruitment. We needed people who knew more about learning analytics. We needed more people who knew about stats. We needed to do more internal training. Crucially, we needed to take on people who could interact with the students and really make the interventions that would count because some of that could be done by the tutors, but we didn't want to overload the tutors. We wanted more people talking to the students and helping to make the changes. And then, of course, we had to monitor what was happening and check that we actually were moving towards the vision and then make changes based on that. I've got three minutes, which means I can whizz through another example, a completely different example. The University of Technology in Sydney, that's a Brickson-Morter University, on the other side of the world. They've also implemented learning analytics. They've also done a more top-down one led by the Deputy Vice-Chancellor. And you can see their vision is slightly different to our vision. Their vision is much more focused on the learners and the teachers. They're moving towards being a data-intensive university because they're in a university of technology. They want people who are enabled to go out into society and work with data and make sense of data. It's a top-down project, but it's also really brought in people right across the institution from the beginning. It's recognised that everybody needs to be involved. It had pilot projects. It began to set up centres around this and it kicked off with a day school. You can see it wasn't just the typical people you might think of. For example, they brought in library staff right from the beginning and said, how can you be involved in this? What would you suggest? So it's thinking right across the environment of the university who needs to be involved? Who are the key stakeholders? What behaviour changes do we need? What ones are we expecting? What ones are we going to report on? I'm zooming through these. It's all up on slide share. You don't have to note it all down at the moment if you wanted to. Engagement strategy. All the things that they were doing, a lot of investment. They set up new centres. They brought in new people. They developed a new course and crucially they didn't just develop the course for students. They also developed it for staff. And then they monitor it. They carry on monitoring it over time. So to wrap it up, if you want to follow up on any more of this, this is my most recent presentation on slide share. So you can check it out there. If you'd like to join us on the Learning Analytics Community Exchange, you can go to our website and join us on the associate partner. You can join our LinkedIn group. And if you'd like to get more involved, we've got a free event at the Open University on the 9th of October. You can sign up for that. Come along. That's about putting research into practice. So we're bringing together researchers and practitioners. So I look forward to seeing you all there. Thanks. Thanks, Rebecca. Yes, we have a question here. If you could just wait for a microphone, please, and introduce yourselves. Bob Banks from Tribal Group. I think your Sats example illustrated how this all depends very much on what you measure depending on what you measure. You can have really negative results as a lot of people think resulted from Sats in UK schools, as well as positive results. So this question of what you measure, it's easiest to measure really simplistic things, simple numbers, but that often drives a negative conception of what education's about. So I'm wondering is there research, are there sort of things you can point to around measurement and measuring more valuable things and real things in terms of real learning outcomes? Okay, I think, yeah, I think Sats is a very good example of some of the things that can go very wrong with this procedure if you start looking in a simplistic way. I think it's very important that you align what you are looking for with what your vision is, what you are trying to achieve. And that often means you don't just take the off-the-shelf tool which can do a lot of generic things. You think about how it can be customised for you. One of the things that we've got through the LACE project is something called an evidence hub, which we're trying to bring together evidence from around the world, either for or against some proposition. So one of our propositions is learning analytics can support learning, another is learning analytics can support teaching. We've got evidence from around the world, either for that or against that. So you can go and look at what the most recent evidence is. You can contribute your own evidence. It's an open access thing. And the more things we can put in there, the better. Learning analytics is a relatively unique field and you merge really in about 2011. So the detailed evidence is small. So it's useful to look at the big projects and see what's coming out of those, I think. Thank you. We have just time for one more quick question. Liz Bennett from University of Huddersfield. And I was going to go exactly to the same point about the negative impact. And I wanted to whether you had any cautionary advice because of where we are in the sector, where learning analytics is just at the point where it might be implemented in other institutions. What would your strongest cautionary advice be? OK, my strongest cautionary advice comes from a journal paper that I reviewed where they had done something along the line of Purdue course signals with the red and the amber and the green. But they hadn't thought through to the actual, what can I do about this? So they had given the students red, amber or green. And what the results they reported showed was that more students had dropped out of their course. So I thought that was an absolutely terrible example of learning analytics. And the ethics of that, I thought, were horrendous. But it's an easy thing perhaps to do if you haven't thought these things through. I think the other thing I would be aware of is the black box analytics where data goes in, it's crunched in some way that the people who get the crunched data don't have any idea what happened there so they can't critique it and they can't say, hang on, I know things which critique that. So you've got to have somebody who can take a look inside that box and spot what might be going wrong. Thanks, but did that paper get published? Sorry, we are rushing into the next step. I reject it. I'm sure this conversation is going to go on. I encourage you to catch Rebecca during the lunch break and during the break at 11.35. But in the meantime, can you give Joimine, thank you and have a...