 and let you get going. Great, thank you. Yeah, hello and welcome. So thank you for joining us in this session. So what we're going to be doing is talking through our experience of building effective learner interventions supported by data. We're gonna start with a brief introduction to the work that we've been doing at BPP University Law School and explore some of the considerations we've had to make during our work on learning analytics. So this will be a bit of an interactive session. So there'll be opportunities for you to answer some questions and then also collaborate on a shared document as well during the session. And we'll be sharing everything afterwards. So all of your input will help us write up our own experience with a bit of a wider context as well. So to introduce ourselves, I'm Tom Pironi. I'm the Head of Learning Innovation at BPP. I've been working on the Law School Redesign Project for the last two years or so. And I'm Lucinda Bromfield. I'm a senior lecturer in the Law School. Practiced as a solicitor before moving into professional legal education. I've been with BPP about six years now and I've mainly been involved in the learner intervention aspect to just bring more of a lecturer perspective to it. Because I'm particularly interested in how we support students to develop professional skills and attitudes. So things like identifying their skills, gaps, reflections, adapting to adverse circumstances, all that sort of thing. Great. So to jump straight in, we'll give you the quick context. So at BPP, we've been pretty busy. So for the last two years, we've been redesigning all of the core law programs. So it's the GDL, the LPC, the BPTC. And that's to meet new regulatory regimes set by the SRA, the Solicitor's Regulation, Regulatory Authority and the BSV, the Bar Standards Board. And through the opportunity that comes with those regulatory changes, we've designed those programs to be basically highly relevant for the next generation of legal practice. So this has been a massive undertaking. It's a complete redesign of every program offered in the school. And so we took the opportunity then to rethink how we approach learning and teaching and offer a level of consistency across all of the programs. And this is what we came up with. So this is the law school learning framework. It's a cyclical approach that encapsulates a single workshop of learning as well as the program as a whole. And it moves through the four primary stages which are prepare, apply, collaborate and consolidate. So students engage with prepare before their classroom session. This is where we build the knowledge. It's set through the use of our adaptive learning platform which is a platform called Century Tech as well as through academic readings and other content modalities as well. Students then take that learning apply it through the use of our virtual practice environment. So this is a new platform that we've built specifically for law students. It's a simulated legal firm or a barrisse chambers depending on the program you're studying. And it's essentially an opportunity for students to be given real-world tasks based on extended case studies. And we use this to draw on students' prior knowledge and experiences and kind of contextualize that academic knowledge into actions. It's the opportunity to in a real-world environment provide that connection between the taught theories and the practical application. We've then got collaborate. That's largely what we would consider to be the classroom and that's whether it's online or in person but it's designed to be active. And then we move through our consolidation activities and this essentially acts as a recap of the week asking students to engage in assessment level forms of questions and also provide a link to the upcoming content as well. You will notice that assessment sits slightly off on a tangent and that's not to say that assessment is disconnected from the learning across these programs but instead because the law programs are regulated by external bodies they've centralized those assessments. So this has been the case for barrisse training for many years but it's entirely new for qualifying solicitors. So although our students are working toward a BPP level seven qualification and we are aiming to develop competent and confident lawyers we're also preparing them for that centralized assessment. So on this slide I'm just giving you a quick snapshot of what students sees whilst they're on our programs. So on the left is our VALE. The top right is the adaptive learning platform, CenturyTech and at the bottom is the quick glance to the landing page of our virtual practice environment. I won't speak too much on these platforms. You're more than welcome to reach out and get into it if you wanna know more. I can talk about this stuff for days. The purpose of this slide was more to indicate that there's actually quite a bit more tech than has previously existed in the law school. And as you can imagine this leads to quite a bit more data generation as well. So BPP has close links with legal employers. A lot of our students are sponsored by their future employers. And in planning the redesign there were lots of conversations with the legal profession about what skills, knowledge, attitudes. They thought that their future solicitors, trainees and pupil barristers were going to need to succeed in their chosen career. And as you see from the slide, there were lots of things that came up. Some examples improved digital skills, financial and business skills, commercial awareness, willingness to embrace innovation and technology in the profession, networking, team working, personal wellbeing skills, resilience. And interestingly, legal knowledge very rarely came up and when it did it was just, oh and they need to know, obviously they need to know the law because that's just assumed as a given. So given that there's this huge range of skills and attitudes that are going to be helpful for our students to succeed in their future careers, then the question for us was, well, how do we best support students to develop those skills and attitudes as well as passing the required assessments? And how could learner analytics contribute to that? Okay, so our students are engaged in our programs. We've got all these new platforms. We've got all this new information. So what is it we have and what are we doing with it? We've got our student information systems. We've got so CenturyTech, which internally we've called their BPP Adapt. We've got our virtual practice environment. We've got the VLE and we've got all the additional systems. So things, the library, our support teams and basically everything that we don't have a snappy name for. So then what are we doing with it? Well, we've worked closely with BPP's data team. So this data team, they've already had quite a successful data project within the organization, but it was focused mainly on professional qualifications in the accountancy space. So it's known as the exam success indicator. And what it does is actually look at the students progress through the materials and determines whether or not they're ready to go for the centralized assessments for accountancy. Now it's been very well received. It's proven to increase pass rates for those students doing those professional courses. But from a law school perspective and from a university perspective, we felt that we needed something a bit more because it focuses on those professional qualifications. It focuses on learners that are actually with the organization for a matter of weeks. It's just while they do those courses that prepare them for the papers. Not a full degree program. So after all, our students will be with us through the GDL conversion if they don't come from a law background and then move through the solicitors course or the barristers course, or our degree apprentices could be with us for up to six years. So we've had to come up with an approach that provides the data to drive the dialogue between tutors and students, but doesn't dictate it. You'll see here that I've separated out the subjects and personal tutors. And now we recognize these are the same people, but with those individual hats on, they have very different roles. The data doesn't change, but the purpose of that conversation does. You'll also notice that we don't actually have reports of students to view their own progress. And this is a conscious choice, but it realistically is based on the constraints of an unideal world. We're working very hard to bring through student dashboards themselves, but we also recognize we're not actually there yet. I will caveat this by saying students do have a data dashboard within CenturyTech. We just don't have something that combines it all together in a single place. So by no means are we hiding any of the facts that we're doing this data project. We're working with students to make it very transparent. And students have access to all of that, just unfortunately not in a single place. We're trying to get there, but we're not there yet. When we started thinking about this and thinking about introducing the idea of learner analytics, we had a lot of informal conversations with both education and legal practitioners. And we got some pretty mixed opinions about how they felt about the idea. So we had everything from some people wall for it to people feeling was too much like a big brother idea. And there were a lot of people who felt that because we're dealing with students who are adults who want to pursue professional careers, it's actually up to them to make the most of their studies and to ask for help if they need it. So we're really interested to see what you think. So if you could take 30 seconds and let us know whether you would actually like to have more learner data about your students and you think it would help. That would be fun, thank you. Give another second or two, and we'll close it off there. Just gonna take a quick screenshot of that for later. Okay, so I don't think there's any major surprises there. Most people were toward the more positive end of the scale. And in the project, at the start of the project, a lot of us on the project team felt exactly the same. We knew it would be, what we believed it would be effective. But we also realized we actually had to define what more effective means. So in terms of pure data, this is where we arrived. So at the top, I will say this is a dashboard that comes straight out of Century Tech. So not one that we've built ourselves. What it does is provide a class overview for a specific topic area. So it's a spread of engagement by the scores achieved in the multiple choice questions that pair up with the content. So very simply in that matrix, you're looking at time spent over score, but it gives just that nice snapshot around once they've done those prepare activities. At the bottom then, you've got what we've started to build out in Power BI. This was sort of a high fidelity prototype that we put together. And here we're able to actually then drill down into an individual and we're able to see their actions against each of those stages in the learning cycle. We can see which parts have more engagement, less engagement. Again, we can drill down and see what that really means. So what aspects of prepare did students engage more with or less with? Same with apply, same with collaborate and same with consolidate. We can see then that engagement over time. But it did raise then that we've got to actually consider our actual tutors and it raised two major considerations for us. So one, do our tutors have the confidence and competency to be able to use these data dashboards effectively by our own definition and also then do they have the time and exactly how much time do we expect already time for faculty members to have in order to engage with this properly. So we've had to develop a strategy that engages all faculty to work with these dashboards both efficiently and effectively. So to bring it back to the room, we've got another question here. What does more effectively mean to you? So I'm just gonna share a blank whiteboard space and you should be able to see that now. Up in the top left corner, you've got a few different options. There's text tools. What I'm just gonna ask for is anybody in the session just to choose the text tool and add some notes. What does more effectively mean to you? I had to give you text tools. Okay, probably should have thought about that. I remember that. Just enable those. So hopefully people have access to those and they'll start scribbling away for us. Okay, so we've got some interesting things coming up here. Increasing student confidence, actionable intelligence. I'll be interested in hearing a bit more about what that describes. So we've got a couple of those. Identifying areas for improvement for the course in individual students. Achieving the learner outcome success just generally and improved self-regulated learning. What I'm particularly liking is there's the comment about context dependent because obviously it's gonna be different. The different students are gonna need different things. Tom, do we have time to ask whoever it was who wrote about our intelligence in the sense of information coming to staff? Thank you, Chris. Yes, so they're getting the data they need to then be able to support the students. Yeah, I would agree with that comment as well that success will look different to different students. It's what the student wants to get out of it and ideas around deep learning as well. This is all really helpful. I think given time constraints, I probably need to move on to talk about what we hope it'll do. And actually it's all of these things and I'll give some more concrete examples in a minute. So did anybody want to add anything else before I move on? No. Confidence not, yes, very true. We don't want people with false confidence and that can happen. So what we really want is we want students to have a realistic picture of where they are and to be able to understand for themselves and identify what they might need to do to get to where they want to be, which fits in with the self-regulated learning. Better student support, yeah, of course. So I'm going to stop sharing the whiteboard there and I will take us back to the slides which should come up. There we go. Hit the share now button and we are here. So what I was going to do just quickly is sort of talk about what we hope the tutors having access to this data is going to do. And I'm going to keep it split between the two roles, subject and personal tutor. There will be some overlap, but just for clarity I'll keep it separated out. So for subject tutors, we are looking at improving the student academic experience and outcomes and subject tutors are going to be able to see all sorts of useful things like are students taking longer than expected on the preparation? Are they doing the preparation? Are there certain topics or exercises that seem to be particularly problematic for them? And are there any individual students who seem to really need additional academic support? So for example, they're spending a long time but they don't seem to be getting any results from that. So tutors should get advanced warning of what to expect when they walk into a class with that group. And what that means is they can adapt their teaching in advance. So when I'm teaching on a more traditional program, I normally start by asking my students how they found the preparation and how things are going for them. But I already would have had a snapshot of that and I could go, oh, look, they're struggling with this. So maybe I need to put aside more time in that session to cover those issues. Also means that subject tutors can intervene earlier if it looks like a student isn't engaging and worried about them. And by intervene, all I mean is open a dialogue with that student to talk to them about what's going on and to see whether they need additional support. And at subject team level, then we'll be able to identify whether there are any exercises or materials that just don't seem to be working particularly well for students and therefore to refine the design of the modules as well. And in terms of personal tutor, we're really looking at improving student support. So we're hoping we'll have earlier, more targeted interventions and those could be at student group or cohort level, depending on what the patterns in the data show may be necessary. And that will hopefully lead to earlier referrals to other BPP and external services. So signposting where necessary, for example, to learning support or to wherever it is that's appropriate for the student's particular individual circumstances. So sorry to interrupt this, this is a five minute call. Ah, in which case, I shall wrap this up quite quickly and say that what we... So it's just about getting the support to the students earlier. So what we hope to do in the last five minutes is Tom is going to share a Google document and then what we're gonna ask you to do is navigate to the document, pick whichever one of these questions most interests you and just tell us what you think. And then we will be sharing all those thoughts later. Yep, so when you get to the Google doc, there's a table of contents at the top. They are direct links that will take you to the page for each question. It's just an empty page for each question. Just write a few notes down, whatever means the most to you. We're gonna share this openly at the end with the whiteboard and the poll at the start. And we're also gonna then produce a write-up. So using all of your notes based on all of your thoughts and perspective on all of this and contextualize it again within the project that we've been working on to actually help us refine what effective analytics means to us and also hopefully produce some useful guidance for other people who undertake a similar project. So there's a few people starting to write notes in there now. That's brilliant. And I can see that one of the comments is about how you would find the time to look at the data and fulfill that role. And that is a very key concern. Yeah, under hidden dangers, we've got making assumptions about what the data means, which is something that we've worked kind of really hard to kind of mitigate against this. We've given suggestions of what these things could mean, but more so on the basis of whether it requires an intervention and a conversation rather than a direct outcome of directing the data. Yeah, so when we've been talking to tutors about this providing training to tutors about this, we've really emphasized the point that all the data does is give you an indication that there might be something to have conversation about. It absolutely does not tell you what is going on with the student. So knowing your student is still absolutely key and you need to have that conversation rather than making those assumptions. Yeah, there's a really interesting point there as well. So as a student, I'd be unhappy with this kind of data being shared, that note's being finished, our prospective employers. So this is, I suppose it's an interesting one to contextualize for us. So within BPP, with the law school, we have a lot of students who are sponsored by our clients and these are, so there's students who are already technically employed. So it's not so much prospective employer as it is your actual employer who's trying to support you get through the program as well. But we've got an interesting mix then between what we share. We certainly don't share every aspect of the data and we don't leave employers to then make assumptions of what that data means. Yeah, so when we were debating this, there was a very, very strong feeling within our organization that it is not helpful for anybody for things like formative assessments or how a student's doing at any given point to be shared with an employer because it's not necessarily remotely indicative of how they're gonna get on at the end. And it just adds an additional pressure that's really unhelpful. It does not make for a good learning environment. I can see that as we're just approaching the time now that we've just had a question coming on the chat from me which says, it's probably be, I should imagine the last question really that we've got time to answer, but it says, is there a connection between this project and JISC's learning analytics tool set? The connection is we are heavily influenced by the guidance that JISC put out. So we've basically used that as our kind of starting point. So every lesson that we've got is work that other, every lesson that we took on from the start is work that we've understood from other people. We've taken that away and tried to apply our own context to it. So it's by no means, it's not hugely different. It's just a different approach where we've taken that influence there. So we've looked at a lot of the student guidance and making sure that we're very transparent about what we're doing and how, in what ways we work and with students with this. So again, to bring it back around, we don't necessarily have those dashboards, but we're not hiding the data from students. We're letting personal tutors engage with students using the data as well. It's more just the tech issues. But we've taken a lot of influence then from Greenwich who have published all of their information that they give to students direct on their website. And it's really just kind of helped us to make sure that we're on the right lines when we're doing this work. But yeah, we can leave it there. So I've just flicked over the slide to the contact. If you do want to get in touch, you're more than welcome to drop as an email. I will note my email is slightly different. So next week is my last week at BPP. So you are more than welcome to reach out to my new email. I'm going back to London Business School or you can get me on Twitter. But we will be sharing our write-up of this session. We'll be sharing the document that you've all helped us collaborate on. I'll be putting that out on Twitter with a time frame for people to add to it before we start writing that up and share it back with you all. And thanks very much for coming along. Thank you so much for that presentation. It was absolutely fantastic. And let's all find the clap function in our chat and say a big thank you to Lucinda and Tom for that session. I very much enjoyed that. So thank you so much.