 Welcome to the ALT South Tech Thursday event. I'm delighted to introduce Stephen Ford, who's one of my colleagues from BPP University Law School. Stephen qualified as a solicitor with Freshfield. He then went on to work as real estate partner with Eversheds for nine years and then moved over into legal education. He has done pretty much everything at BPP. He's been a model leader, he's been a programme leader, he's been head of programmes. He was in charge of the legal practice course when we had the first lockdown and the LPC is regulated by the Solicitor's Regulation Authority who have always been not at all keen on online learning or online assessments, but of course they had to move with what was happening at that point. So he had the task of looking at how this was going to work in the law school and implementing it, which has led to his current role as head of assessments. Looking at how we can do that better now we have more time and more consideration. So Stephen, thank you very much and over to you. Thanks Lucinda, yes it's an exaggeration I'm afraid to say that I was very involved in the early stages of moving to digital assessment, but I've got more and more involved as time has gone on. And certainly as you say looking at how can we do things in a better way, better both for the students and for faculty who are creating the assessments. So I'm coming at this very much from a kind of practical here are my thoughts, here are some issues that have arisen for me and for BPP perspective. I'm not really talking about the pedagogical implications of online assessment or only indirectly. And I'm not really going to talk in any detail about the different products and platforms that are out there. I'm happy to talk about my experience of the platforms that we've come across. And if you like some of the pros and cons of those, but the focus is a bit more general than that. So, if you do have more specific questions, do shout out, interrupt me or as Lucinda has just said put questions in the chat. So, trying to find a hook to hang this little chat on, I thought Beauty and the Beast summed up some people's attitudes to digital assessments. There are those that love it. And there are those like the SRA as Lucinda said that really hate it. So, trying to get my slides to move. There we are. So by way of introduction background, I probably don't need to remind anybody of what happened. It was all going on in March 2020 around my birthday, which was slightly frustrating, but there we are. And we had good old stay alert, whatever that meant. And the immediate impact for us was was fairly swift. So we took a decision on the 16th actually when the, it was pretty clear what was going to happen to stop running face to face assessment. So, and as you can see, the first one that was impacted by that was actually the following day. And we had a job of contacting all the students who were due to sit that and tell them it was postponed and then we set up another sit online later in the year. In terms of background for BPP in terms of what we're doing, as Lucinda said, the LPC program particularly is very heavily regulated by the SRA. And in particular, most of our assessments have to be proctored. The SRA insists elsewhere in the BPP law school, we have programs that aren't regulated to the same degree. And most of them have exams which are not proctored, which makes life a lot easier. The SRA initially granted a blanket consent for all LPC providers to run digital assessments. And we initially simply use Moodle on our VLE with a sort of add-on called Proctorio, which gave proctoring functionality for our written assessments. But relatively quickly, we moved to a product called Inspira and that's broadly what we've been using for written assessments ever since. It is worth pointing out, though, that we use a variety of platforms for different types of assessment. So digital assessment obviously is quite a wide term and lots of people will be used to using, say, turn it in for coursework assessments and that kind of thing in any event. So I would imagine most people are already doing some element of digital assessment. I think what is different and what the pandemic has kind of more or less forced us all to consider is an examination, a timed examination that traditionally would have been pen and paper in an exam hall and the extent to which that type of assessment can be moved online. So that's pretty much what I'm going to be focusing on. Again, happy to talk about other platforms and other types of assessment if people would like me to. So that feeds into the first section here. What kind of assessments are we talking about? As I've just said, really written assessments are what I'm going to focus on. We do have a couple of pure MCTs, multiple choice tests, so assessments that are all multiple choice questions and actually those run very well and very easily online. And one of the massive advantages of digital assessment is that with things like MCTs, the computer can do all the marking for you. And that really is a massive benefit. The other issues that we had to think about in particular were, first of all, the proctoring issue as I've spoken about. That makes the choice of where you go very different because you need to make sure that the program or the platform that you're looking at has a proctoring component or you can buy one in separately in the way that we did initially with Proctorio. But it's probably a bit of minority interest, so I'll try not to waffle on about proctoring too much. But the other thing that was quite interesting was that on the LPC in particular, we had a closed book or at least a restricted book policy. There were certain materials students were allowed to take into an exam and there were certain ways in which they could annotate them. And they would be checked by an invigilator towards the beginning of the assessment. As soon as you go online, you can't really do any of that because you've no way of controlling, even with proctoring, what the students are accessing. And so we effectively had to flip to open book exams more or less overnight as well. That does involve a different approach, a different pedagogy, if that's the right word here. When the students can access anything, at least anything in writing, that's very different to when they can access a statute book or that kind of thing. So if that's an issue, then there is something to be thought about in terms of how our assessment is going to be different. How do we assess the students differently because we know they've got a different set of materials with them. So in terms of the marking, I couldn't resist this little cartoon when I found it, when I was looking for something. So as I've said, MCQ's fantastic computer does it all, spits out the results for you. It's absolutely great. Everybody loves it. But most exams aren't MCTs and sadly require some kind of human intervention in order to do the marking. So here I think the way in which you approach marking has an impact on how you might set up a digital assessment and even which platform you choose. So again, the LPC is a bit out on the limb. We have a very detailed marking guide for our assessments where, I mean, it's not quite, did the student use this word, give them half a mark or a full mark kind of thing, but it's heading in that direction. It's very, very different from the more usual kind of academic marking, which is more impressionistic, which is against a limited number of criteria. The type of marking you've got, as I say, may make a difference to how you run things. The initial challenge we had with the LPC because of this very detailed approach to marking was how do we show a moderator or an external examiner what we've done? Because particularly the software that we're using at the moment in Spira, it really only allows you to provide a mark per question overall. So we've ended up with a not ideal dual system where we have to show the detailed breakdown of marks on a separate document. We haven't used a spreadsheet. So as tutors are marking, they've got the digital script open and they've also got this mark sheet effectively open at the same time. Link to that is the whole question of feedback. A lot of these products allow feedback to be delivered through the platform, which is certainly much more helpful than the way we've traditionally done it, which is by having separate documents and emailing them to students, which is phenomenally labour intensive. There's a slight issue that I need to address imminently in relation to the feedback then going out automatically to all students. And in particular, we've got an issue internally about students who, for instance, are on financial hold, where they haven't paid their fees or something like that, because if we move lock, stock and barrel to the automatic type system, then they'll get their feedback regardless of what's going on. What we can't do in any of the platforms I know about is pick and choose who does or who doesn't get feedback. So again, that's worth having a think about. Moderation and verification, external moderation, it's all part of the same process. As I say, that was something that we really struggled with early on to know how to capture the data for the moderators. What kind of checking of the marks, particularly as I say, we're storing our marks in two places, which is never ideal. It's in the exam platform, but it's also in a separate mark sheet somewhere. And from our perspective, making sure the two are the same is a separate task that needs to be gone through. And then you've got the whole question of how do you then get scripts to the external. As I'll say at the end, one of the other big advantages of digital assessments is you don't have paper scripts to handle. You don't have to slip them home, you can't get them lost or leave them on the train or that kind of thing, which I think from a faculty perspective is really helpful. You can do your marking from wherever you have a computer. For the externals, it's a bit different. What we were doing is downloading samples from the platform and then making those available to the externals. We're now experimenting with giving the externals access to the exam platform so they actually go in and they have a look at the marks in that platform. In an ideal world, all of these issues would have been considered before we went down this path. Because of the pandemic and the speed at which everything was happening, we didn't have that time. And so we've been doing a degree of catch up and revisiting decisions that were taken and trying to use different methods. But I suppose also we're learning more about the software the more we use it. And we're finding different ways of using the functionality of the software to produce the result we want. The experience I have of these software packages on the whole is they expect everyone to do it one way. And if you don't, then you've got to try and work within their functionality and have all kinds of workarounds to get what you want out of it. Another key issue, and this I think will be of interest to Heather and Olivia, is the question of how you get the assessments into the platform. And there are a variety of various administrative aspects to this. One obviously is what's it going to look like in your exam platform. We want to make exams to look consistent so much as possible so a student knows roughly what look and feel of the BPP exam is and therefore what to expect. Which in our case where we have members of faculty creating the digital assessments, it really is like herding the proverbial cats. Where you have some digital technology specific people or experts that might be easier because you've got a smaller group to try and deal with. But some kind of template or instructions obviously is very useful and important. Equally the naming convention so that people can find an exam and tagging depending on how you put your exams together. So again the way the LPC works, yes absolutely Heather, had to provide a lot of support faculty. That doesn't surprise me a tiny bit. Not templates as such Olivia, but what I've done is I've created for instance the standard instructions to students that we presented at the beginning of each assessment. We have for proctored exams you have standard wording about how they're supposed to do a room scan. There's sort of standard introductory wording to MCQs, different introductory wording to long form questions written answer questions, whatever you want to call them. Those kinds of things and the layout of the question I've tried to make as consistent as possible as well. Although that's something I'm kind of reviewing as I go along and one of the things I'm thinking about at the moment is how to try and simplify that process. So have less rules and less work for the people that are creating the assessments. On other programmes, we have more of a kind of bank of questions and then we would pull questions in from the bank to create an assessment. So if you're doing that, then the way in which you tag your assessments and your individual questions is really important so that you can find the right questions at the right level. And you know which ones you've used before and you haven't and all that kind of stuff. So we're just embarking all this on this with a product that's new to me anyway, which is question mark, something I think some people have been using quite a lot and even in BPP. Some of the other schools have been using question mark and they work very much on a topic basis. That's their key component, as it were, and then within topics you have questions. So setting up the right taxonomy, the right folder structure to keep your question bank, it is really important in terms of knowing you can find the right stuff when you need to. Another issue that we've had to address is scrutiny of assessments. So our process broadly is tutor A drafts the assessment and tutor B then scrutinises it. And partly because we're generally recycling old assessments to try and save time, the original assessment that we're using is the starting point tends to be in a word format. So lots of tutors are actually amending that word document first and getting it scrutinised in that context. Then it gets inputted, converted into the digital platform. And that said, you then need a second set of it's not quite scrutiny, but certainly checking that it's been set up correctly in the digital platform. So, ideally, either faculty would do both of those things at the same time so that the assessment would be created on the digital platform. Or another possibility is to look at the faculty drafting the assessment and then educational designers or some similar group putting those assessments. Then into the digital platform. That itself will raise a load of questions about the format in which your instructional designers need to have the material to make sure that they produce what faculty are expecting at the end of the day. So I would imagine there that you might want to create some templates for faculty to use when they create the assessment so that it's clear which bits go where in the digital environment. Paper sits and learning support requirements are another thing to think about starting to look a bit broader. The SRA has required us to give students on the LPC the option of sitting an old fashioned pen and paper assessment. At the moment, the numbers for that are very low, which from my perspective is good. Ideally, we wouldn't have given that option generally. We will obviously have made arrangements to have a paper assessment available for anyone who had a learning support need. But I think ideally we wouldn't have offered it more generally. So how you're going to deal with learning support students or whether you're going to give them an option generally of sitting in paper is again something that needs to be thought about. Producing a paper assessment is a bit of a. What's the right word. In its own right in the sense that what I've really tried to ensure is that there is as little chance as possible of there being a discrepancy between the digital version and the paper version. So as I said earlier we we tend at the moment still to create a paper version first. But experience has shown me that if you try to keep a paper version and a digital version updated in parallel, something will slip through the net. So the solution I've adopted for the moment is that you take the digital version that is the master if you like of the assessment and you download that and then tweak it a little bit in order to create a paper version that people could sit. There are lots of ways of approaching that. And while numbers are very small, I have to say it is it is painful if you're having to produce a paper version for one or two students as we are for some assessments. It's a lot of work for a very, very small number of students software. I'm not going to say much beyond. I think this is from Indiana Jones and the what kind of what the last one was, but it's a bit of a minefield. There is a lot of stuff out there. And again, as far as I can tell, they all do some things very well and lots of other things less well. So there's going to be an element to compromise almost whichever software package you end up buying. And there is also as with all software, there is a little element of shall we say over promising or of a bit of what we lawyers call sellers puff involved in what the software can do or what they're developing and are about to release. We've been a little bit stung with promises that new features would coming any day. And as the months go past, realism sets in a bit shall we say, but I think that the key thing is to have done some of your thinking at least or preferably quite a lot of your thinking in advance. There's always I find with software, not just digital assessment software, a degree of I suppose compromise in a way is the right word. But you have to move a little bit away from what you would ideally like to do for it to work with the software and then you need to use the software sometimes in ways that maybe the developers didn't expect in order for it to achieve what you wanted to. So there is certainly some accommodating that needs to be done, even after you've made final decision, because, you know, yes, we did quite a lot and we continue to do testing with each iteration, each new version of the software. And certainly at the moment, as I said, we're, we're looking very closely at question mark for one particular programme. And we're hoping to do more or less a dry run. We're going to take an old assessment that we're currently, or we have run in in Spira, try and export all those questions put them into question mark and then a group of us are going to sit the exam. I'm hoping they once could be marking it, but a to see what students experience is like, but also one of the things that's really important to us on that programme is being able to get good statistics out of the software at the end. So which questions were answered well, which questions were answered badly, both at the individual student levels so that we can give them formative feedback, but also at the exam level so that we can see which are good questions, which are perhaps less good questions might need replacing or rewriting. And from what I can tell so far the reporting function in question mark is one of its massive strengths. It looks from what I've seen so far a bit more complicated from a creation perspective, but yeah the stats and reports you can get out at the end are amazing. The, the other reason we're going for question mark for this particular programme is that you can, you can have quite a lot of input into what the exam looks like to the student was within Spira. It's kind of that is the inspire interface that is what you get end of. So this programme is basically preparation for an external assessment that's run by the SRA. And what we've been able to do with a bit of help from our, our IT people is to get our summative assessment in question mark to look very much like what the, the external assessment is going to look like for the students. So that's, that's a bit of a bonus really. Clearly regulation validation is always something that you need to think about. Do you have any external constraints like us with the, the SRA or indeed with the BSB on our barrister's training course. Equally, you know, it goes without saying you need to go through whatever internal validation processes you need to go through. I would imagine a move to digital assessment is a fairly major step that probably requires quite a lot of process internally. That's not to be underestimated depending on exactly what your processes are. Sometimes there is an element of needing to educate people as to the value of digital assessments or at least that they are equally valid to pen and paper. Sometimes people are fully on board and they can see the benefits and that isn't an issue. So that's definitely an area that needs considering and in terms of implementation has a massive effect on timescale. Clearly. So that's the specific stuff that I wanted to talk around. Interested to see what what your view is after all of that. But to try and pull some of those strands together. The things that we found really positive about digital assessments are certainly some of these I've mentioned already. Don't have paper scripts to manage, which is a big plus. You've got no handwriting to read, which is a marker is a very big plus. Depending on your systems. One of the potential advantages that marks are saved. Egypt Lee. And therefore in principle can be transferred to other software packages easily. So. It potentially cuts down the opportunity for error. If marks can be transferred directly from one computer package or one software package to another without needing to go through a spreadsheet for some human. The other advantage that it actually seems to be surprisingly. Significant for the students, which I hadn't expected is the fact that students don't have the stress of travelling to a venue on the morning of the exam. Because of the size of some of our assessments, they've nearly always historically been at external venues quite often at the Excel centre for our London students. And it's a horrible place to get to. And it does cause a lot of stress, whereas with the digital assessment, they can sit it at home. That it stuff raises issues if they don't have a suitable place at home. But at least they aren't necessarily having to travel to out of the way places. On the negative side, you will have technical problems. Technical problems, particularly for the students. You know, sometimes the software won't work because that is the nature of dealing with software. And therefore some students will have a poor experience. We're finding now we're getting about 3% of students with our proctored assessments have an issue. The proctoring adds a lot of complication. So with our non-proctored assessments, we have a lot fewer, but we still do have occasional people in there for you need a process to help resolve that for the students. There is a lot to think about and a lot of decisions to make, not just about the software package, but about how you're going to use it, what you're going to use it for, how you're going to set it up. All those kinds of things that I've been talking about. So it's not something to undertake likely. And equally, as I've said, also there is inevitably an element to compromise. You're going to be very lucky if you find a package that does everything you want. So it's about looking at the options and deciding which works best and then coming up with some way of dealing with the things that don't work so well. Overall, my feeling would be it's been good. And I think most students have had a good experience with digital assessments. And the things might be a bit harder to take into a conclusion on this, and they're what you reckon. The most tutors happy with digital assessments. I think probably they like some aspect and not others. Yeah, so generally the reports that I've had are that they love being able to mark anywhere, they love not having that, did I leave a script somewhere, all of that stuff. You know, you're always going to find some people who have been doing something a certain way for a very long time. It does require quite a lot of learning new processes. And I think it is fair to say that the process of setting up assessments digitally is a challenge for lots of people. I don't know if those that are here have much to experience lawyers, but lawyers and technology isn't always a good fit. So we have a few challenging cases. We do have a particular subset as well. We have lawyers who've then chosen to become academics, so it is a very particular group. It is, and yeah, IT isn't always at the top of their skills. I was going to say something else there as well to pick up something I said earlier, but I think it's gone now. Does anyone have any questions? I know you've been asking questions throughout and I've answered all of those as they cropped up. Heather, do you want to come in on the mic? Yeah, I was going to say you touched really briefly on that you have some complex issues with proctoring. I didn't know if you could just talk about those because we did online exams during the pandemic for some of our areas, but others, I think, were put off because we didn't have any proctoring. So I'd be interested to hear about that. I don't know if that's relevant for Olivia as well or anyone else on the call, but if there's other questions, I know you said you weren't going to talk about that so much. I think I talk about it generally and I can talk specifically about in Spira and the issues we've had there. Proctoring raises a number of issues. The first question I would ask is, challenge your faculty as to why an exam needs to be proctored. If you don't have an external requirement for proctor, why are you doing it to yourselves? If you do have the proctor, then the proctoring adds quite a layer of complexity from an IT perspective. So what it means is that students have to have a microphone and a webcam on throughout the assessment. The software also locks the computer the students are using, which means they can't access any soft copy materials. So broadly our rule is you can have anything that's in print or in writing, but you can't do anything soft copy, you can't have any other devices and so on and so forth. What we find is that there is a lot of incompatibility between the proctoring for the software and what people have on their computers. In particular, MAX, it's the base of my life, it's the fact that students with MAX have compatibility problems. In particular things like, yes that's true, in particular things like you've got to disable Siri, you've got to disable dictation, you've got to make sure there are no programs running in the background, you've got to have the right level of lighting in your room so that the proctoring software will recognise you, those kinds of things. So it has got better over time with each new version of the software that we've had fewer issues, but we still do get quite a lot of technical issues. The other thing you've then got to think about is who is doing the proctoring. And broadly you've got two types of proctoring live or after the event. So question mark-off is both within Spira we're using, we record it and then someone we're using afterwards. Most proctoring software has some kind of built-in AI that is intended to spot potential issues and flag them. But in theory the proctor, the invigilator, only needs to look at certain bits of the assessment. In practice my experience is it's not very good. So you end up watching a lot of exam and that requires a lot of person hours, it's very resource heavy. Live proctoring you can usually do with multiple screens. So one invigilator might be watching six students, nine students, twelve students. I don't know, it depends on the limit of the particular software you have at the same time. Yeah that's what we've done, that's how we've managed it. Yeah, which means you don't have the job afterwards but you still have to have a lot of people sitting around trying to concentrate. Watching what's going on on lots of little screen in screens. Thank you. The other point I was going to make actually was about the unspoken bit of digital assessments is of course that it means you expect all your students to have a computer. And we've taught with that by having a digital first policy across the university where that is a requirement. If you want to study at BPP you have to have a computer that meets certain minimum requirements. And even in the physical classroom we deliver documents to students electronically now. There is clearly a social mobility issue there that needs thinking about do you then need bursaries and grants and fellowships and whatever so that students who don't have a computer or don't have wives' deficient specification can buy one. There's all that as well. What we try and do linked with the issue of students don't necessarily have a secluded conducive to sitting assessments space at home. A lot of them in flat shares and house shares and those kinds of things. We do offer up to a point limited numbers the opportunity to come in to a BPP centre and to sit an assessment either by using our facilities but their own computer or even in some cases using a BPP computer. So again the extent to which those things are made available needs to be thought about. Enter you can proctor live. So actually there is a degree of benefit to doing that. So yeah that's my form proctoring. Thank you that's really helpful. We've got a question from Edith as well. Sorry my computer is a bit slow. Stephen I wanted to ask you what's the timeline roughly for the key stages of the process. Of moving to digital assessments. Yes so you know from the time you've trained the staff to the time they are able to actually develop online questionnaires to the time you train the learners and you set up all of the systems all of that. How long does it take? Yeah I mean I think procuring the software and working out how you are going to use the software are the really time consuming bits. I think once you've got there training faculty. It depends how many faculty you've got I suppose but it's not a massive job necessarily. Training the learners that certainly with the packages we've used there hasn't been a great deal to that because most of our assessments are either MCQ click here for this answer or their essay style questions. You get a big box and you write your answer in that. So as I say I think it's really the procurement and the setup. And that's a little bit like how long is a string I'm afraid. I would definitely urge people to do it in a hurry. I don't know what other institutions are like PPP has a bit of a habit of saying oh we've made a decision to implement this next month and it's kind of. I'm going to do that then. I would think you I would think six months. As a minimum really to think it all through properly to look at what's out there work out how you might use them and then set up processes. Thank you. I need to find another question. What did you wish you had known before you started? I think for me the the biggest you has been how we record the marks. I wish we had thought before we particularly before we moved to inspire how we were going to capture the breakdown of marks with each question. Yeah, I think that's one of the one of the key bits. Thanks, Heather. Good to meet you. Can everyone see Elsie these messages when they pop up by the way. Yep. Yeah, good. We can assume they could. I was going to ask a question. Do we know so if it isn't learning support need do we know of any other reasons why students would choose to sit paper assessments rather than online. I think some students are worried about the logical side. So is their their broadband their Wi-Fi at home secure enough to sit an assessment. Some of them, as I said, don't have anywhere sensible to do an exam at home. I mean, I had one report for my sins. I'm also responsible for for academic misconduct reports within the law school. And one video that was to me to review was this chat sitting in what was clearly his bedroom. And what must have been his roommate walking backwards and forwards semi clothed during the exam. You know, I am sure there was nothing that student could do about it because I imagine that was the only space he had in which to sit the exam. And you can't exactly if you're sharing a room for instance. You can't say to your roommate, I'm sorry, you can't come in for the next three hours on that. Some people can't type or can't type well enough. They're concerned they won't be able to get everything down in time. And I suppose that there is a range of issues that runs into learning support. So people don't have good eyesight. You know, it may not be a disability as such, but they're uncomfortable if having to stare closely at a computer for several hours. That kind of thing. I mean, it's all kinds of reasons. Thank you. That does actually raise one other question which I didn't mention and which has been a bit of a bugbear for us, which is that a lot of our assessments were three hours. And particularly with the proctored assessments and the good old SRA, that is a problem because the SRA is very concerned about possible collusion. So students leaving the room during an assessment, talking to the mates, coming back and then writing a perfect answer. So broadly we've had to implement a no breaks policy during assessments, which is certainly not something that we were keen to do. But we can't come up with any other way of addressing SRA's concerns. That then has raised the question of how long our assessments can realistically be if we're not allowing the students to go to the bathroom. And we've arbitrarily chosen two hours as the cut-off point. Any exam that was longer than two hours, we've reconfigured so that it's, in most cases, it's two papers that are half the length. So a three hour exam is now two, one and a half hour exams. That was a random issue that just occurred to me. Any other questions from anybody? Just one more from me, Stefan. What was the role of the leadership team in this? Did you have a lot of support? Did you need interventions or was staff just especially around the situation of COVID and the lockdown staff understood the urgency? This was really coming from the leadership team, to be honest, because of COVID and the need to provide some alternative method of assessment very quickly. So it's been very much down. This has not been something that faculty have asked for and have implemented. And as you say, the pandemic has very much forced it upon us and dictated the time scale, neither of which really were ideal. Thank you. All that remains is for me to say thank you ever so much, Stefan. Really interesting and really appreciate it. I hope it was useful for people. Very nice to meet everybody. Certainly. And to just stop.