 Just while people are moving around, can I, I've not got a view box, but I would like just a little show of hands. How many of you feel you're doing a significant number of your exams digitally? Okay, so a few, how many of you have got dedicated systems? Okay, a few again, okay, that's interesting. Okay, so, yeah, so I'm Graeme Retrobux, I'm Head of Education and Enhancement at Newcastle University. Just to give you a little bit of context here. So I started at Newcastle about 10 years ago as a development officer, looking after e-portfolio, looking after learning analytics, but at the same time, we, the whole team would contribute to digital exams when it came around, all the checking, all the process, et cetera. I then became manager of the team that had digital exams as part of that, as a portfolio. And now last year, I became head of service and Terry, who was meant to do with me today was the manager of the team. We're a Russell Group University, about 28,000 students, 8,000 of those post-grad, so about 20,000 undergraduate, about 5,000 staff. Last year, we did about 240 digital exams. So I'm gonna talk about sort of vision and strategic drivers around our digital exams. I'm gonna talk about the history of exams at Newcastle. I'm gonna talk about some of the challenges and potential solutions that we have. So this is our vision that we've got. So it's recognizing, I guess, that exams are not the best way to assess learning outcomes. I think we're probably all aware of that. But while we're not looking to increase the number of exams that we're doing at Newcastle, we are looking to make sure that when we are doing exams, if possible is done in a digital format. Okay. So our strategic drivers that we've got is thinking about that student experience. So it's interesting when you talk about 30 years of all, if I remember 30 years ago, I was sitting in my undergraduate exams. Many of you have had similar experiences, all handwritten, all on paper, and my writing's not great. Any spelling errors, lines through it. If I wanted to rearrange something, it was an hour down the margin. It was just a bit of a mess that you're handing in. And so making sure that that student experience of sitting digital exams is really vital for us. And it's the key driver and something that we're gonna keep focusing on. Also improves staff experience of assessment. So by using it in a digital format, they're not carrying around lots of exam scripts. They can do really complex things around marking, et cetera. Back in 2018, we developed our education strategy at that point, our five-year plan. But we also had to think of the Tel Road map. So the Tel Road map was a 3.2 million-pound investment in technology-enhanced learning, seven different strands of activity that were then embedded into the education strategy. I'm gonna read out the one that's linked to the assessment and sort of drove this a little bit. Develop and implement comprehensive approaches to online assessment, marking and feedback, making online summative assessment possible across a wide range of assessment types and enabling the wider adoption of electronic submission and marking of all appropriate student assignments. Obviously that second bit about wider adoption of electronic submission and marking and all appropriate student assignments, the pandemic really forced that to happen. And thankfully we've not gone back to paper handings. But this did give us a focus on the digital exams. And also this art to increase diversity and authenticity of our assessments. So authentic assessment, I'm sure is something that you're hearing a lot about at the minute. And we're not trying to suggest that a summative assessment and exam at the end of your module is a very authentic assessment. But when you're using particular bits of software and you're able to bring in industry standard software that the students have been using the whole year and will be expected to use in employment, the fact you can use that as part of your assessment makes it a lot more authentic than a handwritten essay would be. And then thinking about that management of assessment workflows. I'm sure similar to many of you, we've got lots and lots of different systems at the university, quite a few of them related to assessment. So the fact that we can move data between those systems, especially when it's in that electronic format means that for people like professional services, colleagues, actually inputting all those handwritten marks when they can just be taken straight across from the system is a really important factor. This is a question that we've got just now. Is AI gonna increase the demand for in-person examinations? And that worries me slightly. We don't want it to increase it. And certainly the principles that we developed is not thinking down that route. But anecdotally, I have heard of schools in the institution that are talking about getting students in a room because this way they can mitigate against AI. So certainly a concern. Okay, so I'm gonna go back now to the history of exams at Newcastle. So back in 2007 was when we did our first exam. And digitally, we were using Blackboard tests. We were using Firefox with grease monkey scripts to create a lockdown browser. We were using central computer clusters because we had to reboot the whole cluster to turn it into an exam room. The project we all have an acronym, our project was called OLAF. It was online assessment and feedback. And there was really a couple of things that happened. So we were certainly, we were concerned about the reliability of the SE question in the Blackboard test. And again, this is back in 2007. It wasn't saving your work on a regular basis. So if the student decided to do control A and delete, we wouldn't be able to get the work back. I know that's unlikely, but it still meant that we didn't do any SE questions. And so most of the question types were things like multiple choice, multiple answer. And it ended up being mainly stage one students that that would be assessing. The other issue we had was around the cluster capacity. So around 2013, we got to a point that during the exam period, the clusters were full. We could not fit any more students in them. So we really had an issue around that. And also at the same time of doing all this 2019, just as the pandemic hit, we decided to change our VLE from Blackboard to Canvas. So I had an extra complication on top of all of this. Okay, so I'm going to talk about a project that we had called diversifying our exam provision. And this is really looking at the two issues that I've just highlighted there. The cluster capacity limiting the rooms and the VLE limiting the range of question types that we're going to do, in particular those SE type questions. And so we ended up looking at this pilot. So part of the road map funded a post to do a pilot in this area. So, oh, actually I'll go back a slide there. So one of the things that we did for the cluster capacity is we looked to bring your own device and whether that was going to be possible to do and do at scale as well. And with the VLE limiting the range of question types, we looked to get a dedicated system that would be more appropriate, let's say, for those exams. Outcomes of the pilot that we did. So this is about written exams compared to paper exams. And this is the student feedback that we got at the end of the exams. We asked them to complete a quick survey. And so we've got there, the 79% of them either said it was better or worse, neither better or worse or better. And then looking at it from a BYOD point of view, again, 84% of students are saying it's neither better or worse or it's better. So really positive feedback from the students. You know, this is a route that we wanted to go down. We got some written feedback as well from the students. And that top left and the bottom right one certainly rings true with my experience of doing exams, the fact that you're able to get it legible, the fact that you're able to edit your essay, you know, it's the headings in, you're able to move things around, you know, really positive features. I thought the bottom left one was really interesting. Thinking about that cyclical process that you have when you're writing and actually going back and be able to amend. And again, that does hint back to my experience as an undergraduate student. And the fact that we're able to use accessibility software and some of the software built into the systems as well, have accessibility features really supports all our learners. So following this pilot and the success of the pilot, we felt that actually we would go to tender for a digital exam system. And this is we, in Spira, were successful in our tender and we have some of the inspirer team at the back here. The tendering happened a little bit later because of the pandemic and we weren't doing any sort of examinations during the pandemic. And what we've found is the fact that we can do this type testy questions, we've had a rapid increase in the number of digital exams and demand for digital exams. We found that within Spira, it allows a wide range of question types. It allows a type test is really meets our requirements around things like BYOD. It's, you can allow less software. We've had our first exam this year that had multimedia in it. So we had video and students with headphones watching video as part of their examination. So a lot more wider range of exams that we're able to do using this. So, there has been some challenges, though, I would say with the software. Certainly not just with the software, there's some issues with the software, some issues with our approach to the implementation and some technical issues at Newcastle, but not elsewhere. So I'm going to those in a bit more detail. Issues within Spira software, it's just really about the functionality within the software. So, especially when academic areas have been using other software and then have moved to our centrally supported system. I can give an example of in the medical sciences faculty, we've got schools that have been using Speedwell. And one of the things that Speedwell does really well is that statistical analysis after the exam. And in Spira doesn't do that as well as Speedwell. So academics have voiced their frustration with that. So we're working within Spira to develop that better and also looking at things like Excel templates or Power BI to be able to support that. Issues with our approach to implementation. So an example, I think falls into this category, is thinking about, well, actually, we got a call one day from one of the schools, I'll not mention the one, two days before they're going to be doing a mock exam with students. They had scheduled to bring your own device exam in a tiered lecture theater, the first time they'd ever done a digital exam within the school. So we came along, normally what we would want to do is give our information to students, advise them what they need to install in their machines, give them a mock test that they're able to go through on their own machine to check everything works. We weren't able to do any of that. And we ended up having a case with lots of students who have an issues, but because in this tiered lecture theater they don't ask everybody else to stand up and then have to come down to the very front. So I think that's what I mean by our approach to implementation in certain aspects. I think we could certainly be improved. And the last one there is technical issues at Newcastle, but not elsewhere. So we looked at remote proctoring. When we weren't wanting to do remote proctoring unless there was a particular professional body requirement, we felt that we needed to have something in place in case we ended up going back into another lockdown. Now with in speed as remote proctoring solution that they've got, we need to use a browser called, I'm going to say speed exam portal. I'm looking at the back. See I thumb up from Harvey. So yeah, in speed exam portal we haven't got that working with our single sign on. It's just, it just hasn't worked, but it does work at other institutions. So we don't know why and they're working on that just now. Okay, so challenges and solutions. And this is where we're at now. So this is the demand currently for digital exams at Newcastle. I've just had a message this morning from Terry to say that we've had the numbers in for next year and it's got the same percentage increase again. All we've got though is we've got no extra resource to deal with all this. Okay, so this is really the big thing we've got just now. We've got two and a half FTE within the service that looks after exams and during peak times other people in team help out but it really puts so much pressure on those two and a half FTE. One of the things we're doing just now is a strategic review of exams as a whole and this is our change team within the university as doing that, so outside our department. And they did a mapping exercise. So though you don't need to see the text on all these things, but you've got, this is a Miro board. You've got September through to August and then these are all the different activities that are needing to be done for digital exams. Now, the yellow or the amber is what the academics need to do. The green, this is the green there, that's what our exam office does and three guesses the pink is what we do. So you've got two and a half members of staff doing all that. And it's just causing a lot of pressure and a lot of concern. The other concerns I have about the pressure we got is when I'm looking at the challenges here. So we've got members of staff within the institution. There are absolute experts in assessment. They are brilliant, but they're spending all their time on process. That's all they're doing. They don't have time to go out to the schools, talk about assessment, best practices, et cetera. So it's really frustrating and they're frustrated as employees because they're not able to go and share the expertise that they have. This is a concern of mine. It doesn't seem to concern as many people, but it's certainly a concern of mine is that we've got lots of areas of the university responsible for digital exams. So we've got our exams office who looks after paper exams, but they look after the scheduling of digital exams and the invigilation recruitment for digital exams. We've got estates involved with our BYUD exams. We've got NUIT involved with our BYUD exams. So a whole load of areas of the university are involved in the exams and that concerns me slightly. Because of that small resource that we've got and with the increase in exams that we have, I'm really concerned about pressure leading to errors. And if a student is doing an exam and they have an error with it, we've created because of our processes or our resource constraints, huge impact on their student experience. But that is really concerning. Thankfully, we haven't had any major errors yet. I'll teach you, we'll go over here. And also colleague experience. Because we're under so much pressure within the team with the demand that we've got, we're not able to spend as much time with colleagues as we would like. And again, if they're not feeling they're getting a great service from us, there's a reputation impact from my department, but also for digital exams as a whole. And they may go back to paper based on that. So that's a worry as well. BYUD is a consistent challenge. We use about four or five rooms at the minute, but we've got to make sure there's enough power. So we were working with the States in the new IT, looking at things like power banks, they ought to plug them into laptops. We've got laptops that we can swap over with the students one if we need to. Wi-Fi is a big issue across the campus. So some areas of campus is great, some areas is not so great. We've got a big network infrastructure rebuild happening just now. So again, we've got to work really closely with other departments to make sure that they work well. Looking at our solutions that we've got, so we're currently doing a strategic review of exams. I was expecting something in August, but I'm assuming it'll be September now with leave. What that leads to, I don't know, it's not going to lead to any new resource across the living at the minute, and the financial pressures in the university, we are not able to get more people in the team. So I'm interested to see what that actually comes out with. We have a new education strategy due in January. While that strategy is not going to mention digital exams within the strategy, we'll talk about assessment as a whole. And so that might be able to lead us to doing something around the exam process. And then lastly, it's a bit more of an operational one this actually is, and there's something that I think has been a real success is our digital exam support assistance, but also employing students at peak times. So the digital exam support assistance really is in bringing your own device exams. We were finding that the invigilators that we were employing as standard with our normal exams tended to be retired school teachers and may not have the digital literacy. I don't want to generalize this whole because some did, but may not have that digital literacy to support any errors as students were having with bringing your own device. The other problem I had, if they were trying to solve an error with the device, they weren't able to keep an eye on the room. And their job really is to make sure that exam runs properly and is rigorous, et cetera. So we employ PhD students as digital exam support assistance. They have to have a level of technical ability and they will go into the rooms and support the invigilators and deal with any technical issues that come up on BYOD exams. The other thing that we do have is we have PhD students coming and actually helping the team with the process elements around exams. So they'll do things like checking of settings and sort of release dates, all these sorts of things that we do as a part of our process. It's been really successful. The students are fantastic, dedicated. They do a great job. I've got real concerns about it though. My concern really is that we are getting people to come and help out on a short term. It'll be a new person next time, and while it's been successful, it's a bit of a sticky plaster. I don't feel that's a long term solution to the problems that we've got. Okay, and so that is our really summary of where we are with digital exams. Probably not as many solutions as I would like to give you, but I think if we maybe spoke about this in a year's time, I hope that it will. How can you take any questions from me? That's just my question about whether the majority of the PhD exams are in portrait and essay type subjects and whether you try to look at the consequences of this in the private life of maths really not in your experience and by answers? Yeah, well, with the maths, we develop a new cast lens. Actually, it's open source now and some of you, and I'm sure, use it as a tool called Numbers. So Numbers is really used for all our mathematics exams. We don't really do anything through in spirit for that. We do have areas like computing, et cetera, using exams, but just not for those particular mathematical elements. When you do need to do some sort of element of handwriting, in spirit, I have a bit of functionality called in spirit scan. And so they get given a bit of paper and it's all scanned in at the end and linked digitized and linked to their particular submission. Anybody else? Yeah, we have a certain number of laptops and that we have university laptops. So if the device stops working, we can flip the new laptop across and once they log in, the exam will just pick up exactly where they've been. If they lose Wi-Fi connection, the software will still run and still manage. And actually, when they then get connected back to the Wi-Fi again, it'll upload again. So there's lots of functionality built into the software as well as what we're doing in the rooms. Just a follow-up question. What sort of proportion of students who find leaves and additional laptop, whether they're in the students room, or in the student room? A small number? Yeah, it does tend to be. I don't have figures, but it is a small number that we've got. The problem actually we've got with that is the more it grows and the more venues we are doing the BYOD for, then the more laptops we have to get out to those venues. So we're going to work really closely with NUIT. Other issues we've had, for example, is our suite of laptops are only used during exam periods. So when they're off the network for a certain length of time, they drop off, you can't reconnect and they can't do updates, et cetera. So we've got to really work with NUIT and make sure they're regularly switched on, updated and then brought out to the venues. So the more we do the BYOD, that's the sort of logistical challenge that we're going to have. Richard. Could you say something about assisted technology and how you deal with student needs in that next step? Yeah, so there's certainly some functionality with an inspirer that allows us to do some accessibility elements. Allow listing of software will allow us to use some of the assistive software as well. There has been circumstances where we just had to make it not a lockdown through the system. And actually, we have to have an invigilator for that student based in an open, almost like an open book exam type thing. So. So you get to the space, it coordinates that and participates student needs because some students declare and have support plans that is done. And particularly for postgraduate international students, we receive a lot of them, they don't come out of college with declaring. Okay. Short time before the exam, you can get to go around their challenges on the software they're using. So how is that all coordinated? We are led by the exam's office. That sort of registration of students against a particular exam and their requirements is there. I really as an assumption for me, I'm assuming they're coordinating with student wellbeing service and those things in place. Not some have had raised to me as a major issue that we're finding. But yeah, could go undetected again. Question about I guess the more the beginning of the process that you were talking about and obviously you ended up with the spirit that did you bring looking into any other platforms could you tell us about that? Absolutely. I mean, when we did the pilot, the pilot wasn't missing spirit. The pilot we use wise flow as our pilot. And the pilot really wasn't about the system. It was about proving that we can actually do that. You know, what we needed to do. When we can do the thing with wise flow that students really didn't like. I don't know if it's still the same with the software or not. But back then it was, you could do a multiple choice and an essay question but you'd have to do the multiple choice first, finish it, submit it and then do your essay question. You couldn't then go back into your multi choice questions. And that's something that students felt. And actually if you're doing that part A and part B, when you're working on part B, actually think, oh, hold on my answer to part A is wrong. I'll go back and change that. But they couldn't do that with that software. But yeah, the tendering, very robust tendering process that we went through, student input as well as staff, as well as ourselves. And yeah, in spirit I kept my clear winners with that. And how are you talking about your academic or group and they are you getting two days to be in the zone? Do you want to just talk a little bit about that? Because I can imagine for all the students their own language is going to go up if they weren't prepared. So I can try. And how did you go about helping them understand better? Well, I think, I mean, yeah, I think that was just one example, I guess, of the sort of implementation issues that we've got. In that particular scenario, as well as the students being frustrated, we were really frustrated because we weren't able to give the support that we would normally want to give. I think following that, we've really worked with deans, worked with faculties, et cetera, to make sure that if people, and if it was an end of year summative exam, we would have got it through the exam's office, so we would have got it well in advance. It was a fact, it was a mock exam that they had organised themselves within the school. And that's really the scenario where we had. So, yeah, it's really just a communication thing and really trying to make sure that schools are aware and on board with these things. Just one other one around. Is it like a new strategy coming in January? Yeah. All these different groups that are relying on each other in this exam space, is that something that we're going to be asking for more together? And we do, I mean, we've worked really closely together already. I'm just really worried about it being a whole load of different groups. I think the education strategy will be broader. We'll talk about what we want to do as a university around assessments and imagine things like a ethnic assessment will be in there, for example. I don't think it will go into the detail of what we're doing around exams as such. One or two more minutes. Anyone wants to stay and ask them their questions. But thank you very much for having me picture which I don't think I've done before. You did. Yeah, just to thank you. We'll see you later.