 Hi folks, welcome to this Eden webinar. As we're waiting for people to come in, I see about 40 people in so far. Please introduce yourself in the chat, tell us where you're from and what the weather is like. In Dublin today, I can report it's terrible. So I only want to hear a nice weather if possible, please. So we're delighted to welcome you to this webinar about changing assessment due to COVID-19 experiences and impact. And we have some fantastic speakers lined up who I'll introduce in just a moment. I'm just going to allow a few more people to join. Salzburg. Croatia, much rather be there, I can tell you. So I'm just going to share my screen. Just a few things about the NAP. If you're new to this group or this webinar, it's the Eden network of academics and professionals. And we do things like run networking events, collaboration, and also webinars like this. And occasionally we also do tweet chats called Eden chats. So this is the steering committee just to briefly introduce us and also just to tell you a little bit about our speakers. The first speaker is going to be Dr. Monica Ward, then followed by Professor Stylianos Hatzapangos from the University of London and presenting with Dr. Linda Amrein Cooper also from the University of London and followed by Dr. Ines Gil Huwerina from UNED in Madrid. So I'm going to hand over then to our first speaker, which is Dr. Monica Ward. So Monica, if you want to start getting your slides ready and I'll introduce, read your bio. So Monica is an Associate Professor and Assistant Head for Teaching Excellence in the School of Computing at Dublin City University and a colleague of mine. She has extensive experience in teaching and assessment in a range of subjects from technical to transversal skills. She is a pioneer in the use of technology and education. She advocates for co-creation and culturally responsive approaches with academics and students. Over to you, Monica. Grant, everybody, thanks for coming along today. It's terrible having to listen to your own bio, but there you go. So Orna has said who I am. So don't believe the hype as we move through this presentation. So first of all, just kind of give you a very brief overview of what I'm going to be talking about. So I'm going to just look at academic integrity and kind of come up with a checklist for alternative assessment. And then we're going to look at different types of alternative assessment that we might have and some samples and tips. So I'll try and keep an eye in the chat as we go along as well. I like to keep things as interactive as possible, though it's time is limited. So the first thing about academic integrity, it's what we all want in our assignments, okay, and assessment. And when we kind of had to do the COVID pivot last year, it kind of really sharpened the mind of what we want in terms of academic integrity, but I don't think we should kind of just park it there and say that was to do with that. I think we should kind of bring these with us as we go along. So obviously we want our exams to be valid. We want to assess the students to make sure they know the knowledge, but we have to be fair to honest students. So we want to make sure that those who do things honestly and correctly and don't cheat and kind of are looked after, okay? Then we, when in terms of an open book online, non-vigilated exams, there's a lot of concerns. So the academics are all worried about academic integrity. Will the students copy from another? Will they copy? Will they share their answers during the exam? Will they get the information from somewhere online? We're all worried about that. And we're all worried about the temptation for students. Okay, so when they're in an exam hall and invigilated, so invigilated means people walking up and down and checking that they're not copying from either. It's a lot easier to be able to check that they're doing what they're meant to be doing. Okay, so one of the things, when we had to do the pivot, we were moving from a lot of the exams that we generally have and the easiest thing. So define rainy weather or define what an assessment is or define balloon saxonyms. It's really easy for the students. They are kind of memorizing their definitions. And we have no idea whether they actually understand the words that they're regurgitating, right? So part of the move is to move from remembering the lower order of balloons taxonomy from the knowledge and remembering and comprehension understanding to apply it, analyze it. Okay, here's the framework for doing something. Now apply it and show us your understanding. It can be harder to set exams because you have to think of more challenging questions. And if the students have only been doing kind of regurgitation or memorizing stuff to date, it can be harder for them, okay? But I would suggest that it's actually more authentic. So in the real world, so my world is programming and sometimes our programming exams are okay. You have access to nothing except pen and paper and you have to write a computer program. Like there's no way in the real world that you'd be asked to do that. If you're asked to write a program, you will always have access online to kind of previous programs, sample bits of code and you weave them in together to make things work. So I think moving to an open book exam is much more real world. Now, I do accept there are some subjects where it might be slightly trickier to do, but I do think it's kind of it is a step in the right direction. So some other benefits of the digital space is the students can use digital tools and it's what they normally use in their kind of their general semester. They're using things, they're typing up documents in some sort of word process or they're using spreadsheets and all of a sudden in an exam situation, we don't want them to be using word processes. That seems mad to me, right? In the real world, especially in my domain, but I'd argue in many domains, you are using technology as you go along. And I don't know about your students, but my students hardly ever write anything by hand. Sometimes even in a lecture, they're taking, you know, they're typing on the computer and they're going, they're not actually writing anything. And the other major advantage in terms of marking is you can actually read what they write. So sometimes it's kind of a guesstimate. Are they saying this? So I just wanted to look at some, and we came up with a checklist for alternative online assessment. So we wanted to make sure that we were still checking the original learning outcomes because we didn't want to go off and assessing something that wasn't required or we didn't want to miss any of the learning outcomes. We said that the new assessments had to have the same level of challenge as the original. So we couldn't make them way harder because they're open book and they don't cheat or we couldn't make them way easier because we couldn't stand over assessment that way. This was kind of a university-wide list, not necessary for the school computing students, but we wanted to minimize the new computer and technical skills required in order to complete the work. So are we minimizing it? So for students who wouldn't have used a particular type of software, we suddenly didn't want them to have to use it as part of their exam. So we want to help to mitigate the risks if the students couldn't submit to our online portal, our virtual learning environment is called loop, which is just an instance of Moodle. So what would we do if the students didn't have access to online internet issues? And then at a broad level, we needed to consider it for different modules. So there's different requirements if you have a technical module versus a non-technical module. So if I'm testing the students' programming skills versus their knowledge of the battle of X, Y, Z, and they're out of the way of that, that's very different. For some modules, for modules that have working out bits for mathematical modules or programming modules, you want the students to show their workings. So even if they end up with the wrong answer, that you can actually see that they had the right process. And then some modules lend themselves to essay type answers. Again, not too much in the school computing, but obviously a lot of kind of things in the humanities domain would have those. Then we also needed to take into account the different stages of education. So we could, we have obviously different expectations for first years versus master students. So we couldn't have called, this is the one strategy that fits all. We had to be able to kind of cater for different learners. Okay, so the general approach we adopted was we wanted to ask probing questions. So the main thing was we wanted to see if they could apply the knowledge rather than could they remember it. So for example, what are the five steps in design thinking? So they would name whatever the five steps are. But now we assume that they have the five steps in a document in front of them. And we want to see, can they actually apply those five steps to a particular scenario? Okay, and I would argue that's much more real world than getting them to remember the five steps because if they're in a job, the manager isn't gonna say, so without looking, tell me what the five steps are. Rather, okay, here's our scenario, go and apply the five steps and come back to me. So we're assuming that the students will know and have access to their notes. So they will know in advance what's in the notes. So if they've studied and they've been involved all along, that that's fair, I think. And just to emphasize, we're interested in how they apply their knowledge, not that they have it. Okay, so we don't want this kind of memorization, kind of regurgitation thing. So there's kind of two types of questions. I'm just gonna briefly touch on one type, which is the, in my scenario, be software engineering type questions. So this idea is that you'd set up a scenario and you would ask the student problems. So what approach would you recommend and why? So for example, if Dublin Zoo want to design a new IT system, what approach would you recommend and why? So I'm assuming they can look through their notes and they say there's three different approaches to doing it and they could go with, will go with a prototype approach and then develop from that or will go with an agile approach or will go with a different type of approach, whatever applies in kind of software. And then say, how are they implemented? So everybody would come up, their answer to the second part would depend on the first part and their answer would be very different. So it's not something that you can kind of take out of a book. You have to be able to understand the different frameworks and apply them. It was, so for my colleagues in computing, you know, the habit would generally be to look at last year's questions, change a little bit, change the scenario a little bit, but more or less the wording would stay the same. So I came up with some kind of general questions for them to feed and to give them ideas. Like, so something like, what is the most important of effective? So they might know that there's three or four frameworks to do something. So like, if you take the battle of X, Y, Z, so what was the most important strategy they adopted? Or which method was best for fighting in the desert? Or how would you design a whatever? What changes would you make? Kind of given a particular scenario? What other information that you need? You're given this information, why would you need to kind of further things along? And a nice kind of get out of jail one is, could you explain your reason to us? So I went for framework A, why did you do that? I did it because it fitted the scenario or whatever. And then another thing that a lot of my colleagues found useful was smart quizzes. Okay, so on Moodle and probably other VLEs as well, you can have a bucket of questions. So say if you have a hundred students, you don't want all a hundred students to be getting the exact same questions. So you would have a bucket. So for example, I could have a bucket about a question about population and I could have three questions there. What is the population of Albania, China and Kenya? This is obviously a very bad example for an open book example. But the idea is that you would have three different questions and the students would only get one of them at random. Okay, again, for mathematical questions, you can have questions of similar difficulty, but each question student, when they see their exam paper, they're getting a slightly different question. And another important thing is to randomize the order of questions. So say if I have 20 questions on my paper, my question one is different from Orna's question one. Okay, so my question one might be Orna's question 17, but because Orna's under time pressure and I am as well to answer my question. My question one is China and hers is Albania. We kind of don't have time to kind of copy from with each other, okay? Another important thing to do is feedback for students is great, but in a summative assessment, obviously we don't want Orna to finish early. And then we say, hey Monica, these are the answers to the questions. So to make sure that you postpone feedback until all the questions are finished. So another colleagues wanted to have the kind of the option they have a multiple choice question. And then you have to kind of flesh out the answer. So a question might be, what is the standard deviation for avocado production in Chile? If the production values for the last X years are, so you chose the students a piece of data and they would say, oh, it's whatever. And then you get the student to kind of do some sort of analysis piece. So it's combining the multiple choice with kind of a text please, which is exploring their learning. So you have to assume students have access to calculators and anything that's online, okay? So the things that you can do is you can get them to fill out workings in a step-by-step approach. I'll show you an example in a minute. You can force sequencing. So they can't answer question two until they've answered question one. They can only submit one. So if Orna and I have both answered question one, we can't go back and change our answers. It depends on how strict you want to be. And also you can enforce a time limit to make things stricter. So this is an example, don't worry that it's kind of confusing related, but what I have here, I have an equation that you have to solve. And the idea is I have all seven steps here, but the students have to decide which order you actually do them in, okay? So it gets around the problem of students doing something in a different order. I want them to clearly see this is step one, this is step two, this is step three, okay? Just showing some examples here. This is another example of, so you know in things in computing are either one or zero. So in this scenario here, the student has to fill into each box, whether it's a one or a zero, okay? And each of these has a small amount of marks behind it. This is another one here. So I want them to go through each of these different things. Don't worry what I'm doing there, but you can imagine some kind of mathematical calculation. And for each of these things, they have to fill in a piece of information. So it allows me to limit to what they do in terms of making it easier to mark. So just to kind of look back again at academic integrity, obviously it's a lot harder for non-vigilated exams. So if somebody's not watching over the student, it's harder to ensure that we are being kind of academically rigorous. There is plagiarism detection software on most platforms. So Moodle has plugins that we use there to check for that. And I think this is kind of harder for introductory level courses. So if you're asking students, if you ask them something like two plus two equals four, there's not so many different ways that they can actually do that calculation. And because they're introductory, there's not a lot you can ask them. And I think it's particularly challenging for programming and mathematical related courses. So the kind of the approach should be to design your questions smartly, making sure that you're checking for understanding rather than just kind of regurgitation of information. And you want to check the students analysis of concepts. So you could have actually quite a big question. And there might be subtle differences between the answers and then your multiple choice answers can be nearly the same, but are kind of with subtle differences. So just some tips to remember. Ask questions that ask students to apply their knowledge, not just check their knowledge. You could relate it to project work. So say Orna did a project on kind of the Botanic Gardens and I did mine on the zoo. So Orna will only answer in Botanic Gardens and then I can't copy from her because her answers are different than mine. You can use different variants of the same question. So some of my colleagues, there's a tool on Moodle where you can import questions and you can actually randomly generate the elements of the questions with a little bit of coding and smart stuff to, so instead of one plus two, seven plus nine and whatever, so you can do different things there. Some of my colleagues who are using data sets. So Orna would get the data set for flowers. I'd get the one for trees. Somebody else would get the one for fruits. So there's no point in me copying. I actually did detect some things, unfortunately with students that Orna gave the answers for flowers and she actually had the data for fruits. So it didn't make sense at all. And also always reserve the right to discuss with students after the exam to check on their, if anything suspicious happens. I'm conscious of time, Lauren, so this is my last slide, I think. So I would say overall be positive outcomes. So we encourage academics to kind of think outside the box in terms of how they assess students. Some of them got them to do videos. We had online meetings. We can use interactive orals. Just different ways of assessing the students instead of the normal kind of two are paper-based exam. It also encouraged academics to reconsider their assessment practices. And some of us won't go back to what we did before because what we have now is so much better. But some are dying to go back to what they had before. And like in terms of the digital transformation and education, sometimes we had to drag people's kicking and screaming to use our VLE. And now they're all on board and they're doing assessment stuff that they never dreamed they would use in a million years. So I think that's it. Yeah. And so I will stop sharing. Thank you very much, Monica. Very interesting. I think the use of quizzes is very, very interesting. We have a few questions, two questions for you before we move on to the next speaker. So the first there from Cesar, I'd like to know your approach on the impersonating problem, especially with second chance or extraordinary tests where some students have already passed and could be helping or even doing the exam of their colleague that has failed the first attempt. So that's the first one. Okay, I'll take that first. So in our university, all the students have to sit the test on whatever day it is. And if they don't make that, they have a chance to reset the exam in August. So that we don't have a scenario where Ornid does the test today and I have a sniffle and I said, oh, I can't possibly do it today. I'll do it tomorrow. We don't have a facility to do that. We did put kind of extra things in place if the student had extenuating circumstances, particularly in March to May last year when students had connectivity problems and whatever. But in general, we don't have that scenario. So that's kind of interesting that you have that there, Cesar. And one more, which are the reasons to force sequential quiz navigation? Presential exams are able to switch as they like. Okay, so I'm not too familiar with the term presential exams. Me neither. Okay, so in some subjects, okay, some colleagues did this either for questions or some of them had a question and then you could only move on to the next question once you've finished question one. Or some of them did it for sections. So I have all my MCQs questions first. And then once you finish the MCQ, you move on to the more kind of text-based answers, but you can't go back and change your MCQs. Okay, so this is an integrity kind of reason. Yeah, yeah, it's for integrity reasons. Okay, I think that's very good. Thank you, Monica. We'll take more questions at the end, but thank you very much. And we're gonna move on now to our next speaker, which is Professor Stylianos Hatsapango. So apologies for butchering your name. Stylianos works at the University of London Center for Distance Education, where he's a fellow and executive co-lead for research and dissemination. His expertise is in technology-enhanced learning, research-informed innovation of academic practice and doctoral and postgraduate education management. His research and scholarship includes learning design, evaluation of online learning environments, formative and technology-enhanced assessment, computer support and collaborative work, digital literacy, social media, social networks in an educational context. Wow, that's a lot of research areas. And he's co-presenting with his colleague, Dr. Linda Amrain Cooper, who I'll also introduce. Linda is head of the University of London's Center for Distance Education. She leads a team that supports the development of expertise in the field of distance education, providing a focus for the development of high quality teaching and research in open and distance learning. Linda is also in the senior leadership team as director of strategic projects and leads the PG Learning and Teaching and PG program at UOL. So welcome to both of you and I'll hand over to you. Thank you, thank you, Anna. Linda is sharing her screen. Okay, right. Hello everybody. Linda and I will say this slide presentation. So from the title, you must have understood that we are going to talk about this moving to online assessment evaluation at the University of London. Just to start with the overall context and I think this will give you a good idea about what we will be talking about. In late March 2020, the university communicated to all the students that the examinations in 2020 will have to move online because conventional examinations which could normally take at exam centers around the world will not be possible due to the pandemic. So 38,000 students from the University of London, distance learners were registered to take over 100,000 exams in the summer across 23 time zones. So that's the kind of evaluation we are going to talk about. Linda, can you please move to the next slide? So very briefly, very, very briefly, the structure of the talk, we are going to talk about the purpose of the study, some theoretical background. Then Linda is going to talk about the methodology of the student voice. I will touch on the examiners feedback in the program director views. And Linda will talk about some measures, some resources to support students to succeed in the next round of assessment 2021. And she will talk about the conclusions as well. So the purpose of the evaluation was to provide a very detailed evaluation of what happened last summer with this move to online assessment, to identify the experience of the key stakeholders, stakeholder groups, and also to identify implications for the future of digital assessment at the University of London. And finally, very, very important, he had to do his lessons learned to support the preparation for the summer 2021 assessment. Right. So what I'm going to talk about here is some, I will summarize some of the key outcomes of the theoretical investigation we engage with when we started this project. I'm glad that Monica covered some of this stuff in relation to academic integrity, for instance. So the research you looked at had to do with research which was called pre-pandemic. But interestingly, they were very, very fast because there was a frenetic base of people of academics moving their programs online, but also a lot of debates and discussions and depth of clarations of what this meant for pedagogy. So the recent debates on assessment in the higher education sector revolved around changes and uncertainty of the future. This investigated mainly whether 2020 marked the beginning of the end for impassioned, fixed time, paper-based assessment. So quite a lot of debates was around this area. How permanent would be these changes, this transition to online learning? Or as Monica mentioned, some of these academics when the unprecedented circumstances of the pandemic were over, whether they would move back to what they would do in, for instance, paper-based exams for their students. Recent research has also explored the relationship between the students' performance and their preferences when they use online and offline assessments. How to improve digital assessment practice as student motivation and engagement. Overall, improved research, not exactly research that has taken place during this pandemic, but previous research identified the students were comfortable with online assessment. They like online assessments. They like the timeliness of the feedback they might be able to receive with online assessment. They fight is quite convenient as well. Sometimes because of costs, sometimes issues of accessibility. The other kind of aspect that's very important there is that the shift to online assessment and employing online invigilation as sometimes what we call proctomy system has generated, as Monica mentioned, quite a few debates on academic integrity. So in this respect, there seem to be to dominate dominant threats in such debates, which are far from complementary. What I mean by that is there is one threat that involves promoting creative design of authentic assessment. The emphasis here is on authentic and clear guidelines to students about expectations around referencing and plagiarism in order to avoid plagiarism. The other threat is more techie. If I can use this expression. Provides technological and practical safeguards to protect academic integrity, such as moderation of marking, text matching software and the use of other mechanisms. For instance, vivas to verify student academic work. So there is an overall issue about equity of access. Also, some academics talk about, some researchers talk about an overall drop in assessment. Sorry, an overall drop in achievement where students move to online assessment, but there's no definitive kind of outcome or agreement in public research about that. And also there is, as I just mentioned, this overall kind of debate about assessment offenses and how you can manage academic integrity, how can create awareness of plagiarism issues with the students in order to avoid these academic offenses, very serious issues. The other kind of thing I need to mention is that we had in these exams where the students were asked to take these online exams, there were three exam formats. And that was very important because you had to do with the window submission. There was a short submission period of up to five hours or immediate submission period, which could be from five to 48 hours. And finally, there was a much wider submission window identifying, resulting in a long submission period of up to seven days. Over to you, Linda. Thanks, Diana. So just to go back to the, so the University of London has 50,000 students who are studying online and distance with us and about 35 to 38,000 were due to take exams last May through June. And so obviously we had to move those all online. They would normally have taken pen and paper exams in exam centers, local to where they live and they're located in about 190 countries. So it was obviously gonna be a really big change. So the evaluation that we undertook then really considered these sort of four pillars that are on your screen here. So student behaviors. So we were looking at how did student behave? How did they behave? Did they do their exam, which was one of the important questions about whether we were providing that opportunity? How did they engage with the virtual learning environment where they were going to get their exams from? We had some programs that were going to use a separate platform with a proctoring process on that. That was not successful in the sense that, I'll just detail that while we're here, in that students in different locations where they had less established bandwidth were not able to interact with the system effectively. And it wasn't as well developed as we thought it was going to be. So we pulled out a proctoring entirely. And there was also questions about how did students submit the answers? And I know that there's already been some discussion of that. So there was that student behavior section that we were looking at. There was student sentiment. So we sent a survey to students out to that 35, 38,000 student body two weeks after their set of exams finished. So before their marks were confirmed, we also undertook interviews and we had student research fellows working with us on the project as part of the interviewing team. We considered student outcomes and that question of grade changes were they higher average marks or lower average marks? We were very keen to know whether we disadvantaged our students badly. Could they not do the exam and did they do less well than they should have done in this process? And we look back over four years and consider distribution of awards as well as individual module marks or units of study marks. And then we did a deep dive into the operational issues, which surfaced some of the discussion that's already gone on about assessment offenses, integrity and the choices that were made by the program teams. So we were looking at about 100,000 exams across about 120 different academic pathways and programs. So a huge variety there, computing, English, philosophy, law, undergraduate and postgraduate. So a really big variety of programs there. We were also able to look at factors, including location with students as I said in 190 countries, program, gender, age, special exam arrangements and exam type. So just a brief bit about, oops, sorry, about the responses. We got about 8,500 responses from the body that we sent out the survey to. So this is the student body now responding. Pretty even distribution of genders. You'll see ages fairly well distributed in relation to our demographic pattern, which is that we have slightly older students than on campus type provision. Study level undergraduate, postgraduate, mode of study, teaching center and independence. So we have two models of student experience. Students can either study entirely online, working through the VLE, or they can attend a local recognized teaching center that's part of our provision. So it was evenly split with students responding on those. And then also we were able to get feedback from students who were taking those shorter exams, which where the submission was between one hour and five hours. And that varied by paper. It was the same for everybody taking the same paper, but it was different and different modules. And then the medium exams, which were up to 48 hours. And then we had just one program with a small cohort that did the seven day return. So I can't go into all of the details now. And as you can imagine, there was a lot there. But I think it's what I've done here is try and summarize the impact on our students. So the first thing is that 93% of exam events took place. So that was 93% of the exams that we thought were going to happen that students were booked in for took place. That's actually higher than our normal level. It's usually around 89%. So more students did more of their exams. In terms of locations, students in all locations engaged with exams. That's not to say that every single student was able to engage with their exams. Of course, there were some circumstance issues, but overall, there were no countries or locations where nobody could get to their exam. In the feedback, 79% of them agreed that they were able to demonstrate their learning through the online assessment, which we felt was a pretty good outcome. Given thinking back to what it was like in March when we were setting up these exams and we're all working from our kitchen table, et cetera, as well as our students were as well. Some students were not fully equipped to engage with the format and the requirements of their exams. So there were implications for training that we'll pick up a bit later on. But some of the things there were, if they had a 24 or 48 hour exam, what did that actually mean in terms of how much time they spent on that exam? And some students found that was quite stressful. So if they thought they had 48 hours, they felt they needed to be working on the exam for 48 hours. It also meant that things like word counts were quite important, because if somebody's working on a response for 48 hours, they can write an awful lot of words. So we needed to be careful about word counts. In communications, there were opportunities for us to have enhanced communication. I mean, we were really kind of not even swans because we were with 100,000 exams to deliver very quickly in a massive change like this. There was some improvement that we could make. We did move the exams back a bit in the time, so they didn't start exactly when they were due to start. We moved them back a bit. That did have some negative impact on students who might have jobs or family commitments who they had to make some rearrangements. Assessment offences, we had a higher number of assessment offences referred for consideration, and that then kind of bogged the system down a bit. So it meant that we were later, we'd moved the exams. We then had more exam offences, which meant the marks couldn't be released quite as quickly as normal. So it did have sort of a knock-on effect into this academic year. So that's the sort of impact on students. So overall, they could do their exams. They felt they were able to continue with their exams. Much of the open text comment was very positive about the experience, how relieved they were to be able to do the exams, how much they appreciated having the opportunity to do their exams given the circumstances, that some of the stress was gone, the stress of having to go into an exam center with the COVID conditions, but also the usual stress of having to go into exam centers, but also some feedback to suggest that it was difficult to find a space, broadband was an issue, those kind of things. Bearing in mind that all of our students were online students, so they all had to have computers and some sort of connection in. That's a sort of an entry requirement for us. So it did reduce some of those digital inequalities for our student cohort. Gonna pass on to Stylianos who's gonna pick up on the results of the examiner survey and just, I think, Stylianos you're just reporting on one or two findings there, aren't you? Stylianos, are you able to come back in? You might be on mute. Yeah, I was muted, exactly. Okay, as usual, very common. Right, so I will pick only on a couple of very important issues from the examiner survey that have to do with have a direct impact on the student aspect of the evaluation. So the slide you're looking at was question four of the survey which asked them to how this move to online assessment has affected the student performance. And very interestingly, very positive, the examiner said that students have been able to achieve higher academic standards in the submitting work than the previous years which I think was very positive, very encouraging outcome for this kind of evaluation. So the examiners were more positive about the transition rather than negative. There were some other issues that were very important there as well. For instance, students in the context of the paper-based exams that Linda was talking about were submitting paper and handwritten exams whereas now all the exams were online. So in typed exams, the legibility was much appreciated by the examiners. The other kind of body of stakeholders, group of stakeholders, I'm going to look at very briefly is the program director's views. And then very important because the program directors are the belong to the program teams or lead the program teams that set the exams. Monica talked about quite a lot about this that the format of the exams was very interesting for me to hear. So in terms of the program director's views, there was a range of views on online assessment for the future about the plan transformation of assessment which has been accelerated by this pandemic. A very important kind of outcome I think was the evaluation that for 2020, exams remain the predominant assessment format rather than moving to alternative format of assessment for instance coursework which happened in quite a few programs but overall exams remain the predominant assessment format. The reasons the program director's views for that was that the perceived academic rigor of the assessment process, the recognition of the exams were professional bodies, a regional regulators and the impact and radical change might have on the confidence in the quality of degrees. In terms of pedagogy, it was very important to mention that there was some rethinking of the end of module assessment pedagogy. So in quite a few of the program directors that you talked with, there was this adoption of alternative forms. For instance, coursework to complement or to replace the exam. Overall, quite a lot of good ideas there. It is interviews about this shift from view of the exams as the mainly a measurement of learning to seeing the potential of these exams by transforming the content of the exam as and discussing the potential of this for assessment for learning. So the exam content for 2021 would move to open book exams. And there was in that day, when I talked about in the research, there was an emphasis on redesigning assessment rather than implementing huge techy infrastructure that would monitor students. Over to you, Linda. You're on mute, Linda. Sorry. Obviously, the University of London haven't got the hang of this at all. Right. So definitely some real changes, I think, from that emergency 2020 move online to now preparing for this summer's online assessment. So again, about 35,000 students, about 100,000 exams they started yesterday, about 7,000 students assessed in the last couple of days now. So we're moving forward with that. And we spent a lot of time thinking about the findings from the evaluation and what it meant for planning for this summer. And so these are the sort of things that we needed to work at. So communication and having some plans around that, FAQs, lots of webinars, really clear signposting and paperwork, well-being, student well-being, that was really, really significant. And that's not just around the question of assessment. So around the question of living in a time of pandemic, I guess, and we're all dealing with these issues of student well-being and supporting them, but also motivation and peer-to-peer interactions and the way in which peers can work with each other. So we've developed quite a lot in there. Going back to that student behavior, and I was talking about, students not necessarily knowing what you do with a 48 hour exam, for example. So starting to support students and our colleagues to explore how do you behave? We're used to going into an exam centre and knowing you've got to put your bag at the side and you can't have your phone and you can't have written anything up your arm, all of that. Most people have had that experience growing up as a child going into unseen written exams, but now we need to develop all the sort of behaviors that go around an exam, as well as obviously it responding effectively to it. Enhancing technology, making sure that our veerlies weren't going to crash, that the systems were well-signed, those were posted, those were important things. Changes to our general regulations, really troubleshooting around the areas of academic integrity and assessment offences was very important. And very importantly as well, is understanding those students who would normally have special exam arrangements and how assistive technology can work there. And we did quite a detailed study, which I haven't got time to talk about really, but looked at our relatively large group of students who would normally have special exam arrangements and understanding what needed to be different for them. So I'm just gonna stop there really to say, this is a sort of resource that we produce. So students on the student portal have quizzes that they can to these not assessed quizzes, these are quizzes and resources and materials that help them to prepare for exams so that you can't see it really here, but there's study tips and looking after yourself during the assessment period. There's a big piece on plagiarism and the rule for online timed assessments, what we expect around citations, given that we're talking about open book exams, which I think picks up with much of what's already been discussed today. So in conclusion then, we did it. Us and our students did their assessments and they were able to demonstrate their learning and overall there was a bit of an increase in average academic outcomes for our students. So they weren't disadvantaged unnecessarily globally there. Exam delivery parameters continue to vary to support different types of engagement and different types of student and different types of location and different types of subject discipline and within subjects. The crisis has definitely accelerated the change and fostered a rethinking of assessment pedagogy, to the point where we're all exhausted I suspect. Definitely spending time talking about designing for training, understanding academic integrity issues are very important in this world of digital assessment. And then to some extent going back to redesigning our modules and our qualifications, going right back to that start of what are the learning outcomes that we're asking students to demonstrate through their constructively aligned assignments. So it's an exciting time. It's also like everywhere quite a challenging time with such large scale. The references are in the slides and I presume everybody will get the slide. So we'll stop there. Thank you very much, Linda and Stiliana. The scale is staggering. That really stands out to me. Yeah, we felt a bit accurate as well. I come from a similar background but I don't have that many students. But actually we had similar problems though. So very interesting. Thank you for sharing your experience. There's a few questions that we might actually leave till the end if that's okay. And I might introduce the next speaker which is Inés. So Dr. Inés Gil Jorena and apologies Inés if I'm brutalizing your name. My Spanish pronunciation is also rubbish. And so Inés is an Eden fellow and a member of the Eden NAP committee. She's an associate professor at the Faculty of Education at UNED in Spain. She coordinates the Colab teaching innovation group at UNED and her research in open distance education includes assessment of curriculum design. Over to you Inés. Thank you, Jorena. Thank you all for joining us this afternoon. I'm glad to see how, despite we are in different places, we are coming to the similar solutions and concerns. I will show you because I will present some, well, some information from a study we have undertaken at UNED which is quite, has some similarities with the one from the University of London Center for Distance Education and also some of the guidelines my university has said are common to the ones that Monica work said at the beginning. Well, this study I am going to present is not an institutional study but it's undertaken by some faculty, professors from my research group and one of them is also attending the webinar Daniel Dominguez and our concern at the beginning was to see what had happened in our course. We teach a course together in the second semester and what had happened in our course when we had to change the assessment method last year very rapidly due to the lockdown and all the pandemic situation. But then we decided to extend the study to all the university in the bachelor degrees. So what happened last year? Well, first my university UNED is a distance education university and is the largest university in Spain with last year more than 125,000 students registered most of them part-time students and the demographics are similar to those that the previous percent has shown about their institution, very diverse population, different ages, different background, et cetera. And the university offers 28 bachelor degrees and this is the focus of our study. And the focus is also on the final examination which is the biggest part of the final score for our students. There is also some continuous assessment and some assignments in the courses but the final exam is very relevant for the decision if they pass or not the course. Commonly before and after the pandemic the characteristic of the final exams is that it's course, it's the faculty teams they decide the type of exam. So there are very diverse type of exams and with different durations less than two hours, between one and two hours. Commonly and that all the examinations take place at the same time in the same course. So they are synchronous before in all the regional centers face to face and now since June 2020 in the online examination platform. And in this slide you can see the differences between the final examination until February 2020 and in June 2020. And the focus of our study is in that semester in that first time that the university use the online assessment platform for the final examinations. After that we have been using it also in September last year in February we will use it in June this year and also in September. So we didn't really know how long it will the pandemic would last but we are in a similar situation now so we will be still using it. And there are not face to face exams as they were before currently. So the characteristics before in the face to face exams where that they were taking place in different locations in Spain and also in other places in Europe and Latin America mainly. Normally the examinations were without material and there were different control measures. Well the students have to identify themselves before going into the classroom and they were invigilated exams. There were faculty supervising all the classroom and the examination process. And also the students were distributed in the classroom so in a specific order. So there were not two students from the same course seated close to each other. So they were distributed in the classroom. And what happened in June 2020? Well in a very short period of time since March the university has had to set up a new system because it was not possible to have the face to face exam. Some of the centers have very big amount of students doing the exam at the same time. So it was not possible. So the university set in our own platform is called AVEX which is our virtual examination classroom. In some courses now in that call and also now materials were allowed so they were more open book examinations but not in all cases. So most of the courses still had no material allowed during the exam. And there were different control measures in the platform. One was the identification. Well the students had to enter with their own password and also as the other presenters said in their universities as well the different exams had different questions like random questions selected from a bucket or from a pool of questions. Also there is a camera shot during the exam to the students to check if they are, this is for identification and also the anti plagiarism check and software to compare the students responses to the exams. In this context our interest was to see if there was any impact on the students performance when the examination changed to online and also what were the students perceptions about the process. So about the academic performance indicators we have used the information from the UNED data management office and we have processed it. And this is a summary for all the degrees and you can see that all the rates were quite stable but they all went up in last year, in June 2020. And let's see them a bit more carefully. The first rate and which is in my opinion the most important also because we are a distance education university where the assessment rate is not as high as in face-to-face universities. This rate which is the relation between the students who were enrolled and those who took the exam, the students who were really assessed increase in all the degrees in the 28 degrees. And this is very relevant because we can estimate that if the students complete the course and are assessed this would increase their engagement and they will register the coming year. Which can solve a problem or a concern in distance education which is a dropout rate and retention rates. And in the specific degree that I coordinate the degree in social education you can see that also the assessment rate increased in last year. The second rate is the success rate which is the relation between the students who go to the final exam and those who pass the exam. And it also increased in all the cases. So this is a... The increase is not as big as in the assessment rate but it is also a positive. The success rate increased in that goal when the online assessment was just for the first time. And this is the case in the social education degree. You see it is a higher rate but the increase is not as high as in the assessment rate. And the next rate is the achievement rate which is the relation between the students enrolled originally and the students who pass the course. And here also and consequently because this is related to the previous rates this also increased in all the degrees. You can see this represents the increase between the average rate in the previous years and the average in June 2020. And this is the achievement rate increasing in the degree of social education. And finally the average mark which is the final score the students had in the courses. This also increased and in some degrees the increase were very low and in the others was more than one point the score is from zero to 10. And the students had to have a minimum of five points to pass the course. So the increase is in some of the degrees is 0.14 which is very low increase but in others is more than one, one zero five in the highest different. And in social education the increase was not so high. So what does this say about the students' performance? And to me the most significant is that more students took the final exam than in previous years and this was significantly higher with the relation with engagement that I mentioned before and consequently the achievement rate increased those who passed from those among those involved but the average mark, so the final score increased but not so much. And this is some of the points that can is leading to debate nowadays and it has to do with the academic integrity that we will talk about later. And to complement this study of the academic performance which is based on the data from all the courses we took or we prepared the survey in June 2020 originally it was going to be a survey only for our students in our course or in our degree but we made it extensive to all the university and we got replies from 714 students from 20 of the 28 bachelor degrees. Most of them are from the social science area I have to say. And I have chosen some of the questions we made not all the questions in the survey but one question we had is if the fact of the exams being online had influenced their decisions to take the exam or not. And to our surprise most of them said it had no influence so they were they took the exam in any case they were going to be assessed despite the type of exam. For some of them maybe more than 10% it discords them to take the exam because it was online and for some of them it incorns them we will see later some reasons they said for this. Another question we had is if they thought that the online exams were easier than the face-to-face ones and here in majorities did not agree they didn't think that the online exams were easier. We can contrast this with the performance with the average score which we have seen it was higher but still they didn't think that it was easier to have the examination online. And finally we asked for their preference and a majority but not so big majority a bit more than half percent they preferred the online exams and still 40% of the students that replied to the survey preferred face-to-face exams. And in the open-ended question of the survey they they expressed some of the problems they had experienced it and the main mentioned problem was the duration of the exam then they thought it was not enough and for different reasons in some faculty they decided to reduce a lot the duration of the exam it was not enough or they had some issues with the writing in the computer if they had to write some open-ended questions and so the time was a big concern and also the anxiety but mainly beforehand because it was the first time they were experiencing this online system and they had a lot of worries before if the connection would fail if the software or with any technical aspect so it was more anxiety before doing the exam because afterwards the main opinions were that it went well the platform was easy to use so regarding the preferences for those who preferred the online exam a very big reason was the mobility like they didn't have to travel to the regional center if they had mobility issues mobility problems it was easier to take exams from whatever they were family conciliation so these were some reasons they expressed for preparing the online exams and for preparing the face-to-face exams the thing is that the replies they gave to the survey to preparing face-to-face exams for themselves but because they had had problems with the online exam in that same in that first goal in June 2020 like bad experience because they thought the time was not enough they had problems with the internet or they didn't have a proper place to take the exam quietly and they prefer the exam classroom in the regional center and also a minority of the students were claiming for a more diverse assessment type not only the final exam like having more assignments during the semester etc so having this information what are currently the faculty concerns and this is very common to what the previous presenters have said the main one is in the debate every day is about the academic integrity or plagiarism we don't have really information about the number of cases of plagiarism that has been detected and effectively not just the suspicion but confirmed plagiarism and other cases that have happened have made a lot of noise especially people sharing exams or having a wasp group sharing the examination making the exams in group together not individually so this is the main concern and the other one is the workload the exams that are more appropriate to avoid plagiarism and to have a more authentic assessment and also the workload in marking because as the assessment rate increased all the faculty will have more exams to mark at the end of the semester so the workload was really high last year during the pandemic and the university has sent to all the faculty some guidelines for how to design the exams and these are common to what the previous presenters have said they are recommending the open textbooks open book exams to increase the bucket of questions for have random randomization to avoid these very basic questions for just remembered and to adjust the duration of the exam so the students can focus on their exam and they don't have time to share with others or to do some misbehaviors so now the question is what will happen and some faculty we are seeing this as an opportunity to redesign the overall assessment process and to rethink their role of the final examination which we still need to have this is a requirement from the ministry we would need to do a lot of things to remove the final exam and also we have to be remind the number of students we have for instance in my course I have 900 students who are able to have a very formative and continuous assessment with 900 students but we still have the opportunity to rethink the overall assessment process and to think on using diverse assessment methods and the big question is if we will go back to the phase to phase exams as soon as the pandemic allows it we will be combining the online exams and the phase to phase what will happen and we don't have the answer to that yet well thank you very much for your attention thanks it is very interesting stuff I see you had the COVID advance which is what we were calling out last June that all the marks seemed a little higher and we were thinking about all three presentations with the similarities and experience and approach and really quite similar so we might just we are a bit over time but we will just keep going we might just take a few questions if that is alright there is a few there in the chat I see Monica is jumping in there answering one about students with disabilities I think Stylianas and Linda do you want to briefly answer on that absolutely so students we have a variety of different arrangements that are made for students depending on their particular needs so extended time we just applied that as well so they would have more time if they needed it but students for example who might have previously had somebody who did the writing for them ascribe that was much more challenging not in terms of our system but because of COVID because they probably couldn't have ascribe with them so one of the things we are looking into is the use of speech to text software to support students there and alternatively text to speech software is that not all students know how to use what are quite common tools on most of our computers now so actually for all students we have produced a little teaching aid to help them understand how they can use that the other thing was where students might have had somebody who read the exam papers to them that we were able to facilitate because we could use Zoom or phone calls and so on to help that happen and then a number of students who would normally have had physical difficulty maybe getting into an exam room or travelling to the exam centre and so on their feedback was that this was really liberating and they never wanted to go back to exam centres and pen and paper and it was a really wide variety of different needs and some of them had better experiences with the online assessment and some had worse experiences but of course underpinning all of this was the fact that students were themselves getting COVID or their family members were getting COVID so we were dealing all the time and of course the staff were as well particularly as our staff with our teaching centres and we had different points, different people were getting sick so it's been an ongoing rollercoaster of discovery I think is one way of describing it Excellent and I see Monica was typing away very similar how students were handling it in DCU and actually that was a great point there Monica about students using their own laptops which are set up for assistive technologies because I think before we used to make them these borrowed ones or something Preco would have to go into a lab where they'd have the software installed but it was so clunky like they were all sitting in a desktop and they'd have to copy it onto I can't even remember if it was CD-ROM or something ridiculous especially for computing students like that's Neanderthal and then I think we might have moved to USB and more recently we might have been getting emailed or put up somewhere in the system because I'm conscious of the disability students like we have a colleague who's completely blind so he'd be, you know, forefront of our minds on that I don't know if students are hard of hearing I think for some students with disabilities in their own environment using their own setup of however they learned I think that was actually quite liberating for them we would have quite a few students on the artistic spectrum especially in computing so some of them would have sort of, you know, anxiety and stuff like that so they'd have to be in a room with their own and I think the fact that they were at home or whatever it was was great I know students who were really worried like they don't even want a laptop they had another sister doing an exam their parents needed it for work some of them had to mind their older sister looking after Granny so it wasn't all sweetness on life That was the same for our students we asked them those questions in the evaluation and we got a lot of open text responses on that you know the challenge of having a room of your own in order to do the exam not having noise and all of that was a challenge and broadband was a significant challenge students for example some of our students in Bangladesh had real problems with having sufficient broadband and then with having many of them having to use their phone to tether and then the costs of that etc so there were many challenges We had students in India as well and they were saying we've lost internet in our region for the next four hours what are we going to do and that timing of the exam was also challenging years was the largest scale we would have students in kind of Vancouver over to India and just finding a time so some students were saying well I had to get up at two o'clock in the morning to do the exam and you know you can't please everybody That was one of the reasons why we used the 24 hour window with people having for some of the exams once you clicked on the exam you then had three hours or two hours or whatever you could click on it anytime in that 24 hour window it was up to you for a time that worked for you so that was okay but that then does present the opportunity for students to screencast what's on their exam paper and send it down the road to somebody else you know all of those issues kind of relate back to these integrity questions but the work that you did Monica around designing exams with application and really using higher order skills are exactly what's so important I think as we move forward and you also actually said something very important about the exam so just because it's a 24 hour exam doesn't mean they have to work 24 hours on it they don't have to report or a case study or whatever so I gave my students an eight hour exam and you know so they knew what was going to be on it I was going to ask them about design thinking and I was going to ask about some business model canvas or whatever but they didn't know the context that was going to be applied to and you know if you copy to the same thing as Orna and you stood and it was about dub and zoo talking about rhinoceros instead of picking birds or you know African animals or whatever then you know you'd see the similarities kind of emerging yeah but what comes across I think from all three talks correct me if I'm wrong that the emphasis that we as educators are putting redesign assessment so okay all the technical information about plagiarism and proctoring and online visualization software and all this kind of thing might be important but without the important kind of very important aspect to redesign an assessment redesign in any exams or any other form of assessment formative assessment you're going to use they're not useful enough they create a superficial environment which sometimes could be ethically very very dubious as far as the students are concerned so I'm glad that there seems to be a convergence from all three presentations about this quite this quite strong emphasis on redesign and assessment yeah and I mean the other thing that I didn't talk about here but we went through a rigorous process now it's kind of didn't do so much this year but like when we were redesigning it everybody had to have had a buddy who had to review the paper and then I reviewed them from my school and then we reviewed them at a faculty level and we were saying we're doing this questions and timing so there was a kind of a control process put in place so that Orna wouldn't say kind of like the sound of this question and I'm going to go with this and so there was and now the exam the subsequent exams once they have kind of a template of how to do things it's it's you know it can kind of run itself a little bit but I think that piece of somebody overseeing and not just Monica having a mad idea and doing this question like so there was the buddy there was me and then there was the kind of a dean appointed person at a faculty level going through all the kind of procedures how much time you're doing are you doing MCQs and like we spotted you know there's colleagues were saying bad stuff like if well I wrote the question they have 20 questions I'll give them I'll give them 20 minutes to do it like I mean you know it took me five to do it and I'm thinking like you designed the questions you are the experts in the field it would take you at least a minute to read each question and you know if the MCQs are the destructors are sufficiently robust you know so it's kind of managing colleagues expectations and the other important thing I think as well was it was really important to give students an exam an example of what the exam would look like so that if they were used to this kind of normal exam paper from last year and using that as a guideline when they moved to this year and it's completely different even just the physical look and feel of it is different they kind of think oh you know so it was really important to maybe not for every single question but like that example I gave of the the avocados in Chile or China or whatever it was so in the sense of normally you'd be asked to calculate it but here we are giving you the answer and you can kind of work out which way it is and then you can see for each student yours is avocados mine is oranges and the country is Spain rather than Chile or whatever it is and so that kind of so you're asking the same thing really but it looks superficially different. So that quality assurance that you brought in Monica that that bit was really good Ines is patiently waiting with her hand up there sorry Ines going back to one of the first questions about the the students with disabilities because we at UNED we have a big experience with that we have many students with disabilities in Spain study at UNED so we have a special unit for supporting their needs and their demands and I don't have the data about their performance last year but using all the information from my course I can say that the assessment rate was 100% so all the students with disabilities having the online exam was good for them they had a bigger opportunity to under the exam for mobility issues like the I think it was more Linda who mentioned that people who can't go easily to the regional center to the exam I don't know exactly what type of blind students the platform has some accessibility issues that had to be solved so it doesn't mean that they can't take the exam but they don't feel confident enough there so then these students can go to the regional center to take some face-to-face exams and they can go to the regional center to take some face-to-face exams any students with disabilities or students that anticipate they can have internet connection problems or any other difficulty in the online platform they can apply to go to the regional center it is a minority but it is still happening so the regional centers are open for that and about the project to redesign the assessment method I'm happy to see that it's very common in all our cases because in my university there is a debate between some faculty who are claiming to go back as soon as possible to the traditional face-to-face exam and others including the center for distance education they are trying to rethink all the assessment and providing some guidelines including this one you just mentioned giving examples of questions to the students so they don't feel anxious before the exam and also the university is organizing a series of webinars where the faculty they share what they have done specifically in their course so in specific areas and with specific types of assessment so they are sharing that and it's really useful to share those experiences I think we could be here all night talk about this, it's a great discussion I think I'll bring it to a close because we've gone 24 minutes over but listen I'd like to thank the speakers Monica, Linda, Stylianos and Inés it's really interesting it's strange considering quite different contexts in some cases in different countries but very similar conversations especially around the academic integrity the timing, students with disabilities it's astonishing in some ways how similar some of the experiences are and the contributions thank you very much to those who attended great participation via chat on YouTube and we look forward to seeing you at our next webinar thanks and bye thanks very much bye everybody, thank you all bye