 Hello everyone, welcome to the Moodle Moot 2022. Super excited to be here. So my name is Andrew Bogue, I'm the CEO of Catalyst Canada and Catalyst Australia. This talk is about a very exciting project we were involved with that had a bit of COVID, a bit of Moodle and a bit of proctoring involved and high stakes assessment. And I'll be presenting with Sandra Gabriel. Hi everybody, my name is Sandra Gabriel. I'm the Vice Provost for Innovation and Teaching and Learning at Concordia University, which is a comprehensive university in Montreal in Canada. So just the term coal is what the system was called. Coal was a dedicated Moodle instance. Concordia University has been a Moodle university for quite some time. They have a Moodle for learning delivery. But the coal instance of Moodle along with some other technologies was a dedicated instance of Moodle for exam delivery assessment delivery. So Concordia online exams. It was a collection of technologies, both a dedicated Moodle instance 3.9 with some extensions from Monash University, which we'll talk about in a wee bit. It was on a high availability AWS cloud platform provided by us. It was also using Proctorio proctoring. They're actually here, so that's one of the proctoring tool sets as well as other technologies such as AWS Connect for telephony and communication and some BI tools and various other things. So just a bit of context is that in 2019, you know, before, in the before, Catalyst had decided to move to start up some branches in Canada. We had some business opportunities there. We liked Canada. Before that, we were Australia, New Zealand and the EU. So after sort of a period of trips and some journeys and some planning and some strategy and some business development, we decided to move and start open a Canada office. And that sort of kicked off just before sort of March 2020, at which time the world all changed. I was there the last time before COVID in January 2020. Very excited about the opportunity. We met with Sandra in Montreal and then everything sort of stopped. We had planned to send people to Canada at that point, but still we had opportunities that were looking interesting and we managed to pursue those. One thing I want to mention, which I won't talk too much about, but it is very important to understand is the time zone difference. I believe the time zone difference between Australia and North America, well, the American content in general, is the worst time zone difference out there, right? Because of the date line, right? So you really don't want to use the words tomorrow to day or yesterday when you're talking to someone across that date line. There's only a four-day overlapping week. Short notice meetings are very, very, very challenging. There is essentially no overlapping work time. I mean, Europe, Australia is tricky, but North America... Well, the American content in Australia is worst of all. And there's a certain... There's a certain scenario that I had in the mornings sometimes after waking up after a full day of Canada being when I was asleep, when I would open my phone and look at my phone and start scrolling at the emails sort of with one eye open. And if I had pages and pages and pages of emails about some sort of incident or something that had happened, that was a very unfortunate way to start the day because it was all these things that I wasn't able to get involved in helping or solving because I was asleep. So it was an ongoing challenge, given that we were delivering the project remotely on both sides. Just a quick talk as well about Monash University and how they fit into this engagement. So Monash University in 2018, they started work on an E-assessment delivery platform for in-person delivery, by the way, not for remote delivery. They were doing this inside a large hall, being invigilated, using some improvements to Moodle that include better word processing function, better question types, some UX improvements. And that work was the basis by which coal was made in many ways. A lot of those enhancements were used, and that started with Monash in 2018. And that was something that Catalyst was involved in. So I'll hand over to Sandra. So the background to this story, really, is that Andrew and I met in Dublin in 2017. Really? 19. And we met at the Online Learning Conference of the World, Online Learning Forum, I think it's called. Anyway, we met for other reasons. We used Catalyst. That was the nature of the conversation. When COVID hit, I called up Andrew in a bit of a panic, saying, so COVID hit in March. So one little piece of context I should share is that we're on a trimester system at Concordia. So our exams were coming up. We shut down in March. Exams were coming up in April. And so we were trying to figure out what we would do short of cancelling all of our examinations. We run 1,000 plus exams every single cycle. So we run a lot of exams. We have a lot of courses that run where maybe not the supermost efficient university. So the point was that we called up Andrew and Catalyst in a bit of a panic saying, what do we do? So by May, we were able to get this solution up and running. So yes, we missed our April deadline. And what we did was we just ran our regular Moodle site. There were other using the quiz system. And there were other platforms that other profs had already engaged that were in use. And so by May, though, we were established. We had a contract in place. Well, maybe not completely signed. Andrew, if you've ever dealt with him, is maybe not the most quick on getting those contracts out to you. But regardless, we were already working together, getting the system established and in place. And we were at that time, in fact, licensing Manash's tools. So the reason why Andrew wanted to make sure that we mentioned Manash was that we were in fact licensing their platform initially. With the goal that we were going to get our own up and running. And that happened by July of the following year. So needless to say, these were some of our core challenges. This is a very, very, very short list. I could go on and on. I could spend the rest of the talk really fleshing out what these challenges were. You can well imagine that bringing in a brand new system that our faculty hadn't used previously was a huge learning curve. And so what we did was, in fact, create an exam creation team. We didn't have that before. So obviously, when we were running exams in person, there was a whole system in place to support that with dropping off paper exams, et cetera. And now what we were moving into the online system, we needed a way for exams to be created that were properly created. So it's really the most tragic and stressful situation when you discover that an exam has not been set up properly in the middle of the exam. And so we were trying our very best to ensure that we were avoiding that. And so we actually created a brand new exam creation team to help ensure that we had the right quality assurance process in place. That was initially run by Andrew's team. And then eventually Concordia took over that when we launched 2.0. And so the last thing I just wanted to say that Andrew and I had a lot of conversations about is this last point around the appropriate guidelines for implementation. I'll come back and speak to that in a minute, but I cannot overstate. And Andrew was the one who impressed this on me. And I'm perhaps shockingly, because I'm an administrator, underestimated how important the guidelines were and continue to be now as we are moving into some new territory where we're trying to ensure that all online experiences stay online. So if a student signs up for an online class, that their entire experience will stay online. Our profs have been very quick to jump back to an online course, but an in-person exam, which doesn't have a ton of sense behind it if you're a student trying to sustain an online experience. So this is just one example of some of the supports that we needed to create. And we had to build a ton of infographics, quick, easy ways to help our faculty and our students know and understand how to get in and use the system. This was one of them that we decided on. I just want to mention that step two came in pretty quickly. I was the subject of a 35,000, I think, plus change.org petition. I was named in the petition, because I was just the face of it. Asking Concordia to stop using all online proctoring during the exam season, so that was really fun. We understood very well the discomfort that students had with the camera coming in. We selected Proctorio, or an automated proctoring system precisely because it didn't have a person behind it. We weren't able to get our own invigilators in place in time because we are a unionized university. We couldn't make that transition quick enough. And so we didn't want strangers in our students' bedrooms, and we wanted to be sure that we could say that they were our employees, but we couldn't make that transition fast enough. So we went with an automated proctoring system. It had its ups and downs. And I'll now say that we've run a survey with our students just in the spring, and we asked them to rank their preferred ways to take an exam. From the context back in March of 35,000 plus signatures, asking us to stop using Proctorio, the second ranked option that the students selected was a proctored online exam. The first was unproctored online, and the second was a proctored online exam. So it really spoke to how quickly our community moved over and became accustomed and much more at ease with using the technology. Okay. So the last thing I want to say before I hand this over, I think I'm handing it over to you next. No, I'm not. I'm going to keep talking. So these were some of the questions that really came up pretty quickly. You know, when is proctoring absolutely necessary? So again, to take you back into the context, an in-person exam is always automatically invigilated at Concordia. So the students come in, they show their ID cards, they're assigned a seat. We have procedures that have been in place for a very, very long time, and there are invigilators that wander around the room looking to see if students are following the rules. This was an assumption by our faculty that when we made the transition to an online exam, that would follow. It just went with their way of thinking, and we had to really push our community, as no doubt many of you did, to think about other ways of actually assessing our students to ensure that they've met the learning outcomes of the course, and we wanted to make sure that we were facilitating the ease with which they were making this transition. But it wasn't an easy ride because, of course, the faculty had one way of assessing in their minds. So the one thing that also came up was whether or not exams were, in fact, the best way to do that. So I'm going to just slide. We've been given a five-minute warning. We had five minutes? Okay. We're going to speed this up a little bit. So I just wanted to show you that this was what we built in Coal 2.0. We were able to really, we were very much inspired by Manash and the system that they had built, which was really geared towards the student experience. So we had easy exam and navigation, for instance. We had ways for the students on the number three on the bottom to click if they wanted to come back a way to indicate that they wanted to come back and take a look at that question again. We had opportunities for the students to highlight questions. We had, in some cases, case studies that the students were writing their exams on, and so they needed to be able to highlight information or somehow note important information. That's also a very common tool for students with disabilities that they learn to highlight key pieces of information. A timer was up in the top so that the students knew how much time came back and how much time they had remaining in their exam, which was really critical, because in some cases their whole screen was locked down so they couldn't actually check elsewhere and weren't supposed to have anything else available around them. And when that went down, or if that ever glitched, holy cow, did that ever create huge problems for the students? Yeah, that's all I'm going to say on that. So we also had a hybrid question type where people were able to upload photos of submissions. I'll just jump through that. It's the topic in itself. We used a chat ops approach to supporting the students while they're in exams, because this was high stakes in the sense that people needed to be able to reach out and get assistance during their exam in a real way. Not open a ticket, not we'll get back to you when we can. So there was a team of operators and academic staff and technology people who were absolutely at the keyboard ready to engage and communicate with the students while this was happening. And we used a really... Worked really well towards essentially using teams or Slack or Matrix to do that. We use Rocket Chat now, okay. I might jump through this and just jump to the next one. So I just wanted to talk a wee bit about proctoring from the technology point of view and from the perspective of a service provider, because it is technology in many ways, is, you know, someone said to me once, you know, what is an airline cell? And people say, you know, service and reliability and prestige, but, you know, the answer I liked was an airline cell is a rival time, right? So in that same way, what is the technology proctoring solution selling your institution? Now, the perception that will want to be ceded often is it will stop people cheating in exams, right? It will catch out the people who are trying to violate my academic integrity. But in reality, it's not quite like that, right? An organization can't outsource the ability of identifying, processing and excluding students from your university, you know, to an external provider. It just doesn't work that way. There's process, there's right of appeal, there's validation of the evidence. For example, in a real exam, in-person exam, if someone's caught with a piece of paper or a phone and they caught cheating, they'll probably be picked up and walked out of the exam, right? Whereas in many of the proctoring tools, they allow you to do that. They allow you to boot people out as soon as they are detected to be cheating and you think, well, that's a good idea. But in reality, it's a terrible idea because in the instance that you exclude anyone falsely, they have huge amounts of appeal. They're very, very upset. You're going to have to give them another exam. There's a lot of false positives with proctoring solutions for legitimate reasons, especially during COVID. Someone walked into the room and they talked to them. And also, the person sitting in an exam in a remote situation, even if they are cheating, they don't actually affect the rest of the people sitting in the exam. So that's not an obvious thing when you start out with proctoring solutions, but they're very, very different. What I think proctoring solutions sell, actually, is part of the ingredient that universities need to be able to make sure their academic integrity is enticed and that they can demonstrate the steps they took in the instance that they're accusations of cheating. It's not perfect, right? It's to some degree a compliance exercise, and it is only a part of the solution. We also did a tier one support engagement. I haven't got a lot of time. We learned a lot about providing tier one support. It sort of got thrown into the engagement at the beginning. Catalyst providing tier one support. We generally have nothing to do with this. We set up a team. We hired people in Canada. We just let them loose on the students. And it was not a traditional call centre experience. But what we learned was that someone with a bit of empathy, context, understanding what people are calling about, and some good communication on the team actually worked quite well, right? I'll give you an example of one of the situations we face, because there were some very unhappy students sometimes for valid reasons. In Montreal, where Concordia is located, they have power issues during lightning storms. So we got calls into our... I was an operator myself a couple of times because we didn't have enough operators. And people would lose their electricity because of power storms. And they would call the hotline and say, I help, help, help. I've lost my incident. I can't do my exam. And we had to deal with that. And if you were empathetic, if you got people, if you got back to people and if you told them the truth, you actually got a long way towards keeping people happy. So I think we're on to... Well, I don't think we've got time. Sorry, yeah, sorry. The thing that we also learned was that from a phone approach to a chatbot, like it wasn't a chatbot, you were actually talking to a person behind the scenes. Chatbot worked a lot better in the sense that one operator could deal with multiple students at the same time and you had much better audit trail, right? And on our actual 1-800 number, what we did was, is we left the Q message was, please use the chat in the application, right? Did you know there was a chat? Why don't you go there? Please don't call us. Because one operator, politely, one operator can only talk to one student and sometimes those students want to talk for quite a long time, right? So the rolling out the AWS Connect chatbot was a real powerful tool and there's a lot of cutting and pasting with messaging, which worked well. So I think that's probably us. Thank you, yes. Just wanted to let you go as far as you could there. Anyone have any questions for this presentation? We have a few minutes. Just raise your hand. We'll send you the mic. Thank you. Abongile from University of South Africa. The lady mentioned something interesting that when the rolling out proctoring, one of the issues that they had to grapple with was the fact that they had to deal with organized labor or unions. So if you can just provide a high-level account of, I mean, the exact issues that you grappled with when you were engaging organized labor and how those were mitigated. Thanks. Sorry, I wasn't able to catch the gist of the question. It was about the labor unions. Yeah, perfect. So we're, like many universities in Canada, a very highly unionized campus. We have, I think at Concordia, we might have a record. I'm looking at my colleagues from other Montreal universities here. I think we have 30 different unions on our campus. And one of them is for invigilators. And what that meant was an invigilator's role is clearly defined and it has a set list of duties that basically goes from one semester to the other because the work doesn't really change. When we were planning to do this online and we're, in fact, we've moved towards the system with a live in-person but on the computer written online exam. So we call them a mixed mode because they're actually online, but they're live in the room. We were able to, for instance, shift those union duties, but that took a lot of negotiation and time with the union. So to have put them online would have meant that we would have to understand, first of all, the role. What was the scope of the duties that they would have? We'd have to list them and then get the agreement of the union and, of course, then with the labor relations team understand if there was going to be an impact on how much they were paid. So because the scope of the duties were completely different we couldn't just hire a pool of invigilators that we had used previously. We've now done that, but it takes time. So in the context of the pandemic there was really no time to do very much. And so we ended up using the automated proctoring solution. And as Andrew said, there are a lot of issues with it. Our legal team has found it very interesting. There are, in my own view, the most profound challenge is the availability of the video to the student themselves so that they themselves can review the video in the preparation we have a tribunal system where any student who's caught cheating has the opportunity to present a case in their defense. And with Proctorio there's no way for us legally we can certainly do it illegally to actually take the video and share it with the student which, you know, is really, it's the foundation of why we have the tribunal system that they're able to review the evidence against them and then can adequately prepare the case. So I hope that answers your question, though, but the union, thanks. I have a question. I was asked in a picture from the University of Frankfurt in Austria. We switched completely to proctored exams and one huge topic was the change of grades. So how the grades looked like in a written exam or an oral exam before. We also had computer exams in place and, yeah, the exam grades, they changed and we had to argue sometimes where these changes are coming from so I would be really curious if you have had the chance to evaluate. Yeah, that's a very complex question. I have been stressing with our colleagues with my fellow faculty members that looking straight across from grades from one year to another is not an appropriate way to be able to assess whether or not we've got grade inflation because of online proctored exams. So, yes, we've got grade inflation, 100%, but if you've moved the entire modality of the course online, then many, many, many factors have changed all at once and so to really be able to review that would require a really sustained look. We made lots of adjustments when we went online. We made lots of adjustments to assessing in different ways and so to say that it's because of online proctored exams, I just don't think we have the right evidence to be able to support that. It may well be that case, but I don't think that we've, in the interest of, oh, I don't want to call it science because I don't, you know, it's teaching after all, it's not scientific in that sense, but I do think that we haven't exactly established that no other parameters have also changed, right? So the modality changed, the way to teach changed, the flexibility changed. We really, it was a great opportunity for us to roll out a universal design for learning without calling it universal design for learning because all we had to say was be flexible, show your students that you can take their assessments in a variety of different ways, et cetera, right? So I don't think we've got quite the right evidence, but we are certainly looking to ensure that we're maintaining rigor. But remembering, you know, one of the core, I mentioned universal design for learning because one of the core principles is that when you're actually conducting an assessment, you're looking to see that the students have met their learning outcomes, right? That they've reached the competencies that you were expecting for them to reach. You can do that in a variety of different ways. It doesn't have to be done in one single modality, right? It's not just exams. And so what we've been saying is, how else can we do this, right? Are there other ways in which we can do that? Like even when you're wanting to run an exam, how else can you organize the exam? And some of that can also help to take some of the ease off of the proctoring, the reliance on the proctoring for the academic integrity. So no one wants to reduce the rigor. No one wants to not uphold academic integrity in the university, but how we do that. But that requires a sustained conversation. And unfortunately in the pandemic, we didn't really have much opportunity for that. It's a great question.