 Good afternoon everybody and welcome back. Apologies for being a couple of minutes late starting but it wouldn't be a learning technology conference without a technology challenge. I'm Matt Lingard. I'm just here to chair the session today just to make sure you're in the right place. This is leading, leading through listening and the presenters today are Malcolm Murray and James Udale as you can see on screen. They're going to talk for about 20 minutes and then there'll be five minutes Q&A so please use the comments to line up your questions and I'll be facilitating those in about 20 minutes time. So without further ado I'm going to hand over to Malcolm and James. Thanks everyone for choosing the session today. I hope you can hear us. We're running blind because we can't see the comments while we're presenting. My name is Malcolm Murray. I'm the head of digital learning at W University and I've got my colleague here James Udale who's been leaving a lot of the pilot and evaluation work. The session is called leading through listening but actually in the light of the pandemic we thought that actually a more suitable title might be changing VLE in the middle of a global pandemic. What we want to cover today really is where we were and why we felt it was a need for change and then look at it from a leadership lens. What was the challenge we faced? How did we use different approaches to make it happen and then actually get into the nuts and bolts of adopting new VLE, talking about the approach we took in the pilot and then in the institution-wide implementation and then end with some reflections. So to start with looking at where we were. Where we were is fairly simple. Like many institutions we had a very established on-prem VLE. In our case it was Blackboard. What you can see in the screen just now is what students would see after logging in. So yes they was linked to their courses but there was an offer of other information, links to the student record system, to lecture recording, to Pebblepad, to timetable, lots of information. Quite a confusing albeit powerful interface. And our VLE had quite a good run. We started using Blackboard in 1990 and it had been upgraded several times but it got to the point where actually we thought it was time to do something substantially different. We used the digital tracker to get a handle of what students experience was like during the pandemic. These are the top three issues that they picked up on and you can see the most frequent response was to make the VLE more user-friendly. So they didn't think the duo really was cutting the mustard. And this correlated with findings of previous year from staff where again the use of the VLE seems to have changed from being an enabler to being a bit of a hindrance and what concerned us was looking at these three questions from the GIST tracker, the response from DERM staff was significantly lower than across the sector as a whole. So that was a warning bell sounding. Stepping back and looking at how we've adopted, used, supported the VLE, four issues came out. You know that there was a lot of scope creep over those 20 years. It started being used out as a VLE and then we added staff training, fast storage, support for exam papers, HR, communication tabs, you name it, it was probably on Blackboard at some point. It had become very heavily customized and extended. I'm guilty to a lot of that, but it meant that it was quite difficult when it came to upgrades. It was a very complicated test heavy process. If we asked staff or students about the VLE to describe it in one word, clunky was the answer. It just, it didn't fit with their experience of the other tools they were using. It wasn't the sort of modern teaching-run experience they expected at DERM. When we looked at the way staff were using it, at least pre-pandemic, it was still very much in that document dump or positive mentality. There's a lot of content, but it was really there to support face-to-face teaching and these are things we wanted to change. What has the pandemic done? Well, it really has increased the importance of online and it's focused on digital skills. We found out which staff and which students have those in spades and which of those they need a lot more support. It's identified some gaps in our provision and it's also highlighted which systems work which don't. So we've got the lens of the pandemic and we've got our own institutional drivers where our PVC for education has gone on record and said that the university should offer a first-rate digital learning environment on a par with the best in the UK. Well, we felt we weren't delivering that and we wanted to do something about it. So the plan for the old server was for 2021, we wanted to contribute into archive mode, push the data into the cloud later on and put the servers into museum where they belong because we wanted to move it to a cloud-based solution that would give us the agile solution that we felt our users were looking for. So to do that, we wouldn't just jump straight to it, we'd be an initial pilot and if that was successful, we'd move to live which is where we are now. So let's now switch to the sort of leadership challenges and there's two lenses we want to use. One is we want to use the work of Haifa Sital, which is talking about changes as being a painful process. So it's easy to get excited about the new, the shiny, the latest version as a learning technologist, but we have to understand that when you do that, we're actually changing things for academic staff whose competency gives them a sense of power and a sense of agency and if we change the tools, then all that goes away and they have to relearn it. So actually for them, change can be a very painful process. And throwing the pandemic on top of changing VLE, external websites, updates to student record systems, you know, learning how to use teams, learning how to use zooms, staff are facing change after change after change and we are all tired. So this is where the model of Amy Cuddy and colleagues is really helpful. For a change to be successful, the people leading it have to have competence, so people believe that actually the input is right, we've made the right choice, but also there's the human element that's at warmth that we have to convince staff that we're listening to them and students and providing a solution that's going to meet their needs and we will change things according to the input we get from them. So like all four quadrant models, we're trying to get into that mythical upright quadrant for admiration. So looking at the pilot, the pilot was competitive and five programs won. We had four beginning in October last year and a master's program that joined us in January and these were all delivered using the new LearnAlter platform. The program board made some early decisions that helped shape the way it was used. So to try and stop all this scope creep, we went for very clear purpose that the VLE was only to be used for credit bearing, teaching and learning. It wasn't to be used for staff development, it wasn't to be used for story exam papers and all those things and also this was our chance to start again. So we weren't doing a lift and shift, we weren't going to take all the content from our old VLE and dump it into LearnAlter. We were going to start with a new design, new templates and empty courses. That's risky and I know that is all undermining staff confidence and making that change painful. So we also wanted to make sure that we were offering a future that was built on the past. So we were harmonizing the best experiences over the last 20 years and trying to make those available across the university and build a system that meets the needs that staff and students have today. So in terms of how we evaluated the pilot, we looked at two different things, look at the student experience and staff experience. In regard to students, we looked at the gestural experience survey to kind of get a sense of how the student experience compared for LearnUltra against those students who had their teaching or their courses delivered by a dual. So there's two questions in particular related to the VLE, one of which was whether the LNS was easy to use and we found that those students who had their teaching delivered via LearnUltra agreed much more and disagreed less. Equally when it came to ease of navigation, we found that the students had a much better experience on LearnUltra. That could have been down to two things, it could have been down to the student focused activity stream that comes with the product and also perhaps the new course templates that staff have designed for the pilot modules that designed to give a degree of consistency across the program. So we saw these were having real benefits on the student experience. In terms of staff, we chose to do a kind of a deeper dive and we targeted a survey at staff who were on the pilot in the spring term. So those who started in January had a chance to get their hands on the system before we asked them questions about it. We looked really to do three things. We looked to look at what staff wanted in the VLE in the first place, we looked to see how their perceptions of what they wanted in the VLE mapped against Duo if they'd used Duo in the past and then also see how that applied to LearnUltra, so whether there's a difference there between the systems. When it came to what staff wanted in the VLE, I think we found that more or less staff wanted a bit of everything. I mean, roughly to kind of summarise here, the key things that staff prioritised were ease of use, assessment grading and feedback functionality and communication tools. I mean, there was a sense that perhaps some people were still not using the VLE so much for social collaboration, but there was in the open text comments a real sense that practice had changed through the pandemic and therefore what they wanted in the VLE was now different from perhaps in the past. In terms of staff, well in terms of how staff felt about Duo, we saw again those words come up clunky, cumbersome and unintuitive owing to really how the platform was put together, how it was designed and perhaps the edge of the platform. I mean, there was a sense that staff who did find Duo easy to use, that was perhaps caveated by the fact that they only found it easy because they were familiar with the product, they used it for so many years and therefore that familiarity was borne out of, you know, just the fact that they've done things many, many times before. In terms of how staff found Ultra in comparison, I think that the key thing was a second-hand experience from the students. They were saying that they were definitely reporting their students much preferred the system, but in terms of their teaching, they were critical about free things. These were perceived loss of features in comparison to Duo, a lack of perceived customization options, so how to make their courses look and, you know, how to design their courses, and perhaps limited functionality in regard to assessment tools, which I think in the pilot at least that was very keenly felt in the test functionality because, you know, there were certain features that have since been released that weren't available at the time that we would probably consider to be core features. So maybe there's an element of protein maturity that was perhaps informing that opinion. In terms of comparison really, in terms of how that opinion differed between Duo and the Ultra, we found that rather marginally, staff did prefer the Ultra when it came to student collaboration and encouraging staff to try different activities, but we did see it was a very, very mixed field, so we thought there was much more there. We needed to kind of dig down and find out what was going on because it was a very polarized response and we didn't really expect some of those responses from staff. So really what we did is we had a few focus groups that looked at those staff who did the survey and we found out, you know, why they felt the way they felt and the key thing that came up time and time again was time and that was that staff did not have the time to basically migrate their courses and also get to grips with the functionality and the features of LearnUltra, so perhaps there were features there that, you know, they would have helped them or increased their possession of the product, but they hadn't really had a chance to get hands-on due to the pandemic and due to all the things that they kicked in the long grass in the past year, such as research and so on and so forth. There was also a lack of awareness as to why the Veely was designed the way that it was. So for instance, lack of customization options, staff weren't aware that things were restricted to help more accessible content and more device compatibility. Equally, there was perhaps a lack of awareness as to why we were changing Veely in the first place. So it was clear that perhaps at the pilot stage, there was maybe a lack of communication on the high level strategy and maybe that hadn't quite percolated down to where it should have percolated down to. But in terms of staff comments, we did get a sense that really things were moving in the right direction, that perhaps many things weren't quite there during the pilot, but this would be a better experience for students going forward. And there is a sense that perhaps system was perhaps designed in a way that did encourage more active learning design, more putting kind of social collaboration at the front of what they were doing, rather than just, you know, putting their materials online for students to kind of consume and download and kind of in line with their blended learning and their first to first teaching. So in terms of influencing the Veely and kind of taking forward the things that we learned through the impact evaluation. This plan is showing the essentially timescale we used. So we had that big pilot that James has just talked about that ran all the way to June. And initially it was only staff on the pilot who got access to the system. Then we wanted to broaden it out. The initial feedback from the pilot was positive. So it was time to get other staff on it. And we worked with direct education and learning teaching managers to give them access. And once each department had a little bit of expertise, they then as a group built at the department templates. Once the templates were built and approved, then and only then were the courses that staff are using for teaching added to Ultra and not all staff in the department enrolled. The idea here though, as you can see, if everything moves as early as possible, we were giving staff access in mid April to build the courses that they're going to be using in a week or two's time to actually start delivering the learning teaching. And throughout, it's underpinned by a strong support model. For the engagement strategy, we needed to engage at different levels. And some of the comments that James alluded to have suggested that some of us have been more successful than others. So at the high level, it was talking about what we were doing and why, what was the feedback from staff about the old system, what we're trying to change. And just give me an idea of dates and facts and getting the buying at the high level so that we kept the senior management on board. At the mid... Five minutes. Five minutes, James. Thank you. We're looking at department level briefings, where we're getting a bit more into the detail. But it was at the program level, when you're looking at how can I use this new system to build my modules or design my programs, that we actually got to the nitty-gritty and put it to the test. We did do a lot of talking. I don't expect you to read the text on the screen, but it is the dates and names of all the meetings we held and the bar gestures to meet across my characters. There's over 96 meetings we had. We wanted to talk to people about this new system and what it meant. And we had quite a detailed support model. So we had a SharePoint site and combined with access to drop-in sessions for one support for staff. We had some design workshops to help staff redesign modules for online teaching like the pandemic and the new system. And we had some how-to workshops. So if you want to just create a reading list to use Turnitin, we've got those how-to skills there. And in the product, we spent a lot of time at the at-elbow support. So staff and students could get answers without even contacting the central IT teams. We also understood from the feedback that staff time obviously is very precious and sometimes synchronous offerings aren't quite what staff need when they're very busy. So we designed a fully three-aligned self-paced module which was roughly equated to the entire training offering both in the pedagogic end and the technical end. So we decided to make this a benchmark in practice, but also not overproduce it at the same time. So we didn't use Final Cut Pro for our videos or W Photoshop for our pictures. We just tried to use the tools within the kind of ecology of systems that the university has basically within their kind of, you know, the palette of things staff could access. So the aim was to basically make it achievable by staff, not overwhelming, but also convey the confirmation and, you know, provide a good educational experience. So in terms of engagement with the self-paced cost, I'm a fairly slow burn across the summer, but we've definitely seen staff have been accessing it, you know, in out-of-office times, which perhaps is not desirable, but it does show that it has allowed staff to engage at different times with the material. We've seen, you know, 77 people complete at least one full section, which is a fairly weighty thing to do. We've had 163 unique engagements with the tasks and activities. These are just things that kind of leave a trace in the system. There's other kind of reflective and soft activities, which perhaps don't leave a trace, which staff have also been doing. We've also had 18 brave souls who've done the entire lot, which I think is certainly quite commendable, given the amount of content there is to work through. The feedback has been fairly strong for this. I think the people who actually engaged with it have really enjoyed it. What we did find, though, is despite our attempt to not overproduce it, some staff did still feel that perhaps we were asking a bit too much of them, or maybe it was overwhelming. So I think it kind of goes back to that digital skills thing, and to just be aware that even when we're trying to make things achievable, we should still probably at least have some awareness in what we do that not all staff might not be able to get to grips with the things that we're doing, and we're kind of holding up as kind of like achievable practice. So in terms of reflecting on how this is all gone and kind of where we are now. It was a big project delivered in a pandemic, so it was going to get messy. There was technical work seems, academic work seems, and hands up, there were problems at times, but really those are picking. Where was the stopping point? Who's in control? Who's making the decisions? We also faced changes in project management, which didn't help, and changing in senior management, they actually pointed a new VC part through this project to add to the fun and games. So it was again making sure the project still met the aims of those new individuals and kept them on board. I think as well, change fatigue in staff was very high. Obviously we've had a pandemic. We've had new systems like Teams rolled out within the space of a year. So I think one more change and lots of kind of rolling eyes and people's eyes glazing over is definitely an issue. And as I mentioned before, increased workload is a challenge as is digital skills. I think one thing to keep in mind is when a project this size, there's many unexploded bombs with a system that's 20 year old. So many things and unseen activities have gone on for many years and all packages that are being retired may become conflated with the project. So when you have staff saying, I've got these flash files that don't work anymore, that can then become conflated with the project and ready to separate issues. I think it's worth keeping these things in mind in projects of this size and scope. And going back to this idea of the warmth is that we have this really well-defined scope for the project, but sometimes you have to give. There's activities that staff have got that are really well aligned. They've got really sensible learning outcomes. If they don't quite align with our definition of what the VLE is, then we have to take that back to the project board and reconsider it, not saying outside no. And that's what's kept us hopefully in that warm area. We're not going to have admiration. We've had people throwing flowers at chocolate at us, but we have had active engagement. This is happening. People are creating the courses and we've got students actively learning and teaching with it now. I think we'll just skip that for time. But look at consistency really is a key thing. When we've got all these changes going on, we need to make sure that staff want to know what's happening. They want to be told once and then they want to have it happen. And the more we can insulate them from the changes, pick up early things that could destabilise it and correct it, the more important it is. So really to conclude, we'll talk about the rays of light. I think so far we're seeing that the student experience has definitely been much improved for the change. And I think one of the key things from a pedagogic point of view is the benefits of consistent and well-designed templates have been willing to shine through. So departments having that chance to go back and really readdress how they teach across the piece is definitely set to pay dividends. And I hope that many years to come, that will be the case. Now, I think we've seen benefits of staff thinking more about accessibility inclusion. So we've given a blank course on adding items one at a time. They've had to think, well, are all these slides up to date rather than just having a quick look at a folder that was there from last year and with things that ally is flagging up the ones that aren't. Also interesting is that we've a lot more comments from staff about the student experience. They've thought a lot more about what it's like as a student to use the system, which we think is really beneficial. And also, because they've got a new system, they've had to rewrite things and rethink about it, they're much more open to trying new tools and not just repeating what they've done in the previous year. So we think it's been worthwhile. Thank you. Thank you. I'm going to interrupt you there, Jen. So I'm sorry to cut you very slightly short there. We were late starting. So we haven't, we've got a minute, but we haven't gone much more just in terms of people moving on to other sessions. But thank you so much for sharing all your experiences. Don't think anybody envies you having to do that through the pandemic. I've not spotted a question yet, but there's more comments which kind of have questions hidden in them, I guess. There was an interesting comment earlier about how we often see and describe these projects as transformation projects as having an endpoint, a start and an end, when actually that's not really the change and that's not really true, and there's constant change coming. So I assume you're not seeing this as job done and you're building stuff for other adaptations. Yeah, certainly it's not come the first of October. Now we all have a big party go away and I never talk to Erin again. But certainly I think that there's a change in pace and in a project is important to have an end point as well and say, right, now this is when it's gone from a perhaps early model white glove, lots of support to mainstreaming and making sure that everyone knows how to answer it. And if staff have questions, it goes to the standard routes, not some special project. So there will be some work at the end to make sure that we can move from the pilot stage to the sort of standard service delivery model. But yes, we're learning and particularly in a SAS model where you've got continuous change and new features coming through, we're always learning. True, true. Thank you, Malcolm. Got one quick question come in from Kirsty, which is on the screen for you now. This will be the only one, I'm afraid. I think there definitely are. I think that, you know, as I mentioned in the pilot feedback, I think that students are definitely saying that, you know, well-designed templates for modules and consistency is definitely an improvement and also just to kind of clean our interface and the activity stream as well, which is very student activity focused. I think that's definitely been a well-received thing. So I think that hopefully that's repeated when we kind of roll out on mass and maybe we'll talk to you in a year's time and let you know how that's done. Great, thank you. Thank you so much to both of you. We really do appreciate all the work that goes into these presentations and right through from the submission process. So many thanks for today and thank you everybody for joining in as well. Yeah, thank you.