 Good afternoon. Welcome to the session this afternoon, where Marika, Guy and Turindu will be from UCEM. We'll be talking about formalising the evolution of learning technologies. Over to you. Right, hi everybody. Hi afternoon. Hopefully the weather's a bit better where you are. It just keeps raining a lot here, so it's just been tipping down. Hopefully you can hear me over the rain. Welcome to today's session on making your mind up, so formalising the evaluation of learning technologies. For those of you of a similar age to me, I was going to share a picture of Bucksfizz, so these were the UK entry at 1981 Eurovision Song Contest, but copyright and all that. For people a bit younger than me, looking up on YouTube, there's a very exciting costume change halfway through. Those were a lot simpler days before COVID and the like. So today's session is a shared reflective look at the evaluation process for learning technologies. Perhaps this is something we can work together on as a sector. The session is going to be presented by myself. I'm Marika, I'm the Learning Technologies Production Manager at University College of Estate Management, and Turindu is a learning technology researcher. Turindu is going to be helping me by keeping an eye on questions and the chat. We're both based at UCM, which is the leading provider of supported online education for the built environment. UCM have been around for 100 years and have more than 4,000 students from more than 100 different countries, which is great. The plan for the session is for us to start off by contextualising ourselves. So it's been a very odd couple of years, and we'd be putting ourselves at a bit of a disadvantage if we didn't look at what's changed and why. We'll then introduce the procurement and decision making process for choosing learning technologies and see what scope there is for formalising it or fleshing out and sharing best practice. We'll finish this session with an invitation for your ideas. We have a mural board with a set of questions on it, and you'll be able to add your own post-it notes. It's a little experimental and I'm hoping that people are really going to go for it and share their thoughts. We'll tweet the board after the session as well so others outside of the session can contribute and we can continue the discussion. So what just happened? I know I don't really need to go over this, but COVID happened and last year led to the massive piece of work that was pivoting face-to-face courses to online. The initial approach was emergency remote teaching, which has been followed by a year of online teaching. I'm using, you can't probably see me, but I'm using inverted comments here because this online teaching wasn't necessarily planned in from the get-go. At UCM we're fully online and we design courses and modules that are supposed to be delivered online so the lead-up time to this is huge. Most institutions are still sort of paying catch-up on courses that weren't originally designed to be taught in this way. The whole COVID situation has put digital and online in the spotlight and then there's been a lot of discussion of the opportunities and affordances of online closely coupled with lots of discussion on the downsides of online teaching. It's almost like online teaching and learning technology has kind of appeared on some sort of reality TV show and it's become famous and now the media onslaught has started. People working in the space have made the best out of the tools already purchased and on the whole it's gone okay, I think, but most institutions weren't appropriately set up for the situation and students were initially grateful and then they moved on to quickly being slightly disappointed, not necessarily with online but with the experience really. So to quite a few stats I've got the ONS poll where more than one-third of university students were unhappy with the academic experience this term. The National Tertiary Education Quality and Standards Agency amazed at up to 50% of students were unhappy with online learning and of course the NSS and satisfaction hit a new all-time low with a score of 75% overall satisfaction. Those satisfaction scores don't necessarily late directly to the tools being used but it's been a really bumpy ride and we begin to take a closer look at the technologies we're using to support learning and teaching and we're sort of taking stock and re-evaluating. Do they work in a fully online environment? Are they fit for purpose? Are they right technologies to align with their own institutional direction? Do they support a pedagogic approach? So for example the whole sort of digital assessment discussion and investment in the digital campus and infrastructures begun to be seen as really important, possibly even more important than the physical campus. So I've added in this quote here from the OFS Gravity Assist Report which is a need for greater investment in the digital campus. Money is starting to be spent which is good news but there's obviously a lot of decisions and once the decision to adopt a technology has been made to set the ball rolling it leads to a really complex procurement and evaluation process. Usually the tech learning technologies are heavily involved in assessing and recommending products for purchase even though the final decisions might be made by sort of some senior leaders but these decisions are just not easy to make. So to give a little introduction to making technology decisions and I just want to start off by saying that decisions aren't straightforward and they aren't isolated so we have pressures from above so we've got things like senior leadership and breathing down our next budget issues, overly risk-averse cultures and the need to stick with existing tools and then there's the precious one below so resource, time, the users that are involved in this and then pressures from the side. So here I put down things like security implications, the wider world accessibility regulations, government regulations, the OFS, existing institutional frameworks and approaches for example UCM learning and teaching assessment strategy advocates very student-centred industry informed outcomes driven approach so you need to think about that when making tech decisions. These decisions involve a lot of skills so I've got some here you know to make these decisions you need to compare, contrast different technologies, consider the different tools, evaluate and test them, analyze them, write reports on them, negotiate when what's come up is maybe different people have different opinions, involve various stakeholders, align them with the strategy and drivers and then compromise as well and then placate possibly even and decide and recommend and there's a lot of problem solving there we need to be really open-minded and practical and once the decisions have made they involve a lot of conviction from from our point of view. So I wanted to start off by looking at some different frameworks that could be used in this kind of evaluation process. The first one I've listed here is the edu-cause model and this looks at things like appropriateness so will the tool solve the identified problem features what features and functionality does it have and what purposes do these serve uniqueness this tool replace other tools are its functions unique I mean this is a big one because we're quite often trying to work out whether we can get rid of tools just as much as bring new ones in interoperability and integration does it integrate with existing systems accessibility and then how does it compare with other tools. I'll talk a little bit more about the edu-cause model in a minute because that's the one that we've looked at specifically UCM. The technology acceptance model is quite an interesting one there and it kind of suggests that when users are presented with a new technology a number of different factors influence their decision and you can sort of see from the little graphic at the bottom there and some of them around ideas around perceived usefulness so this was defined as the degree to which a person believes that a particular system would enhance their job performance and then also whether it perceives ease of use as well and so whether they feel that they would be able to use the system without any effort and if the technology is easier to use then what kind of barriers are and how are they conquered as well the things to think about. Jacob Nielsen is something I refer to a lot and some of you who have been involved in website will probably know him from from his usability testing and he talks about things like your learnability efficiency memorability errors and satisfaction and also the importance of user testing so his idea was always very much to bring in I think it's something like five users and how just this small number of users would really have a big impact on testing your site. And then another one I've brought here and this is one that the Rindu suggested Tony Bates' section models and so this talks about and provides a framework for determining the appropriateness of teaching technologies and you use very much in the sort of school sector and classroom sector. So looking at the students and who are your students and the demographics and digital skills of those students and the ease of use again and technology that's reliable for teaching ease to set up the costs involved which is really key and one of the things they talk about here which is quite interesting is the idea of whether there is extra funding available for implementation of particular solutions and I noticed in the previous panel that there was someone from from one of the funding agencies and I can't remember the name of it now but yeah so so going out and then getting funding for implementing solutions is really key as well and then they look at things like organizational issues, networking, security and privacy as well. My second lot of existing frameworks take a slightly different approach so these are change management approaches that can support the approach so my favourite here is probably ADSCAR so awareness, desire, knowledge, ability and reinforcement so very much about sort of coming up with that building that awareness and want within your sort of stakeholder base and the model was developed about two decades ago by Jeff Hire and he studied the patterns of change within 700 different organizations and we've got some other ones here we've got McKinsey's 7S model and then Lewin, Bates, Cotter and then there are other models that are worth looking at as well and they really help kind of us understand some of the reasons for failure so they look at things like you know not getting the sponsor right and have the importance of senior support understanding the issue before you start on the solution so making sure you understand the problem fully and avoiding jumping to an early solution to a problem and also things about communication as well very important in the change management approaches and how do you change how do you how do you communicate to those involved what's happening and make sure that they're involved but not too involved so it's something really around feeding information to these people and also things around governance are covered in the change management approaches the final thing I've got here is Seamolts which I know I'm sure some of you are working towards or already have but there's a part within core area one which looks at understanding the constraints and benefits of different technologies so some of the things we're talking about today can really help there so what do we do at UCM so we were predominantly looking for a new assessment platform and platforms and technologies associated with assessment and we started off by carrying out a thorough investigation of existing processes you can see there there's a kind of a spreadsheet that I use and that was basically every single step of our assessment process and we went and interviewed lots of different people to find out those different steps and we also documented what the problems were with those particular steps so we really got a sort of a good idea of the type of issues we wanted to look at and we then created a functional requirements document and here you can see that the document was originally written for a CMA because that's what we thought we were looking for but we actually moved on to looking at a platform as part of writing the functional requirements we saw that and also working through the processes we decided we would need to look for something slightly different we then went and identified the different systems to evaluate and so we actually looked at four different systems for our assessment platform in SPIRA, Better Examinations, Question Mark and Uniwise and then we worked through a systematic testing and evaluation plan and that was inspired by the EDGE of course rubric that I spoke about earlier so here's what the rubric looks like and some of the things that it covers really so it looks at things like functionality and so again this thing about ease of use is really important but also things like technical support and hypermediality so that's around sort of interactive clickable elements like graphics and hyperlinks and how they all flow, accessibility a really important area and equipment required there, technical stuff so very much looking at our integration with our existing system so a learning management system we currently use sits and we wanted to work out how that would integrate also with our VLE through things like an LTI as well and desktop laptop operating systems and browsers so looking at testing things on different different types of browser and pulling this into the mobile design as well data is one of the key areas now that you really need to consider as part of the kind of decision making process so it might be how data is stored and goes into the system who the data belongs to and where the data is held physically the location is something that's really key because you know you might reject systems that hold things outside of the EU or not you know now Ireland we tend to store stuff and whether you know it needs to be in the right place and archiving saving and exporting data can we if we move on from this system can we get the data out and then stuff around how the system works like things like the social presence teaching presence how it works with the academics learning analytics and the type of data that's going to be pulled out there and cognitive presence as well and so once we to carry out this systematic testing which did include a lot of different other areas so the systematic testing was set up on a kind of we started off by looking at lots of different scenarios the tasks mainly based around the assessment workflow so that was looking at things like authoring writing of the assessments and then delivery of the assessments to the students and then the marking and moderation workflow so we had all those different scenarios we worked through as well we also allocated lots of different roles to each other and made sure that we went through and tested all of those data again and also different environments and finally we looked at kind of things around documentation from the systems that we were looking at and then we came up with a decision matrix and we evaluated all the different software solutions we've been looking at and we gave them there were different weightings for different decisions based on things like cost usability integration with our current systems and we waited them and we made a decision and there was also IT we're involved at this point and they went and bought in a software and cloud solution approval process so they went in and kind of almost interviewed the company delivering the solution to check for things like you know whether they adhere to certain safety protocols and making sure that their system would definitely integrate with ours they have a much more technical knowledge there and off the back of this we wrote a recommendation giving the reasons for our choices why we had chosen one system over another one thing I just wanted to note here that UCM is working on at the moment is we're building a student consultation panel so this it will be similar to the OU approach that they take which is they have a pool of students who come in and do use visibility testing on different activities and approaches and I think bringing your students in when you're making these decisions and you're going through the evaluation process it is really important and is really useful and of course once all this has happened a decision needs to be made and agreed by management and then of course the work really begins because then you begin the full implementation process okay so so that's kind of the thought to us but what we really wanted people to do and I'm not entirely sure on how much time we've got left yet at the moment but we um we wanted you to be able to input your own experiences in here and we were going to have a bit of a longer time for you to do this but we think what we'll do is we'll actually tweak this out to people and so that they can add their own ideas at a different point but what we'd like you to do is consider a couple of different questions so if I just go through these questions with you and so for example do you use existing frameworks and if so could you share them if we haven't mentioned them already another one that I've heard that's out there is the actions model but you may have your own you may have your own institutional one as well that's been designed internally and what works when making decisions for example you know might be things like getting stakeholders involved from the beginning and also what doesn't work or what traps are there out there for example and we struggled with the fact that quite a lot of systems we were testing wouldn't allow us to have a full version or test fully and they gave us a cut-down version so that might be one of the issues that you had also budget is something we struggled with as well so what kind of challenges do you see and then how can you ensure a thorough evaluation process that is still time and resource effective so you know what your thoughts here and maybe it's involving more people maybe you've got something else that you might want to suggest and finally who should be involved in the evaluation process and I know all stakeholders but are there people that quite often get left out of this process is it people for example we were very keen to involve the exams unit or the assessment unit and the learning technology the designers the exam markers and people like that in the process as well and so mural is pretty easy to use and we've laid out a pilot post-it notes for you to have a go with and we'll keep the mural open for another day or two and then we'll summarize the results in in the kind of some sort of blog post that were put on on the site so hopefully they have shared the details on the page and people can start adding in murals I've actually got a copy of the mural here all people have started which is really exciting sorry just for me the second and you can actually see the mural as it now is I think I'm back on the screen now so hopefully I've gone a little bit fast but I've kind of stuck with my 20 minutes and yeah I can see people going in and possibly struggling within the mural layout it's a good tool to believe I've seen it run on a couple of other sessions so I think I'm kind of open to questions now hi hi Mary hi there I'm your session chair thank you so much for a really interesting presentation and awesome to hear a shout out to Seymour holders in the audience and we do have a question already coming in but I can see many people are busy on mural as well for Moira Sarasfield so we'll just start with that question Moira wants to know how do you arrive at the short list of solutions to test in more detail so there can be deal breakers from different perspectives okay all should check first that is a really great question so actually I did a bit of kind of sort of a meeting is sort of starting off work I prepped and went and spoke to a lot more providers than we actually brought down to the four providers what I did was I had a sort of meeting with them where I talked through our requirements and then heard from them one of the big things that was a deal breaker for us is we need an LTI integration for this particular platform so if people weren't willing to do that then we have to say well you know I don't think your solution is right for us and there were other things some solutions were clearly for much bigger institutions than ours and didn't seem to scale very well sometimes the licensing costs comes into that so I whittled it down to four different platforms that I thought were going to kind of work for us and spoke to them I know I've heard of bigger institutions who've gone through a big procurement process and have looked at almost up to 20 solutions so a lot of it is around you know how big your institution is what are you looking for whether you know what it is that you want to get from the particular platform thank you and that sounds really like a big sort of scale difference doesn't it depending on the institution and the context and we had a comment from Rod earlier as well and who mentioned the importance of a good project sponsor could you speak a little bit more about that how has that impacted on the on the work you're talking about today yeah I mean I think I think that's key and I think you really do need support from up above but also when you say about project sponsor this could come from different areas so this might be you know for example if you're you're bringing in a particular pedagogic tool and you need an academic sponsor there I think it's really important that you have someone who's very on board with it I know in my previous institution we actually wrote a procurement form where we wanted people to indicate who the academic sponsor was and how they were going to be involved in the process and you know I think it's important that that gets updated to some extent because what you might end up doing with an institution is having tools that you're still paying for as an institution but actually nobody's really using because they don't necessarily have some sort of sponsor behind it so yeah I think I think it's great to get senior support but also support from someone who's going to to use the tool in particular that's really helpful actually and I was wondering what is the next step in this in this project where you're hoping to take it next? Yeah so it's very timely we've just announced that we are going within SPIRA which is very exciting we'll be piloting SPIRA over the next couple of months and we've just written a blog post about that so now are we going to full project management mode having made the decision but at least we know that no matter what happens ahead we've gone we've had a really sturdy process to make the decision and so we're really kind of committed to this decision and we're looking forward to the pilot so yeah exciting. Wow that sounds like there's a lot of work ahead and I can see people are all engaging how long will the mural be open for and we've got about sort of a minute to wrap up if you want to make any final comments. So the mural is going to be there for a couple of days what I might just do is just to close up is just to say you can there are still some if you find the slides there's references on there as well so I can show people there but just just to say that the references are out there as well if you want to go and refer to any of those models and yeah there's lots of lots of sort of potentially other models that we haven't heard about I did do a big look but I think it'd be really great to pull some of those in and maybe move this forward as a sector coming up with some because I'm on so many mailing lists where people start a project and then the first thing they do is mail the mailing list and saying I'm starting this project I don't know what to do but it'd be really great to have a bit more of a framework that we can use as a sector so I'm hoping that's what will happen with this. Oh fantastic well to both of you thank you so much for what's been a fascinating presentation I've certainly learned a lot and I think their great reference is there to follow up and if I can see many people are working rather than applauding in the chat but from everybody here thank you so much for joining us and thanks everybody for joining the session it's been fun. Thank you bye.