 I'm now delighted to introduce Aurelie, a lady that needs no introduction. Aurelie has recently moved to UCL, but we've had a long partnership with Aurelie in DCU for many a year and delighted she's able to share her insights with us today. So Aurelie, what I will do is, are you happy for a 15 and five, five minutes Q&A, 15 minute presentation, or do you want to probably do 10 and 10, but let's go for 15 and five and if I go way over 15 is the problem. Oh, right, right, no father whatsoever. Okay, right, the clock is starting and the floor is yours. Thank you very much, Mark, and good afternoon everyone. So this is my first time presenting at the Moodle Munch, and I'm really privileged to show you what I've learned so far at UCL. So this is about my vision, my view of what we do at UCL. I've only started at UCL a couple of months ago as a learning technologist, well in January, and I'm going to show you our approach to supporting academics, digital resources and content within Moodle. And so this is to, oh, I can't click through. Yes, technology, hey. This is to map with the area to digital resources of the DG Comp edu framework. Specifically, we've selected digital resources for identifying assessing selecting digital resources, but for supporting the academics and to help them select those resources and consider it within the broader learning objectives and contexts, modifying and building on existing open license, openly licensed resources, but especially 2.3 is what we focus in our example today is to organize digital contents as well within managing, protecting and sharing digital resources. But really the approach that I'm going to show is valid for all resource creation, supporting the academics in doing all of these management of resources. So who are UCL? So I joined UCL a couple of months ago and I knew they have worked with colleagues at UCL for quite a few years on and off and but I knew they were big university, but it's not just big, it's massive. So 11 academic faculties made of many, many more departments, over 43,000 students and over 14,000 employees. So it's a big institution and as a result, there are a lot of different practices with regard to digital education and a lot of different technologies use a lot of wide variety of practices in terms of the competency framework that we've talked about as well and levels of application of digital education. Who are we? So our team are digital education, digital education support, and we're supporting digital recreation across UCL, but each of the departments and faculties also have additional digital education support in various degrees. So we are the digital education core, my team. We have six learning technologies and three this is who are doing support work with mainly, but we do documentation training, anything that's to do with the core system, which is basically mood all and additional services that enable digital education. Then departments might have additional technologies that they license locally for their department and support themselves. We then have a part of digital education. We are a really wide team. We have a digital education advisory. Some of you may be familiar with the UCL baseline, which we used in my previous university experience, Country University to adapt to make our framework for approaching module development and design and checklists as well and MOT. So that's on the wiki and a lot of people are familiar with that as done by the education advisory team. And then we've got the future teams looking at new technologies at the moment, a lot of things around immersive spaces, etc. Digital skills, development and digital accessibility are kind of the same sub team of digital education as well. We've got online learning, which is more the CPD external faces, another Moodle called Extend, some of you may be familiar with. And then we've got the faculty and department learning technologies, learning technology head, so faculty, and they will support the department specifically with their needs and represent them within our team as well. So I sit within the core team. So what do we do? We solve problems. And in regard to these aspects I've just mentioned earlier, which is the selecting digital resources, managing, protecting, share digital content and resources. How do we do that? We have this learning technologies with looking at different tools, analyzing different technologies. We've got the baseline that I've just mentioned and templates that go with the baseline, the Connected Learning Template, which is aligned with the baseline and help academics with the starting point. We've got the wiki, and we've got Moodle and a variety of plugins and third party services. So I actually had to put a list together in front of me. So we've got Moodle, H5P, Blackboard and Lie, Mahara, called My Portfolio. We've got Lecture Capture, which is Echo360, which is plugged into this as well. Connection of timetable and student record system, Opinio, Collaborate, Zoom, Teams, all the video content that you can imagine, Tenetine, Wiseflow and Mentimeter. And there are many more, but these are the key core tools that people have if they need to. So they need a lot of support, documentation and guidance as to what's appropriate for the context, which is in that area too of the digital education framework. And so how we approach this in core is we work with the learning applications teams. If you want, they might not be happy for me calling them that, but in a more techy side of things. And we together, as what we call a squad, we support Moodle as a product. We work in an agile approach. So an approach that enables collaborative time content and chunked approach to developing software normally, but in our case to improving Moodle as a product and the attached services. And just as a caveat, I am new. So everything I'm discussing today is from our observations in the last couple of months and discussions with colleagues. Things are changing a lot as well as we're working in an agile way. So how do we do it? We do story mapping. So we're not just doing requirement gathering in the sense that we don't, for those familiar with Moscow, require prioritization. We don't just get requirements like that. We talk with the users, we try to get the story. So learning about the issue we're trying to solve. We then do some testing, documentation, feedback and try to apply continuous improvement as well. So just so that we can, I can give you some kind of illustration. I'm going to take you on a little story of a request from the initiation to the implementation. I'm going to try and do that very quickly. So we have an academic who has a problem with the amount of content and resources they have in their student pages. And they just can't remove the content, the contents there. And the issues with the pandemic happening and additional asynchronous learning material, they have had to put more content and they need to find a solution to present those resources, develop resources appropriately and activities in Moodle to make them more navigable, more accessible, more approachable for the students in their learning path. So they talked about it at their next faculty teaching committee. And here, so this is a picture of how committees now are, obviously. It's not a round table. It's a team's meeting. And there was our digital education service manager, Jason, in this case, we could have been learning technologies, anybody from our team listening and trying to capture that requirement of they need a solution to help them deliver those resources. And so then the next step is to understand the story. So trying to discuss this with the person on trying to understand who they are or the origins are we're trying to help. What would these the solution plug in whatever we're talking about help them do better. So our aim is to help academics and students do better. What does the solve problem look like? So what would they imagine their solution feel like in a way? And then we research the different technological solution. It's not always a plug in. It could be an RTI connection. It could be a completely different service. It could be just a tweak to an existing course or configuration in model. And we have a checklist for anything that's plug in an LTI, we actually go through a structured approach to ensuring the technology solution is reliable, complies with data protection and accessibility. And this checklist sort of make us more in line with these digital competency too of having all these checks happening to make sure things are shared. We try and favor open source as well. And things are up to date and accessible and data protection is kept as well. And then this is me. We test the technology with the steps. The story has helped us establish. So we actually check against what we are trying to solve, not just functional testing with our purpose. Once we've done the testing with the documentation, sometimes it happens in parallel. So we document the usage. As you can see an example here, what we have found for this person is a flexible format course content. So it's a little bit like great for those who are familiar but has additional content in there. And with a flexible format, we gave them something called structured label, which is an additional resource type, which enables them to have a more visual approach with tabs, well buttons, if you want, we have links at the bottom. So it structures the content, make it more visual, break it down. And that's what they needed in this case. So usage, limitations, pitfalls, non-issues. In this case, there were a couple of limitations that didn't stop us implementing the solution. But actually we had to document it because it was a technical issue with adding more than three buttons or tabs. When you edit them, it was actually stopping you displaying them. So you have to make sure you reopen and all that is under the caution part of the wiki. And we obviously will add training for those implementing this course format. We get some feedback. So we have what we call screen demos. Not sure how much I've mentioned that in the start, but all these cyclical development activities happen over a two-week period. So we've got a sprint period of two weeks where we review the plugins, etc. And then we will look at implementing them or make changes and start again the sprint. So the sprint demo at the end of two weeks will show what we've done to the broader team or to our colleagues. We also review the story with the user themselves and solutions that potentially make some changes or configuration change or actually decide to not implement it if it's not suitable. And then finally we deploy it. So this is happening next week. Well, this week, sorry, we're going to have three new plugins out following these requirements. The structured label flexible format for this person and a current cloud for another completely different requirement, which is what was called lab tutor for people wanting to do science labs online. And then potentially we repeat and it can be a bit overall. It's kind of fast-paced. You go through a lot of things, but it's extremely rewarding to be able to give the academics what they need for teaching when they need it. And this is what we're trying to achieve here. There are downsides to it, which is you can't start something straight away for the next sprint, for example, you have to prioritize that. But it does mean that we can help and deploy things quite quickly if they are suitable. But it also means that we are quite strict with our checklist. If the code is not right or it's not supported properly, we will not implement it. Any questions? I'm going to have to, what I'm going to do is stop the share, I think, because I can't see the chat and I can see lots of things in the chat.