 So now oedd wedi'u chyflwytoedd yn y wlaffau? Byddwn i'n dweud, mae hynny yn rhywbodaeth, peth yn rhywbodaeth ac yn ymysgol. Mae'r cyhoedau cyngor wedi rhywbodaeth. Mae'n ddysgu ymddug. Mat os gwaith? Ydi'r hyffordd wrth moedd wedi'u gwneud. Bydd hi'n sefydlu i ni'n gweithio mewn gwnaeth. Yn y gallwn ar hyn, mae hynny'n gweithio wedi'i gofnwys i'r hanffordd. A'i eich bydd nhw eich cwestiynau, yn gwybod amddangos nhw sy'n mynd i'n cael llyfrigol. Are you okay with this here? Sorry, this is for the second. Mae'r gweithio allan. Fy enw i'r unrhyw o'r parlau'r cyfnodd. Ac mae'n ddweud y clywed ar wahanol. Mae'r ffyrr, yn Cwestin Garddon i'r ffyrir o'r hwnnw. Mae'r ddweud o'r Pwysig, i gyfrieddol, i gyfrifio ac i gyfrifio. Mae'n ddweud ychydig ar gyfer Mae'r fwgon Llywodraeth. Ac mae'n ddweud Amanda, ac... Fe hwnnw. Yr 25 o'r pwysig. So, I'm going to do 50 minute presentation, five minutes of questions. OK? Away you go, Christine. OK, thank you. My name's Christine Gardner from the Open University, and I've been working on a data analytics project with Alan Jones, David Chapman and Helen Jeffries. Alan and David are both part of the TM355 module team, as am I. That's a communications technologies module with the Open University. Helen Jeffries is also part of our computer and communications school. I'm not directly attached to the module, but she does work with a lot of the Open University stuff in our school. So, just to tell you a little bit about the module itself, it's a distance learning module. Most of the materials are print-based, but as it's a Level 3 module, that equates to sort of the third level at Conventional University, there are some quite difficult concepts within the module. So, we've developed some online resources to supplement the printed text. And the focus of this research project is actually on those online resources that supplement the written text with the module. The module itself is divided into three blocks. The first one is about signal processing. The second block is about coding and error correction of those transmitted codes. The third block of the module is about the kind of those kinds of topics. So, if we just have a quick look at the sort of outline of this research, there are two main aspects. We're using data analytics in the first part of the project. And specifically, what we want to know is how students on this module engage with the technology-enhanced learning and teaching tools that we've devised to help the students with those tricky topics as they get to that point in the module. We want to know whether they're using the tools at predicted points. We want to know if they go back and use those tools again. And we want to know if the kind of student engagement is changing with the different presentations of the modules because we don't know whether this previous presentation was typical, for example. But to supplement that, we also wanted to interview some of the students just to get more of an insight into how they're using the tools. Do the technology-enhanced learning tools actually help them to understand the topics better? Or are they deterred from using these online tools because they're too complicated or they're too time-consuming, et cetera? We really want to know a bit more about why they're using them or why they aren't using them. So, what we actually use is this analytics tool, analytics for action, which was developed at the Open University. What we've actually been doing is looking at the analytic data, investigating specific issues, and where we are at the project in the moment is identifying actions and prioritising those actions before thinking about selecting the methodologies and evaluating outcomes. So, this is a walking progress report at the moment. What we're trying to do is actually see how specific students are using the online tools. But the analytics for action analytics tool actually identifies how students are using particular tools at a sort of a top level. It doesn't give us the actual detail on which students are using which actual online resources. Just to give you an example, this first example at the top is actually showing the discrete cosine transform online tool and how students are using it. If you look at the top there with the blue bar, that's actually an assessment point on the module. With this particular online tool, you can see that the students, they realise that this particular topic is relevant to their assessment at the end of a block, and they've gone to the relevant tool and they've used it. There's a very obvious peak there. If you look at the screenshot at the bottom with Hammon codes, there's actually a different pattern use there. Again, the blue bars are indicating assessment points. The first one is the end of the first block and the second blue bar is the end of the second block. The third blue bar is the end of the third block. With this particular online tool, you can see that the students have used the Hammon codes online tool throughout the block. There's some peaks and troughs, but it's being used throughout the relevant block. Then there's a peak at the end where the students have gone back and used it for revision. So, what we did was actually have a look at how the students actually performed during the exam at the end of the module to see if it made any difference, whether the students actually used the online resources of revision time and also question them to see if they didn't use them. Why aren't they being used more extensively? A lot of time and effort has gone into devising and developing the tools, so we'd like to know a bit more about why they are or are not being used. So, the particular prompt for this research was an exam question that was set in the 2016-2017 presentation. It was based on error control codes. One was a block two topics. The exam scores for this particular question were relatively low, and we knew that there was a specific online tool, which would have been very helpful for the students had they used that tool in their revision. So, using the analytics option, we could actually have a look to see how that tool was used and whether it actually made a difference to those students. So, the particular question was not a popular choice with the students. We could have a look at the top-level data. We could see this particular tool was used in the relevant block of the module. Some of the students went back and used it at revision time, but only about 20% of the student cohort actually used the relevant error control codes online resource. So, what we wanted to do was check the analytics data. We could see that there were a subset of students that we would like to investigate further. That is those who actually answered that particular exam question. We worked with the TAL team to interrogate the analytics data more deeply. So, in particular, we wanted to know if those students who answered the question actually used the associated online tool. So, to supplement that, we also wanted to ask the students questions about whether they used the tool and also check to see if the analytics data was actually given us the type of responses that we were expecting. So, when we actually looked at the data, we could see which students used that particular online tool. What we found when we matched that to their exam scores was the overall exam score for that particular question was 45%. For those who didn't use the technology enhanced learning and teaching tool at all, their exam score was 30% overall. If they used that particular tool at least once, their exam score averaged at 53%. If they used the tool specifically at revision time, the average score was 52%. If they used the online tool multiple times, they actually averaged out at 58%. So, although this is quite a small sample of students, it was encouraging to see that the data actually was looking like. Those students who used the online tools were actually performing slightly better. As it was a very small sample, we wanted to see if this particular group was any different to the cohort as a whole. So, we used a different predictive analytic tool. This is a tool that is used by the student support team to see whether our sample of students who answered that question were any different to the students who didn't answer that question. Looking at the overall predicted pass rates, our sample very slightly weaker than the cohort that didn't answer that question. We had 0.85 probability of passing versus 0.87. So, there's no particular reason why those students should have performed badly on that question. So, what we wanted to do to follow this up was to find out a little bit more about why students were or were not using the online resources. So, we interviewed a small sample of students and had some quotes from them that seeing the coding in practice and having interaction actually helped with their understanding of the topic. The online resource is also very good for self-testing and that was noted by several students in the sample. One of the comments from the students was, why wouldn't you use these resources? That's exactly what we were trying to find out. Obviously, there are also some negative comments and a comment there, just a video clip, didn't really add anything, but overall the comments were relatively positive. So, what we want to do now is actually to progress this and what we think will help is to give some indication of the time needed for the activities and then add some descriptions on what kind of activities they are because some of them are interactive and of different types. So, give some kind of indication of what they're like, promote them in the student forums, possibly have some sort of talking heads on how students are using the interactive tools and also mentioning them in our introductory or revision sessions that we're producing for the module. So, we're in this phase now of actually thinking about how that will best work. So, we're going to produce some descriptions and timings for the activities and review the use of the technology-enhancing and teaching tools for the next presentation, add advice to a revision podcast and interview the next cohort of students and then see if that has actually made any difference. So, and then what we'll also do is consider additional guidance when we come to the mid-life review of the module. It's off. So, say, any questions? Thank you very much. OK, so any questions for Christine? I haven't seen any come through me too. Obviously, you're very welcome to do that. A question here from the floor. Can we just wait for the microphone, please? I'm Richard Tree, it's also from the OU, so I'm in the telty. I was just interested, so it was just a video. It didn't actually get the students to practice anything or do anything. Oh, no, they were interactive. So, the students were using various interactive tools to input different data, for example, and actually playing with the interactive tools. So, it wasn't watching videos. It was interactive. It wasn't just interactive. OK, say, Helen's here, who was also working on the project. So, it wasn't just interaction. The question at the top. You showed there that the sample size is massive, but it does show some compelling evidence. Had you thought about carrying out some statistical analysis just to see whether or not there was a statistically small difference between the samples? Yes, so we haven't done that because the sample size was so small. That was why it seemed to make more sense to sort of compare that cohort to the other students who haven't answered the questions. But it's certainly something that we can look into, because, say, this is sort of a work in progress. So, I think what might be useful is to actually break down that cohort of 48 students into those who did reasonably well compared to those who didn't, and then review their online tool use further to see whether there's any difference there, and then possibly work on that data to see if there is some significance there. OK, maybe one more question. OK, maybe I'll ask it then. So, you were shown aggregated data there, and was there any chance you could actually look at individual students' usage patterns to see any interesting categorisation of use? Yes, that was where we managed to get the student details from the aggregated data to go back and see how those particular students answered the exam questions. But that is something that, to say, we can work further on that to break it down to see who did well and who didn't do so well. OK, thank you. Thank you very much indeed. How do I move this over to the... Oh, we've got it. Good. Thank you, guys. OK, so on next, speakers are from Molysh College. We have Calari and we have Sharon. I'm, again, going to speak for 15 minutes, then somehow, through Wands of Technology, we're going to ask them a question. So, please use me, too, to post any questions. Obviously, there'll be opportunities at the end as well. So, hopefully, Calari and Sharon, if you'd like to begin, everyone can see you and can hear you, hopefully. Hi, I hope you can hear me. We can hear you. Good, good. So, I'm Calari, director-e-learning, and with me here today is Sharon, who is our senior learning content designer. From Molysh College, back in Melbourne, Australia. So, what I'm going to do is give a brief introduction to our project, and Sharon will work you through the data that we have corrected at the finding space at that. So, let's get to the... to give you a brief overview. So, Sharon will control our slides, so let's move on, Sharon. Our journey so far is... The college is fully owned by Molysh University, and we started using equal to existing schools and future plans. So, in this process, we have a very comprehensive environment where we link the strategy with our teaching and learning pedagogies that is supported through a learning design process. It's part of our very near-with-Jillie Sammons' coffee deal process, so we use that learning design approach with the five-stage model that Jillie researched and put in a spot of research on that. So, while she was here back in Australia, that's something that we embraced and that has worked very, very well for us. So, it's part of our strategy. We have the governance, which supports how we break down these criminal tools including equal to existing. So, to make the students and staff wise really using it through a single sign-on, they are giving all that they get into their virtual learning environment and then they have all the tools in there. So, to get into this one, what, like I said, we use, we pick a unit or a subject and we design using the learning design process that is the coffee deal. So, it's a colour-based approach to delivery and while delivery, part three two is we bring the technology, the quiet technology. So, in terms of existing, we need to use equal to existing. This was one of the first ways that we designed and integrated equal to existing. How do we use it? What would the benefit be? Why would that be part of our strategy? So, to give you who our learners are, most of our learners are international students, they have English is not there for signage and they are from very good cultures. As well as learning styles are very different, they are used to a different style of learning. So, to increase the integration, the performance, all those things we use equal to existing as a vehicle to get to the end of the point. So, to integrate all of this, we've got a new learning space and I'll show you in the next slide what are the environment rules like. So, this is a classroom where we have one behind the space and a lectern where we can lecture and control everything. But very, very fat for learning. So, this is where we started using eco-physicity environment and based on that we have done a few iterations of this with multiple subjects, business subjects, engineering subjects, high and academic subjects. And what we have found is quite a nice set of data. I'll be showing you this. Thank you, Julie Steader. Thank you, Laurie. So, for those of you who are too familiar, just to give you an overview of eco-physicity platform, it's a cloud-based moving and teaching platform which integrates content management, student engagement, lecture culture and engagement analysis. As Laurie mentioned, as the most of our tools, it integrates into our LMS. So, this provides our students with simple sign-on, security and accessibility. So, our lecturers and our students engage in the platform via their devices and they can interact through interactive polling tasks which are a variety of question types, multi-chures to the false. There's a bone-taking function where students can take notes that are synned to the lecture recordings. There's also a bookmarking function which allows students to actually bookmark certain points in the lecture where an informed piece of information might have been given. They're able to navigate confusion with the lecture by the push of a button and it's a very visual opinion by our lecturers and they're able to adjust the confusion at the time or later on. There's also a question and answer feature within the platform that allows students to ask the lecture questions and for our students, this is a really useful tool because it moves the face where you need to ask any question in the large group in a second language. So, those are some of the pages that are a brief overview of any of this history. These students experience surveys of your time at the museum, the NLP, an active learning platform, as it's said. These have been done for an honour school reform. They are voluntary for the students to take part in. Our recent survey was done in July this year and we had 228 respondents from three different deployable units and different subject gurus. We asked our students how easy they found using a 360 and, as you can see from the results here, it's quite overwhelming that they do find it very easy to use. The majority of students say it's fairly easy, but a small amount of say they find it difficult. We asked students about the perception of who's being used by a 360-degree method and what they perceive to be the benefit and this is an overwhelming response that the majority of students believe that the rotating function is the most useful feature of the platform. Actually, in previous surveys, students have said to us or reported that they've taken more problems using the platform than without it. Our students are used to the platform outside, where it's being used for IAR or watching the poor needs to catch up on these lectures or to review. A small number of students, of course, is using that to respond to the lecture and with their class names. Now, the next slide is really interesting. We wanted to find out whether our students knew that the platform supported their learning and their language years and then to see that it was going to be an overwhelming response that they'd be doing the way that it does. A later comment from a student was that it helped practice their listening, and I was able to practice their listening by listening back to the extra colleagues. So what the faculty and I are going to do is to practice the use of English, what we read for listening and for writing. And this is actually an area I think that is worth much more investigation. But that's really wonderful. My name is Nelson. I'm a student. I like the best about using the platform. I mean, it's been like staring at a computer screen for a long time. And the function now is not being able to use the lecture slides or notes on the lecture slides, that it's reliant on internet access, that they're unable to watch it in live streaming. So not really, but the fact that they're tall is sort of difficult to use or not useful for them when they're reading. My best students I've advised, obviously, it's been able to review their quantities whenever they want and as many times as they want. They're able to do the same on the same page as your lecture. I think they have a very good experience of the living process. And they enjoyed the interactive quizzes in class. And we'll find it a simple medium. The way it is is that our students and mothers enjoy using the platform now. They want to keep the platform for their learning and they actually want to make the effort that they need. They want it available for their other units. So that's really positive, some positive feedback from us. I'm glad you want to just wrap it up. Yeah, so one of the bigger components of our strategy is to enhance student learning experience in the DH system when they're learning as well as getting better results. So, as part of getting better results or getting jobs because some of our students have a master's degree in education and just trying to get a job and going through the employability skills development programme. So, Lightning Edge Skills is really important for us and that's one of the key areas that we need to be able to focus on. So, being able to do 6.16 for their Lightning Edge development is one of the important things. As well as some of the units that initially started using this system, for example, a conflict system, a conflict subject, that increased pass rate as well as increase of the average student mark was really high. So, the first time, the pass rate increased by 20%. And the average marking increased a lot. So, we have seen over now, nearly two years, improvements that this school is engaging our students through working like this as well for the subjects as the last of the students. And also, we have more and more staff and students requesting because we didn't make it available for all the subjects in one go. We only have a small number of venues that can record, but we haven't been able to pass for the web application. So, it's great to see the enthusiasm coming from staff and students that it's a top-down drive, which is really good for us. So, it's the end of the share of these findings. As well as to get support from my colleagues, this is back here in Australia very well supported by ECHO. We had music groups and confluencers as well as different updates and so on. So, we are very kind of thankful for making it very modern and so friendly and supporting us. I think that's about it from us and we would like to get some questions from you. Sharon and Calari, any questions? We've got some on the board on me too. Any questions from the floor before we go to these? I'll start with them from the board that came up first. So, between the two of you, are there any challenges with the bandwidth and availability from the perspectives of the local user, presumably in the actual classrooms themselves? That's for Calari and Sharon. Yes, are there any challenges with the bandwidth and availability from the perspective of the local user, the student? We have worked with the IT department to have additional Wi-Fi. That problem is minimised. But if you look at this new space, this particular one, this space was almost purpose-filled around the same time that we got a code. So, these are new learning environments or learning spaces. But if you look at this one slide prior to this, it's based on a set of containers. It's designed and built using a set of containers. But inside, inside, you cannot say that it is like that. And they are very critical of this type of learning, very mild IT, setting, creative ways of... Yes, and another one. OK, thank you very much indeed. Any questions around how did you arrive at a decision that Moodle was not sufficient for your needs? Moodle was not sufficient for your needs. OK, I don't think it's going to overlap as a content system as well as some of the other interactive tools. But it doesn't mean it's not giving everything. Like I said, what we have now is 10 core learning tools integrated into Moodle. So, the core includes study data, financial satisfaction and online marketing tool, ECON360 as our active learning platform. We have Kalsura as our video management system. We have Arna Mawr, something called the Roosa. We have tools, we have an adaptive learning system, Smart Sparrow. But there are a lot of tools that we have integrated 10 or 4, and those are also more... So, the identification is more than just one thing. It captures data, it does certain things, but it is not solving all the critical requirements that our staff and students need to address too. So, we did a review in 2015 that will be clearly shown where we were at and where we needed to be, and that got started us off in the long run. OK, thank you very much indeed. Caroline Shaw will turn you around so that maybe you can see the audience. Can we just say a short appreciation and give a round of applause? Thank you. Thank you both. OK, on the next speakers, Nan and Amanda from the University of Hertfordshire. Can you change the screens, please? I appreciate the next presentation. Is this your... Yeah, that's the right one. Good. Thank you. And welcome to our session today. I'm Nan to Geoffrey's, and this is my colleague Non Scantelbray. And we are from the University of Hertfordshire. I'm based in the School of Computer Science. And Non is based in the Library of Computing Services, which is quite an interesting combination for our keynote speaker, isn't it? So, we're looking at academic and library services combining on a research project, and that's what we want to share with you today. So, we're delighted to be here sharing our recent research into our student digital experiences. Can I just at this point say, this session links directly, perhaps it's a bit of a taster of a university case study of student digital engagement. It's going to link with the GISC presentation of the whole data set at 4.30 this afternoon into Room 2.220. So, if you like, this is what it's like on the ground at one particular university, and then you can see the whole, actually, 39,000 students, the results from that later this afternoon. So, over to you, Non. Okay. Well, we're also interested, we're going to do a little poll in a minute to see how many of you participated in the tracker project this year that are actually in the room, and if you're not sure, we'd like to know that too. But just to revisit the work that, why we got involved in this, was essentially we started a project, an enterprise-wide project at the university, looking at the digital capability needs of both staff and students. And typically, it's quite a complex organisation. There are several schools involved, and very much typical of some of the things that we heard about this morning in the keynote, in terms of the complexity of the organisation. And so, it made sense for us to actually work with the GISC because we were obviously aware of the work that they were doing around developing a framework of digital capability development. And if you don't know much about that, then visit the GISC stand, and they'll tell you all about it. And so, they've basically started off with this framework. For some frameworks' work, for others they don't, we felt that we could work with this and adapt it to our own needs. And we're very keen to actually continue to work with them, particularly around the digital capability development of staff using their discovery tool. But for the first time, we actually got involved in the digital tracker with them this year. For many of us, we know we get a lot of surveys going on for students. So, we were sensitive to that fact. But we did get absolute support from our PVC for the students for this. So, having that endorsement at a senior level also was enormously helpful for this. So, we were aware that there were lots of digital opportunities for students to get involved. But it's not always coherently delivered in that way, which again was something that we were hearing very much about this morning in terms of the context of using this. We were also very much aware that there were assumptions by many, thinking that just because they're young people and they're constantly on phones and various devices that they had a level of digital expertise, which actually, once we started asking them questions and interviewing them as well as some of the focus group work we did, that there was obvious gaps and particularly in terms of how they apply the use of digital to their learning experiences and their confidence in using some of these technologies. So, at the moment where we are is thinking about, well, how do we develop this further? How do we focus on the group? What sorts of development do we need around that framework where there are particular gaps and building on strengths as well? So, we got involved in the tracker project and now we're actually going to ask you to take part in a me too poll just to find out how many of you here today are aware and know whether your organisations took part in the tracker project this year. Ten seconds or so. I think it's a yes and a one, not sure. It's coming through. Poll is closing, poll is closed. Okay, so, interesting split here. We were particularly interested to note the not sure's because, again, typically with an organisation that we're involved in, there's so many staff, so many students, so much diversity of activity going on. It's really difficult to get the messages out there around why this is important, why you should engage, and what to do as a collective to address this in those contexts. So, thank you very much for participating in that. Okay, so, for those of you that don't know much about the tracker, it was a series of questions, really, that were kind of seeing some responses from students around their expectations and their experiences of using digital technology, particularly set within their learning contexts. So, there were questions that basically came from those full pains, and so, they invited responses into that. It was quite tricky to get engagement at the university due to all the different things going on, and some universities had more traction around that than others. But we were very encouraged that a lot of kind of seeing did go on on the ground. We were out and about in the university, being very proactive. We were at the refreshers fairs. We had stands, and we did a lot of activity to try and get the target audience to engage with us. And essentially, this is what we did to get that engagement. There was a 400 student population target to take part in the tracker. We got 216 responses, which wasn't too bad. Our target for this very first time at a small sample in the organisation were 200, so we did actually exceed that. We had support from the president of the student union as well, so that was absolutely essential. Amanda and I are basically part of a kind of spearhead strand of activity focusing on the students, and so we've got a lot of student involvement in it. Plus, we've also got some collaborative activity going for the first time with our careers service, too. So we're trying to take a very collegiate collaborative approach around this. We emailed the target group, basically went out from the president of the student union itself with an embedded link to the tracker. And as I previously mentioned, we were out and about doing all sorts of things as well, running opportunities for people to come along and actually sign in and take the tracker. And we had a full set of Chromebooks available, so we were able to go out and about in different fora with those Chromebooks so people could actually sit down and just do it there and then. Okay, we also offered an incentive, a prize draw, which was supported by the chief information officer as well at the university. And at this point, I'm going to hand over... Well, actually, we've got another move to do. That we want you to do. So what we'd like to ask you all here is if you'd participate in building a moodle... A moodle? I've got a moodle on the brain after seeing the last presentation. So if you would please actually take part in building our wordle around some of the digital activities that you actually use in your modules. There are just some examples up there, but we'd like to hear if you've got anything at all that you'd like to contribute to that. You can see lots of typing going on from here. So the reason behind this, perhaps we can close the poll now, thank you. The reason behind this is that one of the questions in the digital student tracker was actually to ask our students what kind of digital activities they were doing. Hey, and here we are. Thank you. Actually, it's really hard seeing it for a month and a year. Well, and it's really interesting to see quizzes coming up so prominently. Yes, but the interesting thing is what our students said. So shall we go to this? So which digital tools do students find useful for learning? And basically this was, we just captured the data here. So they didn't know that we were going to use it as wordle, and it's not particularly well edited. We've just taken their words. YouTube, interesting. Canvas is our new MLE from this year. We also use StudyNet. So our first year students, who were actually the majority of the students participating, would have said Canvas, and StudyNet, which appears just above YouTube, is actually the VLE that the second, final, and Master's students would be familiar with. Okay, so that was one of the interesting questions that the students were asked, and that gave us lots of new information. And we're just going to work through some of the questions that we asked. We just picked out some which we found particularly interesting. So we're looking at student reaction to when digital technologies are used on their course. And do they understand things better? Yes, over 72% agreed, and very few disagreed with that. I enjoy learning more. It's just so encouraging for us. For those of us who have been plugging away, either as academics or as professional staff, I enjoy learning more. And then this issue of being more independent in my learning. A lot has been written about this in the research. Do we just make students more dependent? But actually the students here are saying, that's nearly 78% saying, I'm more independent in my learning when digital technologies are used on my course. I feel more connected with my lecturers. I think we would have liked to seen that pushed up a bit, but still that connection, again, the challenge in a previous question, was whether the students stopped coming to lectures. Well, actually using digital technologies implies they feel more connected. Digital skills are important in my career. So there's a lot of emphasis on using their course towards employability. We will encourage that the perception by students that digital skills are going to be important. But does their course prepare them for the digital workforce? We would have liked to have seen this a little bit higher, obviously. It's just under 50%. That's mitigated by the fact that the majority over 30% of the participants were from our first year. And typically we don't emphasise the employability of the courses in their first year, unless it's something vocational like nursing. So that message we have taken back to help us for the future in our planning and our learning design. So drawing in some consequences and some conclusions, we are encouraged that this actually shows from this relatively small sample that students are keen to engage digitally with their learning. These are things that we're taking back to our staff. We promoted it at the Learning and Teaching Conference back in June. This perception that online they learn better and more independently is encouraging. And the benefit of digital skills for future employment. I think that message is loud and clear. And this connection with the lecturers. We feel this is something we can build on and I hope others will be encouraged as well. How we develop this as part of the strategy is what we're asking ourselves now. And we've got ideas. That's very much something that Nonn is taking forward in the next year. And at the same time, we're encouraging staff to update their skills to support student learning. You may have remembered from our own wordle of the student feedback about Linda. Linda.com we have subscribed to that as a university and is one of the areas that Nonn has been putting forward. So sorry I missed that last one. Okay. Developing our digital capability strategy for us too. And so I'm just going to, I forgot this one was there. Sorry. And that's it. We would ask you for your take away just to acknowledge all those who've helped us. Thank you very much. You have a few on the board. You can actually read it from the tablets as well. I made this mistake too. Ah, right. Start of the possibly if you want. Which is simple to introduce must-haves? Would you advise academics to begin experimenting with? Any ideas? It depends where you're starting from. I think it's looking to see what you've got in your MLE, what support, what tools. And actually engaging with your staff to see where they feel they are lacking in digital tools, digital capabilities. But when you get feedback from students, feedback from the students would like more digital engagement. I think having been involved in researching this area for over 10 years, it's very much as a push from the students to have more choices and more available in a variety of online environments. One thing as well that's been absolutely key that the Learning and Teaching Innovation Centre have been spearheading is the guided learner journey. So, again, this is the whole idea of actually building the learning experience first rather than focusing too much on the tools. But again, the tools are an essential component of the course delivery. And I think we're probably fair to say it's a bit of a mixed economy. We've got some brilliant case studies of where it's been actually delivered really well. And we've got some other cases, very similar to what Tracy was saying this morning, where people have just been using Canvas more as a kind of document repository and moving, transitioning their content that way. So it's a journey that we're all on. The other thing is Office 365 is seen as a key component. It's something that we offer free now to staff and students to be able to download onto their own devices. So we're just about to launch a programme of blended learning events. So we're going to have a mix of face-to-face delivery through the library link-up programme, which we're just launching this semester. And there's also going to be some playlists developed as well within Linda that we're going to share with the whole enterprise. Yeah, and I think our philosophy generally has been both face-to-face and online, both the blend of it's not one or the other, but it's working out as academics what works for your students. The question there, how did staff respond to the results? Well, I mean, as you can understand, we are early adopters with some of the enthusiasts, but we are building that enthusiasm within the different schools. And each school has got their Canva supporters and a network of colleagues who are supporting the academics. And there's, as we said, a huge development of the support for students so that people don't feel left behind. Just a final comment a little bit. Yes. Yes, somebody picks up that it's interesting. Students see Google as a digital tool, but don't mention library catalog, e-books, e-journals, et cetera. There was a separate question on e-books, which were quite well used, so I can't remember the facts. Well, the key thing as well, I think, and that's going to be one of the essential things we're going to deliver through the Lincoln programme, is not just simply how people find and engage with e-books, but it is the actual note-taking again, which follows on from the previous presentation. And these are very, you know, key digital challenges because we don't have standard platforms. So if you, you know, some publishers will link you into their platform and won't let you take your notes anywhere else. And then you've got other tools like Blue Fire, Eda, for example, those kinds of things, which will allow you to keep notes from all sorts of different places in one place. So, knowing the right tool for the right job and working through all the constraints that we still find ourselves in, we're still transitioning to this new world, but one thing's for sure, doing it together, doing it collegially and collaboratively is essential if you're going to build that engagement and get people involved and involve students all along the way. Great. Fantastic message to finish. Thank you very much indeed. I'd like to see all our speakers. So we have an hour's break now. There is lunch down in the exhibition area and sessions start at 1.30. I'm John Wilson. I'm the CEO at Agenta. We're a technology company that focuses on education and learning. We build, manage and operate platforms for education, for video collaboration. Externally, we prefer to work with what we feel as ethical industries. Obviously education, teaching, learning, healthcare. We feel that we can really contribute to these industries by creating exciting platforms, easy-to-use platforms, secure platforms that people can utilise. What we feel is one of the most important things for Scotland to boost economic growth is investing in rural areas. By investing in broadband in these local areas we can attract more talent, we can attract more companies and we can drastically improve the delivery of education and learning within these schools, within disparate regions within Scotland. I'm John Wilson. I'm the CEO at Agenta. We're a technology company that focuses on education and learning. We build, manage and operate platforms for education, for video collaboration. Externally, we prefer to work with what we feel as ethical industries. Obviously education, teaching, learning, healthcare. We feel that we can really contribute to these industries by creating exciting platforms, easy-to-use platforms, secure platforms that people can utilise. What we feel is one of the most important things for Scotland to boost economic growth is investing in rural areas. By investing in broadband in these local areas we can attract more talent, we can attract more companies and we can drastically improve the delivery of education and learning within these schools, within disparate regions within Scotland.