 Absolutely. I'll just share my screen. Okay, now hopefully once it comes up you'll be able to see it. There we go. Are you able to see that? Oh yes, yes, yes. Excellent, excellent. Now I do have, just so you know, I do have this in full screen mode. And unfortunately I can hear the thumbs up or the hands up to indicate that yes you agree, but I can't actually see that. So apologies. And I also want to point out that I'm actually doing this through Mentimeter. I've imported my presentation into Mentimeter. So throughout the presentation there might be an opportunity for a little bit of feedback from you, a little bit of interaction. And if at any time you have a question at all, if you do log into Mentimeter and I'll give you the code shortly, you can actually click on this little icon down the bottom right hand corner and you can post some questions and then I'll be able to see it. Just helps during this presentation mode. Right, before I begin I'd like to make an acknowledgement of country. I'd like to acknowledge the traditional custodians of the land on which I am presenting today and pay my respects to Elders past, present and emerging. And I come to you today from Ghana land. I'd also respectfully acknowledge the Ghana, Boandik and Bangala First Nations peoples and their Elders past, present and emerging who are the First Nations traditional owners of the lands where all of our campuses are located in South Australia. So shortly I will introduce myself but I just thought I'll give you an overview of what we're going to talk about today. So we're going to talk about what are systematic reviews, what are rapid reviews and why I was asked the question the other day. Why is a systematic review or a rapid review better than a literature review and how do they differ? So we're going to cover that a little bit today. I'm going to present you with some of my own research of teaching and learning through the pandemic and that's going to include looking at not just the schooling level but also at higher education. I'm going to talk to you a little bit about some of the lessons I've learned then going through those processes so that hopefully I can provide some advice to you where you're thinking about doing it yourselves going forward. I'll also provide you with some further resources and time for some questions and answers. So I'll just keep my clock there next to me. So my name is Dr Melissa Bond. I am currently at the University of South Australia, which is in Australia obviously in Adelaide, but before that I was a high school teacher here in SA for 10 years, did my masters during that time and then I moved to Germany and was at the University of Oldenburg in the north of Germany, near Hamburg or Bremen if you've been up that way and that's where I did my PhD with a real focus on student engagement and educational technology and really synthesizing the research that's happened and conducting my own primary research as well. I then moved to UCL, University College London, where I was a support officer for EpiReviewer, which is an evidence synthesis software. So I sort of brought my practical experience having used EpiReviewer as a researcher in Germany and then I brought that to UCL to be able to help other people. I'm not currently doing that role, but I hope to do it again in the future in a very reduced capacity. So I've done a number of systematic and mapping reviews. So some of the ones that have been published really do focus on student engagement and digital learning in both higher education and K-12, but also things like artificial intelligence in education and at the moment I'm currently doing, actually I'm doing more than four, but anyway I'm focusing on four areas, looking at language buyers in educational technology research synthesis. So things, questions like investigating where researchers have already conducted reviews. Did they just search in English language only databases or did they also search in languages other than English and what does that mean if they didn't and where should we go from here? So that's one of the ones we're looking at, also looking specifically at the role of student engagement and learning analytics, doctoral education in motherhood and international research collaboration as well. So just quickly, I'd really like to, I'm going to open up the voting there. If you have a device and if you are so inclined to interact with me, you are more than welcome, I'd like you to go to menti.com and pop in the code that you see at the top of your screen and I will give you about a minute to maybe let me know what your expectations are, what your hopes are for today's presentation. Maybe you didn't come with any preconceived ideas and that's absolutely fine, but if there's something specific that you had on your mind that you'd like to contribute to, then please by all means go to menti.com, pop that code in and tell me. So I am going to give you a minute, actually I might make that 30 seconds. Oh, I've given you an extra 30, that's not right. Let's start again. There you go. All right, so 30 seconds to tell me if there's something specific you'd like me to focus on. Otherwise, I shall just continue on my merry way. Excellent. Inspiration and also the trigger to start doing some research yourself. Excellent, fantastic. Ah, that's a very good question. How rapid are they? Excellent. Getting washed up, very nice. All right, fantastic. Excellent. Okay, apologies if the voting closed before you were able to add your thoughts. By all means do put it in the chat so that I can have a look a little bit later. But there's some absolutely fantastic ideas there and thankfully I am covering most of them in the talk so you should be pretty happy. So systematic review methodology. Now the importance of systematic reviews and again this question of why is the systematic review better than a literature review? And as we know when you're looking at a phenomenon or when you're looking at a topic, it's really important to rather than just look at one study in isolation all the time to consider the body of evidence. And when we look at something like the COVID-19 pandemic that has occurred, we really want to try and gain an understanding, a really deep understanding of what happened and learn from those lessons. So very quickly, has anyone here already done a systematic review before? I can see that from your responses just before. Some people are only at the beginning of their journey. So I'll give you 30 seconds to let me know whether or not you've done one before. And if you think you might have done one but you're not really sure whether it was systematic enough and just click unsure, that's fine. Or you can click yes. Interesting. Okay. Most people haven't. Well, in that case, a systematic review, I liken it to producing a recipe for a literature review. But this recipe needs to be so explicit and transparent that someone could take your method of how you have undertaken this review of literature and be able to replicate it step by step. And if they use the same databases, they use the same search string that you use, they should be able to, if not replicate it exactly or perfectly, very, very, very closely replicate what you have done. Literature reviews tend to be more cherry picking. And so what happens there is, you know, you're conducting a literature review and you find a couple of studies that you think are really good and you think are important for your topic and they might well be, but they're only providing a very small snapshot of what literature is out there on your particular topic. And so that's putting an inherent bias into what it is that you're writing. And even though you're synthesizing across those studies, even in a literature review, it is also given that bias to it. So the idea behind the systematic review is that because it is so transparent and explicit in the way that you've conducted it and that you're looking much more broadly in terms of where you're finding your studies, that you are lessening that bias that can happen in literature reviews. And you're also then able to identify or more easily identify gaps in the literature and identify perhaps some contradictions, consistencies or inconsistencies, you know, and really narrow down on gaining more understanding and depth of understanding about a phenomenon. In terms of where systematic reviews sit in terms of scope and in terms of time, this is a really nice graphic that I like here. So we've got literature reviews sitting at the bottom here. Now, as you increase in things like the length of time it takes for you to do something, the cost, whether you're undertaking a critical appraisal, a quality appraisal of the studies in your review, how broadly you're searching and what kind of systematicity, I think I've just made up a word, how systematic you're being in your approach is sort of looking up this ladder of reviews. So going from a literature review to a scoping review, rapid reviews, right up to the full systematic review and then to that meta review level, the review of reviews. In terms of how you conduct a systematic review, again, I really like having some visuals to try and understand how it works. So we start off at the top here by identifying an issue, determining your research questions and figuring out sort of conceptually how you're going to frame what you're doing. You would then come up with a search strategy. And so this would be a really defined search string with particular terms in it that you would then apply to any of the databases or any places that you're going to be searching for your literature. You would also come up with very specific inclusion criteria. So these studies, if they match these criteria, all of these criteria, they would definitely be included in the review. But these studies, if they match these exclusion criteria, they're out. We're not even considering them. They're outside of the scope of the review. Once you've done your search for studies and you've gone through and come up with a final list of studies, you then need to conduct what's called screening. And at that first level, you would be screening all of the studies that you found just on their title and abstract. So say, for example, you were using a database or a platform like the Web of Science, for example, and you've exported your search results. You've imported it into either reference management software or, in my preferred case, A, an evidence synthesis software, like EpiReviewer, for example, or perhaps Rayan or one of those other softwares. You would have exported the metadata and then you'll only be looking at the title and abstract and checking against the inclusion criteria. Actually, when I do my reviews, I'm really looking for a reason to kick them out. I'm looking to see if there's a reason that I can exclude the studies because at the end of the day, you don't want to be having to synthesize hundreds of studies because you want a nice, focused time review. So once you've found the items you want to include, you would then locate your, as many of the full texts as you could, using whatever resources you have available to you. Now, I draw the line personally at paying for articles. I will do my absolute best to find an open access version of an article if I don't have institutional access. I must admit, I beg borrow and steal from all of my colleagues different institutions to see if they have access and if not, I would contact the author by email and after that, I draw the line. You would then screen all of those studies on full text. That is reading through the entire thing, paying close attention more to the method section and the results section to see if it matches your criteria and then you get your final number of items to extract the data for. You would then do a quality assessment. If this was a full systematic review, you would do a quality assessment and you might use a tool that's already been developed, for example, and you would assess the quality, say, for example, looking at the rigor of the methods of that particular study. You might be looking at their, like if you were, if it was a systematic review, you'd be looking at how rigorously they applied their search criteria, for example. You would then synthesize the results of those studies and then you would disseminate it in some way, shape or form. That's a very quick overview of how that works. Now just quickly, for that one or two people that have done a systematic review before or just you yourselves that haven't as well, what do you think are some of the benefits to undertaking systematic reviews, do you think? I've got my own opinions. Do you have any ideas on why you think this is a good idea? It might be something that I've already said, it might be something that you've come across in your own work as to why these are really beneficial. Assurance, that's an interesting word. Oh, there we go. Yep, absolutely. Removing bias, clear process, specific, repeatable, credible. Brilliant, yeah, absolutely. Rigger thorough depth. All right then, same question or similar question. But what do you think are some of the challenges that you might come across if you were going to undertake a systematic review? What might be some of the, maybe there's something stopping you personally from undertaking one and that's why you haven't done one yet. Is there anything in particular standing in your way? So what is it, right? Doing it alone, yes. Ooh, now that's interesting, the quality of research. Now I'm guessing there that that person might mean the quality of the research that is being published and therefore when you synthesize it, how well can you really, yeah, that's interesting. Hopefully that person crops up at the end and we can have a good chat about that. That's a very interesting thing. Ooh, when to stop, yeah, absolutely. All right, but time seems to be the overwhelming factor here. All right, well in my opinion, yes, the benefits are many and varied. I've already said that you can gain an understanding of a topic, a really deep understanding. You can identify research gaps, which if you were at the beginning of a project, for example, or at the beginning of a PhD, or you were just at the beginning of wanting to look at maybe implementing some kind of intervention, this is a fantastic way of gaining an understanding of what's happened before, what's worked, what hasn't worked, and what might you be able to do or contribute in your future work or practice. Absolutely. You can also gain a really great exposure to a lot of different research and writing styles. And that's one of, to be honest, this is the way that I actually learned to write my own articles was I just, because I was undertaking systematic reviews, I came across so many different ways of writing and presenting data and really as well the journal, different journal styles and what they expect. And of course, you also learn some really great search and retrieval skills. However, challenges for me have been understanding the method in the first place. I had never come across systematic reviews before I started my PhD and that was what I was employed to do. So it was a bit of a learning curve and I did undertake some training at UCL before we started. So that was a help. But I think if there had been more out there, especially in education, in educational research, there really isn't a whole lot of guidance because this is really something that health and medicine and more of those natural sciences have done and they have a great and grand legacy in history of doing this kind of work. But education is still, I would argue, a little bit in its infancy. Understanding what kind of software to use as well can be a little bit tricky. Whether or not you have to pay for that software can be a hurdle. And just getting really familiar with that and to be able to undertake them. Scope and retrieval, now you already talked about scope when you came up with some of your own challenges for why you might not be able to factor in doing systematic reviews. If you are trying to synthesize all of the research that's been undertaken within your topic, oh my goodness, you are setting yourself up for a massive challenge. And so it's understandable maybe, okay, maybe you need to refine the scope. Maybe you need to think about, well, perhaps I'll just look at the last five years or perhaps there's a review that's been published sort of three years ago and you'll just look at the last three years. And that's absolutely fine. And the other thing of course is resourcing time and people. And someone said doing it alone, yes, doing it alone can be quite scary, especially if you've been asked perhaps as a PhD student or even as a master's student to undertake a systematic review by yourself and you've never done one before. So it's really about finding, trying to find people who can help you on your journey and start building up those skills and knowledge. But sometimes the best way to learn something is to try. I will just say that. Okay. Now the other challenge that you can come across is if you're wanting to publish a systematic review, here we come back to time and resourcing. Now there was a study done by Bora and colleagues in 2017 and they looked at 195 different systematic reviews that were published and they looked at not only how many articles they included within those reviews at the final stage, but also how long it took for their systematic reviews to get published. Now if you're not planning on publishing, this shouldn't be an issue to you. You might be just interested in publishing for your own department, your own colleagues. And that's absolutely great. That's brilliant. So in that case, you know, don't listen to this. But if you are looking to publish, they found that it took on average 67 weeks to conduct and publish a review. And this is a full systematic review, not a rapid review. They found that reviews that reported funding that were funded to undertake a review took longer, there's a surprise, 42 weeks versus 26 weeks and also involved more people. Well, they had more funding. They could afford to have more people on their team. So, yeah, just very, very interesting. And then we actually did publish an open access book on conducting systematic reviews and educational research and the references in the reference list. And from the reviews that were included in this particular book, we also had a look at how long it took each of these teams to undertake their reviews. And we see ours was an absolute outlier. This is the one that we did here on student engagement and ed tech in higher education. And it did. It took 24 months, but we were fully funded. We had three research associates working on it. And we started off, this is where I say you can learn the most from doing initially 77 and a half thousand. Now, don't worry, we did change the scope and we narrowed it and we refined it and it wasn't that bad in the end. But it did start out fairly crazy. But you can also see this one here. It took between one to four months. There were a couple of different reviews in that particular article. This one only took 11, this one took nine. So it really does vary in between the different reviews that get undertaken. Now, moving to the pandemic, we know that there was such an abrupt switch and a shift really to emergency remote education. And teachers just did not, educators did not have the time to try and find their own research to find their own evidence. And so we saw that it was really our responsibility as evidence synthesis practitioners that we needed to try and help people by conducting evidence synthesis of the primary research that was being undertaken during the pandemic. In order to help other people understand what was effective, what perhaps wasn't quite as effective and then to help inform policy going forward. Now, here's where the rapid review comes in. Have any of you done a rapid review before? I'll give you 30 seconds to tell me because we needed something fast and we needed something quick. And even before, you know, right in that early sort of six months of the pandemic when at first here we were all in lockdown and we were under severe lockdown in the UK, we needed information and we needed it quickly. And so doing systematic reviews as you've seen, it takes quite a long time. All right, so no one here has done one. All right. So a rapid review. It is a form of knowledge synthesis, but it is a form of knowledge synthesis that, I'm finding it hard to say that word, synthesis that accelerates the process of conducting a traditional systematic review through streamlining or omitting specific methods to produce evidence for stakeholders in a resource efficient manner. So some of the ways that you can do this include the following. You could limit your search to publish literature only. So for example, you could say, I'm just going to search in the web of science and whatever's in there, it's been peer reviewed and that's what we're going to be searching in. Now, I will just say that there are a number of systematic reviews that are published that only do that. So it doesn't necessarily have to just be a rapid review to do that. You might choose, as I said, to only search in one database. You could use a form of automation and I'll talk a little bit more about that shortly. You might limit your inclusion criteria to just a couple of years, three years. In my case, it was around eight months for some of my reviews during the pandemic. You might just say, I want to look at English language only. You might just have one person conducting the review, which is absolutely fine. Although that does bring up some issues of rigor of the review, especially when it comes to bias in terms of you undertaking that alone. You might choose to conduct multiple steps simultaneously and this is something that I did do in the review on the back to talk about where I was actually screening on full text and extracting data at the same time because if you've already identified that an article could be included, you're looking at it. You're eyeballing it. Why not do the data extraction at the same time so that you don't then have to read the full thing twice? And you might choose not to conduct a quality appraisal, a full quality appraisal where you're looking at risk of bias and things like that. So there are other ways that you can make it a rapid review, but those are some of the most popular and I would highly recommend looking at Trico and colleagues, their work as well as Garrity, the report, their absolutely brilliant guidance for how to conduct rapid reviews. Okay, so it was the pandemic and evidence synthesis needed to happen. In this case, I chose to conduct a rapid review on the K to 12 literature. That was happening, that had happened across the first eight months of the pandemic and I chose to do it alone as a side thing outside of my own professional work. So I focused my research looking at, not just on who was conducting it, where it was happening. I also looked at the technology that was being used. I looked at student engagement and how different factors of student engagement were being affected, being discussed and what kind of recommendations that these primary studies were making that I could then perhaps feedback to others as well as making my own recommendations. So the way that I made it a rapid review, I limited the number of databases. I conducted it by myself. I did limit it to English only, but I did also acknowledge that is a bias in the limitations and I was also really lucky that I was able to draw on previous reviews that I'd conducted. So I looked at the search strings that I'd used in other reviews and I was able to apply that to this one to sort of cut down a little bit on time. So this is the search string that I used here. It was a combination of terms to do with emergency remote teaching or the pandemic, terms to do with K to 12 schooling and I also had some knock terms because it kept coming up with, even though I was looking at purely schooling level, it kept coming up with particularly public health, telehealth, telemedicine and things like that. So I tried to limit the results that came in my search by including those knock phrases as well. And whilst I said I limited the number of databases, I did actually search through quite a few and yeah, so some people just focus on one or two databases. In the review I'd done before we used eight and so this is actually a reduction of sorts. Okay, the next step was to come up with exclusion criteria. So I was really focused on schooling. I really focused on the teaching and learning setting. So where it was really about the students and their learning and how teachers were coping with the pandemic in terms of their actual teaching. So within school structures, English empirical studies undertaken within the pandemic. Now I screened 777 on title and abstract and this was just me and I used EpiReviewer to be able to do that. And I ended up with screening 156 on full text, which sounds like a lot, but when you really know exactly what you're looking for, it's not such an overwhelming thing. Now within EpiReviewer, this is a screenshot here from the software that I use. And this is what it looks like when you're actually screening an item on title and abstract. So here you would set up all of your exclude codes. So this is where I had, if it's not K to 12, I don't want it. If it's not empirical, I don't want it. So that's what I mean when I screen something and I'm trying to exclude things. I don't bother listing my include codes. I just say include on title and abstract. Otherwise, how else can I remove it from the pool? Within EpiReviewer, you're able to do things like click the auto advance button, which means that whenever you make a coding decision in one of these checkboxes, it automatically goes on to the next one, saves you nearly seconds at a time, but it all adds up. You can also set it so that it shows terms where you highlight key terms you're looking for and it just makes it more obvious for what you're looking for. So here obviously I set it up so that online teaching, online learning, phrases like that would be shown in greens. I could see, oh yeah, this is definitely going to be something I'm looking for. Again, it just saves time. If you use a touch device like an iPad or some kind of tablet, screening is also really, really simple and easy, especially if you've got auto advance set up. I did also choose to make this what's called a living rapid review. Now in hindsight, hindsight's a wonderful thing. Doing this by myself, it was a little bit ambitious to then try and hope that I could keep up with the amount, the sheer amount and volume of research that was being published during the pandemic. And so I'm a little bit behind with the livingness of my review, unfortunately. But EpiReview, for example, has the function that you're able to bring your review up to date. So search within something called Open Alex, which trolls through Google and can find things that match or very closely match the item already have in your review so that you can find some other studies. You can do things like citation checking. So you can have a look at the reference list. It will look at the reference list for you of the studies you've already included. And then it can find those and import those studies. And then it can look at other things that are associated with your topic. And you can also create alerts that let you know if other studies are newly published that match either the items in your review or match some kind of setup that you've got in your review. And that can make things so much easier so that you're not the one necessarily having to physically go into all the databases and keep searching for things. And that's when machine learning can really help cut down on the time factor. So data extraction, there were 89 studies that I included for data extraction. Now, if you haven't seen one of these before, this is a Prisma diagram here. Now, I must say, even if you're only conducting a rapid review, and I say only because it's still work, you should always make sure that you include one of these diagrams in your reporting, however you're reporting to anyone. It just shows people the steps that you've gone through, the numbers of records that you've screened, that you've excluded, and why those things were excluded. And I have the reference in the reference list for the latest guidelines on how to do that, that's page and colleagues. And then I extracted my data based on a framework that I'd already come up with in a previous review, which made it easier. This is a view of what it looks like when you're actually doing the data extraction in EPI reviewer. So you're able to upload PDFs, like a reference management software, I suppose. You're able to import PDFs into the record, and you're able to highlight phrases, sentences or paragraphs, and then assign that particular section of text to a specific code that you have predefined that you've come up with. And it just makes it really easy when you're coming in time to try and synthesize all of the information you've gathered from these different studies. You can run a report just on one single code, you could run a report on the whole lot and have lots of different codes in the report, and it just really allows you to pinpoint where this information is coming from in a quicker way. And in terms of synthesis, I created a narrative synthesis which included tabulation and interactive evidence gap maps, as well as computer-assisted content analysis. So I'm going to share with you some of the selected findings. I'll still keep an eye on the time. I was really impressed that 88% of the studies were available open access, and that's something that's really important to me. We found that, or I found that most of the research participants came from Europe, Asia and North America, which does sort of follow through with how the pandemic progressed, especially during the initial stages. Although there was considering... Actually, we found in higher education that Europe and Asia had approximately very, very similar in terms of where the research participants came from. The majority of studies were focused at experiences at the secondary level and focused on teachers and school leaders rather than the students themselves. So the majority of studies there were focused on the general challenges in teaching and learning, followed by a teacher digital competence, digital infrastructure, student learning habits and the school home connection. Now, I want to ask you... I've got a couple of little, tiny quiz questions just to, you know, lighten it up and be a bit more interactive. So if you're not on Menti and you want to be involved, you can get on Menti now. Otherwise, I'm going to hit Enter and we'll get going. All right, here we go. If you answer quicker, you get more points. It's just a bit of fun. What do you think were the most used tool type? So the most used type of technology tools? Do you think they were assessment tools, social media tools, multimodal production tools like videos, synchronous collaboration tools like Zoom, text-based tools like discussion forums or knowledge organization and sharing like learning management systems? Wow. Okay, that's a really big spread. It was actually synchronous collaboration tools that were the most reported in the research. All right, let's see. Let's see where we're at. Oh, well done, Dr. Centipede. Okay, excellent job. You were the quickest. Okay, we actually found synchronous collaboration tools were reported in almost half of the studies. Followed closely by knowledge organization and sharing tools like learning management systems and then text-based tools. But there were over 80 different individual tools that were mentioned in this research. And you can see on there the screenshot of those different types of tools. The most frequently mentioned individual tool was Zoom, followed by Google Classroom and LMS, videos made by teachers and video conferencing software. And I'm not sure if I've got it here, but it's really important. It came up that in the number of the reviews I've done that having videos made by teachers or educators as well as videos from other sources like YouTube or other OER was really found to be more engaging to have the balance of the two because students were getting that teacher presence, that individual personal teacher presence as well as the information from other sources. I also created interactive evidence gap maps for each individual research question. They're available freely open access to people. They are filterable and searchable. People can download references from there. It also directly links to studies and can assist you if you're doing your synthesis. So when you click on one of those bubbles, it'll bring up this record panel here and it will show you the information, the metadata and then you'll be able to click on the link and it'll show you. It's just another way of being able to not only synthesise your work and see it really graphically, really visually but able to share it with other stakeholders and interested parties. Now, I realise time is marching so quickly so I'm going to hassle on through some of these slides I'm afraid but unsurprisingly and I say that because of other educational technology research I've done we've found a real lack of research in Africa, Oceania, the Middle East and South America and that was echoed across both K-12 and higher education. Found that more research is needed on the experiences and preferences of students especially in regards to vulnerable populations. Found very little research looking specifically at students with educational needs and disabilities. And there was also scope for more research looking at multimodal production tools, social networking tools and assessment tools which is really quite surprising given the need to be able to find online assessment tools that were able to help teachers manage that kind of side of things and provide that idea of just in time learning and things like that. And same with Google Classroom it was mentioned a lot as being particularly engaging but I think there was scope more scope to actually do a bit more research there. The other thing I want to mention is that through EpiReview you're able to create a freely available online web database which you can either have completely accessible to people or you can lock it down with a password. It just provides people with different ways of being able to search through what your findings people are able to create frequency charts, cross tabulate the information you found they're able to see a really great overview of publications by here and so on and so forth. So it's just again another great way of being able to share the work that you're doing with others. Now due to time I think I might skip forward through this but the slides are all there it's such shame because I mean if you know me personally you know I like to talk so unfortunately I tend to elaborate on a lot of different things. This is a review that I did we did focus solely on secondary students and again we found very similar results we found that STEM subjects are always the most researched no matter what review I've done that seems to be the case and largely focused on student and teacher data and at least in this case by this point more students were being involved and that really still begs the question how do these voices are being heard in this research and in what ways and it also really did show through all of the reviews we did in the pandemic how much researchers were undertaking emergency remote research of emergency remote education. Okay we found key findings really interesting though I will just mention we found heightened self-regulation understanding of topics and enjoyment we found there was a case of some people who'd had issues with truancy or just not wanting to be engaged at school were actually starting to be engaged because they didn't have to physically be in the classroom and we found that with parents as well but also social disengagement was the highest disengagement indicated by far and there was absence in lab lessons there was confusion about about task requirements as well as safeguarding concerns and the issues with people who'd never learnt online having to learn how to learn online and that was also quite overwhelming as well as people's life loads now same question again but with a different cohort ooh I've got a question with the aspect that zoom was most published interesting to know how this related to the actual practice and usage of tech during the pandemic zoom most published item but was it the most used tool type in secondary schools out of all of these assessment tools social media tools multi-medal production tools synchronous collaboration text based knowledge organisation and sharing and that was a really good question I can't give that a thumbs up yeah that's actually something we might need to pick up at the end but that's a really good thing a really good point that you brought up knowledge organisation and sharing it was again it was synchronous collaboration tools and like you said I say most used in these studies that were published if they mentioned if I said that they were they were used and they were featured in the study that means they actually reported that they were using it but I won't have time to talk about how they used it probably today so let's just see nice work rivet rivet was the fastest time but we still have oh and rivets in the lead very nice okay this was just an example of how you might present the results of your research which I think is also an important consideration you know you're going to undertake something that will take quite a substantial amount of time although I should just mention that the rapid review that I've just talked about that I talked about first only actually took me a month to do and that wasn't a month full time that was a month to do outside of my normal day to day nine to five and then it did take me a month to actually write the thing up and then get it published but you know in terms of doing the rapid review it was really reasonably quick and thankfully I was able to get it out there quite quickly as well but this is something that I think is really important is that the information is available open access it's available people can go and have a look at the interactive evidence gap maps that have been created they can have a look at the database of studies if you were a busy practitioner and you were interested in looking at for example how zoom was used in schools or in higher education because we've also got the higher education review there as well you would be able to come to this web database and at least have a look and locate the studies that were included in the review which I think is really valuable okay I am keeping my eye on that time so this was a review sorry this was a review that we did in the higher education in particular now unfortunately we haven't yet had time to look at it in terms of engagement but we did look at the technology side of things and we also looked at sort of the themes of the topics that were being looked at being researched during that first six months of the of the pandemic this was a well it was a mapping we called it a mapping review in the end it was a systematic review and we did undertake a form of quality appraisal but we actually did this one with a team of five of us and so we were able to manage the 10,000 items that we had screen on Thailand abstract between the five of us in terms of how long it would have taken I think it probably took us at least three or four months to actually do aside from writing it up so it was quite labour intensive but we did choose a very broad topic and there was a lot more research that had been published in higher education than had been published in K-12 because we synthesised sort of 89, 81 studies in those two other reviews that we did but this one there were 282 in higher education so again we found that STEM was the predominant sort of disciplines that were being researched and education was really quite a bit less and as I said Asia and Europe were basically very similar in terms of where the participants were coming from but here in higher education we saw the flip of what we saw in schooling where students were the ones that were the participants in the most case as opposed to teachers which is what I found in the K-12 sector we did find that student perceptions of online learning was the top area of focus and we really saw that students' experiences and the impact of the shift to online learning was under the predominant focus now I'm going to ask you for one last time what you think in higher education was the most used tool type was it something different, was it the same? I actually thought assessment tools might score more highly they didn't it was synchronous collaboration tools again it was Zoom, it was video conferencing and really there was very little that we actually found looking at assessment tools or even social networking tools so whoever I reckon Dr Centipede's got it let's see congratulations well done C you've taken it out I'll give you a round of applause I'm very impressed so for all three it was the synchronous collaboration tools and here you can see the interactive evidence gap map that shows shows you what that looks like visually so here you can see that the English language studies there was a lot focused on students and really comparatively a lot less in terms of teachers that were teacher experiences and even less when it comes to support staff and professional staff again we've got this available to people if they want to have a look at the studies that were included at some of the results and I just wanted to mention before I stop talking and let you ask some questions before I undertook the latest pandemic ones that I've just shared with you I did a little analysis of whether systematic reviews are harder to get published and I found just amongst the ones that I've done that on average it did actually take 19 days longer to receive an initial response to a systematic review article that I'd submitted to be published it took 40 days longer for final acceptance and the overall process took 66 days longer on average so don't be discouraged if you're wanting to publish your systematic review because it may seem like it will take a bit longer and sometimes the only reason for that is because there are less people less reviewers that necessarily understand the methodology that goes into doing these and so it takes a little bit longer sometimes trying to find reviewers who have that expertise to be able to help you and review your work I do have that in a blog post again it's in the references if you want to have a look now my lessons learned seek expert guidance if you can and try and have at least one person on a team if you are working in a team who has done them before or who can help you if you're doing a rapid review I'd suggest keeping the team reasonably small and that's just simply because once you the more people you have the more you need to do sort of inter-rater reliability you need to double code things, double screen things and make sure that you have the same understanding across the different codes that you're using and speaking of time you need to be aware, rapid reviews don't take as long you're looking at one to three months depending on your scope if you're doing a full systematic review we saw that it can be from four months up to how long is a piece of strength so really make sure that you're aware of that you need to have a good understanding of your research questions and of the coding scheme between all of the review team members and it's a good idea to use some kind of evidence synthesis software like an EPI reviewer or a Rayanne and try and keep all of your information in the one place you should also consider language bias and where possible involve researchers that use other languages and that can then search for research in those languages and you should also consider great literature not just thinking of your standard library repositories but also have you got someone suggested to me the other day have you perhaps got a teams channel where you might have shared some literature that you could search through could you look on Twitter could you look on ResearchGate places like that you can also utilise machine learning where appropriate and I already mentioned make sure you include a Prisma Diagram no matter what kind of review you do and if you are planning to publish it's sometimes a good idea to have an outlet in mind as early as possible so that once you start crafting your report or your write up your article that you're doing it in the style that they want from the word go so that you don't then have to go and reformat everything I do have some other resources I will share the slides with you so that you can look at that at your leisure got some references there and I am more than happy to answer any other questions not just now but also following you know in the days weeks following this if you want to find out more about anything that I've talked about or if you've got other questions please don't hesitate to reach out as you may have noticed I love a chat and I'm really passionate about evidence synthesis and I'd be more than happy to help and just finally because you know why not end off with a mentee if you could just maybe share how you found today's webinar I know it was a bit of a smorgasbord of information but I'm going to hide the results so that you can write those privately and let me know how you found it that would be absolutely brilliant and I'll open it up to questions I'll stop talking and I think I've answered yeah I've sort of answered that question there's been some really interesting research published on students the use of sharing their webcam and also on yeah video conferencing in general that I could point that person to if you're interested okay I'm going to stop sharing my screen and okay so gosh there's so many questions wow alright let's see yes I will provide the PowerPoint yeah Emma was right about the synchronous tools thank you so much everyone for coming along absolutely fine if you need to pop off for another meeting if you've got any questions don't hesitate to get in touch good question Gerald so with a rapid review how do you decide which of the streamlining approaches to take goodness I guess that depends on your resources and on how much time you've got to undertake it I would definitely recommend limiting the databases for one limiting the place where you're searching it is absolutely fine to just search for example in the world of science because it's been found to be a really valuable and reliable place for systematic reviews that have been published so too with EBSCO host and Scopus there's a there was a study by Goosenbauer and Hadaway I'm pretty sure and they found Scopus and EBSCO in particular to be particularly useful so I would definitely consider that as a first step again I would also consider the number of people that are doing it in terms of streamlining because yeah as I said the more people you have the more inconsistencies there can be and the more you have to run a lot of decisions past a lot of people so that can actually the process of I mean I know we love collaborating with people but the process of collaborating can actually hinder how quickly you can undertake this type of work and yeah and if you're if you're going to be narrowing to just one or two of those key kind of very English language dominated databases or platforms then yeah you would be looking at maybe narrowing it just down to English language but again that is an acceptable method even in systematic reviews as well what else and make sure you use some evidence in the software as well that can be really helpful but then you need to get past the little issues of learning that software to begin with okay hopefully that answers your question Gerald excellent okay you'd like to ask you about the research questions within the systematic review since it's becoming the most challenges for me to conduct the systematic review based on your experience how to make a great research question oh gosh that's a really that's a that's quite a big question now before I answer Jim you had your hand up do we need to stop no no no no no reason to stop I I wanted stepping because there was questions but I can see all the questions here keep going keep going okay um gosh okay well obviously like with any research they need to be answerable in terms of the research question and not too broad they need to be quite specific now one method that I would highly recommend is using what's called the PCOTS framework and this is something that is used or any other version of that acronym this is something that's used especially in the health and sciences area um where they it's it stands for participants intervention comparator um oh gosh uh outcomes uh time and and and and is it setting setting I think for these are a setting or source depending which version of this you look at so this just helps you kind of narrow down your questions to think about okay who are the people that you're wanting to focus on are you just looking at higher education students are you looking at um higher education science students you know really trying to focus your your question then intervention are you looking at some kind of specific intervention in particular for example the flip learning approach as an intervention are you looking to for studies that compare anything now you don't have to you don't have to have a comparator um but say for example you were looking at how does the flip learning approach or is the flip learning approach more effective than face-to-face teaching for higher education science students that would be the first bit um in studies published between if you're looking at time studies published between um you know in the last five years and in terms of setting um in Africa in Asia in Australia you know if you're wanting to try and really focus now you don't have to use all of those um all of the acronym uh you could just use some of it um outcomes you could be focusing on something like student engagement for example but that is one way of being able to try and tackle those research questions to get really specific in what you're what you're wanting to answer but there's lots of other things we could talk about then and I'd welcome you to have a chat with me another time about that um there was a question um peer review status is however being knocked off databases records are having less detail yes absolutely um it is it is actually quite tricky in terms of conducting the review sometimes with missing metadata um and unfortunately there's not much that can be done other than manually entering in some of that information if you know if you've got the if you can find the pdf then manually entering some of it if you're wanting to include the study in your review um and peer review yeah it doesn't necessarily mean just because it appears in the web of science that you may have scopus you may find that it is something that's not peer review so that's something you need to think about but that is actually really controversial um as to whether or not peer review is equals equals rigor does that equal enough rigor for you to have that as the quality assessment or a quality appraisal standard I think that's really um yeah that's a big question um recent reducing bias do you use people outside of your team yes um so with my rapid review that I conducted um this was one of the things I actually didn't I didn't do a proper a proper quality appraisal using a tool and I actually for that one did not um double screen or double code anything for that one um but for um for my other reviews we've made sure that we've included a librarian or information specialist to help us um come up with the search string to begin with um we've also included both quantitative and qualitative and math mix methods people in our review team because we have such different ideas we come from really different worldviews of what research is what quality looks like um and my goodness we've had some arguments with between perhaps myself as the qualitative reviewer and and my friend my colleague who is a quanti person about whether or not certain qualitative studies were of a high enough standard um so I think it's really important that you do have a mix if you're going to have a big team a mix of people um what do you think of the standard of learning tech research or okay there was a lot of research that was published during the pandemic as I said it was very emergency remote research um that's what I'm calling it there was research that didn't use um any kind of theoretical framework it wasn't informed by theory it was a lot of um a lot a lot of it was practitioner research to try and you know understand what was happening within their own classroom and use sort of um online surveys that were made made for that particular thing it wasn't necessarily using validated surveys or anything like that um so yeah that did need to be taken into consideration and where that was an issue during our higher education review we actually decided that unless it had a proper method section and unless it specified x y and z it would not be included in the review um and overall there has been a real um um lack of reliance or lack of drawing on theoretical um theoretical frameworks in educational technology research across the board and that's not necessarily a new thing that is something that has been talked about for quite a long time um but that doesn't necessarily mean that in and of itself it affects the standard or the quality of research it just means that that is one particular area that could certainly be improved um there is a lot of research that doesn't report where where the research is undertaken who exactly the participants are um they don't fully report on the type of technology that's been used or if they do they don't actually explain how the technology was used at all within the course or within that particular study for us to even understand the results so that's a bit of an issue with some of the research that that we see yeah but again I could talk about that a lot any other questions or did you want to add to that Jim? Oh well um no uh I've I've just I've just been trying to get a paper published recently and um the reviewers are saying that where's your theoretical structure Jim you know and um uh and and you do see it is a common criticism I've seen other people talk about this you know so um I know it's outside the scope of this particular thing but in a funny kind of way do you think you're talking yourself to more literature from the learning world almost like um doesn't help with that because not many other people are talking about the theoretical structure of this and then when you do find it you know it's um perhaps not not used in the most useful way you know so um I think there's there's a there's another session but I've got to say now that you know the clouds have lifted all of I don't know what everybody else thinks but suddenly I can see how um how to structure reviews and and undertake them and yes you've revealed all the secrets not that they were but well it can be a bit mystifying if I'm honest it really can be yeah yeah um and there's so much to unpack so I realized that this was a very short sharp more maybe not so sharp but shiny version of ideas and what you can do and how you can do it yeah I mean I'm on the organizing committee for Elysig and I just think well wouldn't it be great to get a group of people together to do a review like this just so that we can learn how to do it and perhaps even have a mentor like you Mel I'd be more than happy to help good part yeah yeah yeah yeah because um you know there's there's a lot of literature out there I find that it's pretty disorganized so it um trying to get a good review and find a good review as well so I'm sure we could um help our our world by um by conducting some I'll try and pull something together um I think people are popping off I don't think we got a massive influx from the uh the mess up over time so oh thank goodness I'm really yeah I was worried yeah uh con um I didn't see anybody else join all of a sudden at uh 12 o'clock at our time I don't know whether you noticed anything um but uh but um yeah just brilliant because what tends to happen actually in these sessions is people will talk about the findings and concentrate on the findings but it was just wonderful to have somebody to describe the methodology so clearly and you know and help us understand how all the technology behind all of that works so um uh brilliant yeah yeah um yeah so what I what tends to happen with these sessions is um oh let me turn off the uh recording by the way let me turn it off