 Hello everyone, welcome to Workshop on Synthesizing Research with the Scoping Review. My name is Sarah Young, I'm a librarian at Hunt Library at CMU and part of our evidence synthesis service team. And my colleague Melanie Ganey is one of our biology liaison librarians, director of our open science program and also on our evidence synthesis service team. And we're happy to be here today to present this workshop to you. Our team of evidence synthesis librarians consists of Melanie and myself, as well as Ryan Splenda and How Young Lan, who are both also liaisons to different departments in the colleges here at CMU. And we're here happy to help with any questions you might have about evidence synthesis projects, including scoping reviews, which we'll talk about today. So today's session is really covering, firstly, evidence synthesis broadly and where scoping reviews fits into that suite of methods. We'll talk about question development and developing protocols, conducting searches for scoping reviews. Some thoughts about screening search results from your scoping review searches. And finally, some guidance around reporting and documenting for scoping reviews. So, thinking about evidence synthesis broadly, we think about this as an array of methods that can be used to combine information across multiple studies that have investigated the same thing. So, this sort of helps us to come to a better understanding of what we know about something. For example, whether or not something works or doesn't work to achieve a particular outcome in a given population. That is often a question that we might ask and answer with a method called meta-analysis or systematic reviews. These are methods that are probably the most common form of evidence synthesis and probably the most well-established method as well. But scoping reviews are becoming more and more popular and can kind of help us answer slightly broader questions than a typical systematic review and really gets us a sense of what has been done on a topic, what has been, you know, what sort of the research looks like around a particular specific question or research area and the types and sources of evidence that might exist to inform practice policymaking or further research. So, this would include identifying the different types of evidence available in a given field. You might be interested in looking at the way concepts and definitions are sort of operationalized in the literature. We can use scoping reviews to examine how research has been conducted on a particular topic or field to identify key characteristics or factors related to a concept. Often scoping reviews are done as a precursor to a systematic review and we often use scoping reviews to identify knowledge gaps, so where the research still needs more work, but also where there is areas of saturation, where there might be enough research to actually conduct a systematic review and answer more distinct questions about what works or doesn't work. So, the general steps in a scoping review include developing a research question, and there's a lot of thought given to sort of the scope of the research question in a scoping review. They do tend to be broader than systematic review question, which is generally very narrow. I can ask a slightly broader question with a scoping review, but you still need to really think about your concepts, how you're defining different concepts in your question, and really where, you know, sort of the boundaries of your question, where something would be included or excluded from your review. So, once we have a well-defined research question, we move on to developing a search strategy, so this is the strategy that you use to find all of the relevant literature to your topic, and you then go through a study selection process based on predetermined criteria. Once you've identified all of your included studies, we extract the data from those included studies that we're going to use to then analyze, summarize, or report results. And sometimes there is some steps, either at the end or the beginning or throughout the process, where you would be consulting with stakeholders who might be impacted by the findings of your review. This schematic sort of puts that into more of a visual view where, again, we're defining the question or searching and screening, searching for and screening studies, extracting data and then discussing and concluding, making concluding findings. This step here assess study quality I have kind of grayed out, but this is a step that would be required if you were doing a systematic review. And again, we're talking about scoping reviews here is not systematic reviews, but I just wanted to put this in here as a way of distinguishing those two very related methodologies. There's a lot of good information out there about scoping review methods, how to conduct scoping reviews. These are two of the probably more well known resources. The Joanna Briggs Institute, I think actually they might just go by JBI now, puts out has a really good and thorough scoping review guidance document Cochran, which is largely medical and health focused organization also has some good training materials and scoping reviews. So these are two good sources if you're thinking about doing this kind of project and really want to learn, you know, the best practices and best methods. Some other useful references here. This 2005 paper was kind of a good sort of seminal paper on scoping review methods. It's been updated more recently in 2010. And then there's this really nice paper that sort of distinguishes systematic review from scoping review can be a little bit tricky sometimes to figure out if your research question fits into one of these methods. So this is a nice paper that helps you kind of make sense of the differences between the two approaches. So with that overview, let's dive into sort of the beginning portion of conducting a scoping review, which is really just formulating your research question. We sometimes point to various question frameworks that can be helpful and really kind of structuring your research question that can then inform the following steps of this process. You don't have to use a question framework. Sometimes it's hard to find one that really fits into your research area, but it's useful to know that these exist and that they might be useful, depending on what you're working on. ECO is probably the most common question framework, very popular in systematic reviews in health and medicine, where you're looking at the effects of a particular intervention, say a drug-related intervention or maybe it's a social policy intervention that's been conducted in a given population. So you're going to define specifically what population you're interested in and what outcomes of interest you're interested in understanding the effects on. So sometimes we sort of think about comparison as well. So in a given study that might be included in your review, what comparison is relevant. So maybe you're interested in looking at how a given intervention compares against status quo or common practice, or maybe no intervention at all. So that's another sort of thing you need to think about and define in advance. There are many other question frameworks out there. Again, some of these might be more relevant given a particular discipline or topic. CMO is one that looks at sort of context, not so much population, but context. Again, an intervention, maybe the mechanism for how that intervention works in practice and again outcomes of interest. There is another framework that looks at sample, which could be a population, it could be something that has nothing to do with humans, for example. So if you're working in a field that's not really human focused, this might be a good framework to consider. So here we have phenomenon of interest design or evaluation or both research type might get at the study design, for example. So lots of different ways we can think about question frameworks. And just to give you a sense of some examples of scoping review questions and these are two of, you know, many, many scoping reviews that have been published. I'm not necessarily saying these are the best scoping reviews available in terms of methods, but just some good examples of the kind of questions you might ask. One here is asking about why people choose teaching as a profession. This is a paper that looked at studies from 2007 to 2016 and looked at why, you know, sort of the various factors that come into play when people choose teaching as a career path. Another review here looked at machine learning and mental health. So it was not looking at, you know, what machine learning applications work best to address certain mental health issues, rather just sort of getting a scope of what applications have been applied in the mental health field. So again, that sort of slightly distinguishes your systematic review question from your scoping review question. So we are developing sort of a plan for our scoping review. It's really nice to really weigh out a protocol and this is kind of a roadmap for your entire project. There's other reasons why we tend to develop a protocol for these projects. The main one being that we are sort of being transparent about the decisions that we're making about the scope of our project, what types of studies will be included. We're not making decisions kind of on the fly, possibly introducing bias into our decisions about the studies that get included in our review. So it can be really nice to lay out this protocol, everything from your question formulation to the search strategy that you're going to use, even the analysis that you are going to plan to do with your studies. This is publicly available in a platform such as Open Science Framework, where you can publish and register your protocol and we have some information about Open Science Framework here, this link. And we're happy to help, you know, walk you through this process of registering a protocol here. Okay, once you have formulated your question, and you have a well-defined, well-scoped research question for your review, you go on to developing, designing, conducting your searches. We tend to think about searches for reviews as very sensitive, as opposed to precise in a scoping review. In other words, we are hoping to capture as comprehensively as possible all of the research that has been conducted on your topic, as opposed to, you know, just a handful of studies that might be relevant. This is a nice analogy, sort of a typical literature review where we're not trying to be comprehensive. Maybe we are just sort of thinking about the studies we already know exist, or maybe we're looking at some key authors and what they've studied. If we use this phishing analogy, we can think of that as kind of just, you know, casting a fishing rod into the stream, looking for a few trout in that stream, not really trying to capture all of the trout in the stream, but just, you know, with a specific bait, capturing a few trout to sort of give us a sense of what that literature looks like. On the other hand, a scoping review or systematic review, again, is aiming to be comprehensive. So we might sort of view this as an ocean trawler. We're putting out a huge net, capturing all of the fish in that particular area, which is going to include all of those trout, but maybe also a lot of other types of fish that we don't want, where we're actually going to have to manually screen those out. But we're left with a much bigger comprehensive picture of what actually exists on the topic. We take a systematic approach in these reviews and doing these comprehensive, broad, sensitive searches. Oftentimes it's useful to start by looking for related reviews. Number one, you don't want to be repeating work that's already been done. So make sure someone hasn't already done a scoping or systematic review on your topic, but also with published systematic reviews and scoping reviews. If they're done well, they've reported their search strategies. So this can be a great place to get a sort of starting point for a really good comprehensive search strategy on a topic. So this is sort of a good first step to developing a search strategy for a review. Once we've done that, we tend to start with well-structured databases. We'll talk a little bit about how to select databases for our review. And we conduct and design searches for just this sort of well-structured academic databases. Then we move on to less structured databases or even free resources like Google, Google Scholar, maybe certain organizational websites that might contain reports that are relevant to our review. And then we move into, this would be sort of the gray literature or conducting the step of hand searching, which is really about looking at, say, tables of contents of relevant journals that might not be fully indexed yet or covered in the databases that we're searching. As a final step, we do citation tracing. So this is when we take the included studies in our review, we look at the references in those studies, and we might even look at who has cited those studies using a tool like Google Scholar, for example. This is a nice final step to ensure that we're capturing all of the research, even stuff that we might have missed in these comprehensive database searches. So, again, we're wanting to search a range of databases. We might choose, first of all, key databases in our subject area. As an example, if you're doing research that maybe is related to education, we would probably want to search Eric. Eric is a key educational academic database. We often also search sort of the broad, big multidisciplinary databases. So that would be something like Scopus, Web of Science. These are two really big, broad academic databases that cover all disciplines, and usually we do include at least one of those in our scoping review. And then there may be other databases that might have literature that sort of has some impact on your question. So if you're asking a multidisciplinary question, think about all of the disciplines that will come into play, and thus all of the key databases that might be relevant to that search. I want to think about also geographic focus possibly there may be some regions of the world where there are specific databases that help us get to research being published by that region if it's relevant to your topic, which may not be well covered in some of the other databases that we have through the libraries. Google Scholar is great, but it cannot replace the work of the academic academic databases that you have access to through the library. This is in part because it is not. It really just has some functionality that makes it really difficult to use for these sorts of searches. It's a great supplementary source, but not useful as a key database in a scoping review. And we certainly encourage you to work with us. We can point you to the right resources help you choose which databases are appropriate for your review. And just sort of covering some of the stuff that I just mentioned. Scopus Web of Science great multidisciplinary databases to include some examples of the more disciplinary databases, depending on what topic or discipline you're working in. And again, thinking about really just covering the gamut across these different disciplines for your review. And with that, going to stop my screen share handed over to me. Okay, hi everyone. Next we're going to be talking about the screening phase of your scoping review. The eligibility criteria, which as Sarah mentioned will be predefined are going to be applied in two phases to your project. So in the first phase, you're going to be applying the criteria to just the title and abstracts of the studies that you found in your searches. And because the tiles and abstracts don't have that much information in them you're really asking does the study appear to potentially meet the criteria for inclusion. And if so, you will include them from the next phase of screening. The next phase will be based on the full text of the studies. So, at that point you can ask, does the study definitively meet the criteria for inclusion. A couple of things that are important to keep in mind as you are screening is that you want to have two independent screeners look at all search results for both phases of screening. And this should also be done in a blind process so that the screeners cannot see the decisions of the other screener, and this is to avoid bias. The conflicts, any time that the two reviewers disagree on whether a study should be included can be resolved by a third reviewer or if you don't have a third reviewer through a conversation between the two reviewers. Typically, you will spend quite a bit of time defining criteria before you start screening and then you can pilot those criteria against a small set of papers to make sure that they're working the way you want them to before you screen all of the papers. And so doing this pilot phase is really important for finding criteria and will in the end save a lot of time. And since screening is a time intensive and manual process people are increasingly using machine learning approaches to make this more efficient. And so this is an example of a scoping review that applies machine learning methods. And this can be particularly useful when you are doing scoping reviews that have very large numbers of studies that are being surfaced in the searches. So this is just an example of what the inclusion criteria might look like for study. So there's often criteria related to the publication date. Whether the study has to be in English or not, which would depend on often if you have people on your team with foreign language expertise. And then there's many different pieces of information about the study that are included here in the criteria. And then similarly you'll have a list of exclusion criteria. These are often highly related to the inclusion criteria. And so you'll use these list of criteria to screen the studies. There are many software platforms available now to facilitate screening, although you can just do it in a more manual way with your citation manager and Excel. But these platforms really do make the process more efficient and some of them have machine learning capabilities built into them as well. Ray Ann is a nice free resource for screening. And this is just another paper that talks about how automation and machine learning are being used now for various parts of the scoping review process. So once you do the screening, the next step will be to do citation tracing for additional studies. So you'll take your set of included studies after full text screening and look through their reference lists, as well as any studies that have cited those studies, the forward citations, and look through those for relevant studies. And this helps make our projects comprehensive, make sure that we're including all of the relevant literature. And there are tools now that can facilitate citation tracing as well. And then the step after that will be the extraction and coding where you're pulling out the pieces of information from these papers that you're interested in reporting. And then finally, the synthesis and write up for publication. Scoping reviews have a very standardized reporting process. So Prisma has this checklist of all of the pieces of information that you are supposed to report when you do a scoping review. And they also have what's called the Prisma flow diagram. So this is where you document how many studies were included and excluded at the various parts of the project. This is a very standard form of reporting for scoping review and really lends a lot of transparency and reproducibility to the project. This is an example of how you would document the searches. So again, there's a big emphasis on reporting for transparency and reproducibility. And so it's important to document the search strings as well. So these searches are advanced. They're often multi-line searches. And this is just an example of one that was conducted in Psychinfo database on the EBSCO platform. So you typically document the database, the platform, it's on the date you searched it, as well as the number of results that come up for each line of the search. And so if you look at this search, basically there are three concepts. In this question, one related to humor, one related to management and one related to productivity. And so these have been separated out into lines of the search where synonyms are strung together with the OR operator. And then you can string together these three concepts with the AND operator to find studies include all three concepts. So typically you will write a search in one database and then translate it into several other databases to make sure your search is comprehensive. And it's important to keep in mind that the syntax does vary between databases. So it's important to be careful when doing that translation step, look at the documentation for the databases, and if possible have somebody check your work. And that's where we come in, we're happy to check your searches for you, give you suggestions on constructing the searches give you suggestions for databases to choose for your field. Give you suggestions and tips on how to access the full text articles. And so we're really available to give you guidance on a lot of different parts of these projects. And you can book a consultation with any of us on our team and find the services and team info on our webpage linked here. So thank you. And again, please get in touch if any questions come up in your evidence synthesis projects. We're happy to help.