 Okay, hello and welcome to Esmarcon 2023 and to our panel discussion, how do we scale evidence synthesis education and capacity building? This session is being live streamed to YouTube and automatic subtitles should be available shortly after the event and we'll work hard to get these manually verified as soon as possible. If you have any questions for our panel, you can ask them by the ES Hackathon Twitter account by commenting on the tweet about this session. And if you registered for the conference, you can also comment and chat with other participants on our dedicated Slack channel. We'll endeavour to answer all questions as soon as possible. And we'd like to take time to draw your attention to our code of conduct available on the Esmarcon website at www.esmarcon.org. Our panel consists of Wolfgang Wittbauer from Maastricht University, Sarah Young from Carnegie Mellon and Mark Lejunus from University of South Florida. My name is Chris Pritchard. I'm from Nottingham Trent University and I'm going to be moderating the session today. I suppose we can get started with a nice warm up question of them to the panel of what barriers have you faced when trying to teach about evidence synthesis methods? Anybody want to take this one? Otherwise, I guess I could share some of my thoughts. Well, one of the difficult things with evidence synthesis is in my opinion that it spans so many different methods and techniques, right? So if we teach regression modeling, okay, that's a big kind of warmth in itself, but that's somewhat still contained. So when it comes to meta-analysis or evidence synthesis more broadly, there's so many approaches, techniques that we use, and you basically cover an entire stats curriculum once you start delving into these methods. So I think this is one of the big challenges that also people do need a somewhat decent understanding of basic statistics, right? So I mean, that's sort of you cannot really easily delve into evidence synthesis methods unless you're already somewhat comfortable with regression modeling. And while it touches basically also multi-level modeling, because typically these days you have much more complicated data sets. So you get into dependencies, dependency issues, suddenly you are knee-deep into multivariate methods, multi-level models. Yeah, it's just you open up such a wide diversity of techniques. And yeah, I think that's one of the major challenges and barriers that, yeah, in my opinion at least. Sarah? Yeah, I would, so I'm going to be, I think a lot of my experience is coming from teaching information specialists and librarians with evidence synthesis methods. So the folks that are supporting researchers in doing this kind of work. And we've developed a course that we teach a couple times a year. And I think our main problem and I think this is something we'll be talking about more in this panel is scaling. So there's a huge need and a huge interest in these, these skills and just even within the library and information specialist community. And we only have so many instructors so much bandwidth ourselves to teach these courses. So we end up having many, many folks register for our training. We can only take on about 50 participants per course. But we know there's a huge need out there. And so I think one of the biggest challenges that we've faced is just how do we reach more people, you know, with a high quality and kind of rigorous training program. So I think we'll probably address a bit more in later questions. Mark, did you have anything. Yes, yes. I agree with everything that's been said already. And some of the impressions I get after the courses that I teach is that they're very thankful that they have a better understanding of the process of how to do a synthesis. And once they actually get to starting a project. You know, it's, it's the reading and extracting is the biggest challenge. And that's not anything we could train because every study is very idiosyncratic in what it's reported. And so you're kind of left really having to solve a problem with every single study to extract what you need. Like Wolfgang said, you know, you really have to be like a historian of statistics because you're reading old papers that are using antiquated methods, methods, along with like recent papers that are using fancy at mixed models. How do you pull effect sizes from a mixed model to like someone that uses a bootstrapped ANOVA or something like that. And so, but that those are what I always feel like are epically giant gaps in training is these minutiae of information that you could only really accrue by doing it and getting experience in in in reading all this stuff. And actually, if I may follow this up. So one one of the biggest challenges I totally agree is actually the the extraction of the effect sizes. Often, there isn't like a perfect easy solution right so you have to sort of sometimes figure out is what I'm doing good enough right is it sort of kind of defensible I know well in this case for just to give an example right. We can take it a T statistic from an independent samples T test and convert it into a standardized mean difference. But what if it's not a students T test but a welches T test strictly speaking that conversion formula doesn't apply then it's not any more correct. So, right at that point you have to make a decision. First of all, you need to even be aware of these kind of intricacies right that's one issue. But even if you are aware of this, then you have to make a decision what do you do here right do you just say well I cannot include that study you so you're throwing a whole study away if you cannot reach the authors. That's always an attempt was an attempt but yeah that doesn't always work out. So then you have to make these decisions and they're incredibly hard and they aren't easy solutions here. And so it's, it's, it's a combination of like experience, having sort of an understanding of robustness of certain methods and, and this is not something you just, you pick up a book and they will tell you like this stuff isn't written down often. So that's, that's really a major, major challenge right so the cases that we teach often are the straightforward easy cases where you avoid all these intricacies while you have to start somewhere. But then people get into real, real meta analysis or evidence synthesis project and suddenly they are faced with real life which is 10 times more difficult and all these nice examples that we dig up for for our teaching so yeah. Excellent. Does anyone else wants to come in on this question or I can kind of move nicely on to the next one. I was just going to make a comment on King's comment and then just to the sense that that sort of, you know, starts getting us to this question of maybe a higher level capacity building so beyond training and thinking about, you know the organizational support the technological experts that people might have access to, you know, when they get to doing their, their actual project and needing some sort of, you know, more than a textbook or more than the, you know, the training that they had to turn to. And, and again maybe we'll get more to that but I think, you know, thinking about the training and building that sort of access to the expertise is also an important piece of this. And I think we've heard a lot about a lot of the challenges in even just training 2030 people at a time, you know, and some of the challenges that you face in doing that but you know in, in order to build capacity we have to sort of look at training entire generations of research so what are your thoughts on how we train lots of people in these methods, particularly when they're not very widely understood or appreciated. I think a lot of people do appreciate the methods. Right and that's probably why they're interested in jumping into it in the first place. But there's layers of training that need to happen before we could make use of their skills. And it's really difficult to have someone start a research synthesis project if they haven't done any stats, or they haven't done their own research project yet. And I, I interact with a lot of graduate students where the first official project is a sense synthesis project. And there's so many layers that they have to understand like experimental design just what what is considered a high quality experimental design or what is considered a observational experiment. How do you distinguish these things what what kind of evidence can you pull out of these types of studies. And so, and so rather than spending time in like teaching synthesis tools like pooling and aggregating and all that stuff you're really spending time catching people up. Which is fine. I'm totally happy doing that stuff. But if you're teaching a cohort with mixed experience. Really what you're doing is spending a lot of time teaching people that are behind. Now we're back to the challenges again. I mean, okay, so I think you have to to phrase this maybe a bit more positively, you have to just make sure that you have people that people have sort of decent exposure to to to the fundamentals experimental design as much as I said that's that's a real good one. So you have to make sure that people have the basics are covered. And, and yeah, you cannot treat I think evidence synthesis as like, yeah, sort of a starting point that is sort of the culmination of many different techniques that you learn along the way. And then, once you start to master those, then I think the next step is to, well, if that's what you're interested in is to consider something like a meta analysis or research synthesis. But I don't think, as Mark also said, I don't think it should be like your, your first year PhD project I've seen that many times. And that's, that's usually not a good idea in my opinion. I mean, it's, it's often the advisor that says well, you know, you're reading the papers anyway so why don't you just do a meta analysis. And yeah, that's, that's often going to end in a lot of tears. And Sarah, did you have anything. Yeah, not much to add just to agree that, you know, in my experience and supporting researchers at a university and doing evidence synthesis. You know, we do get definitely, you know, graduate students who want to take something like this on for a dissertation or, you know, maybe just in preparation for a dissertation and, you know, we often advise against, you know, trying to incorporate a systematic review into a dissertation because it in and of itself is a potentially multi-year, multiple team member kind of project. So yeah, just to agree with what Mark and Wolfgang were. And again, to avoid that this again sounds like too discouraging. I mean, on the other hand, if you are willing to take this on, I think you have to take it on with the right expectation, right, that's so be prepared that it's going to it's not going to be like a quick weekend project right that's that's unrealistic, and also probably not half a year it might take longer. But but if you are willing to take this on your building up skills that are extremely valuable. And at a certain point you just have to dive in right I mean you could say well no this is too difficult I'm going to push this off I'm not going to dare to tackle this. But if you are willing to dive in and to to work on on on on meta analysis, you are going to gain an a deeper understanding of the literature it's more so than if you were just reading the papers, because you're really really deep between the lines and so you you're definitely going to get that. Plus, you are picking up extremely valuable skills along the way right in terms of understanding different designs, how you can evaluate this kind of evidence how to search the literature even going back before we get to the stats right. So there's so many skills that you will pick up along the way. So you could say great I mean couldn't be better right but it's just it's just it's going to be tough going right because there's so much to learn along the way but otherwise yeah I mean. So I want to avoid to that people are completely discouraged here, because yeah, you could also phrase it as well. This is like the most effective way to gain a very deep understanding of the whole research process. I agree I've seen many examples of people taking on a research synthesis project, and then going back to doing primary studies, and just reminding themselves you know I'm not going to be the one that's not going to report the sample sex. Right. I'm going to report. All the information people need to make a an assessment of the evidence that I'm presenting. So that that itself I think is very valuable learning outcome by taking on one of these epic challenges of synthesizing research is you, you end up knowing how to report things, maybe useful for another synthesis person to come by and make use of your study, but also to add clarity to just primary research. Sarah. So just to agree with those statements and to think sort of gets me thinking to about, you know, just, maybe even starting with building some basic foundations and raising awareness of these methods, learning, you know, teaching, maybe undergraduate students, how to read and evaluate systematic review, I think is a step toward, you know, capacity building and training of that next generation. I know we've, and I mark I think you have some great examples of this to teaching undergraduates the evidence synthesis method, and all of, you know, these skill sets that they can learn and apply to other research projects. I mean, I think there's been some successful examples of that. So, yeah, lots of different sort of stepping blocks or stepping stones that process. Excellent. I think one of the things that was mentioned was about sort of supervisors suggesting, or why don't you just go ahead and do a meta analysis and thinking it'll be a relatively easy first year project but I guess there's something I've picked up there about, you know, could mentoring from evidence synthesis specialists for people who are really interested in, in taking on one of these projects be something that could be used to build that capacity. I mean, I do with my graduate students like I, I don't has it if that's what they want to do as their front first project, I'm okay, but you know they have direct access to me. And, you know, I understand all the challenges already that they're going to be confronted with so I could easily provide solutions or, you know, training modules or maybe some neat resources that are freely available out there to help them move things along. But other mentors may, may not, may not have such detailed understanding of the discipline. And so the training should also be at that level also right we were training at the undergrad level, but you know profs also need to catch up on things. I was just thinking that. I mean people are going to be doing evidence synthesis in some form anyway. Right I mean the moment you sit down and you read five papers on a particular topic treatment, whatever you are autumn, you're already starting to do evidence synthesis in your head right I mean, we're doing this constantly. So, I think just raising the bit of the awareness how how this is happening anywhere in any point in your in when you're doing research you're always doing evidence synthesis, whether you're doing it formally or informally that's that's a separate issue. And I think once people start to realize this, then then it's it's like everybody should scream like please train me more on the more formal methods right because it's it's the, the fundamental thing that we actually end up doing in science. Right, you do of course your own study, but you always write an introduction where you try to embed your own research in the context of what was done before. Are you doing this informally yeah, typically, but you could also think about okay well I should maybe do a little bit of a more systematic search it doesn't have to turn into a meta analysis in the end that you publish as a meta analysis. But at least you are sort of more aware of, well, I shouldn't be just picking the that these five studies because they support where I'm going with my own research. Right, so I should really be a bit more neutral and search the literature and see really what comes up when I do it a bit more comprehensively and not so well selectively. So, in a way, I think, yeah, once once you understand how important it is. I think it should be, you could also turn it around and say well the entire curriculum should be based on just training people to do evidence synthesis. But maybe I'm a bit too enthusiastic about it so I don't know. I see that when students are pulling together a thesis proposal. Right you, you're trying to figure out how you fit into this research domain. You indirectly synthesize a lot of information to see how you could squeeze in. Sure there's heavily emphasis on novelty. But your supervisor might point you in a direction where they've accrued a lot of experience on what is a good experimental design what's not an adequate experimental design, all these things emerge through reading a ton of stuff and being able to evaluate the quality. And then you know, because you're so familiar with it you, your brain automatically synthesizes and provides the the aggregate effect. And so I see it all the time but again you're right they, the philosophy isn't that you know we're applying research systematically research synthesis in a systematic way, it's more like I need to pull together an introduction, describing this domain. Approaching it systematically is probably maybe the easiest way to actually get to that. At least you have some rules to follow. I suppose just just sort of picking up on that I mean. There's a chance that some people may well know about some of these good systematic approaches in these methods. So how do we encourage people to actually use them in sort of their their writings and and their research. I think people are definitely willing to use them because they you you realize that the impact that systematic reviews and meta analysis can have on the field right so I think that is the one of the appeals that that you are having that sounds maybe a bit not the right motivation but okay in the end you end up maybe getting an article and in a high impact journal right so and that is attractive right I mean that is definitely something good for for your own academic career and that's how it is right so that in itself definitely motivates people to to learn or has yeah I think encourages people to to learn some of these techniques and then also to attempt an evidence synthesis. So, I think the the understanding of the importance of these methods is out there I would say, because people read meta analysis all the time right. I mean it's usually the first thing that you probably try to find when you are getting into a new topic right you're not going to start right reading the 50 individual studies. See if there's a meta analysis right so, and then while being a good consumer of that meta analysis is maybe even the first step right and, and then you might get encouraged to to maybe attempt something yourself like this yourself. Any other people want to come in on that or shall I go to the next question. Okay, I suppose I'll go for the next question so in terms of recent developments and towards building capacity in delivering of events and what would you say is like the biggest recent development that's impacted on on you building capacity within your organization or, or elsewhere. You mean besides as mark on fright. I don't think it's a joke right I mean I think this is this is definitely one of these developments that that is leading to to generally helping people get more into these methods and building skills. And I think this goes along with just generally more openness in in in the sciences right I mean we have open science but what goes along with that right so open materials open tours. And I think there's this explosion of software and our packages and, and I think that's the biggest development over the last 1015 years I would say. Even before that people were still doing their meta analysis often using spreadsheets that they would put together. And yeah, well, that's probably not the best approach. And yeah, we are seeing just an incredible explosion of tools that we're hearing about as party also of this conference, right. I think that is the biggest development in my opinion, just, yeah, tools and and openness in terms of sharing materials, making things accessible like doing this conference. I think like the stuff that Wolfgang does with the live streaming. All that stuff is like you couldn't get that stuff 10 years ago. Right there's no way. No resource on the internet where you could like sign up for something and be a participant and have direct access to ask questions in a visual way. Right, maybe you had like a handbook, a dusty handbook that a supervisor gave you and that was your entire exposure to the synthesis. But if you want now, right, you can sit in with Wolfgang and he'll he'll teach you some cool stuff. I think that's been an innovation too because the reach is incredible. Anyone in the world could sit in on this. Right. If I teach a lecture at the university, it's closed, right five graduate students, four of which aren't really perhaps interested in research synthesis they have to take the course because it's part of their degree requirement. But everyone, if you seek this stuff out, you know, you could find people to sit in on, sit, twitch and watch, learn things from that. I think that is, you know, if you could push things to the visual medium. I mean, as opposed to text, you, the possibilities are endless in terms of training. I think we've definitely seen that again with the training that we do for information specialists that originally was planned to be an in person twice a year, gathering of, you know, 20, 20 folks that we could support with our grant funding, and we were able to switch to virtual training, which turned out to be really successful and so we just have done all of those trainings virtually we were able to train twice as many people per training as we expected and, you know, and I think folks are just now more used to the zoom environment and learning in an online environment. And that really does help especially with the, you know, the sort of more live, you know, training opportunities that really opens the door for anyone to join, you know, from basically anywhere so that's, that's been really exciting. So just the building sort of online course materials and making those available for asynchronous learning there's really great examples of that now. You know, and a lot of good evidence that that kind of learning can be really successful so I think, yeah, that's me that's been one of the biggest advantages of the recent few years. I think I'm just sort of reflecting on the effort it would have taken to get you three together in a room prior to these developments would have been something. So, yeah, that's that I think, certainly the the reach of these sort of online courses and conferences is phenomenal. I mean, I know that last year we had we had participants from across the world and it was really, really cool to see. So, yeah, it's, it's nice that sort of in your words it's also making making that impact. And I suppose that leads me nicely on to, you know, if you've got somebody who perhaps is in an institution where there isn't much capacity to deliver evidence synthesis but they want to build that capacity. What would you say to them. So I think a selling point here is open science. So, so I often view meta analysis actually as part of open science I don't know for some reason to for me these things are like completely intertwined. I'm not sure if I could formulate exactly why I see it this way but to me, right, I mean, if if you are synthesizing evidence right this is what we are doing. And then, then you want this process also to be part of the whole open science process. I mean, the ideal is of course a meta analysis of registered reports. Right. So, but you're you, the moment you get into these methods you are immediately faced with the this these these concepts coming from open science. That's how I sort of think about this. And I think there is this increased appreciation for the importance of open science and and I see this happening at my own university. And so I think that is I don't know exactly how I would sort of package that. But I think this would be kind of maybe a good way to sell how important evidence synthesis is because well if you're embedded in terms of this these open science practices. I think then there's immediately more interest for this because yeah, there's just generally more appreciation for the importance of open science. So that will be kind of my strategy I would say. Yeah, open science is one of these indirect high quality learning outcomes when you take on a research synthesis project and like one of the exercises I like to do with my students is, you know, on campus do try to find PDFs and then but also do it off campus. When you don't have institutional access to the library and see how many PDFs you can recover. Right. And in the end when we come back and we discuss these issues they're like holy smokes. Like what do people do when they want to aggregate a whole bunch of studies and they don't have access to it. Like well, I don't have an answer to that but you get the point that it's it could be incredibly challenging. So the, like Wolfgang said it's all intertwined at every level. And so yeah that should be like one of these doorstoppers to get people in is just understanding the importance of having free access or open access to information. That's such an interesting observation Wolfgang, I always am looking for that intersection between open science and evidence synthesis as a librarian who works in a library where we have both an evidence synthesis service and a pretty robust open science program. I think libraries are actually a really central, you know, they play a really central role in this sort of ecosystem and, you know, if I think about, you know, an institution that's looking to build capacity, you know maybe looking toward their library as a place to start in building up expertise and information retrieval and searching methods, building, you know, open access and open science, you know, platforms and services and systems. I think, you know, the library can be really pivotal and supporting this kind of work in both of those kind of directions. Definitely. This is probably a good time to remind people that they can post any questions you've got on Slack on Twitter or actually comment on the on the YouTube video as well. So if any of you have any questions for our panelists, more than happy to kind of put the questions to the panelists, I guess. So yeah, excellent. And thank you for those comments. I suppose you've talked about this a little bit about how you're able to scale the cohort sizes of the online delivery, but I suppose what techniques have you used to share your knowledge about evidence synthesis with larger groups of people, maybe 50 plus 100 people maybe. Well, I mean it was already mentioned that that's the one thing that I do through these switch streams right so and also just also making more having moved my courses online. Before I would use I used to teach a meta analysis course in Maastricht, and then you only have people coming who are either nearby or we'll have the money to travel and so it just that's so good. And in retrospective so incredibly restricting terms of access and moving moving these courses online. I can, I can lower the cost to do something that is much more affordable for for much broader audience and you immediately see that happening. So I think, yeah, so that's that's what I do. I love teaching through this sort of online format. Of course it's nice when you're sitting together in a room and there's this atmosphere of we are in this together and we are learning. That's great. It's just, but but then if you do if that's what you do it just means you exclude like 95% of people right so it's that's the trade off and I think. Yeah, it's that's too big of a trade off in my opinion. And you really want to maximize your time. Right you're spending a lot of time and educating people. Why not make it open to as many as you can. Which I think facilitates the whole thing I try to do it with my undergraduate students. You know I have 250 right now and some of the first things I talk about is research synthesis outcomes. It's because once in a while you know and science comes there their way, and they need a way to kind of interpret and understand like what's the magnitude of this finding. And when I describe them that there's a whole process a whole system out there that is devised to aggregate and reach a consensus. Oh, okay so there's more to it than just the one study and made a proclamation that it's often belongs to a constellation of studies with various outcomes. And that, and that there's people out there thinking about that structure, not just the one study. And just to build on that same you know I think we've already mentioned the move to virtual trainings and the building of online content I think I could mention here the Campbell collaboration just released a course on systematic review and meta analysis, which for really applies widely to to reviews in the social sciences and it's built on an open learning initiative platform which was developed here Carnegie Mellon based on learning science and learning science principles so it's it's a really robust course definitely encourage folks to check that out. And we're looking for your feedback on that course now so you know hopefully we'll see more and more of these sorts of things and opportunities over time. But you know I will say that was a very much a team effort over a very, very long period of time so building this content sometimes can be, you know, a pretty significant time and resource investment. But, again, I think we're seeing more, more of this sort of thing available is great. I think one of the potential barriers is also that you are offering content for free, right and then you have people hire up the chain, who sort of question this right so like, can we not sort of one. monetize this or right so what do you mean you're providing these materials completely for free to people out there right so I think there is there is a change of thinking that goes along again with with open science the open science movement, but it's that still will take time, right that. Yeah, that material should be open and accessible and not locked behind paywalls and. Okay, I have to, of course, some of the teachers that I that I some of the courses that I teach are not free, I mean, I do also have a course fee for some of my courses. So, you could question that I mean if you if you want to be, you know, really picky or maybe yeah, I don't know I mean I but then then again I do provide also that a free and open teaching right that I that that I think that compensates a little bit for that so and and yeah I actually even think about this quite a bit why why not just make it all open and freely accessible and I don't know it's it's something that I think about at times yes. Yeah, and I think for the course this Campbell course, those conversations are happening and it's currently free and I think, you know, the reality is that it's not actually free to host this content you know there's someone is paying paying for something. So, you know we're looking at models where we might offer sort of a certificate level version of the course for some fee and then an open and free version of the course. You know that's that's not attached to a certificate so that's one, you know one potential way to address that. I think that that brings me relatively nicely on to the question about using like nukes the massively open online courses and that seems to be what what the Campbell course is all about but that has its own challenges doesn't it. You know it isn't free to host these things and you know, Sarah's talked a bit about the challenges around that but I guess just and we're going a bit but but did anyone else have any thoughts about about the use of mooks to kind of deliver training on undertaking sort of evidence synthesis. So, I think mook in itself to some extent requires that you provide support somewhere right that you that there's some forum, or even maybe some opportunity for live interaction. Right so a mook that is just purely thrown online without any opportunity for for people to ask questions or even just to interact with each other right so I think that's that's. Okay, I mean you are providing it for free right but still. I think in an idea setting, there should be some kind of forum interaction possibility. And so yeah then that requires people it's time and commitment and so yeah, well, where, where is that coming from right. So, and who is supporting that so then you do get into these questions about why should we charge for this, even if if it's a minimal fee, or maybe for people who want to certificate at least. So, I mean, it's great of course when people provide things for free, like completely for free, but that it's it's not free entirely even for the person right the person who spends their time writing these materials teaching openly freely right somewhere, there's a price to be paid. Right, I mean, sometimes my wife asked me like, are you doing on these Thursday evenings when when you're sitting there for like seven hours blabbing away. And I have to say she's extremely supportive of it but she just wonder sometimes like, why are you doing this right. So, I don't have a good answer for that. I said some of the mooks that I've seen. Excuse me, I'm losing my voice. It's a very passive learning approach. People are just watching like a fisheye lens in a classroom. And so they're not really applying a lot of best practices in terms of education. You know that being active prompts, trying to catch people's attention. Not not taken as far as doing like flip classroom stuff or, or approaches to try to get people to learn learning outcomes in a different way. But like what else are you supposed to do if you have a thousand or so plus participants, you kind of have to adopt these broad dissemination practices. And so yeah so my advice is a lot of these big things again they're not fully optimized in terms of efficient learning. And then maybe that's a potential future Avenue is adopting a lot of these best practice education approaches to enhance what people walk away from the course, rather than just passive information. I think we can also maybe think about other like models I often think about the carpentries, for example, where you're basically building free curriculum that anyone can take and, you know, teach in a live, you know, synchronous learning environment so kind of a combination in the sense that it's, you know the curriculum is out there and you have this whole world full of potential instructors to sort of take it forward and teach it. And that's not quite obviously not not a MOOC but sort of just a different way of thinking about standing. It gives that opportunity doesn't it to to kind of put forward more of those sort of active learning techniques and things like that and you can get from your typical MOOC so perhaps that that is something that that may well be a more effective way of actually, you know, delivering training and education in these methods than just just having a sort of more passive online platform without any of that, or without as much of that facilitation. Yeah, I think. Oh, all I wanted to say is I think one of the successes with the carpentry stuff is they're very consistent in the messaging. Everyone, everyone who's teaches is trained the same way. And everyone is describing the same material. I think that consistency is is helpful if you're trying to train a broad pool of people. And so we did have we had a sort of it's not really a question, but I guess a comment from on YouTube about evidence synthesis being part of the research methodology curriculum in universities. I'm thinking that that sort of applies to undergraduate level. And I suppose just I know we've mentioned it a little bit but just putting that out there is as a common what do you think to evidence synthesis methods being part a key part of research methods curriculums undergrad level. I think this would be great. Because, like we said at the beginning right it spans so much of what people are learning anyway and I think it's right we show these evidence pyramids right and okay you can debate this but right it's it's somewhere at the top there right for better or for worse. And it's, and even if you don't think about it in terms of quality of evidence you can think about it as part of the whole process of what we do and in the sciences. And so I think it's it's almost should be a requirement that people get some exposure to to this right coming back to just even even these basic skills how do you search the literature effectively. Right what a what databases are out there. This was this is not stuff that I was taught at definitely not at that no no way actually I mean this was just not part of the curricula that I followed. Right so we learned how to calculate a tea test by hand, and you can question wouldn't the time have been spent better on on some of these other things right so yeah. Excellent. Any other comments on that sorry Sarah I think you want to say something. Excellent. Okay, I think we're coming up we've only got some nine minutes left so I'll I'll try and wrap up with maybe one or two more questions but one of the questions I was thinking about was about, you know, if you do try and build evidence in this capacity within your institution. Not everyone may maybe use, maybe using best practices, or maybe as up to date as others so how do you combat sort of poor evidence synthesis practices when trying to build capacity. Do you just have to shame people into now. Yeah, that's that's tough. I mean you have to pick up people where they are right. You have to sort of develop sort of an understanding of what you can throw at people and add I mean obviously don't don't, you know, like directly confront people. That's just not a good strategy in my opinion. I think if you if you. You just have to be patient that's one thing I would say be patient don't immediately expect people to to change their their ways completely and we see the same thing again coming back to open science right. Don't don't think people are immediately going to adapt every little piece of the whole open science idea and one step at a time right. I think that's that's how you should approach it and yeah. I was just going to sort of meant that some of I think some of the responsibility for, you know, producing quality evidence and says does lie in the peer review process and that sort of loops back also to open science or maybe open peer review. The models might be a way of addressing some of the problems and the peer review of evidence and so I mean we see a lot of really bad systematic reviews published and, you know, got to be somewhere along that pipeline there's there's a problem and you know open peer review might allow for more, you know, people with real methods expertise and the systematic searching or the statistical side, you know, to get in there and comment on, you know, on work that's being put out there. Yeah, having teams and be open to expert advice early in the phases of a project, rather than late. It's not unusual for me to be presented a project. And then I see that like a lot of best practices were kind of script skipped over. And they were like fix this. I could I could have helped you out at the beginning of the project and we could avoid it all this mumbo jumbo. And so maybe training people to be more amenable to collaborations during the devising of the protocol or during the devising of the project so that they could avoid issues downstream. When it comes to pulling together a report that is nice. I feel like over and over again I just get presented something that is incomplete. But a lot of the issues could have been addressed if I was involved in the project earlier and I'm not going to try to muscle my way into a project right but if your colleagues are aware that there's an expert or someone who's done these types of projects beforehand then then having easier communication and you know I don't need to be a an author on the paper. I'm totally glad giving advice at the beginning or at the end at any point. It's just yeah there is a timing to this if you want to do high quality project and I'm not a recovery expert. I can't resuscitate things and usually when I provide ways to improve things which is usually after a process of review where the reviewers asked for something like I'm glad to help but yeah there's a way to ease the tension and pain for everybody if there's expertise pulled in at the front end rather than at the back end. I think that's a really powerful point isn't it? It's much easier to kind of design something properly in line with best practices and do it then kind of realize that you miss something really important out and try and recover that because it's just not going to happen. Certainly not as effectively as it would have done if it was if it was identified early. So someone someone's actually just put a comment on on YouTube and I think it's quite a good one. They said I think quality could be promoted by making it part of themes in a conference and then put out a call for papers so that participants in a conference can see what a high quality evidence synthesis report looks like. That might be some feedback for EsmaComp for 2024. So yeah I think you know quality having quality appraisal of the reviews as part of a theme within a conference to make sure some really high quality synthesis can be showcased could be a really positive move going forward not just looking at the subject of the review but how it's done and use that as a as a core theme. Yeah but I could also see that turning sour where you know where you're indirectly disparaging researchers because they weren't able to follow best practices. Right. But maybe that was a shaming part. And there are definitely challenges to following best practices not not not least, you know, resourcing issues and time scales and availability of people. And I think, you know, whilst we can look at really high quality sort of evidence synthesis and ones that have followed best practices that doesn't mean that the ones that haven't been able to and necessarily completely worthless, for example. So we're coming up to eight o'clock and I think we've kind of reached a natural natural sort of end to the question. I've got no more questions on YouTube or Slack but I'm sure if people are watching on catch up they may well post some questions later and we'll do our best to try and answer them as soon as possible. And all that's left is for me to say a massive thank you to Sarah Mark and Wolfgang for giving up their time. Today it's been really interesting. I've been quite lucky getting to get into moderate this session so just a massive thank you for giving up your time this evening and also a thank you to everyone who's who's watching.