 Okay so we're live streaming now. Good afternoon or good day to everyone. Welcome to the SMICONF 2023. This is the panel discussion on stakeholder engagement and evidence since this. If you have any questions for our presenters you can ask them via the ESAQ fund Twitter account by commenting on the tweet about this session. If you register the conference you can also comment and chat with other participants on our dedicated Slack channel and we'll try and answer those questions as soon as possible and if not live maybe just after the event. We'd like you to take some time to have a look at our code of conduct which is available on the SMICONF website at www.smicomp.org. So let's get this panel started. I'll pass it over to our presenter Alex. Brilliant thank you very much Matt. It's great to be here and I'm really looking forward to what's I think just going to be an hour of chatting about stakeholder engagement and reviews which is a great opportunity. So the sort of aim of the next hour is for us to discuss and explore how stakeholders have been involved in systematic reviews. As panel members we're not actually familiar with each other's reviews so I think we're all hoping that we'll learn things from each other as we go along. So we're going to try and keep this as informal as we can in terms of discussion but we have got a set of questions that we'll work through so we'll really work through the sort of process of how we all thought about involving stakeholders in our reviews. So I want to start with my first question which is who are you and why are you here so I'll go to Armin with that. Thanks Alex. Well I have to admit I'm not a evidence synthesis guy. I am working for a science technology and society unit at Graz University of Technology here in Austria. I have a long track a long history and quite some track records in stakeholder engagement in controversial technology topics but not in evidence synthesis. Still a couple of years ago in the course of the European Union funded project we started to explore the use of evidence synthesis in the field of GMO as a genetically modified organisms impact studies and in that project which is quite a huge task we were responsible for the stakeholder engagement and developed a kind of well specific approach. Brilliant really interesting area thank you Steve. Hey good to see everybody my name is Steven Cook I'm a professor at Carleton University which is based in Ottawa Canada which also happens to be on the lands of the traditional lands which are also unceded of the Algonquin peoples. I lead the Canadian Centre for Evidence-Based Conservation and happen to be the Secretary for the Collaboration for Environmental Evidence. I'm an applied fishy colleges by training although working in the environmental evidence space our team tackles projects that extend beyond that so I'm a subject matter expert as it relates to fishy things and certainly involved in evidence synthesis but I think that's one of things right I really like about this is a member of an evidence synthesis team one gets to work on projects where there's tons of opportunity for learning and a lot of that learning comes from those interactions with stakeholders and rights holders I look forward to to talk to you about that today. Great thank you Emma. Hi everybody I'm Emma France I'm based at the University of Sterling in Scotland UK I'm part of a research unit there called the NMAP research unit and I'm an associate professor with interest in children and families health particularly around long-term health conditions and I recently led a project which was what we call a qualitative evidence synthesis so pulling together multiple qualitative studies to try and get a handle on what the studies were saying and to come up with novel findings based on those studies and it was around children's chronic pain and we had extensive stakeholder involvement in that including children and young people, parents of children with chronic pain and a wider group involving healthcare professionals, the third sector like charities and others like academics with an interest in the topic so that's why I'm here today. Brilliant thank you so they're coming back around to me so we've got very varied forum this is great so my background is largely in Cochrane reviews and I've been coordinating editor of Cochrane Stroke for the last few years very heavily involved in doing stroke reviews and have involved stakeholders and within my stroke reviews so the next question on our list then we're very briefly going to touch I think on definitions I think we have to acknowledge there's international variation in how different terms are used when it comes to stakeholder engagement today is not the forum for us to spend I think we can spend the rest of the hour talking about what term we should use I thought it'd be useful if we all just started by saying what the term stakeholder engagement means to you just so that we know what we're all talking about so if I come back to you Armin yeah certainly I mean the the term stakeholder what it includes and what it doesn't include has been an issue also in our activities and we have a tendency to have to keep it as broad as possible but on the other hand then it could it could include everyone who has a kind of interest could be a teacher for example so so in practice therefore so it's typically then narrowed down to to organize groups and in our activities I think we typically involve key stakeholder groups for in this case with GMO impacts it was typically the biotech industry seed industry civil society organizations competent authorities these these type of actors thank you Steve you anything to add there yeah and Armin just used the word actors which is a word that in Canada we're tending to lean towards a lot more because the word stakeholders excludes our right holders or it diminishes their their value admittedly the case that I'm talking about today does not involve rights holders so I'm not going to go down that path but I'll try and keep my my Canadian hat on and insert that perspective whenever relevant a lot of the work that that we do is about informing environmental decisions and so the key actor group that we're usually interacting with are environmental decision makers at a variety of levels from those on the ground frontline practitioners who you know in my case are wearing rubber boots and are in the field you know getting getting dirty through to those that are based at headquarters of various organizations making high level policy decisions that guide the frontline practitioners and part of that ecosystem that environmental decision making ecosystem does include other groups like industry representatives or the NGO sector where appropriate but I think it's all about context right the the stakeholder group for one systematic review one topic one issue is going to vary quite quite widely absolutely yes Emma I think it can be quite hard actually to know what you're talking about sometimes when when you're saying you've involved people or what involvement is versus engagement there seems to be a lot of definitions out there and in the UK I think we might have quite specific terminology that might not be used by the signs of it in Canada or or where Arman's based either so we took the view that involvement was more by collaboration and having people involved in decision making whereas engagement might be more telling people what you were doing rather than really getting them involved in the day-to-day process of of doing an evidence synthesis or systematic review and we have we separated our what we called our patient and public involvement our children young people and parents or carers and had a specific group for them versus our wider what we had called stakeholders so like our clinicians health care professionals and charities and so forth yeah so I think my experience is probably quite similar to Emma so I've come from that field of patient and public involvement which is also sometimes PPI that people talk about so this is very much about having people with lived experience of a health condition and contributing in some way to the systematic review but like Arman and Steve have also said we do then acknowledge that right broader group of stakeholders which was in my case would be then health professionals so clinicians working in the field and then policy makers organizations supporting patients and carers etc so I think although we might use different terms I think we're probably all but similar models picking up on Emma's point about the term involvement and engagement I think what I have very much come to realise that in the UK as Emma said we use the word involvement which I think if you're in Canada you would use the word engagement and they really mean the same thing but when we use engagement in the UK it's got quite a different term so it just yeah adds a layer of complexity but I think so in today's conversation we might interchange between involvement and engagement but I think it's fair to say we're probably all talking about the same thing so if we move on then in terms of just thinking of the reviews that we've done and what we actually did in terms of stakeholder involvement I think we've all given some thought and selected one of the evidence synthesis that we've done in the past what we've involved stakeholders and we're not going to take it in terms to talk through the different aspects of how we went about that involvement there are many of the questions that we're sort of going to work through have come from the active framework which is a framework which I've been involved in developing which is a very simple framework just based around how to describe that stakeholder involvement and I think that Matt's going to share the link to that within the chat that he's got so we'll just go back round again I think and we each say what the the evidence synthesis is that we're going to focus on today and really why we decided to have stakeholder involvement in that evidence synthesis are many all right if I come to you first again yeah sure um so um I think um we have made a very particular case so because evidence synthesis was for the first time systematically explored in the area of socioeconomic environmental and health impact so three totally different domains of research and evidence and in that it was a large endeavor essentially where we started off developing 68 review questions um and in that this area is has well the particular issue about these this area is that this is heavily contested it's it's extremely controversial it's heavily polarized and and there is typically a lack of trust between the various actors so therefore the in our case it was not a single evidence synthesis so and I think my my turn on this was was not to develop a stakeholder engagement or to use a stakeholder engagement for a single systematic review rather than developing something that would possibly work for these kind of polarized and controversial issue which is characterized by the lack of trust I think that's what's our role interesting thank you Steve so the example I'm going to use is based on a paper by Jessica Taylor published in 2019 in the journal environmental evidence and I was one of the co-authors and it was all about the effectiveness of different habitat creation or enhancement methods for substrate spawning fish so thanks Sam so right so if we're going to build a rock pile do fish use it or not for for spawning um I think important to know is that we were contracted by fisheries and oceans Canada the federal agency in Canada responsible for fish habitat management and protection and so not surprisingly a big element of that was being relevant to them but at the same time there's always this tension about independence right so uh you know we we want to be relevant and engage relevant parties but we also don't want that level of engagement uh to be such that it actually biases what we do and so that was a uh something that we we had really spent a lot of time thinking about and as we go a little bit further today what you're going to learn is we actually took advantage of a parallel process embedded within fisheries notions Canada to do that to help create a pathway for engagement and also to validate what we did through their routine processes it's basically how they a process by which fisheries notions Canada brings new science new evidence into their day-to-day management processes so instead of there being this third party systematic review that just sort of floated out here we took advantage of that process I'll tell you a bit more about that shortly thank you yeah ever thanks that was really interesting to hear Herman and Steve's um points around the influence that that involvement might have in your review and how you know you have to be careful I think to get that balance of different views and whether it might introduce some bias that's in some way undesirable which I think isn't something with has been talked about a lot um but is is definitely important because um everybody comes with their own kind of background experience and they might have another um agenda if you wanted to call it that um the review that um I was going to um really focus on today was a project um which is we're still disseminating but we've kind of formally finished um around children's chronic pain which I mentioned earlier we called it the champion project for short because it has a really long title um it's a meta ethnography which is a type of qualitative evidence synthesis if people are interested in that but um we um we wanted to involve people right from the start so even when we were putting in for funding we had um children young people with chronic pain parents and carers and others involved in commenting on our plans for the project and the questions and objectives that we had in order to make sure that it was relevant to to the people that were going to be affected so you know children's health services health and social care services for chronic pain and it went right through the whole process so more or less every yeah pretty much every stage of the the evidence synthesis had involvement from um children young people and and and their parents um and you know they're still involved now while we're disseminating which is you know it can be hard to keep people going but we started in 2020 with the actual project um and we've managed to keep um our group with us right up until now two and a half first so years later yeah thank you so my my one that I've chosen to talk about is maybe a little bit different in that it's a Cochrane review so obviously the the aim of the Cochrane review is that it gets updated every few years so I've chosen to talk about a review which is to do with physical rehabilitation after stroke and this is a review that was first published in 1999 2000 with no formal stakeholder involvement and then when I did an update of it I think this was the second or the third update in 2012-ish I felt as did others that this review was maybe not hitting the mark in terms of influencing clinical practice and policy it was just slightly off it was sitting on the shelf unread perhaps um which was maybe to do with the language that was being used within it within the way that the physiotherapy was being described so in that update we really wanted to engage and involve stroke survivors and carers and physiotherapists really to try and address that issue and make sure that the Cochrane review was turned around and was really properly useful and meaningful to the end user so I'll just move on as I'm talking about mine and talk about the next question which was about who was involved and how did you recruit them so I think I've just said in my example stroke survivors carers and physiotherapists that was very obvious selection for this review which was about the clinical delivery of physiotherapy the quite a complex question um within the UK guidance says that stroke survivor gets 45 minutes a day of physiotherapy but our question was really about what is the content of that physiotherapy what is the physiotherapist doing in that 45 minutes which could be a whole lot of different stuff um so in terms of recruiting people we decided we wanted to have a group of sort of 12 to 16-ish so that we could manage having face-to-face meetings very much pre-COVID times we were going to meet in a room and we wanted to have a group that was 50% stroke survivors and carers and 50% people with clinical experience and in terms of recruiting those we wrote a single page role description of what the role would be in that group that stakeholder group with lots of details in it so we actually specifically stated in that one page description the dates of all the meetings that we were going to have across the whole 18 months of the project so we were very upfront in terms of what we were asking people to do we circulated that role description using existing networks that we had so that went out via stroke charities and stroke patient support organisations and physiotherapy networks asked people who were interested to get back to us and give us a little bit of information about themselves and then we did a purposeful sampling in terms of try and pull together a representative group of the 14 that we ended up in terms of getting that representation we were looking for people that had different lengths of time post-stroke different types of stroke related disability or for the physiotherapists different settings that they were 10 so make sure we had representation of acute hospital care community care and different years of experience so we deliberately made sure that we had some newly qualified physios along with physios who were in various senior positions so we used that and pulled our group together I suspect that is very different from how Armin went about his so I'll hand over to Armin and hear what you did well thank you Alex it's a nice contrast because in comparison to the field that you are has been describing evidence synthesis has not been used in the in the field of GMO impacts so therefore stakeholders even did not know what evidence synthesis is all about how this is being conducted elsewhere and what it means to do a systematic review so therefore we decided to give a particular emphasis also on it's introducing them at the concept and the methods by using examples and also aimed at their broad involvement so we we used the kind of snowballing approach to start it off with the with 500 email contacts and then just try to spread it among a really broader stakeholder groups based in entire Europe mainly European Union which at that time includes the UK and so it was quite successful and and and they were kind of kind of inspired by the novelty of that approach so in our first I remember in our first face-to-face meeting and it was a multistag process we gathered 59 people in the room so it was really many said it was in a kind of successful but also difficult to handle these large groups but it also helped I think to introduce also the concept and a tricky thing was in this kind of polarized environment to get people understand what role evidence synthesis could play in a field where every bit of evidence is being heavily contested but perhaps we can talk about that later yeah I think as you say really nice contrast between yours and mine so yours sounds like it was very much an open recruitment and anyone who was interested should come along whereas my example we have sort of pre-planned people going to have quite a small group and try and get a representative group so yeah nice contrast of the two methods Steve what did you do yeah I think it's probably worth noting that for the example I'm providing it was really on the back end where there was a massive amount of engagement and that's again because we had this this process so this all came about because there were revisions to the fisheries act and this is the act that governs everything that's done on the landscape so if somebody is going to put in a road if somebody's going to build a house water rolls off the landscape and could effect fish so it's a very powerful piece of legislation and so there's a lot of interest among that broader community from developers oil and gas transportation sector the environmental consultants and then of course the regulators that had come up with this legislation so we were doing science to support the roll out of that that new legislation and I alluded before to this internal government process in fisheries and oceans Canada it's called Seathass the Canadian Science Advisory Secretariat and that's a process by which they take knowledge and evidence that comes from all sorts of different sources and use that to sort of bring it into that management sphere and so most of the decisions that would be made by managers practitioners on a day-to-day basis would be derived from what are called science advisory reports or SARS that come out of these seathasses so it was fine and dandy that we had a completed systematic review that would have been ignored okay so what we needed to do was to essentially go through that internal validation process to essentially establish institutional legitimacy for what we had done and that involves 70 plus stakeholders we met in Ottawa again pre-covid and Canada is a big country we needed representatives from all our provinces and territories because the geography and cultural contexts and fish communities are so varied across the country we had representatives from a number of those upper mentioned industry sectors and then within DFO we had their science team we had practitioners and we had some external academics and essentially it was like an additional layer of peer review but from a very practical perspective and at the end of the day we ended up with a bulleted list of take-home messages and that's what went to practitioners across the country that was message from them the systematic review wasn't slapped down on their desk instead it was this additional document so in some ways when I'm talking about stakeholder engagement here I'm almost sort of drifting into the knowledge mobilization and translation space but for this one I think it's unique which is why I chose this one that's where that process happened I haven't done that for others but I think again it goes back to who was the contractor and the fact that we really needed to to embed what we did within the within DFO processes and without that we wouldn't have had legitimacy it would have been just an academic exercise so it sounds like your stakeholder group was almost predetermined you know it was they were sitting there and ready to go to another nice different example from Armin went out and formed his by recruiting them I recruited a small group but you actually had those people sitting waiting to be involved in many ways yeah what about your example Emma? Mine's probably more similar to yours Alex um because I'm working in you know the health health field um I mean we we had to take different approaches to recruitment and we had like a core group of children and young people with chronic pain and their parents or guardians but we also went wider than that when we needed to and did ad hoc kind of recruitment and involvement so we had our initial patient and public involvement at the very start which I referred to earlier and then when the project started we we looked to try and recruit children and young people who are had different sex or gender um from age we thought age eight would probably be the youngest that could really contribute in the kind of way we were looking for so we wanted to involve them in decisions in the project and analysis so for our children young people we were looking for them from age eight up to 18 and we ended up going slightly wider than that because we had some young people who are maybe 1920 but who you know were very recently using children's services um um we wanted to get a range of different um health conditions that cause chronic pain so that we had good like a good spread of experiences people in different parts of the United Kingdom we're focused pretty much on the UK for this because our funder is interested in the relevance for the the UK's national health service um we wanted to get moms and dads or guardians who were not necessarily biological parents but it's challenging you know I don't know how how others have found it sometimes trying to to get that spread that you want and diversity you want is difficult so we we didn't recruit dads we had one person who identified as a boy the rest were girls with chronic pain um we didn't get a great range of people from different ethnic backgrounds most were Caucasian or white as you might call it um but we did get diversity in other aspects so different kinds of chronic pain conditions living in different circumstances different parts the UK um and we had our core group of 12 children young people and eight mums that came with us and then when we needed to get a wider view um then we we did things like social media engagement surveys online surveys to try and to get um wider kind of um views from a more diverse or broader group that's interesting so you actually had your group but then you were going out to other organizations of things to then get sort of add on to the involvement yeah it's not quite as broad as as Stephen Arman's approaches which is really interesting to hear about I wonder if we just stick with you ever going to the next question as you've just described your group so I think you're beginning to touch on that so the next question was about you know what was the mode of involvement or engagement so what did you actually do that you said to us you've got this group of children and mums so how did they and make decisions for your review yeah well we we knew in advance that we wanted to to get them involved in decisions around for example which kind of studies we included and to help us refine what we call our search strategy you know where should we be looking for studies and evidence um are there any experts we need to be contacting to to see if they know of studies in this area um so we you know we wanted to input in the kind of early stages of the design of what we were doing but we also wanted to get them involved in in analysis which hadn't been done before for a meta ethnography as far as I know um and is quite challenging because it's quite technical you know um and so we um we came up with some creative ways of getting them involved in analysis you know we used cartoons to convey findings and we were we were going to have face-to-face meetings you know we had funds from our funder the National Institute of Health Research to to hold face-to-face meetings COVID happened we couldn't have any face-to-face meetings so we had to do online which is quite different from what we had planned and come up with creative ways of involving people in something quite dynamic um using video conferencing and so you know we would send people materials in advance we we developed little cartoons to express some of the early findings either directly from studies that we weren't really sure about what they meant or early findings that from our analysis um to to find out you know what did it resonate with them did could they disconfirm any of the findings to clarify what things meant you know or some terms used in the literature that weren't very well explained like we talked a lot about having control over chronic pain in the in the studies but they didn't really explain what they meant by that and so we had discussions and input from our our group about what control meant to them which actually was quite varied and seemed to differ whether you're a parent or a young person and the idea of being able to control your chronic pain seemed quite negatively actually and so it really informed how we were thinking about the analysis and and then they were involved through you know the further stages around how should we be disseminating what's going to appeal to children and young people and they've um we're currently working on a an animation like a car um animated cartoon to convey the findings to in a format that will appeal to children young people so we've got our children young people deciding that the the style of animation that appeals to them looking at the the the helped us work out which findings we should be trying to convey in a short animation if it's three minutes you can't convey everything what's most important what's most helpful most interesting to people of your age and commenting on the on the little script little story that we put together to accompany the animation to ensure that it's understandable but it's appealing it's engaging and it's not going to make children young people feel bad for having chronic pain so um yeah so there's a question that's come in from youtube but i think actually you've done quite a good job of of addressing that the question is um what if some of the stakeholders don't have the technical skills needed to fully engage in the process i think you know what you're saying is that you took it down to their level you've got to take it down to the level of an eight year old and make it and so it sounds like you had a lot of discussion about terms and actually just chat about taking it back to basics absolutely and we had to get rid of all our academic jargon and find language that was meaningful to children young people parents and the feedback they gave us were that we were successful in doing that and that they could understand what we were trying to achieve in what's actually quite a dry you know academic piece of work in some respects and brought it to life so yeah so so if i move on and talk about what we did i think just because of some parallels between your experiences and my number and then we'll move to Armin and and steve and i think addressing that question as well when we formed our group of stroke survivors cares and physiotherapists we specifically said that they did not need to have any technical skills that we were interested in their lived experience of rehabilitation following stroke or delivering rehabilitation to stroke survivors we didn't expect people to know what an evidence synthesis was or what rct was and that we would tell them that so we pre-planned right from the beginning so before we'd even recruited the group that we would have three meetings and each of those meetings had a very free planned aim that we wanted to achieve so the first two meetings happened right in the beginning of the review update when we were sort of tweaking with the protocol so the first one was about working out what the scope of the review was and whether it was right or whether it needed to be changed plus how we would describe and categorize the physiotherapy interventions so that meeting we started by very very simple you know what is a trial you know people get randomized to one group or another we used an example of if you wanted to find out whether eating chocolate made you feel better just trying to make it meaningful to the people and then we went on talked about bringing trials together into a systematic review that worked you know these were not people that hadn't at the end of this an in-depth understanding of a systematic review but they knew enough to understand why having a way of describing physiotherapy was relevant to the bit of research that we were doing so and they were in a format where they could ask questions so I think if you'd spoken to any of the stroke survivors of Paris afterwards they probably wouldn't be able to give you a very good definition of a systematic review but they absolutely knew why it was important that we had a way of describing what physiotherapy was and what happened in that 45 minutes of physio you know was it a physiotherapist using a bit of machinery or was it the physiotherapist talking to them coaching them or was it them doing fitness training or was it them somehow having some hands on treatment so the patients involved understood that part of it as I said we had pre-planned aims for each of our meetings and a goal a decision that we were handing over to our stakeholder group so in order to make that decision we used relatively formal consensus decision making techniques so based on the normal group technique well we got the group to discuss around the issue so what should the scope be agree a statement and then do a series of voting so we really had objective data at the end to demonstrate that we had reached consensus on the way we're going to move forward so I'll hand over to Armin now I think to get a very different view on what you actually did in practice with your group of 59 people yeah probably yes so I think in our case well initially we set out to do stakeholder involvement across key steps of developing protocols and conducting systematic reviews or maps and with the with a number of of these activities doing in parallel but given the novelty of the issue to us as a team the many tasks in parallel and also the fact that this was a very controversial issue we actually we did not get to that really to finalizing the systematic reviews in the course of the project we rather be so our stakeholder engagement focused on developing the review protocols yeah so explain the method familiarizing stakeholders with what is evident synthesis is all about and how it can be possibly used then developing questions I mean it's a it's a huge it was a huge task to make stakeholders also understand what type of questions you can ask in your for systematic review and what you cannot and how broad and how narrow yeah yeah so it took a time and and also as with GMO impact assessments at all every single questions is being contested that means also the question actually in what particular what particular impact we would look at what would be what would be would be neglected for example so in that context so we our main focus was to develop a kind of long candidate list of review questions then discuss these review questions refine it prioritize them narrow it down and then based on that write and publish a review protocols so that was a multi-step process essentially three main steps developing the questions the candidate list then prioritizing them and and and compose the review protocols three main steps and we used face-to-face meetings we used email consultations written questions written answers and and and also we used online surveys in particular for for prioritizing the review questions as I mentioned we started off with 69 review questions and narrowed them down to a total of 14 and still at that time did some some refurbishing reframing or phrasing of the questions so it was he in and a particular um um well quite extensive task turned out to be the email consultations because in the entire process just developing the the the questions and the protocols we received and responded to in writing a total of 520 questions so they we we have we several volumes of reports which included the questions and the the the written responses of the team members so it was quite a quite a task yeah I've seen lots of parallels between priority setting processes so in the UK we have the James Lindelheim's priority setting process which is about generating questions for research more broadly but I really like that idea of doing it knowing that you would to do the evidence synthesis at the end so yeah really nice example so then Steve your example is different because you're at the other end of the review process I think aren't you so absolutely yeah and I told you that I was going to base this off of one review we actually ended up having to do two which was unplanned so we had our sort of gold standard systematic review and we really focused on the so-called high quality evidence and before we met with our stakeholder group they said whoa you know where's the rest of it you know you're telling us this is the extent of the evidence space and we said well no there's a whole lot more but it's biased in all sorts of ways and they said well we want to see it and so we ended up reviewing doing a parallel synthesis focused on the low quality evidence and then we brought all of that to the room with the associated caveats that that comes with that and that was a really fascinating process because it allowed the practitioners and you know folks that live in this habitat fish habitat world on a day-to-day basis to learn about the difference between something that is high quality and something that is low quality where bias comes from and from a really practical perspective it allowed them to also see what they could do to help build that evidence base moving forward so if the folks in the room as regulators they have to authorize habitat alterations and usually there's some monitoring along with that the monitoring was pretty pathetic frankly and so simply by having better standards for monitoring you actually can build that evidence base and then strengthen you know the you have more support for those you know decisions in the future so we learned a lot from that process about what stakeholders think of different forms of knowledge different studies so there was learning for us there was learning for them and and you know at the end of the day it allowed us to also have that literally side-by-side comparison here's the conclusion from the so-called gold standard with high quality evidence here's the rest and here's what you get and you do get different answers and so you know what are the what are the biases and limitations with that so yeah it was it was it was quite illuminating and it allowed us to make sure that when we were generating that again that science advisory report that internal documents for DFO that as appropriate we could pull some things from that that lower quality evidence base but doing so with eyes wide open in a very transparent way so it was really about the interpretation and implementation side where where those things sort of came together so yeah it's definitely on the the back end uh of the process uh Armin's example I'm impressed about being able to go from more questions down to fewer every process I've been involved with when you bring stakeholders in the room early on it explodes and there's mission-free been all directions so well done so I think given time we should skip forward to the question that we had which was about how much control did the stakeholders have and there's been another question coming from YouTube and that really is the same question so it says how do we avoid forcing stakeholders who lack the technical skills to just not agree to whatever we tell or show them how do we make them more critical about whatever we tell them in other words how do we avoid authority bias so I wonder Steve might make sense just to stick with you to start with on that as we've just been hearing about your project so at that final stage then did you feel that your stakeholders had real meaningful control over the decisions that came out? Yeah absolutely and I although I would also suggest that they all had technical skills they all were active in that fish habitat space so there were everybody in that room had some level of professional environmental training um so I I think that's a bit different uh but they all had a you know you know as the regulator they wanted to make sure the science supporting the regulation made sense if you're the industry player and you work for a hydropower company and at the end of the day you have to spend millions more you have a pretty big stake in it so there was a high level of engagement because this was part of a policy process rolling out a a new piece of legislation so again that made it very clear who the most obvious stakeholders were and if they weren't in the room we would have ended up with something that again lacked legitimacy not only within DFO but also externally so that was the importance there but again I think I a different one so I'm keen to hear from from uh from others that might deal with uh uh for example some of the children that Emma deal with which have certainly had experience but but probably no formal training in uh in in medicine for example yeah well let's go to Emma then so Emma did you think that the children and parents involved you know did what level of control did they have over the decisions that were made in that review that you carried out I think it varied depending on which part of the review we were working on uh we we didn't have a like a co-applicant or co-investigator on the study who was a representative of patient you know patients or the public and if we had then the the level of control over the the initial steps would have probably been greater and so that's something I would probably do different next time and there were certain things we had to do because that's the kind of accepted process you use for doing an evidence synthesis and so we we tried to make it clear what could um what decisions could be they could influence and make and which ones um were not up for discussion and couldn't really be changed and explained why so I don't know I think it's a really interesting question about avoiding authority bias I don't know if maybe um you don't always know if you've had authority bias and so it's hard to avoid isn't it but having ground rules as well during meetings and being quite open and honest about whether we as a team had a preferred option was important and makes it more transparent and then when we're reporting what we've done also to say well you know we went to our our group and we said this is what we were thinking would be a good thing to do it did they agree or disagree so if we had like a preference we tried to be just very transparent about that and where we didn't have a preference at all again we were we were quite opening and transparent about it um and just the way we interacted with people we tried to make sure everyone felt comfortable and that um they were listened to and that we took on board their comments um and that if we couldn't act on a decision we told them why but we always fed back what we had done as a result of their participation in meetings and and you know to try and and make sure that we um have a a trail basically you know a trail of how we decided things and what level of input our our young people and parents had had yes I think my experience quite similar to yours and we maybe formalized some of that sort of control so we had as I said I've already said we free decided which decisions we were handing over to our stakeholder group so there was some clear decisions about the scope of the review I'm not going to go into the technical details but whether we expanded into some of the international approaches to physical rehabilitation or whether we focused it down more specifically um there were clear decisions about what subgroup analyses were relevant you know so what questions do people have about the sort of subgroups that were relative to this so we've pre-decided that and we absolutely handed the decisions over to the stakeholders so that process of nominal group technique that I've mentioned so that the group had a discussion discussed around a number of things came up with the statement the researchers and the reviewers were involved in the discussion but didn't get a vote so the decision was carried or not on the views of the stroke survivors, carers and physios so I always was very much definitely in those decisions were in control but like Emma there were other parts of the review process where our stakeholder group had no involvement either we were carrying out Cochrane methods and they weren't involved in our development of our screening process or an assessment of risk of bias that was very much done without stakeholder involvement so we had those separate decisions I have to say as a researcher it was absolutely terrifying we had decided we were handing over decisions to them and I can remember that moment of panic when my heart increased as I realised that we're going to make a decision that was not what I would have made as a researcher so I can absolutely say our group had control over some elements but not all elements so then over to you then Armin with your yours I'm sure you had a very vocal group oh yes oh yeah so here we had I mean I was smiling when I read the comment about how do you avoid authority bias so because we had the quite the opposite problem in our case I mean similar to what Steve was describing so the stakes were quite high for everyone at that time there was a rumor that evidence synthesis might become part of the routine pre-market assessment of genetically modified organisms in one way or another so therefore there was a lot of interest for all stakeholders to understand what that could possibly mean in practice that also partly explains where there was a quite quite quite a number of participants involving but also explains then a strong interest to interact and perhaps even influence what is what is being what is being discussed and what is being decided so in that in that context we for us it was quite clear that we the the core team has to stay in control after the entire process it and it was fairly too new for our stakeholders and also for some of our team members to make kind of voting on all in all decisions and making steps so the only voting process that that we had is when narrowing down the the the review questions from 69 to 14 so that was the the real voting in all other questions we had kind of discussions and and and trying to balance it but the tricky thing for us was that these kind of these kind of divided world into two camps pro or con GM crops or GM organisms also stretched out into the into the academic world and into the review teams so we cannot even avoid that yeah so therefore I have to have to deal with that and the way we decided to deal is to force all our team members and to really consider every single comment made by a stakeholder carefully consider it and respond to it in writing and so that you clearly argue if you did if you have if you if you will consider it and if not why we will not consider it so so these kind of mechanisms helped us to make sure that that we keep a kind of balance but that that was really a title book after after so but where we've just got five minutes left so I wonder if we just do sort of another round and each of us reflect on the impact really of having the stakeholder involvement you know so what did it what did it change and was it worth it and so Armin why don't we stick with you I think you started to touch on on some of that in the last reply well I think we had there was a mutual learning process on two different levels one is that we all learned a lot about evidence synthesis and what it could potentially mean for this field so from and I cannot see that that it really led to a kind of explosions of evidence synthesis activities in that field but from them I see every now and then I see kind of reviews systematic reviews and evidence maps emerging in that field and being used far from being routine but still it's it's it's it's starting and that that was one comment the other issue was it's a learning about stakeholder interactions and about trust building so I probably didn't mention that but it was a three-year process just for essentially developing the review questions prioritizing them and developing the protocols and starting the reviews not further than this alone took took three years so and we had several interactions and it took that many interactions to get to know each other and to kind of make sure that there is a kind of mutual understanding and trust and even if if stakeholders would have very very different views that there will be this element of respect and they will be prepared to listen to each other and to interact in it in with each other in another way than just repeating already general views of their their organization organizations they are representing that took really a long time and but after three years we had to conclude it it was worthwhile because because over over time it really worked out yeah so I think in a nutshell it's a huge amount of work but worth it so Steve I think we sort of know that you probably say that yours was was worth it but I mean any final reflections there on the impact yeah I don't think I'm going to say much more about the impact but I do want to touch on co-production very very briefly and so certainly in the environmental space we're sort of in a you know the era of co-production making sure people know what that is and how to do it well that's usually on the generation of new knowledge and so when it comes to evidence synthesis I think it's really about the co-assessment you know collaboratively evaluating what the existing evidence is what it means how to interpret it and so yeah I think just as we think about how these terms and different levels of engagement and what we're looking at I think we need to make sure that we're appropriately trained to engage in those activities and that you know yeah we're not with evidence synthesis I don't consider it generating new knowledge it's it is that synthesis and so I think co-assessment really is a good description for I think much of what we talked about today thank you yeah yeah I agree thank you Emma and your initial question was impact Alex I know we haven't got much time left um well yeah and was it all worth it what did it change or was it worth it in two words absolutely and I would do it all again and I would do it even better next time yeah yeah and I think my conclusion would be the same absolutely it changed the review was very very different of the involvement it did make it more clinically relevant but yeah like everyone's saying was a huge amount of work and a huge amount of to learn about how we do it well um to have this optimal impact from it so yeah I look forward I'm on my next update of my clock review so doing doing it again so um trying to put some of that learning into practice so yeah it's been a fantastic conversation I've enjoyed it I feel like I've learned a lot um so thank you yeah it's been great I've learned loads it's lovely to meet you all those of you have not met before I think that's it for me to say a final few words thank you all for your inputs I've learned loads too so it's been really really good um and yeah that's it for this session uh we we hope you enjoyed it as much as we did and uh yeah thank you again to