 Hello, everyone. I am Terry Piggott. I am delighted to be giving the keynote at this year's conference. As you can see from the slide, I am a professor at Georgia State University in Atlanta, Georgia, and I have a joint appointment in the School of Public Health and the College of Education and Human Development. And the title of my talk today is Synthesizing Communities, Improving Evidence Synthesis through Collaboration. And what I want to do today and our short time we have available is to talk a little bit about my career, my 30 plus years in evidence synthesis and meta-analysis. I'm going to share with you a few of the key learnings that I've had in my work, particularly my work through collaborating with others who are also working on evidence synthesis. So let's start at the beginning. I first got into evidence synthesis when I became a graduate student at the University of Chicago. This is a picture of one of the buildings, IV covered buildings of the University of Chicago, where I was getting a degree, my PhD in a program called measurement evaluation and statistical analysis. And when I first got to the University of Chicago, I was of course assigned an advisor. And that advisor was some guy named Larry Hedges. Those of you who, those of you might know who Larry Hedges is, those of you who don't. If you do any work in evidence synthesis and use standardized, the standardized mean difference as an effect size, there is an effect size called Hedges G. That's Larry Hedges. Hedges G is a small sample correction for a standardized main difference. Okay, great. The first thing I did when I arrived at the University of Chicago was help Larry, and his advisor Ingram Olkin, edit and do all the examples for this book called statistical methods for meta analysis, which was published in 1985. So, I was there, I guess, at the be at the beginning at the, at least at the first book, the first book that I know of published about meta analysis. From there, I had a my first publication was on was a systematic review or meta analysis in that case on group based mastery learning programs and education. I was working with a researcher called named Thomas Husky, who did a lot of work on professional development of teachers and schools. And so this first article 1988 as you can see here uses meta analysis to look at 46 studies on group based mastery learning. As you can see, if you might be wondering why the screenshot of this paper looks so weird. It's because this was again, pre PDF. So this is a scanned version of my paper that is, is how you can access it to this day. At any rate, I'm since 1988 I've been involved with a number of organizations devoted to evidence synthesis. Most notably the Campbell collaboration. I've been involved since the beginning of 2000, the 2000s probably since about 2005. If you don't know the Campbell collaboration, it is an international, an interdisciplinary organization devoted to the conduct and the use of systematic reviews in a wide range of social science areas such as education, criminal justice, international development, social welfare and other related areas. I've also been involved and you can see the little bit of the logo there with the Cochran collaboration through my work in the Campbell collaboration. I've gone to and interacted with many researchers in Cochran and participated in meetings at the Campbell collaboration and joint meetings between Campbell and Cochran. I'm also a member of the Society for Research synthesis methods, again, an international and interdisciplinary society for people who are interested in new methods and developing methods for evidence synthesis and meta analysis. And I currently serve as the co editor in chief of the journal research synthesis methods. All that to say that one of the themes that threads through all of my work has been collaborations has been working with others who are doing this, who are trying to improve evidence synthesis as a whole. And when I think about my own role in in this work in this effort to improve the way we synthesize evidence in a variety of fields. I think about myself as an in between as being in between sort of the translator of new methods from my more statistical colleagues to the practice of evidence synthesis. And in my work with others who are trying to apply these methods at the at first, as you can see the beginning of my career was sort of as a translator of Larry Hedges work, applying it as soon as I learned how to do it. In teaching and in my own research. I have come to understand the, the critical importance of collaboration for moving this field forward. And so let me share some of those key key things that I've learned in my work in meta analysis and in evidence synthesis. I found that I must interact with colleagues outside of my discipline who are using and applying systematic review and evidence synthesis techniques to their context. Now I'm not saying that this has been an easy task throughout my career. I often interact, and I often interact with other researchers who are using evidence synthesis who are developing new methods for meta analysis or systematic review. And we find that we're using different terms for the same issue. And then sometimes we find out that really we are talking about different issues. What I have come to learn is that many of the challenges that we face in applying evidence synthesis to ecology to environmental issues to education to health. Many of the challenges are similar. And so that many of us are working on similar solutions to actually the same problem. So let me give you some examples. For example, solutions to organize, streamline, increase the accuracy of screening. Those, those solutions have been mostly developed in health, and many have been developed here at this conference. But social scientists have gone ahead and adapted those solutions. So we haven't had to create our own solutions. We've been able to borrow across. I'm not sure and I, I, I see a lot of times in my teaching of systematic review and evidence synthesis methods in my own discipline that many of my colleagues are not familiar with the solutions that have been developed and tested among people outside of the social sciences. And so it's so important that we understand, you know, the solutions, you know, that are being developed for similar problems across our different disciplines. And a second example I have is one of a paper that of some work that I've done. I distinctly remember many, many, I shouldn't say well, probably many years ago, sitting in a room with my colleague Jeff Valentine, listening to some Cochran colleagues talk about this challenge that they face, called outcome reporting bias. And we were listening to a paper I think actually about this. And we were listening to researchers talk about this exact paper I have listed here outcome reporting bias and industry sponsored trials of gabapentin. We were listening to them talk about how they compared the number of outcomes that were measured and in primary studies on at gabapentin and they measure and they compared those, those outcomes that were registered in a protocol to the outcomes that were actually reported in a published study. And I leave it to you to find out what they to read this paper and find out what they found. But let's suffice to say that there were a lot of outcomes missing in the published papers that were allegedly gathered in the protocols. And as we Jeff and I were sitting at this meeting listening to a problem that was not in our own area, we began to think well, I wonder if this occurs in education. And I wonder how we can study an issue like that, given that we are in a context that doesn't have a strong history of publishing protocols. And at any rate that that listening sitting in a room, listening to how people are addressing challenges that they face in their in their own context, led us to some really interesting and fruitful research about our context in So a second. A second thing I have learned through my work in evidence synthesis is how important it is to work on a team that is doing evidence synthesis. And evidence synthesis as I think everyone who's listening to this talk knows is a team sport. No single researcher could ever possibly have all the expertise needed to complete a complex synthesis on their own. And working as in a team as the methodologist as the person who's supposed to to do the meta analysis part has highlighted any number of challenges that need to be addressed. I think in improving evidence synthesis. So a couple of examples from some recent work of mine. I recently completed two different systematic reviews with two different teams. The first team with my colleague Julia Latel. A particular systematic review is published in the Campbell collaboration, and it's looking at the effectiveness of a strategy called multi systemic therapy for youth aged 10 to 17. And finally, I was working with an early career researcher Priscilla Lou who's at Southern Methodist University. I'm looking at estimating the correlation between big five personality domains and alcohol use. And as I was trying to. And in both of these cases. We were trying to apply the most recent the most up to date models for meta analysis. Those would be models that reflect the complex structure of effect size data that we have multiple effect sizes nested within studies. We were trying to apply multi level models that took into account taken to account the dependent effect sizes that exist. And in both of these contexts what we've run the problem we ran into was that there was one huge trial. We included in each of these systematic reviews, a trial that sort of dwarfed all the other primary studies we had. And so a question a lingering question that came from both of these systematic reviews was asking about the appropriate meta analysis. When we have this kind of complex data structure. So if you think about multi level models, if you have one huge cluster and then a bunch of small clusters, what is the appropriate model. That's led to led me to be thinking about, you know, what are the conditions under which these models operate. Well, what are the optimal conditions for these models. This also led to another collaboration between my early career faculty member, Michael, family, and my colleague, James Pustiesky, and thinking in very sort of, and in thinking about sort of where what are the conditions under which the power of these models is optimal. So going back to the systematic review on alcohol and big five personality domains, you know lingering question there and one that I think many of us who do systematic review in the social sciences at least I'm struggle with is how to appropriately assess the quality of observational studies. We know very well, what are the issues that that can bias our conclusions for studies that use experimental methods or quasi experimental methods. It's much harder for me to provide advice to people who are doing systematic reviews looking at correlations, you know that include really observational studies. So these are just two examples of the questions that have arisen for me as I've been working in these in these larger teams who are doing systematic reviews, and how important that interaction has been for me to think about new ways of solving common challenges. Finally, I think I've come to over the course of my career come to to be, or at least aspire to be an open science practitioner and to support open science practices in my in the projects that I work on, and in my own work. I don't think I can emphasize enough that as I've been looking through my career on one key revolutionary idea has been that we have now have solutions for meta analysis in our that are openly and freely available. Back in the day when I started in 1983, not only did we not have PDFs, we also didn't have solutions that could be easily implemented in software. If someone created a new systematic review model or new method to do for meta analysis, we'd have to wait until someone could implement that in a way that could be applied. And if we are open source in our, we can immediately implement new methods, and, and even in more importantly those new methods can build on things that we've already that we already have available to us, like programs like metaphor, where now people can build off of what Wolfgang has so generously given to us to use. The same issue with open protocols and evidence synthesis materials having having those materials available things like coding forms data analysis scripts. We've always had them, or at least partly we've had the protocols and coding forms and so forth available in Campbell, but also moving toward having open data and open data analysis scripts. It's going to also is has also been revolutionary and important for the field, because now we have these resources that are not just about accountability for our own work, but their resources for others to use to improve their practice. And I'm for me to find code to do the thing that I'm not quite sure I understand yet. So my challenge to you as I finish up my talk here is to tell you that is to ask you to please attend a session outside of your discipline. Think about how that work from someone outside your discipline could relate to the challenges you face and evidence synthesis and how those interdisciplinary collaborations can help us move the field a little bit forward and how we can find ways to improve the way we synthesize evidence and the tools we have to help others to do that. Collaborations have been key in my work. They have inspired me inspired as I've shown you in many different projects and and also have been fun and continue to make me think about all the different ways that we can improve what we do. So in that light, I just wanted to thank a couple of collaborators of my own new and old or older. There's a picture of my current collaborators, they work on my team at Georgia State on the left is Jay Morris on the right come all middle Brooke. Then the picture on the left are my always collaborators, I might have to say, I'm Josh planning on the left, Beth tipped in in the middle, and on the very far right, Ryan Williams. To all those I haven't mentioned. Hopefully you know how enriching our collaborations have been and how much they have helped me to think through my own practice and helping to push the field forward. So thank you for listening. You can contact me at tpigot at gsu.edu and my Twitter is at Terry Piggott. Have a wonderful conference.