 Great, we go. Good. Welcome everyone. My name is James Wilson from the Research on Research Institute in the UK and University of Sheffield from where I'm speaking to you right now. And it's my pleasure and privilege to be moderating this first part of a two part discussion that we're going to have on the important foundational question for a meeting like this of what is metascience. I think the original prompt for us, including this, these two sessions in the agenda for this, this year's meeting was a very interesting debate that ran in the sort of margins of the 2019 metascience meeting that some, many of you may have may have been part of I was there. Stanford, along with others. And this debate was essentially prompted by the fact that the metascience meeting was happening simultaneously, while the STS community was meeting. New Orleans I forget now elsewhere in the US at that time for the four S meeting. And I think there was also a day of overlap or a day of two of overlap with the ISI meeting for the, the society of scientific traditions of which Cassidy, one of our panelists today is the president so there was this forced discussion as to as to what is this thing happening here that says it's new and different in some ways from these other more established communities that are also meeting at the same time and it prompted an important set of reflections so we are keen to dig into all of that. And in these discussions we want to really ask. When we use terms like metascience meta research science of science or research on research as in my own Institute, what are we actually talking about what's new what's, what's old what's borrowed. How do these categories and terms relate to both what's gone before and what we're all doing now, and particularly I guess being constructive as it were how can we build both better shared understandings of the projects and priorities to which we're all directed, and build a stronger shared agenda through these different communities, while I should say not forcing everyone into the same narrow category that's our objective at all. So we have a fantastic panel I'm going to shut up in a moment and hand over to them. I'm really pleased we've been joined for part one here by a really good range of speakers who in their breath and depth of expertise reflect some of the energy and excitement that we see in these debates and fields right now so we're going to hear first in a moment from Cassidy Sugimoto, based now at the Georgia Institute of Technology. Cassidy is going to be followed by Saira de Rijka the director of CWTS the Center for Science and Technology studies at Leiden University in the Netherlands. I'm Cassidy Santo Fortinato who's director of the Indiana University Network Science Institute. We're then going to hear from Aileen Fife, who is a professor of modern history, the University of St Andrews. And finally last but by no means least we've got Charlie Ebersol, who's joining us from the University of Virginia and Charlie is postdoc research associate working in science better science from a psychology perspective. You see even in the lineup, some of the challenge and excitement of the tasks that we have. We thought we would avoid big upfront talks and try and get a conversation going both initially among the panel. And also I hope very quickly, including as many of you as possible so please do make liberal use of the chat discussion question functions. So we can pick up both on those in the box and also maybe here directly from some people as we get into to a discussion. So I am going to ask each of the panelists to sort of opening questions to get the discussion going and then we'll take it from there. Cassidy, I'm picking on you first. You wrote a very powerful punchy review a few months ago in nature of the book the science of science that came out recently from Dash and Wang and Albert Laszloh Arabasi. And you criticized their book to some extent for describing the science of science as emerging without engaging sufficiently with its historical or interdisciplinary foundations. And I just thought not going into forensic analysis of that book in particular, whether we could take that critique, which was as I say expressed a bit more broadly about the science field certainly around the 2019 meeting. And as the starting point really to just explore. As I say what are we actually talking about here what's, what's new. And how does it relate to what's gone before, and how do we navigate between areas of novelty and progress, while acknowledging the foundations and ongoing work in various disciplines and sub-disciplines on which these build. Is there a middle path between hype, columbus and erasure, and on the other side seeing nothing new at all under the sun. So a bit too often discussed there Cassidy over to you. Thanks James yes I could certainly do a week's lecture on that but I'll try to keep it to the three to four minutes that you gave me. It's a fabulous question right what's new here, how does it relate to what has come before. So I'll sort of start with my own conception of meta sciences I think that's important right the meta sciences at their heart. Have the primary object of study as science, science itself and I mean that writ largely in the sort of a Latin sense of sciences right all of these knowledge bearing kinds of activities. The fields are delineated by their cosmological and disciplinary classifications whether it's philosophy history sociology economics psychology, or science of science, which uses scientific techniques to understand the mechanisms of science. But the simplicity of these descriptions I think mass the complexity of nuanced political and ideological wars that have been waged in defense of certain approaches. These descriptions and somewhat evoke what Andrew Abbott might call the fractal dichotomies of the meta sciences, the positivist versus the constructivist the quantitative versus the qualitative the theoretical versus the empirical, which we spend a lot of time debating. There's been argued, for example, that there's been relatively little interaction between scientific scientific matricians and sds scholars since the late 1980s, what people commonly refer to as the quantitative and qualitative areas of the meta sciences though I abhor that dichotomy. However, like any good narrative, a protagonist is in need of an antagonist. These areas didn't develop in parallel or a complete isolation but in close interplay with one another. The citation debate, for example, can only occur when these fields are in dialogue to juxtapose constructivist and positive narratives the fields had to read and engage and debate with each other's work. So one might argue that the meta sciences are in the global context of science, more similar than they are dissimilar, but it's in the controversies that they actually build knowledge, and this is nothing less than the accumulative model of scientific development. Another danger as you sort of reference is either to intentionally or unintentionally ignore the work of these other fields what you call erasure or Columbus saying but what has been referred to lately as epistemological trespassing. And so to your point on the review there's been several explanations not just in that book but in several other ways, the emergence of a new field of science of science or the meta sciences. And these calls try to bring attention to what they consider a nascent field, which negates decades of research on the area. And this isn't just a political stance but one that in that eraser causes us to advance less quickly. Right, it takes us back years it brings false assumptions into the science studies in ways that are very damaging. So we know that director so a price uses phrase more than 60 years ago science of science has been around for a long time. And there's been different justifications of the emergence argument, whether it's the introduction of new data sets, however large scale data sets have been around for some time, people said oh well it's the entrance of new individuals like those from physics, the director solo price himself was a physicist so that's not a very compelling argument or novelty. So where meta sciences is new. Well, Morris coined it in the 1930s calling it an inquiry into the methodology and philosophical implications of scientific investigations, also not new. Now some people will say the meta sciences is really focused on universal laws, and that's the new contribution. But you can take a number of examples and I'll pull from Bordeaux in 1968 he called for the unity of the meta sciences, implying that we might be able to find agreement across scholars from all these different epistemological frameworks on the founding principles of what science is. And in this way Bordeaux is not any different from price each seeking universalism universal laws or theories to understand science I'll be up from wildly different premises. So sort of close with this is there nothing new under the sun, perhaps not. Maybe that's not a problem right, but I'm hopeful that the biggest contribution of the contemporary era will be one of coordination. And I want to be clear and I think you sort of implied this in your opening statement James that coordination doesn't mean that we are in complete accord with one another. I think discord is an essential fundamental element in science, and I want us to be an open debates with one another, and that requires dialogue. And I think that's one of the things that's missing from some of the newer entrance to the field the one directional conversation around the review referenced above is a stark example of that. So one might question whether we can coordinate around these distinctive epistemological frameworks one might argue that constructivist work is in direct contradiction to positivist work one cannot coordinate across those. But I think it's precisely that opposition with strengthening fields, having to defend an approach against the various alternatives to explore other ways of knowing in this disagreement, we get closer to true understanding. And I find it not impossible to imagine a future in which scholars can move seamlessly across all the meta sciences, the sociological the historical the economic, the empirical to have a broad understanding and deep expertise to build teams that are representative and therefore more robust, and to interrogate current theories develop new ones triangulate across methods diversify our data source. And this must be done in open and constructive dialogue with one another. So today's panel I think is a perfect example of that so I'll sort of leave that as sort of my opening statement and turn it over to my colleagues for their comments. Brilliant. Thank you, Cass. That's a fantastic start. It raises about a dozen questions that I'd like to ask I'm going to resist that temptation, but I mean just to pick up on one thing. I mean, I said we weren't going to focus on that review but as you say you've reminded us that the constructive, you know, debate disagreement is a good thing and indeed is one of the foundational things on which we build in any field. When you wrote that review you must have obviously anticipated it would sort of raise a few eyebrows sparkly debate did it have the kind of result that you thought it would or that you expected. No, I think the most disappointing thing is that what I heard back from that was overwhelming support people writing thank you for articulating what I wanted to articulate that is my perspective to hundreds of people tweeting and telling me their support of the article, but not a single response back from the authors on their reaction to it their defense of their argument. And what I wanted to do was not to produce a punitive diatribe but to open a conversation with that field and that conversation never happened not because I wasn't present for it but because they weren't going to pick up on some of that and try and have bits of that I should say dashing Wang one of the authors of that book is is in the part two panel for this so perhaps the conversation have it happen indirectly through that. Great, thank you Kathy you've given us a very rich starter to the meal ahead and we're going to move now to Sarah Ryker from Leiden. I'm a collaborator of my own so so it was good to see Sarah and debate these things with her we certainly discussed these things at length over over a number of years, but Sarah with colleagues at CWTS and others at Rory you've been doing to try and map and navigate make sense of the meta scientific landscape. And I just wonder if you could give us a few headlines as it were from that kind of analysis what can we say in 2021 about you know who's involved in meta scientific work, where are they working what are they working on. How is that picture changing. And I suppose being a bit deeper into that landscape map. Do you have a sense of what meta science. If it is one thing indeed is trying to achieve and who is it for you know who's this project, who does this project serve. Sarah. Would you allow me to briefly respond to custody and to your question. Yes. Yeah, I really admire that you wrote that review and I think I also agree with with some of the points you just made I just want to qualify one thing I think from the what I interpret as some of the discomfort from parts of the STS community. That starts from a different starting point I think, because that part is also really skeptical of the universal claims that you now adopt and uncomfortable, I guess with with more negatively put more technocratic and scientific inclinations. So, when indeed James and I were talking about these issues we were really thinking about so what what is what is shared and common because we do want to build these bridges. But from the point of view of that STS community that I'm now referring to I guess but it's not the trespassing, but it's the fact that there's the risk of erasing and maybe repackaging interests in in research on research that are in at the interface of science technology and society, and the society part. Maybe sometimes disappears from you when we focus a lot on the more methodological questions or the, and it makes a lot of sense seeing where where some of these questions in parts of the meta research community come from I will talk about that later but rapid disability issues of course and and so the concerns that come from these disciplines. And I guess what what what's shared in those backgrounds between STS community and meta research communities that you see these these scholarly identity so they were trained in disciplines and then they recrafts these these identities. I think that's that's what we all do, and also build on certain commitments that you care about and engage in, but then maybe in slightly divergent ways as well. So I wonder if at some point it would be more productive to move away from these more foundational discussions I think it's necessary to do this right now sometimes into pushback, but also to look at if we can find shared shared topics to work on and then see if we can find ways to do that from these different perspectives and views that we can bring so thinking of open science for instance or integrity issues. So I hope that is a way forward. That said, I do think you need to push back and you did and that's that's totally, I think I totally support that but I just wanted to, I think, qualify it a little bit. So yeah, so, so James indeed we've, we've been trying to get a very particular handle on this research space so more scientifically. A huge shout out to Luda from suggest Helen Buckley Woods at Sheffield and also Simon Porter at Digital Science, we actually did most of the heavy lifting. So there's a lot of labels floating around, a lot of definitions, what we tried to do without colonizing anything but it's very difficult is to use I guess this label of research on research, not as just another field but looking at this term as an umbrella term and to find commonalities within the pockets for instance of STS that do research on research because I don't think all STS would qualify in the stricter sense as being research on research. It's also research on technology or whatever happens in society can also be part of STS for instance. So what we did was we did an initial study in 2019 that was for the launch of Rory. We did a more sophisticated analysis later. And, but already from that initial study it was really interesting to see how we could clearly distinguish these existing and emerging academic communities involved. So you have four prominent ones in the social sciences including Sintermetrics and STS, Innovation Studies, Higher Education Studies. And the science of science is more under the heading of the natural sciences and then the meta science work and communities are more in the behavioral and health sciences and also parts of the social sciences. So that really triggered our curiosity. We want to also gain a more deep understanding of how the fields and how the topics evolved. And so what we did was we looked at eight fields so we looked at philosophy of science history of science STS, Innovation Studies, Sintermetrics, Higher Education Studies, Meta Science and also Science of Science in a period from 1950 to 2019. So we looked at a couple of things I'll just go through them very briefly and then we can open up for others. There's a larger report coming. So what we first looked at was the number of publications in which the fields is mentioned in the title order abstract from 1950 to 2019. And what you then see is basically that around the 2000s, things really start to accelerate and you could say there's a publication boom in research and research fields. First in the existing fields, such as philosophy of science and then followed by history of science and Sintermetrics and STS and a bit later so around. So basically I would say the time in which that issue of fraud and reproducibility started to emerge from 2013 onward to also see this this meta science community come up. And what we then looked at was every what what is everybody working on. What we did was, we looked at the time trend in the number of papers per research on research topic and the topics we explored included things like innovation policy or funding or careers impact culture integrity. We found this again that this all started to boom and accelerate around the 2000s with the biggest jump made with the number of publications on R&D actually, but also worth mentioning our open science and scholarly publishing research quality research careers, which took off from the 2010s. And when we looked at the countries contributing most to each topic we noticed that most of the research on research research work comes from the US and Europe. And, for example, a lot of the output on innovation policy comes from the EU from Europe and on careers from the US. That also goes for work on integrity, which surprised me a little bit actually, because of all the framework programs in horizon 2020 on integrity in our eye but it may be a bit soon to tell, because this data goes into 2019. And, well, finally we looked at the representative journals of the six research on research fields in the analysis that we could do that for. So philosophy of science, higher education studies, innovation studies, etc. And in the period 2010 to 2019. The six countries that have contributed the largest number of publications are indeed the Europe, the US, the UK, China, Canada and Australia. And I found it very interesting also to see confirmed, I guess that philosophy of science is big in the US and innovation studies more European affair and STS and history of science, Europe, UK, US. And that things get a bit more distributed across the countries when we look at higher education studies. Much more to say here, but I'll stop here. Thanks, Sarah. That's great. That's really valuable and we should say there will be a study that Sarah refers to is, we'd hope to have it out for today but it's, it's almost there to be coming out very soon. We're moving on next to Santa. So we've heard in a sense from someone at the heart of Scientometrics, Sarah sort of STS but so with a foot in the Scientometric camp. Santa obviously through the network sciences you're, you're at the very heart of one of the areas that's been most dynamic in recent years you know the application of network science to these kinds of questions. And I think many people would see the sort of work that you and colleagues in that field doing as sort of representative some of the most exciting stuff that's going on right now. Could you give us a sense from that perspective as it were as to sort of both what's, what's now possible and what's becoming possible sort of where you see the, the direction of your piece of this moving feast as it were, where you see it sort of having a heading over the next five to 10 years, and I guess also then how you, you would describe and relate the kind of work that network scientists and some of the sort of big data around analysts are doing in metascience to these other dimensions of metascience that we've been touching on through through Cassidy and Sarah's contribution. Again, sorry, these are all multipart questions I apologize for that but over to you Satay. I'm going to take 30 minutes instead of three minutes to give me. So first of all thanks again for inviting me here I appreciate very much this initiative I wish there would be more in the future. In fact what I was about to say I mean it's also somehow gives an opportunity also some others to comment on Cassidy's initial remarks as I'm not an author of that book but I'm a visible member of the community. I think that, and certainly it's never been, you know, the attitude is saying, oh, here we are. Now we show you how to do well, what you didn't do well for 50 years, I think. I think I feel I know a lot of people in this area, especially the two wrote that book. I know that's not the attitude. The idea is that we bring a bunch of computational tools and techniques that were not so could not be so productive not so perhaps they didn't even exist like at the time when small data sets were available. But I make a lot of sense and they have and they can give unique insights when, when the dimension grows when you approach the so called infinite limit that makes for people like me thinks in fact, easier than when things are small and very you know, when the scale is small. So, I will the same kind of discussion with the network science and we broadly meant when when I say that the science I prefer to think about everybody who studies network so including people who started studying networks in the first half of last century, you know, in mathematics we run the graph theory or social network analysis, you know, more limited of course in the case of social network. So what's different what's changed. I mean, okay, now of course we have large data sets. And that's not actually our credit not just the way it is and it's easy to get this information is easy to process it to have the resources to to analyze them. What changed is that the statistic I mean I treated is very well exemplified by a nice figure that Mark Neumann one of the kind of the founders mother network science shows at the beginning or some of the stocks, in which it shows basically a huge network in a single like you know slide. There are a lot of dots which are completely meaningless a lot of lines and so on and that don't make any sense of this and say, when you had these small sociograms that people were using before and not because you know they're not choice because they were doing surveys they could also sample just a handful and like you know 20 people 30 people at the time was difficult to get this level of information about social context. You could make sense or visualization like this you know because you could tell this note is important because this edge is important because this group of notes and so on. This is really the statistical dimension the fact that you can characterize classes or not to think about classes or notes classes or edges and pushing to the infinity limit to the limit or infinite size allows us also to to make sense of a bunch of phenomena that will not be visible on small system for instance phase transitions abrupt changes in the behavior of of a network specifically. These things are not. We can make a lot of sense with these things we know where they come from we have the background of statistical physics tell us exactly what the options are what the behavior is. And that's also powerful because it allows us to make prediction about this. I mean of course I know that they are not in the audience perhaps not so many so I don't want to elaborate on that but this is one of the things that we bring into the picture, you know. Analyzing large scale phenomenon you know with the lens initial statistical physics but in the meantime, of course, there have been a bunch of other communities computer science has been of course very heavy. We provided a unique set of very powerful argument that didn't exist 20 years ago, it's not that they were better, there were no arguing 20 years ago, specifically clustering algorithm. That now a scalable and allows to break down a system with millions or not pieces that make sense, specifically in the context of meta science this could be groups of scientists working in the same topic, or, you know, group of papers are related to the same topic. And, and the same time of course you can cut astronomical processes on that you know studies like diffusion processes epidemic processes which are so fashionable today for all these reasons. I mean, through these approaches you can make a lot of sense you can make you know, you can start from a simple model but you can use the power of these large networks to determine and sometimes even predict what's going to happen tomorrow, one week one month from now. So specifically in the context of meta science networks have been there for a long time, you know, 1965 science paper network scientific papers by the solar price was a physicist like me, experimental physicists in this case. I don't feel I'm doing anything different than what it was doing, you know, I measure this thing I tried to make sense of it with a simple model. 11 years later he proposed this in a very nice paper and jcs. We basically proposed the rich gets richer phenomenon applied to net us for the first time should then was then generalized to broad cluster systems. And, and, actually, I just because you're in preparation for this meeting I took a look at those papers again yesterday I say we could have done those things I'm easily knowing because what you knew and having the insight that he had. Unfortunately, I don't. It was too late anyway. So I don't feel I'm doing anything different disrespect but of course our communities broader there are also people not from physics and you know that they can use of course they're on computational approaches. So, like once more. So the larger the graph basically the take a message in terms of you know the new perspective the larger the graph the easier it is to discover a simple mechanism that explain the formation of this graph what's behind it and that's what I'm mostly interested in because you know we're broadly interested in the same kind of questions. But we are, you know, take them with different approaches sometimes we have also different intentions I'm more interested in the mechanism for instance I mean I'm more interested like you know in the rest of us because this is what my nature was physics. The physics brings me to do that does not mean by any means at least as far as I'm concerned that we don't acknowledge the complexity that you have that people are not particles and not atoms are not electrons that in many contexts, these laws are not attainable that you cannot forget the characteristic of single domains and disciplines so I'm totally aware of this at the same time you know, since you will find some regularities. I would keep chasing regularities as far as I'm concerned and try to explain them in simple models because that's what we do in other domains as well. Looking forward what's going to happen the next five to 10 years as part of your questions. Within the context of network science I can see two interesting developments which are already going on, and I expect good news from these things in the next five to 10 years. One thing is that when you think about networks of scientific networks like noble people papers ideas. I mean, science is not a single network sense is a complex organizational multiple interdependent networks. Multilayer networks is the hot topic now within network science. There's been even a book recently published by French in Estabian colony which summarizes the principle of this and the main techniques and problems. And I expect this to be applied more more also in the context and not only but also in the context of meta science I will certainly do that. And the other thing is a very promising interface between artificial intelligence and networks specifically using graph embeddings ways to embed. And basically to turn networks into groups of points which are not easy to make sense of because then you have a continuous distribution of points you can make a very powerful analysis that specifically about the possibility of using both structural information about non structural information like metadata or semantic information. So you have the network of papers citing each other but you also have the abstracts of the papers of the titles of the papers keywords sometimes even the full text, and you can use both things at the same time. This is certainly new and innovative, and I expect very good stuff to come from from this. But in terms of, you know, how I position this meta science with their position to see, you know, you know where you are but I really hope that eventually, like, like also Cassidy said that we are realized that they would take on most of the same questions that we may have also different goals and approaches, but, you know, I'm a physicist so I'm used to a system in which the theory dictates the experiments in this field I feel like I'm the experimentalist that but I need the theories and the theory come from sociology, and from many other fields that I don't know so well. So please explain your theories so that I can understand better what I can measure, and hopefully what how I can contribute to this area. Great thanks for that's a nice constructive bridge building note to end on thank you that was great. Lots again lots of food for thought. I can remind everyone I can see we're up to a sizable number of participants but do keep popping questions into the Q&A box for when we get to the end of the panel because we'll be turning to that first for for our source of Q&A. Moving on then our next speaker is Aileen Fife from University of St Andrews. We've already heard a couple of references Aileen to the history of science philosophy of science. And those fields of course have a legitimate claim in a sense to be the original meta scientists although perhaps it's not a term that will be used in self description by that many people working in HPS and you of course in your work to do what you've done on the history of peer review and other things have tried to make some of these links and lineages more explicit. So I was keen to just hear your thoughts really on the role of the contribution that history of science in particular is making should be making can make to meta science in this in this broader sense. And your reflections as well on what some of the other speakers have said would of course be interesting to try and join all this up so Aileen a view from history. Thanks James. So I've been reflecting, you know, ever since I got this invitation and since I had a little conversation Twitter this morning and since hearing earlier speakers are thinking about disciplinary identities and about changing disciplinary identities is something that in the history of science. And I think we've wrestled with now for best part of 100 years. I do belong to a discipline that I think we now think is a discipline, but maybe it wasn't a various points in time but I should say I was trained in history and philosophy of science. I have since worked in history department I still work in a history department, but I still see myself as a historian of science and in my brain, historian of science something. That means is another question, but I was trained in that HPS tradition and HPS itself has gone through various disciplinary changes as fewer and fewer people do the H and the peak part of it but just do one part of it or other. And the fact that in the UK we see a lot of historians of science who are like me working in history departments, not in HPS departments, or we see historians of science working in sociology and technology studies departments perhaps, or in HSTM units put so many acronyms floating around the history of science technology medicine unit. It's not that common to see history of philosophy of science done together anymore which is why there's a campaign to bring them back and have integrated HPS. I don't know all these debates, but I don't think it changes my concept that there is a coherent field that we call history of science, the week we recognize that's a shortcut because our vision of what that science thing is is now much broader than anything that the people 100 years ago would have imagined by science. We have people in the field of history of science who are studying technology, medicine, maths. We have some who are studying things that aren't natural sciences at all but are more other forms of scholarly endeavor. We also have people who are studying practices from other parts of the world, which aren't at all like the Anglo American vision of science, but they are certainly forms of knowledge. We have people who are interested in the knowledge of different cultural groups, different people in the world now, different people in the world in the past. Much of which wouldn't have been labeled science if we understand it as modern science, but is surely a way of understanding the natural world and a way of gaining knowledge. So history of science covers all of that. And so it's clear to me in my head that history of science can contribute to this thing called research and research or meta science or whoever we're calling it. But I've been quite, but certainly not everybody. And I've been quite interested by recent discussions as to whether we should be renaming the history of science. So the history of knowledge, certainly for some of my European colleagues who are working in a different language set from English and debating how to translate words like the wisdom shaft into English. The history of knowledge seems quite an attractive term, but it doesn't mean something different from what is your science was traditionally meant, but it can go in at least two different ways. So my colleagues are seeing that as a way of thinking about not just academic knowledge but also craft trade industrial knowledge, which maybe doesn't come into the ROR. Thanks so much. Others would see it as a way of moving beyond the natural sciences and including history of humanistic scholarship and social science scholarship, not just natural scientific scholarship. All of these are debates that we've been having for decades if not more than decades in the field we might broadly call the history of science. It strikes me that in my context in the United Kingdom. Of course, though not all historians of science are working primarily in a history setting. We're probably sociologically influenced in that we're interested in people and practices and communities. Whereas face for instance our predecessor 100 years ago were more interested in ideas and concepts and hypotheses and the content of science. There is a lot more interest now in the practices the communities people. I think that's quite nicely to a meta science or research and research focus. But I was then reflecting James and you asked me was I, you said I was unusual in making these linkages I'm thinking, is that true. And if it's true why is it true. I mean, here, here's my attempt to answer is that it, when I trained in HPS. One of the things we were warned off very early on was this thing called quiggism or quiggish views of history and of science, and the to be the dangers of present centered history. In other words, we were warned to be really really careful about assuming that the way science is done now is the best, the right way, the inevitable way. So when we look at our historical case studies, when we do our field work in the historical times, we mustn't look at those and think oh they're really poor versions of what we know is going to come later. Or we should look at them and think oh well it's an early phase of development for something that's going to get much better later on. The important thing is to look at them in their own color, understand them in their own light in their own time and place. I think those are very good insight to have drilled into us that we should consider things in their own historical context time place. However, it does strike me that it's made us quite wary of telling the bigger narrative, and of connecting things to the present. I'm still uncomfortable with our muscle in my research will be labeled present centered history, because the fact that I'm working on things like the history of peer review or history of scholarly publishing. The reason I'm doing that is because these are topics of current concern, and I think it would be interesting to know more about their history. Is that present centered I would say not the questions are present are determined by present interests, but when I studied them historically I am still looking at them. Through a historical lens and not a present centered one, but you know the fact I feel that slight uncomfortableness. I think tells you something about why there can must be lots of other historians science out there who are also not quite comfortable about moving out of the history part and starting to relate it to the present, which is what we might be thinking about I guess when we're doing research on research in the current times. I think that what changed for me was doing a project on the bus many of you will know the history of Royal Society publishing that had to start at the beginning in 1665, but also came right up to the present. So many history of science studies look at a particular relatively small point in time. Perhaps too much in isolation from the things before and after it, but, but really for anyone who works before the 20th century. We don't tend to connect up to the present, but I was forced to do so in the work I've been doing on peer review and scholarly publishing. And that made me realize how valuable it actually is to do that not just to say things were different in the past, but we'll say things were different in the past, and look at how they change until we ended up where we are. And one of the things I've really come to realize is how the modern academic practices, scientific practices that we use, they didn't come from nowhere they weren't created for us now. They have historical development, evolution, whatever words you want to use. And we can still see the marks of that historical development in practices that we're using now, whether it's natural sciences using single blind peer review and the human humanities using double blind, that's got historical reason behind it, for instance, there are many other examples of that. And our different academic communities are different disciplines, they have evolved developed differently, or even the same discipline in different countries have evolved different practices and different habits. And it's the history that helps understand that helps understand why I don't personally think we should be looking for universalist laws about how science or academia or research works. I think there are far more variation and part of that is historically determined. But for me I've really come to value that use of history to explain how we got to where we are, I hope not in a bad present centered way, but as a way to try and understand why we do things the way we do. And I think that also helps us to think about if we're trying to make changes. Some of us might be campaigning for all the things that we do the way we do, which ones of those are necessary or good or valuable or effective, and which ones do we do just because it's the way they've been done for the last 150 years. And actually the way they were created back then was for context and situations that actually don't apply anymore. And there's really no good reason to keep doing them that way, if we can think of a different way that would be more suited to our bigger, more diverse, etc. community nowadays. So I think history helps understand that and helps us reflect on, yeah, I say why we do it the way we do and whether perhaps we should still be doing it, given we live in a different context from where many of these practices started. I'll start there. Great. Thank you. That's that's great. And yeah, very interesting. I must say, I mean, we were saying about the Royal Society doing the work on publishing that sort of bringing this stuff to the surface having, having worked as you know, the Royal Society myself for a few years. I mean, it is a very unique institution and environment for making very tangible those threads that connect, you know, the origins of modern science or at least part of those origins, you know, in the 17th century through to the present day quite, quite remarkably. Remarkably so, unlike, certainly anywhere else I've worked, but thanks that was great really interesting and valuable I think to bring that more visible strand of historical reflection into, into the discussion interesting as well your thoughts on the Yes, and that debate will may come back to that in our wider discussion. And but we're going to turn now to our final contribution. Charlie ever soul is talking to us from Virginia and Charlie we wanted you to come in at this point and give us a sort of view from within the meta scientific ranks of psychology. Not only of that discipline diverse cycles years but of this wider sort of phenomenon in science which is that, you know, we've been talking so far as it were from different disciplinary starting points, looking in on the science system, but a lot of the impetus a lot of the real space in recent years has actually come from other disciplines, trying to solve problems in their own backyards whether it's the, you know, problems over reproducibility or problems over equality diversity all sorts of things that different discipline communities have become much more engaged and energized in tackling and have then turned to meta scientific methods and techniques to help provide the evidence to then support the, the change that they would like to see. I thought you, you could perhaps maybe unfairly embody psychology's journey through that over the last decade or so but also maybe reflect a bit more broadly on that phenomenon in other adjacent disciplines because I think it's very important part of the mix. Yeah, absolutely thank you. Yeah, I mean, the, I don't want to like, you know, get too much into my own personal history because I imagine that is of limited value but like, yeah, the, the, in many ways I think how I stumbled into this area of research is is similar to a broader situation that my my own kind of academic home or field of social psychology was having, which was that over the last decade we've kind of looked a little bit really harder at ourselves and thought, you know, maybe there's some things that we can be doing better maybe we're not producing and knowledge and understanding as efficiently and reliably as we could. And that's where it started for me it was well you know social psychology seems cool. I have chosen to pursue grad school in it I would like us to do it in a way that we continue to learn cool stuff. Do that better or get there faster. And we didn't call it meta science it was just like, oh well, you know for trying to study studies let's study studies. I was purely the impetus for it, you know, since then it's been really exciting to tap into all of these these, you know, broader lectures and see all of the things that people have been thinking about and talking about for much longer than you know I have been alive let alone doing you know academic research. But, but yeah I think that the genesis within at least social psychology has been a very problems focused one. There are issues that we saw within our own field that some of us wanted to try and get a better handle on largely with the, the goal of being able to make recommendations on how we might do better. I think earlier in my, you know, meta science research, I would have hoped that some of the things that we glean from meta science and social psychology would be more broadly applicable to like science writ large. One thing that I continue to be more humble about as I meet and talk to other scientists, but you know I think, I think it's still been useful to even if it's as narrow as like, okay there's there's meta science, if that's what you want to, you know, call it if that's a useful label within psychology that can help psychologists understand how they do psych research, and maybe help us do it a bit better that that seems useful to the extent that those lessons. Other fields of inquiry like that's awesome. If they don't, they're just useful in a, in a more limited sense. But I think that that kind of problem focus as kind of the genesis of research on research and psychology has been a big determinant of how we've, we've gone about it. And as, as nice as it might be for helping us come up with practical knowledge there's, there's definitely some downsides in terms of maybe we're not looking enough at the bigger picture, and that's something that I definitely had way more confidence about earlier in my meta science career than I should have and do now. Thanks, Charlie. That's great. Really helpful. Thank you. Good. So we're going to move into discussion we've got 40 minutes or so for discussion both amongst the panel. And we started to hear bits of that as people reacted earlier on to different contributions, and through the q amp a box which I'm pleased to say is now quite filling up, but at least it's trickling in, which is which is great. So we'll maybe pick a couple of those to start off and, and, and, as I say, do also fellow panellists have used use this as a chance to link the conversation across across the group because I think there's lots of threads here that we could try and weave together in interesting way. I'm going to pick the one that's just come in from from Oliver Sal as a starting question. How does the panel believe meta scientific research should relate to the boundaries of disciplines. So when we've been, you know nibbling away at this obviously from different perspectives through this panel. He's saying there's a risk of needlessly, we suggesting there's a risk of needlessly reifying those boundaries through field specific meta research. On the other hand, as Eileen mentioned, the disciplines have their own histories and might merit their own focused research. So in terms of the meta scientific project he said should you know should we have meta sciences of biology of physics of, you know, psychology, etc, etc, or try and deal with this in a more holistic synthetic way. Who would like to offer a thought on that I'm going to go back to Cassidy. Thanks, Cassidy. Sure, because I think this this comes to the, the unity question and I want to push back against Sarah's comment because I think I've been mischaracterized as implying that I believe in a single universal law defining all science which I absolutely do not so I apologize if I implied that I was representing or do and challenging the assumption that the notion of unity or the notion of use of universal laws is new which which it's certainly not. And I'll get to this discussion in a second but I want to kind of go on that and push on the idea that SDS scholars reject universal theories. I think that there are several foundational perspectives that SDS scholars adopt and that is part of those boundaries of disciplines right when we think about what defines a discipline. It is a shared understanding of those principles that is part of what makes something an epistemological construction. And so I would reject we might call them different things right universal laws evokes a very quantitative thing because these theories evoke something a little more conceptual and less empirical, but they're the same sorts of items we're thinking about the foundational principles so to the question, what I mean when I'm thinking about unity and what's possible is taking some of these things where we start to align so stratification, I think is a really interesting one. I think if you talk to any sociologist, they believe in stratification they've observed stratification using their theories. Complex system studies network science has observed stratification right, we've observed stratification in history we've understood it. And so it is trying to look at some of these different issues from all of these lenses to be able to understand them to understand why they've happened when they've happened how they've happened answering all of those different frameworks. That's what I think is the importance of unity to do that, you have to have strong disciplinary training you have to be able to bring your disciplinary knowledge and tools to that and then you have to find the places of misalignment and I think that's what's interesting why would we come and sociologists present one view of this thing and a network scientist presents another few what's what's missing in the gaps what's there what can we know those are the interesting questions to me so when I talk about this unity it's really that coordination that collaboration that exchange of ideas and knowledge so that we find the areas of overlap but also find the areas of discord. And that means that, you know, a reification of disciplinary boundaries is perhaps going a little too strong I think we should, you know, stay trained in our disciplines understand our disciplines but also increase the permeability between those disgusting. Anyone else want to come in on on on that. I'm coming up answer this question from point of view of someone who was working in a history department humanities department not a science department, because in there's a lot of concern I think in the humanities departments that attempts to use this kind of research to change policy for instance, whether that's introducing open science or concerns about replicability or about research integrity or any of that. And that is driven by the needs of biomedical sciences, and then we can discuss whether all the natural sciences share some things, but when we move over to things like history, philosophy, English, French literature, but then we find it a lot more difficult as practitioners those fields say that some of the things that are being that we're encouraged to think about have anything to do with the research practice that we follow. And so, I'm really interested in the question of how, how many values and norms, academics as a whole do share. I think, you know, I agree with Cassidy about the interdisciplinary training. We were trained to do things in certain ways and to believe that this is the right way to do things that these are the right approaches to take, and they are different from other, and therefore I find it quite difficult to deal with ideas that well really policy changes realistically is what we're talking about. Policy changes that seem to be designed for all academics, but are really just designed for some. And one of the things I like to think that this kind of meta science or research research can do is point out why there's a lot more variation between the disciplines that we need to understand properly, so that we don't just take a one size fits all policy, and that then doesn't actually fit all. Yes, I think that's good. Let's take a second question because these these build and connect a net, a net brandy has asked a very good question for us. How would the panel answer the so what question. If science science meta science is important how can we use this knowledge, better outside the the primarily academic discussions that we're having here around meta science. How can we improve how we carry out research more broadly and engage with with others. So I mean this is of course you know we're having a conversation here about discipline the category within the within the academy and very conscious of that but a very important feature of many aspects of the science community is its attempt to, or people's as we've been touching our points actually improve the way research systems cultures operate run. How do we kind of better connect all of this to practice and change the systems that we're analyzing Charlie. I'll, I'll take a first stab at this kind of see a little bit up close the the parallel so in the time since James I think you were sent my you know my contact info I have started as a researcher at the American Institute for Research which is a non partisan think tank based researcher in the US I work on education policy. And it's, you know, we we have different groups different categories different objectives in the like. You know outside of the Academy and yeah I think my, my first and more obvious answer is send people with this training and background and experience outside the Academy. There are things that are useful for that I have learned in trying to do meta scientific research I think they're useful perspectives that you gain identifying with that field and engaging with other people who identify with that field and, you know, some of them are useful out in the more natural world I guess in terms of much more applied research with with folks out in the field. So, I mean that would be probably my biggest advice is, you know, there are lots of us I'm glad there are lots of us that identify with it I think what we're doing is useful and not just among us. We have contact with broader interdisciplinary teams not only in terms of like, what was their PhD and but what are the problems they're actually trying to solve and how did these lessons make it easier to solve those problems. Yeah, that's a great point. I'm sorry you wanted to come in. Yeah thanks James and also excellent question. Yes, so it also taps back into it. I mean it was just saying about the relationship with policy because I think there's one space in which there's a lot to do for us and we are already doing that and engaging in are implicated in that kind of process as as research on researchers, whatever we want to call ourselves. So policy funders publishers researchers themselves, but maybe also more societal actors outside of the immediate ecosystem of higher education and research but but to stay there so outside of our, our fields I think we can do and mean a lot and have meaningful for instance when you consider that funders and policymakers and also publishers and the ways in which peer review works. So that kind of work sets boundaries for what can be researched in the first place and what can be known in the first place and who's to know and and communicate and and set set set policies and and so by studying those and analyzing those practices and also engaging with policymakers and funders and publishers and other researchers I think we can make a meaningful meaningful difference also by for instance discussing well reproducibility is an example I know close from close by because we've had discussion about how is that relevant in all fields and what does that mean in all fields should we use the word and that can result in more subtle policies or or slightly different funding schemes that also allow for different interdisciplinary engagement and that has effects in the world as well. Great, thank you. Thanks for that. That's great. Santa. Just very briefly and of course I agree with what I heard so far. I mean, we study science as a system, you know, of course various aspects of it. And by understanding it better eventually we, we also learn how can make it better you know as a researcher, you know, even not doing these things before because for so many years I haven't you know as a particle physics and then statistical physics before doing things were more interdisciplinary. I would like to know how can you know, maximize my impact my impact by interacting better with my peers by, you know, being subjected to fair procedures or evaluation, you know when discussions about promotions you know and other things hiring, coming to place. So, overall, I mean this is a kind of spin off we hope to generate at least I mean certainly. This is not. No, it's not my main focus like I said my understanding is again, you know, to, to study science like you know we study the University many ways. But in this case there is an obvious implication which affects any scientist you know any, any actor of this any part of this system, whereas you know you're a scientist or somebody is a value science or you know the same in a science fund science. I wanted to move on. Another good question links a bit to this and Heather Douglas philosopher has posed a good question for us, given that humans are indeed not particles and will respond to results from research on research. What do we you as members of the meta scientific community think about the ethical issues of framing which questions you ask and pursue in this research and perhaps one could add to that which questions don't get asked because in some ways that's always an interesting follow on to the other questions so you how do we think about the sort of ethics of what we, and it goes back a bit to the earlier point I asked about you know who is this, who is meta science for as it were. I think Sara touched a bit on this in her response about the public and society but anyone else want to come in on that. I have two words on this and you know this question really resonates with me and is something that I struggle with in my own research as well. You know I think it's, it's interesting what Sarah said about for whom and by whom is knowledge constructed is something that guides a lot of my research so I work through these questions trying to understand how it gets to participate in science whose work is valued and received and rewarded how do our reward structures incentivize certain populations and disadvantage other populations. But in doing that because of the crudeness of the tools that I use, I recognize that I reify many problematic classifications the gender binary, for example the racial classifications in the US as another one. And Eileen talked about sort of disciplinary constructs as well we are constantly as meta scientists, creating notions of what constitutes a community of scholars what constitutes a truth area, and Eileen brought up different ways of knowing not just from this and moving beyond that to indigenous knowledge other forms of knowing. And in every study we do we exclude that when we use a bibliometric database we're making very explicit exclusionary practices that are happening that saying this is what truth is this is what constitutes knowledge and this is who made it. And when we know that those are really problematic so this is something that that I wrestle with of taking the best tools available to me right now to try to move forward on some of these policies which I think are really important policies to improve science and thereby society, but also understanding that those are crude and that they are problematic and we need to interrogate them and improve them not to destroy them completely because I think we shouldn't imagine a mythical past that before we were measuring this everything was better. Right, it wasn't better in the past. Right, so we need to move on to say what we're doing definitely has ethical implications that has ethical consequences. I'm not fully aware of those and engaged in those conversations as we move forward in our research we have the potential to for some serious misconduct. Thanks custody. Anyone else want to come in on on on that point Charlie. Yeah. Thanks. It's a fantastic question and one that I'm really uncomfortable with. So I, you know a lot of the meta science that that at least I got to be a part of in psychology were in the form of these big crowd source replication projects were bringing in teams researchers from all over and all of their time and all of their resources and the like to contribute to a larger project to try and understand how studies and psychology work. I feel really fortunate that I got to be a part of those efforts and lead some of those efforts but I also think that who gets to do those and how those shapes the question it's it's very difficult. I mean, a large part of the reason that I was able to, you know, build teams to do these projects is because my advisor and grad school has a lot of Twitter followers and he would tweet out my stuff, you know, it like, not everyone has access to Brian's Twitter and like that that's in many ways that's a horrible way to decide a research paradigm right like this area should I should not have an outsize influence on what kind of studies get to be done in that space with that level of resources because if you know people are limited by what I can come up with like they're in trouble that's insanely limited. I think the model like the psychological science accelerator which I was fortunate to work on a lot is a good way to maybe tackle some of these questions and these more ethical dilemmas in terms of high resource metascience. I mean, in that they, it's a, you know, it's a large network with democratically elected leadership they select projects to pursue and how to devote resources through democratic consensus. I think that's a better way, because I'd like to think that bringing more minds in these fields to the problems produces better outcomes. But yeah, to the extent that metascience is a particularly research intensive, or resource intensive field of inquiry I would, as much as we can try and make collective decisions rather than leaving it up to like what some shmuck at UVA feels like doing. Yeah, that's, that's a pretty bad way to organize the field. Great. Moving on, but a question from Megan Hicks about, I guess kind of policing our own field or at least preserving maintaining the standards and quality that we would often demand of others. She asked, when carrying out metascience research how can we avoid inadvertently falling into some of the same traps that lead to the problems we've identified in that in that metascientific research so for example misuse of statistical methods and concepts happens within metascience research, even when studying problems. You know with those same statistical methods, I guess in another way frame this might be the sort of John I need this question with respect to COVID and how someone who was obviously a sort of giant in some ways of metascience research ends up being embroiled in a, you know, a very sort of heated and debatable set of discussions around COVID data but maybe that's unfair on John to pick on it but it, you know that clearly is a question for the field particularly when someone that prominent gets dragged into something like that. So what's on that. Santoni I mean, I guess how we sort of live up to. So basically the question is that I mean, I mean that I shouldn't use the, the same techniques that I'm studying. Basically, at the time criticizing at the time trying to I guess how do we avoid falling into some of the same pitfalls that that underpin the, the starting point for the analysis as it were, the more critical analysis. I don't know I mean, of course I mean I know there are big discussions about p values for instance recently I think there is even a question on that. And therefore, and of course I speak in this case on behalf of those that use some of the tools that I use. I mean, we don't really mean of course we discuss about things like significance and, and of course we use when we when we need it you know we use also p values, but I don't really see. I think that this is really a big problem when you try to do the, the type of analysis that we do. I mean that I mean I understand that it's good I mean statistics I mean I understand that if you're having engaged in statistical analysis I mean statistical like your statistics you know inference especially, they could be issues like that and then of course occasionally I use this to myself and I mean I try somehow not to, I mean to avoid this conflict, so to speak, why not try to investigate some of the practices that are somehow not really steady or you know, some of the most efficient established or criticized or definitely improvable. I don't see this kind of conflicts in my, my, in my own studies, but yeah. I mean Charlie do you mean have you sort of grappled with this at all coming. Oh yeah absolutely I mean so the obvious answer is we we form the field of meta meta science so that we can all keep writing papers on something that we say is new. But no it's, it's really tough right and it. I think the main thing that you know I would emphasize the extent to which we kind of get a little bit to tunnel visioned on particular parts of the process so I'd say. At least the meta science that I've been involved with in psychology we absolutely have the opportunity to engage in a lot of the practices that we think are potentially so optimal based on our research. We pick some of those things to study. As we learn that hey maybe you know we can do things in a more rigorous or efficient way we're probably more likely to do them there's certainly some social pressure there, like, you know it would, it would be bad to publish a like I don't understand why he hacking is bad someone's probably going to see that and point it out so you know there's some internal pressure there. I think the harder thing is not getting to wrap around the axle for a given topic. So within, you know, just again go from psychology there's been a lot of emphasis on reforming and better understanding the consequences for how we use frequentist statistics and run our models like that's good I think we can get better there I think we've learned there's other stuff that we're probably not doing as well the one that comes to mind is measurement we should spend a lot more time, I think in a lot of psych sub disciplines on developing better and more valid measures, and the extent to which I'm spending all my time thinking about, you know how should I be better at pre registering studies if that takes away time from thinking about those other elements that could weaken the work. Again, that's why I would point to, you know a potential way forward is larger sustained networks of labs like the psych science accelerator where, you know, we have lots of content experts that each take pieces of projects to try and look over. And it turns out there's a lot of really smart people out there that work on research. And one of the big struggles is getting them to all talk to each other but you know if you can create networks where they do that and we can really leverage people's individual expertise I think it fights against that but yeah like we've, we've got the same problems that we, we, we seek to study, trying to be vigilant for that's an ever been ongoing struggle. And we've got a few more questions coming in but while we, we let people type a few more in. Let me just sort of ask a broader question I guess the panel, picking up some of the points that were made at the start by cast in and also touched on by by Sarah. The sort of institutionalization of this field if one can call it a field is a very interesting question and perhaps me lies at the root or at least that the as ever in certainly in academia that the battle for resources status, the policing of different disciplinary systemological boundaries lies at the root of a lot of the tussling that I guess goes on. And I just wondered if any or all of you had thoughts on sort of where meta science goes from here as it were in terms of its development as, as a project as a set of policies that are of course, very widely distributed across certainly across the university research system, such that it doesn't necessarily seem obvious to me at least that the solution is, you know, the creation of a department of meta science. So that that would seem like this to be the opposite of what's required. So just sort of interested in people's thoughts on that and also inevitably the thoughts about funding and money I mean Cassidy in particular you've, you know, not that long ago come off your your tour of duty in the National Foundation, you know, coordinating the, what was the CICIP program and other science of science program. So, particularly in city your thoughts from that experience but others as well, we've all got experiences good and bad of the ease or difficulty of getting the resources to do the kind of work that we're talking about here. So, so yeah two big questions institutions and money. Any thoughts. Cassidy. Yeah, absolutely. I mean I, I think institutionalization in some ways is an organic progress that communities of scholars follow through they institutionalize around conferences and then journals and then eventually it's degree granting programs and they move towards those levels of institutionalization. I, and I think you pointed out a huge reason because of resource allocation right institutionalization is an easier avenue for resource allocation then sort of distributed umbrella horizontal kinds of activities. So there is something to being a vertical unit that makes it easier to address those. I was fortunate and I think the US. The, the, the, the medicine science community is fortunate that there is a funding agency devoted to funding this area of research and that gives them a home otherwise. Most meta scientists would have to try to fit into homes that really weren't aligned with them be reviewed by panelists who don't understand them. So bringing them together in that way was really important. One of the things that was there is each one of our panels had all the same kinds of representation that we see on this panel today so that people could look at something and say, I'm going to look at the proposal from a sociologist on the grounds of sociology rather than economics and look at the economics proposal from an, you know, economic perspective when I'm doing that evaluation so because you had a diverse body they were able to represent the meta sciences really well and I think that that's important. I think that brought a really important point into this conversation about that institutionalization. It's not that we've just not institutionalized departments of meta science, but we've moved away from the few examples that we have of an HPS department, Indiana is lucky to have an HPS department that sort of brings us together, but most people doing meta sciences are all over the place Santos and an informatics program I'm in a school of public policy leans in a history department right so that's part of the nature of our field and I think that the real difficulty is that we'll have to look to institutionalization, which leads that resource allocation in very different ways coming together as a community will be necessary for that, but it may not be in the same mechanisms of other processes of institutionalization. I think that's the in terms of the departmental. I mean, just to reflect briefly from a UK perspective I mean historically, the big centers of research in this area, Sussex with sprue Manchester were all swallowed up as it were by business, you know, business schools because that was where the money was. And of course that then drags the central gravity into innovation studies which of course is an important part of the mix but not the totality of it so that's another thing. And if I, if I may before we go to Sarah is is also then that changes the evaluation practices for those individual scholars when you move into a business school you're now subject to business school accreditation requirements which means business school publishing requirements which means the way you frame your questions the kinds of collaborations that you pursue is going to be constrained by those disciplines so that institutionalization actually matters for the kind of knowledge that you can produce so that is a, I think a concern that we have to think about just in those evaluation structures whether it's the macro level rep type structures, or the individual p&t structures that are going to have influence on the kind of scholarship that comes out of the science is because we lack departmental homes. Yeah, no absolutely good point. I'm Sarah. Yeah, fully agree with all the castle just said, we are in the process in the Netherlands of together with funders, trying to carve out this space a bit bit more also to help them understand better understand what what this can be. And so they see that there's something going on in this space and, but they're literally not sure if what it is and how it's dispersed is indeed for all these reasons you just mentioned it's an issue. So lack of institutionalization in a sense. I was also wondering about institutionalization and doing that with a critical eye and taking on board some of the stuff that we know indeed from evaluation or what we what we know about open science and how to do open science and opening infrastructures or open knowledge, knowledge infrastructures, would we then indeed start to launch new journals for what purpose would we indeed do that. So, so those types of systems indeed have some purchase for all the reasons you just mentioned, but I also see some clashes with with other value systems that that that we also engage with in the matter research or research on research as communities around openness, for instance, I'm really curious also to see and maybe discuss further how we think about and how we will productively engage with institutionalization processes together. Thanks, sorry, that's great. Eileen, over to you. I found my mute button now. I'm, you see, I don't think I would want to join a department of meta science if there was one because I was initially I was thinking it's the problem here is like other interdisciplinary fields that we've seen in the past and women studies, there's a lot of political science going further back book histories one I know, which have eventually taken on the institutional forms that Cassidy was talking about a moment ago. I was thinking, my areas of interest are not, if we're doing Ben diagrams they overlap with meta science but they're not entirely described by meta science. All those points that Sarah has just made about what sort of publishing practices one might or one of you just made publishing practices norms of evaluation. I think I'd be happier staying in a history or history science structure where the way I work is understood and valued. What I'm looking for I think is ways to have the conversations and the communication, are they working together on a shared problem with different backgrounds. And just to do that, I'm not sure that's the same as wanting university institutional structures that would undermine or cut across my disciplinary identity. So I'm using on that listening to you. Am I allowed to ask a question of my other panelists while I'm here James. I was thinking of something that Sarah said a while back when she was talking about the literature review of where meta science has been done and input disciplines. And I think you said that it's mostly North America Europe phenomenon this interest in the research and research and science science whenever want to be. What I've been musing on is, is it culturally imperialistic of us in Europe, North America to set about studying how the science research that's done in mostly Europe, North America is done should be done and expecting the rest of the world to follow suit, or are we just not recognizing different ways of creating knowledge, respecting that, it just sounds a little bit like a version of let's all look at the scope of the web of science database and assume that's everything that there is in science, which we know is not true for a whole host of different reasons. So how do we avoid that in this new field if I'm remembering what Sarah said correctly. Yeah. Sorry, I want to respond to that. I guess I also couldn't really do justice to the subtleties of the analysis, but I also think you have a point that of course we, we now reify some of the issues that the point that Cassidy may basically about what you start with what you start out with the types of data you start out with. And the type of classifications you use that you come up with partly yourself but also using the classification exercise that leads to work, these end results. I think it's a really important point. And also, I think really a lot of discussion goes around into considering what openness for instance as an example means in different geographical context and also what affordances capabilities different regions have and those types of subtleties I don't think you can see in analysis like this but but it is being researched I think. But there's also these huge power differences and inequalities and funding inequalities and infrastructural inequalities. And that is something that I think some of us are also addressing, but maybe I couldn't touch upon in the analysis. I mean, I know you said, I think Santa yes, you wanted to come in. Yeah, I mean, I mean, does the institutionalization, Dala and my challenge look different Santa. Yeah, I mean, actually, I want to comment on both things about this. I mean, when I study I don't know for instance citation that I don't really cut anybody off you know just, you know, you have all the journals that you can study if you can if you can build it and can analyze it. people making science in any part of the world that are there and the same thing for collaboration. I tend to be as comprehensive as possible, then try to focus on. And of course, I mean, both of the contribution comes from Europe and North America because that's the way it is. But when I try to look at general properties for instance of a network or to make predictions for instance of the future evolution of the number of citations in a paper or an author. And I don't really, I really try to be comprehensive. And I think this, I would say, that's what happens in many studies I'm aware of. When you study about things about mobility, of course, again, there is a dominance of a certain part of the world, but everybody's part of this. I don't really see it like a big problem in our era. Of course, I mean, but I see the principle, right? When especially, you know, focus on specific practices that could be an issue but it's not too much of an issue in our case. The question of citizenization is general. I mean, I've been engaged in interdisciplinary endeavors for a long time and we all have the same problems, a community of complex systems, a community of network science. And the thing that I find kind of ironic is that, I mean, both when I was in Europe and now that I'm here in the United States is that the funders usually say that they encourage interdisciplinary research, but then you're evaluated by most of the time panels of strongly disciplinary person and there is no way you can make everybody happy because there is always, the guy will say, oh, but this is not computer science strictly or this is not entirely physics or this is not a sociology or where is the sociological expertise. So I think this is really the structural problem against any type of interdisciplinary endeavor. Specifically in our case, but I would say in general, you know, of course departments would be ideal but I mean, I also share a little concern, I mean, I don't consider myself a meta scientist, you know, 100%, you know, I'm a history background, I do other many other things at the same time, but centers I think is a good common ground, centers recognize the presence of people that I mean, can have members in different departments at least we represented visible entities, you know, the very first steps for instance, you know, in the developers of network science was existence of centers, you know, within some of the universities and then some of them even created the PhD programs. And now they're even at least two departments I know of network science, but it took 20 years. It's a very slow and painful process. We need, of course, you know, I mean, I don't wanna leave my department or somehow, you know, I can also be in a different department, somehow I mean, I don't wanna be so focused and that in a way it's very slow process because, you know, it's engaged with institutions, but you know, there are structural barriers that are difficult to overcome, you know, this is a slow process. What I would do, I would make it more visible large initiatives, like, you know, large conference which has many people as possible on different, you know, perspectives come and I would see very favorably the formation of creation of centers in various places. There are, I think, a couple that I know and of course, you know, some of them are different names. I think it could help. Great, thanks, Santana, those are good points. Right, we've got three minutes left. We've answered most of the questions. A few of them are directed at specific panelists. We can perhaps do them by typing the answers, but just a final round, if I may, I'll ask each of the panelists in sort of 30 seconds or less if there was one thing that they could do or see happen to the field of meta science, however defined, to try and strengthen, improve and push it in the kind of directions we've been talking about over the last 90 minutes. What would that one thing be? Charlie, I'm gonna pick on you first. An infinite pile of money. No, I think the final question is a really good one and I would hang on that and say that broader global diversity and institutional diversity within projects and programs and having power and resources behind those. Resources are very unequal in a lot of research. They go in bias patterns to particular people. I think we'd be better off if a lot of those people gave those resources and that power to others who have less of a chance of getting it kind of naturally in the system we've created. So larger collaborations that are much more diffused in how they distribute resources with the grid. Thank you. Keep them nice and punchy. Aileen, you're... I think, and I know it's probably unfashionable to say when we're putting back in our travel and we're doing online stuff right now, I think what I would like is more opportunities to meet some of you on this panel, some of you in this meeting and actually chat and get to know what it is we have in common, what our differences are, have those conversations or those arguments, whatever they're going to be, because most of the things I go to are other historians or in this kind of venue, we're having a bit of a conversation but we could have so much more. So I guess I want someone to bring us together so that we can really have even more conversation and argument and controversy and what else. Thanks. No, I think we all second that. Thanks, Aileen, very much. Santo. Oh, no, sorry, I just lower my hand. Oh, okay, okay. No, Zara or Cassidy, any final single thought on... But the thought that I agree with Aileen, I mean, in the sense I tried to say that earlier, we need more of these things and ideally life and to get a bunch of funders over there too. That's also very nice about this occasion. So it's really... And at some point, I guess I would add that not only like opening the doors to our own interdisciplinary space, but also maybe open it even further to do transdisciplinary work with and maybe collaborate with the people that were actually also trying to influence and have impact on. Great, and Cassidy, a final word. My 32nd one, collectively invest in doctoral students. I have gone to so many amazing doctoral forum in the last year or two for the Vermont Complex Systems Workshop, the CWTS, the Syracuse-Eiswell Summer School and all of these doctoral students already embody all of these values. They wanna use every tool method and theory available to them. They think they're curious, they're innovative, they are engaged and we should do our best not to train that out of them. So I hope that we can continue to collectively invest in them and train them in the kind of future we wanna see in the field. Excellent, yes, another very good suggestion. Great, well, thank you all very much. I think that's been a really rich and productive and positive discussion. I mean, I think even looking at the meta science meetings, the two of them that have been, you see in the progression in terms of the framing of the meeting and discussions like this, also steps forward in terms of facilitating and enabling these kind of important foundational discussions. And certainly from my perspective that Rory would like to thank and pay tribute to Brian, Nozek and colleagues at Center for Open Science and our colleagues at Amos as well for the opportunity to bring together the people that we have, even if only virtually for discussions like this. The other one thing you can of course all do is join part two of this discussion which we'll be taking place on the 24th of September or for some of you on the morning of the 25th, I think it's about midnight in the UK. So you'd have to be very committed to hammering out what meta science is to stay up till midnight to do it. But it will all be online. So meta science part two will be moderated by the wonderful Fiona Fidler and there's another fantastic panel and they have promised to listen to a draw on our discussion. So we look forward to seeing them pick up some of those threads that no doubt add some of their own as we move into part two. But we come to an end now and let me just thank again my fantastic panel. Cassidy. Thank you. Thank you Charlie, Aileen. Thank you very much. See you soon. Thanks everybody. Thank you all. Bye.