 It started. All right, so thanks for joining us everyone and also thanks to the organizers for including us in this amazing program. My name is Kristen Brothel-Harwitz and I'm a social and behavioral scientist administrator in the Office of Behavioral and Social Sciences Research or OBSSR at the National Institutes of Health in the United States. And I am a social psychologist and cognitive neuroscientist by training. Today I'm glad to be here virtually co-moderating panel on practical and responsive science in the age of COVID-19 along with my NIH colleague, Dr. Luke Stockel, a program director at the National Institute on Aging. I think we can all agree that the COVID-19 pandemic has made abundantly clear that science needs to be ready to respond quickly and practically to our most pressing issues, which can have a significant impact on both near and long-term morbidity and mortality. With COVID, this included both a need to tap into relevant expertise and also to rapidly support new research on time-sensitive solutions. Concurrently, the pandemic has also been a quickly evolving natural experiment, affording a rare opportunity to study myriad biomedical, epidemiological, public health, and social and behavioral and economic phenomena. Bringing together members of the research stakeholder communities, today we'll present an evaluation of past and present efforts to direct and support quote-unquote fast science, including research strategies, resource and infrastructure development, and funding models. We'll discuss issues of practicality and responsiveness in science more generally, especially in the social, behavioral, and biological sciences, and we'll also respond to questions and comments from the meta-science community. In addition to myself and Luke, speakers will include Professor Elliot Berkman of the Psychology Department at the University of Oregon, Dr. Bronwyn McGinnis of the Infectious Disease and Microbiome Program at the Broad Institute of MIT and Harvard, and Dr. Adam Russell of the Applied Research Laboratory for Intelligence and Security at the University of Maryland. Each speaker will take a few minutes to share their perspective on this topic. We'll take a question or two from the audience, and we'll have additional time for questions toward the end of our session today. Please enter all of your questions into the Q&A section of Zoom, and I do believe also have the option to upvote questions the whole audience, so please go ahead and do that so we know what's most important to all of you and can prioritize accordingly. So we'll now get started with Professor Berkman. Hi, Kristen. Thanks for that introduction. Hi, everybody. I'm Elliot Berkman, Professor of Psychology at the University of Oregon, and I co-direct the Center for Translational Neuroscience there. My background is in social psychology, social neuroscience, and in health behavior change. Though for a long time I've been really interested in this meta science question of how do individuals in fields choose the questions that they ask? I find that in psychology might be particularly guilty of being heavily theory focused. I mean, I'd say we were infamously theory focused. There's nothing wrong with that per se, but the theories that we deal with tend to be incredibly abstract. And by abstract, what I mean from our perspective here is decontextualized. Psychological theory is often posed in terms that make it sound like it's expected to apply to all people at all times. And of course, we kind of recognize that that's a conceit, that that can possibly be true, but at the same time, we don't actually dive into the details often and really understand under what conditions for whom and when a theoretical framework might apply, which I think is ultimately what this kind of relevance and practicality discussion reduces down to. So I think when thinking about how COVID might change social sciences and behavioral sciences in particular, I thought a lot about a conversation that my father told me about with my father's an economist, and he had a conversation with a really famous Nobel Prize winning economist after the 2007-2008 financial crisis. And this prominent economist was really quite rattled shaken by the crisis, not just at the magnitude and the harm, but really kind of on behalf of the field of economics. And this person told my father, he felt that the econ had failed. Econ had just failed in its mission because it didn't foresee the crisis. And then once it started happening, it had no solutions, which was a pretty stunning admission for somebody of that stature. And I feel like COVID really applies in the same way to the social and behavioral sciences. I feel that psychology failed to predict the main barriers that we're seeing, both in terms of compliance with COVID protective behaviors in the early days before vaccines, the question of how do you get people to wear masks? How do you get people to wash hands and social distance? The barriers that we it's interesting now to kind of do this forensic analysis and look back and say, what were people saying? And there are several papers out there that were just this broad survey of like, here's all the possible psychological phenomenon that could be at play. And that's true. I mean, those are all relevant. But as a field, we really had no answer to the question of like, what is going to be the main barrier? Which I think turns out to be something like political polarization and in combination with misinformation. So some people said those things were relevant, but we really didn't flag those as like, these are the things. And then of course, the second problem is, even once we identified them, we have no real solutions. I mean, you ask especially in the stark contrast to the sort of biological sciences, where this amazing achievement, 11 months from we've sequenced the genome of this thing to here's a vaccine, behavioral sciences have had so many years to sort of say, okay, how do you get people to comply with expert recommendations? And frankly, I mean, there's some things around the edges, but we have nothing, I mean, we have nothing that's really at that scale. So it's been very frustrating and disappointing. And I have a lot of thoughts on what we could do differently. And I think a lot of the critique, some people will say, well, behavioral sciences are harder, they're more difficult, they're a little younger, and that's all true. But I also think that's being too kind to ourselves. I think we really need to introspect and say, why, you know, what happened here? Why did we fail to understand what we're going to be the most important psychosocial predictors or barriers to sort of behavior change essentially on a societal scale? And why were we unable to produce solutions? Or why do we continue to be unable to produce real actionable solutions? So let's, I'll just sort of leave it there. And I think we'll have some rich discussion around what happened and then, you know, how we can proceed. All right, great. Thanks so much. I will now shift over to Dr. McInnes. Hi there. Thank you so much to the organizers for the opportunity to be with you here today. As Kristen said, my name is Ronald McInnes. I'm based at the Broad Institute of MIT and Harvard in Cambridge, Massachusetts, where I am the director of pathogen genomic surveillance and co-lead the Broad's Global Health Initiative. I think it's fair to say that the use of genomic data, for which the Broad is probably best known, has really had a coming of age during the COVID crisis. But really the idea that genomic data could help revolutionize the way we think about infectious disease, evolution, pandemic preparedness and pandemic response has been percolating for at least a decade since the advent of next generation sequencing technology. I think as we move through this pandemic and hopefully start to turn the corner on the acute phase, I think it's a great opportunity to take a step back and reflect on how the response of the genomic science community and the use of genomic data for tracking evolution of COVID and identifying variants of concern and understanding their impact and how our response needs to adjust in light of them is something that we just have tremendous learning opportunities from. And it's really not about the science. It's only partially about the science. I think that's why this conversation is so important. I think over the last six months or so, the kind of global response of standing up a genomic surveillance system de facto, it's fragmented, it's not perfect, but I think it's more or less happening around the world at some scale now has been an incredible success. But I think we are a long way from a sustainable operation that will serve us for threats in the future and how we make sure that the gains that have been made during the short time during COVID are entrenched, supported and sustained into the future is a key question that goes far beyond the science that underlies it and really brings in elements of social science as we've just heard, economics, global health, global policy and real continued advancement of the science itself. So I look forward to this discussion. I hope I can bring some perspective from my angle of bridging fairly kind of high-tech science with ground game public health and using it in the COVID response and thinking about how we can make sure that we leverage these gains in the future. All right. Thank you very much. I will turn it over to Dr. Russell now. All right. Thank you. So again, I want to encourage the audience to drop any questions that you might have in the Q&A. We will keep an eye on that. Also, once questions are in there, the audience is welcome to upvote any questions they find particularly interesting. So I am going to speak a bit about NIH and the COVID response, but I just want to take a break here since we have three experts that I really want to hear their views on some important questions. So I think Luke and I have some questions, but also first I wanted to see if the speakers had any additional questions for each other. This is a chance for me to ask like my hero's questions. Yeah, if you'd like to kick it off. What do we do next? It's a good question. I'm just reviewing the there is a question posed in the Q&A. So I'm checking it out. Yeah, I can summarize. I was looking at them there and that they go to the dismiss somehow. So I reopened them. So the gist of it is there are a couple of them. I think they're all linked, but you know, essentially, you know, my takeaway from this, there's a lot in this comment, but I think COVID has exposed both the strengths and weaknesses of our science system. And I think in the interest of this particular conference, meta science and taking a look at how our science operates and functions, you know, what are what have we done well? What have we haven't done so well? Like how is how is the community viewed science as a result of how we've responded to the COVID-19 pandemic? And how do we need to move forward in the world of science in a positive way with very simple recommendations that you, you know, and again, you know, I think Chris and I have a bias towards the behavioral social sciences for the reasons that Elliot at the out start, you highlighted as, you know, glaring weaknesses that I think were frustrating for many of us. But if you could comment, you know, on that, you know, one thing that you mentioned is, you know, I think, you know, in the open science movement, the use of preprints, this person points to preprints in particular, it was a very effective communication strategy for exchanging information throughout the pandemic and getting research, you know, circulated in a rapid way that, you know, prior to this age, you know, wouldn't have happened. So things like that, just practical recommendation, thoughts of the strengths and weaknesses from your various perspectives that the COVID-19 pandemic highlighted, that kind of stuff. Yeah, I can chime in. Just I mean, I guess I have the weaknesses and strengths, I'll start with the weaknesses. I mean, I think the big challenge or to Adam's question, you know, what do we do? We need culture change in, at least I'll speak for psychology. And I think this applies to all the behavioral sciences. And, you know, that's, we need culture change to, to revolve around, be more problem focused. Again, as Adam mentioned, you know, this question of what problem are you trying to solve? That is not typically the way psychologists think about their research. They think about their research in terms of theory, right? Like what is the prediction I'm trying to test? What is the theoretical model that's under examination? I don't want to be too, I don't want to go too far, but you almost say we kind of need to abandon that, or at least recast the way we think about theory to think about problems, right? What problem are we facing? You know, this question of, sure, there's a lot of theories about persuasion, and really none of them actually solve the problem of how do you get people to believe experts, right? Like that is the problem. The problem is people are not listening to expert recommendations that are based on real data, you know, that kind of problem. And we don't, you know, there's theories that sort of speak to those things around the margins, but we don't do that. And, you know, it comes back, I know this comes up a lot in medicine, it's the incentive structure. And in certain parts of the incentive structure, I'll implicate ourselves, academics, it's our peer reviewers, it's our peer review journals, it's our tenure and promotion committees. You can publish, you know, paper after paper testing these abstract theories and finding support for them, and get tenure and just be totally fine. And so until that changes, this isn't going to change. On the strengths side, I think there's some reason to hope. One is the funders, as I was pointing out, all the ARPAs, NIH certainly, even NSF to some extent, kind of get it. And certainly private foundations also get it, right? They don't care about your theory as much, you know, NIH, no offense or anything. I think that's a strength. It's like, it's not, you have to have some sort of theory in the proposal, but it really is about what is the problem you're solving and how is the science going to solve that problem. So I think the funding is actually there. It's really on us. And then the other bright spot that I really see, to me is like the thing that actually kept me in this field, kept me from just leaving outright is this next generation of scientists, the trainees, the current graduate students, even the current undergrads, I'd say, maybe especially the current undergrads, they totally get it because they live through this and they're like, this is real, like science on the one hand saved us on the kind of vaccine side, and science completely failed us on the human sort of, you know, how do you persuade people to listen to advice and to sort of get along and how do you prevent group polarization? We completely failed. And I think that sense is really palpable for this young generation of scientists. And I see, I see them actually, they're going to be the ones that will change the field because they're going to be intrinsically motivated to answer problems and not test theory. Bronwyn, do you want to jump further? Bronwyn, you had. Well, I mean, I just would echo what Elliot said and just add, I guess I'm thinking of it from perhaps a closer or more focused lens about the bridge between academic science and our public health response. I think what we've learned during the pandemic is that there's a gulf between those two. And it's a bit of a wild guard in where where information and data and decision making live for public health response versus a lot of in the real kind of like operational implementation of that, where's a lot of really creative thinking. And I think valuable contributions are being advanced in the academic sphere, but it's staying an academic exercise. And it's incentivized by publication. It's, you know, just not quite closing the gap where where the product of the academic ecosystem is really informing in real time the way that public health decisions are being taken. I think steps have been made to close that gap. But I think that's one recommendation that I would have coming out of this is to formalize ways to to break down the barrier between what's happening in happening in academia and what's happening kind of in our public health ecosystem. Yeah, no, that's a great point, especially, you know, when I said DARPA, you know, DARPA exists in part to do really innovative R&D, but also ultimately to transition that to use. And whenever, you know, the social scientists would stand up, they'd say, you know, where's your transition? And that's really hard to prove when it's, you know, it's knowledge, or it's a different way of doing things, right? Methodologically, to me, transition is when more people at DARPA are pre-registering their studies, like that's that's a win, right? But it's really hard to hard to demonstrate that sorts of thing. And so I think that's thinking of transition as steps in that supply chain, rather than just the end, I think it's really important to communicate, because that also puts the responsibility and the onus on those people in that step to realize the thing you're doing right now may be used on you eventually, right, or may come to impact whether or not, you know, we get out of this as a war. I'm still torn though. This is great, the sort of flagellation is necessary. You know, mea culpa sort of stuff is great. We need to do this or a retrospective. There's still this element of, you know, 18, 20, 19, 20, 20, pick the year you want to go through a pandemic in. And you got to go with 2020, right? At least, at least all indications at the moment. But I do think there's some value, and maybe I'll do this afterwards, is take sort of what the supply chain folks are warning from their experience of 2020, and actually map it to ours. So if you think about supply chain, folks are now thinking hard about how do we do the modeling in advance to uncover hidden risks, right? Things that are not obvious. And I'd say there have been people in science who've done that fairly well to date, but not enough of them, right? By doing things like modeling and simulation to understand what happens when our foundational research is premised on questionable research practices. And now suddenly we need to answer questions like, you know, public health interventions. Where are, you know, what is the ultimate impact of those sorts of things? Where are the uncover the hidden risks? And I think, you know, both of your points is right that in very few models I saw, and this is particularly relevant to the United States, of course, other areas of the world are struggling just to get vaccines. But very few models I saw ever predicted that demand would be the bottleneck for vaccine distribution. That should be a hidden risk, right, that we didn't anticipate, but also indicate that perhaps the hidden risk of this supply chain was science communication, right? As you point out, and that's, we're not thinking about that in terms of very crunchy, gritty details, but I think modeling would have revealed that if we had a concerted effort to explore those. And there's a number of other interesting lessons. I think one of the other things I think that is sort of a tie in the wing loss column is process adoptions or technological advances in terms of process adoptions. Yes, preprints are awesome, but they also just like with social media have signaled that the ability to increase noise as well as signal is huge there as well. So I think there are still significant technological process innovations that we need to adopt earlier on within that supply chain as it were. I'd say to me, again, the last win that I saw, at least what we did well, was the fact that sort of this flagellation is really important because I think it creates and continues to perpetuate the scientific community's general ethos, which is we really do care. We really will answer the call when it comes. It's just a question of, as Elliot points out, have we been sharpening our axes well enough to be able to solve that problem? I do actually hear a lot of concerns from people who are proponents of sort of curiosity, driven research, which is important. I'm not disputing that. But even at that level, you should be thinking in mind of like what problems could this solve. And so from this idea of the lesson of supply chain of diversify your stock base, you should, if you're hitting something truly foundational and fundamental, you should be able to point to 15 problems that might be solved in this regard. But at the end of the supply chain, there's no problem, or at least that you can't identify. It really does call into question like, okay, is there a better use of that kind of time? And I'm actually reminded of, and I'll stop here, I promise, part of the reason why I've gone on to Arles, where we're relating to things like cognitive security and disinformation, was there's a story of a famous physicist visiting a national lab, asking the graduate students sort of, what are you working on? And they're all revealed talking about the projects. And then he says, well, what do you think are the most important problems in physics? And they said, oh, well, that's quantum mechanics. So why aren't you working on that? Right? If that's the most important problem, get on that. And I'm excited to continue to help what I think is probably one of those important problems. But I think the same reminder should be shouted high and broad from funding agencies. Let's get after the most important problems, even if they're the hardest, right, Elliot? I just want to acknowledge, though, that both from the point that Elliot made earlier, and some of the kind of points you're roughing on there, Adam, that while the science that has delivered or enabled the development of the vaccines that are now in people around the world did come quickly. And of course, that is in huge part due to the funding and just the human resource and intellectual power that was put behind it over the last 18 months. That is standing on the shoulders of decades of basic hypothesis-driven research in RNA and mRNA biology, not even thinking so much about vaccines or products or kind of health solutions. And so I think we can't celebrate the success of that end of the supply chain without looking back at the beginning of it and how we got there. Yeah. I want to rip on that a little bit. Oh, go ahead, Elliot. Go ahead. Well, I'll just quickly say that. Let's talk about that. Because one of the solutions that I have sort of presented or part of the solution is to try to explode the distinction between basic and applied science. I mean, I think that's an artificial distinction. And I think it's actually just completely unhelpful and counterproductive because, well, especially in the social sciences, this sort of so-called basic has been elevated above applied. And that's true in some other fields too. And so, but because I think there are by basic you mean sort of theory testing and applied, you mean sort of contextualized, there's way to just to do both. I mean, there's no real need to pick one or the other. And I would conjecture, Brown, that a lot of that the sort of foundational work in mRNA and that kind of stuff was along the lines of what Adam's describing of like, well, this is a so-called basic thing. It doesn't have an immediate application, but it clearly solves problems that people foresee down the road. Questions like, well, how would one fabricate this stuff? How would you create these proteins? That's where I don't we don't do that in behavioral sciences, really. I mean, I honestly see a lot of sort of theoretical thing where there just no concept of like, well, there is this problem in the world that this research doesn't actually solve that problem, but it clearly makes a step towards that. At least you need to consider that. And yeah, I'll just stop there. Luke, I want to hear what you have to say. I was just going to kind of rip off something that Bronwyn said and get very specific. I'm really interested in how, for lack of a better description to biological and behavioral and social sciences as kind of silos, I've handled this pandemic. And so one thing you said, for example, that we tout as a tremendous success early on was the message was definitely, this is a communication issue. The message was out there. We got in 11 months, we have a vaccine. Wow. And then everybody else is like, whoa, based on the science, we're talking 20 plus years of research that has gone through this. And from a communication side, how the message is perceived by the public, we know that for issues like vaccine hesitancy, for example, the quickness, the perceived pace was actually a barrier for a lot of people. You're speaking to my heart, Luke. Yeah. Yeah. So I mean, so that's the kind of thing. So how did we, in our field, do such a bad job with that when we, yeah. Anyway, so I mean, this is a kind of topic that I'm like, it's very concrete. It's very specific. I'm sure we all have very, very specific thoughts about this. I'd love to show that from the rooftops. If anyone has a platform that reaches a lot of people, I think it's a message that we have missed that we have failed when I speak to folks who are a vaccine has it in one-to-one communications or panel discussions. It's always the point I hit first. This, we did not pull this out of thin air in 11 months, and just walk through some of the developments and how close we were. I think we knew that the next major kind of vaccine campaign that the world needed, whether it was globally or more vocally, would be an mRNA based approach if it made sense for the virus. And we were kind of ready for this. And here we are. I think the fact that we haven't shared that more broadly is a failure of the way that we think about communicating this. But I think it's fun. Well, I was just going to get to, I was going to build on, you know, Bronwyn's point. I think kind of like what you're saying, Elliott, about the gap between, you know, what we'll call basic and applied science, you know, this disconnect between what we know on the science side and how we're communicating our science or how public health is communicating our science. You know, what are things that are set up that are, you know, we've talked about the incentive structure, for example, what are the ways that we set up our science and our science systems right now that lead to those kind of gaps that we can address, you know, from your perspective, seeing this play out. I mean, I mean, again, at any level that you think is actionable for whatever you think is most important, I'm interested to hear, you know, your perspective on what you would do differently, you know, to not make that mistake. Well, I mean, so, so let's start with the softest of the soft sciences, right, which I would actually argue is the hardest science is, you know, I mean, anthropology, I'll tell you what an ethnographer would do, right. One of the problems is we got to meet people where they are. And that requires you to understand where they are. And that's really hard to do, oftentimes at scale, when you're busy, like, you know, trying to put out a pandemic, for example. And I gave a talk to the NSF in which I said, the future is qualitative, right, we need to figure out ways that we can actually capture sort of lived experiences that you know, Bronwyn's talking about, where people are not just objectively rejecting vaccines because they're wildly irrational creatures, you know, calling it, it actually makes a lot of sense within the social context that they would believe that sort of thing, right, people not rationally irrational, they're social. But that requires just as you know, Bronwyn's point about the mRNA vaccines require 10 years plus worth of, you know, foundational work to build on. The same thing applies to understanding humans and their context of meeting them where they are. Is that has to begin earlier on, because it cannot be done quickly enough, our survey tools are not sufficient and granular enough to give us that kind of lived experience. In fact, I would argue, almost worse, they can give us a perspective of objective objectivity, because we can capture this at, you know, network levels, social scale levels, and yet fail again to capture what you know, the lived experience that ultimately decides whether or not they're going to believe you as opposed to their grandparents, or alternatively, how do you get that conversation going with grandparents and you and the scientific community, et cetera. So what's to noodle on there, I think, I would just point out, Bron, before I hand the microphone back, that when I first got to DARPA, I pointed to a 1964 article by Platt, James Platt called Strong Inference, which he's beating up on the biological community for exactly the same things that Elliot is beating up on the social behavioral sciences today, which was, you know, this sort of bespoke approach to I have my theory, I have my particular model, I can't just, like, you know, I'm never going to actually just prove it because I don't actually make meaningful predictions, et cetera. Any beats up on biology, I think, to get to where biology is now, in part, not solely that. But so we, the situations are not incomparable. But back to Luke's question, I think it is an open question. And one that comes far too often, far too late in the supply chain is to think about how to meet people where they are and, you know, who those people are and how they differ over. Yeah. No, I agree. It needs to be, and I mean, I would just restate what you said, Adam, is we need to be problem or, you know, solution oriented rather than theory oriented, right? So it's, you know, this question of, you know, okay, a very practical, you know, how do we get people to take the vaccine? And maybe in, I think the whole psychology, I think from the very beginning is just completely off track, right? Because of our focus of like, well, my research is around this theoretical model. It's like, well, that might or might not be relevant to this problem. And it really doesn't, the problem doesn't care about your theory, right? Maybe there's some other theory that somebody else is working on or that nobody's yet invented that actually is relevant to this theory or in reality, it's a blend of many different things, right? Like you're saying, Adam, we're social. So you have to include, you know, okay, so what's the sort of social milieu? What's the social influence? And how does that factor in? And what is the, you know, science communication coming at people? And what is this person's particular values? Right? And so we do have the sort of bits and pieces out there, but we're really just not approaching the science from a kind of cohesive way saying, okay, I'm going to pick up whatever theoretical ideas, you know, that we know about human psychology that are relevant to this problem. That's just not how we go about science. It should be in my opinion, how we go about it. And so we're going to need to be more interdisciplinary. We're going to need to let go of our pet ideas and really focus on, you know, look, my theory might be wrong or probably it just doesn't even apply. So I need to learn some other thing or collaborate with somebody that knows how to do that, which is again, you know, something where I think the biological sciences are way ahead of us in terms of, you know, any medical technology, it's, you know, it's biologists and chemists and engineers. That's the sort of we need to embrace that model in psychology to really actually solve real problems rather than just, you know, push ideas around. As a biologist, I'm flattered that you think or as a scientist, our scientists, I guess, flattered that you think that our community is so far ahead, I'm sure if we looked under the hood, there'd be be more comparable than you may think, but I just wanted to add that I think we need to redefine the meaning of interdisciplinary, even in just the example that Elliot just gave. Interdisciplinary was working across really hard science lines, you know, biology, chemistry, physics, data science, and sitting on this side of the hard soft divide, I feel like we need to be more closely integrated with social science, behavioral science, economics, the things that it just in my work, the things that are going to decide whether genomic surveillance and the use of genomic data and public health response in, you know, in practical terms in the future will not be defined by what we do on a science, no matter how integratively we are on that hard science side. It's really about all of the forces beyond that. And I think it's time to open that up. I think COVID has really shown us that we need closer integration between these domains of research and application. Yeah, let me challenge Elliot. Oh, sorry, go ahead. No, go ahead, Adam. And then I'll talk about, I want to talk about something wrong with that. Yeah. So I'll throw the challenge back to Elliot and say, so one of the biggest challenges in this area is, in particular, if you're looking at the government to fund this kind of what I almost call anti disciplinary research, is scoping a research topic and problem in such a way that it's actually tractable enough that you know you're making progress. It's clearly, it's obvious why it's hard, right? It's obvious why it's important. And that most of your time at DARPA in developing programs is literally trying to figure out the answer of what is the actual problem you're trying to solve and how you know you're getting there, what difference is it going to make? That is perhaps more of an onus on the funding side. And maybe they don't spend as much time as they need to thinking about that because a well crafted problem can motivate exactly the kind of research you're looking for. If the expectation is it's going to organically emerge among scientists. That has happened on occasion. I would not put high probability on it because we'll go with the funding. So I think the onus is back on the government in parts to think hard about the right problems that will motivate that. And so the challenge is, Elliot, we got to get you to DARPA or somewhere where you can craft those problems in such a way that we make that meaningful advance. Back to you, Luke. Yes, I was going to, thanks for that, Adam. And I just wanted to talk a little bit more with you, Bronwyn, about your comment about interdisciplinary. That's the right word. Anyway, you know what I mean. So this is something, you know, we've thought a lot about it in IH, you know, in our various silos. And specifically, I was interested to hear about, you know, your colleague, Partis, Betty, whatever gave a talk and I was kind of purping on the like, you guys did an excellent job. Like, we need more social science and, you know, I hear the message all the time, yeah, we're on board. You know, so what is the bird doing? I mean, what are you guys doing, for example, to integrate more with the behavioral and social science is on a practical level. So I would, I trained in Boston, I, you know, I was at MGH and I would go over to MIT to work on technology that I was thinking, you know, that I wanted to modulate people's brains. I was like, who's doing that around here? Oh, it's folks at MIT. And they made that very easy by setting up programs so that we could collaborate. And then, you know, you could then see, oh, this is successful. And then we can set up funding opportunities, for example, to further develop the needs of research communities identified. So what have you guys been doing at the Broad? I'd be interested to hear more about your interactions, your formal relationships, any programs you guys have set up with the behavioral and social science community to kind of work on some of the issues that you guys have identified as important. Yeah, that's a great question. And I'd love to come back to you with a better answer in the near future. I mean, we have been pretty head down and doing our day job for a lot of the pandemic that brought, as you may know, set up a massive testing platform. And on the back of that, we are using the positive samples for genomic surveillance across New England and the nation. And so I think a lot of our focus has been on just making that happen in short order. But yeah, thank you. But I think what it's exposing is the reality that we are just making it happen right now. And it's not sustainable in any way going forward without kind of taking a moment to integrate these other elements. We're getting challenged on data release, global data sharing, open public data release is one thing. But in terms of returning data back to stakeholders, back to individuals, the lines of public engagement and patient information privacy, all kinds of patient incentives to participate, just some of the behavioral aspects of this particular angle on the work. And I think we are just beginning to accumulate the list of things we need to address to do this sustainably and do it better going forward and need to tackle those. And then also the ethics, I think the bioethics is something that the broad is looking at across the piece, taking a more active kind of exploratory, almost academic approach to the way we think about the ethical elements of our work and the impact on community. And so I think that's another piece that we're trying to advance. Yeah, so I would be very interested. I understand I appreciate the work you guys do. I know quite a bit about it. And it's very valuable, obviously. And I know it's like, this is the pace of all of this is incredible. And but I guess it's, yeah, I'd love to hear, get back to you point. I hear you on that. And hopefully this is just the start of a conversation. But what could have been on the ground for you guys so that you don't, you can do the work you're doing at the ridiculous pace that it's in addressing all the overwhelming things that you're addressing, but in a way that you're more plugged in with the communities that would be addressing some of the social and behavioral issues that we've all seen emerging. Because frankly, as far as I know, we were not set up at all in any practical way to do that. Our science, we're talking a lot about our sciences being more practical. And our science systems are the ways that we're communicating. I hear a lot about communication. Is that where we should be focusing our energy? So I don't bring that up, you know, needing an answer. But really, like, that's, you know, that's the kind of thing like, you know, we were to go through this again. You know, our incentive structures are the way that we're performing science, training people to address things, you know, open science certainly has tried to make democratize science to have more people participate, to drive this from multiple different areas, to meet multiple needs, to grow the community. Yeah, you know, this again, this is not, you know, I'm not expecting an answer, but you know, that'd be the kind of thing that we'd be interested to hear. Like, what would you would have been useful really, frankly, to have on the ground, you know? I think just quickly, I mean, this is kind of getting into the weeds of the particular context of genomic data and genomic surveillance for pathogens, for infectious diseases, like SARS-CoV-2, but in addition to some obvious things, like technical capacity and obviously funding and kind of political will, I think we need to take a look at the regulatory landscape for this type of work and the need for the IRB kind of process requirements, structure, especially when data, the data that we're generating is of, you know, public health value. It took some time for us to, I think, convince folks that it was and now that it is, we're still really kind of slowed down and complicated by the regulatory environment of doing this when really the risk is tiny of some of the things that we're trying to, you know, to protect with respect to IRBs. I'm not saying that it should, I actually probably would say that some aspects of our work should not be IRB regulated. Some should and we should kind of define those so that we're not so encumbered by those processes going forward and have those elements in place so we can move quickly and not be going through that process in a state of emergency. Thank you so much for all of your input so far. I think we've hit a little bit of a lull in the questions. I did want to cover a few things that NIH in particular has been doing during COVID and also prior to COVID and since then if that works for everybody at this point in our segment. So Booth, Luke and I are relatively new to NIH and our prior experiences as researchers and also NIH grantees perhaps as some of the audience here is that the grant funding process was not particularly fast both on the institutional and also the NIH side in our example but I'd say being in NIH or now and having firsthand experience of NIH's COVID response, I've seen that NIH, while it can be slow at times, it also can be fast and responsive to unanticipated health challenges especially by taking advantage of the research advances and infrastructure it has developed and is continuing to work on. So in addition to the miraculous speed that we've seen of vaccine development and tests which of course were built on decades of research both by NIH researchers and also the larger community, there was also a rapid effort to get research funds to investigators as quickly as possible. So I just want to quickly highlight a few of those programs and initiatives that NIH put in place and also think about kind of what NIH is going to be trying to do moving forward. So just a first example, so very soon after the pandemic began in April 2020 NIH created the accelerated accelerating COVID-19 therapeutic interventions and vaccines or active public-private partnership to develop a coordinated research strategy for prioritizing and speeding development of most promising treatments and vaccines and efforts across the active program to accelerate the identification of candidate treatments, clinical testing of treatments and vaccines including increasing clinical trial capacity and effectiveness and also the evaluation of vaccine candidates to be enable rapid authorization and approval and also importantly active is also prioritized identifying emerging COVID-19 variants and coordinating data sharing. Just another example program through the rapid acceleration of diagnostics or RADx programs NIH has aimed to speed the creation of coronavirus tests ranging from rapid home tests to clinical laboratory tests. Some of these tests might be those that many of you have probably used at this point have come from this program and through this program scientists and inventors actually competed in a shark tank like competition which might not be the first thing you think of when you think of NIH with the rapid selection process and pairing with industry partners to increase the odds of success on those. And other aspects of RADx are focused on I think something that we've been highlighting as important community engaged projects to address disparities in underserved populations and also exploring non-traditional detection and testing technologies and scaling up those technologies. On that note NIH has tried to recognize that health disparities represent a major challenge of this pandemic. I think something that a lot of people anticipated but also probably not sufficiently and haven't been addressed kind of from the outset. And one approach to address this has been NIH's community engagement alliance or SEAL which focuses on trying to provide that trustworthy information to people hardest hit by the COVID pandemic through active community engagement and outreach. And I just want to highlight a little bit kind of what extramural funding has looked like during COVID and also what it might look like moving forward. So with this traditional more traditional extramural funding so first in the pandemic this largely came from something called administrative supplements or other transaction awards or OTAs. Both of which can be processed more quickly than the usual full grants. But those administrative supplements in particular so those were add-ons to existing NIH grants so it did limit a little bit who could get that funding. But soon on the heels of this NIH also released new COVID related funding announcements and some even with accelerated processing closer to one to three months instead of the longer process that we usually see. I think I'm going to jump ahead a little bit here. So while NIH mobilized to become faster many of the ways in which it did so were not particularly sustainable long-term. I think that's kind of been a theme across how people have responded to the pandemic. We kind of move quickly to do our best but we have to think about what we continue moving forward. So however NIH has done fast before even in a larger context of its usual grant system. Even dating back to the early 2000s and the 90s several institutes at NIH have you know something called time sensitive funding mechanisms. And these were funding opportunities with relatively rolling deadlines. You could actually apply any given month of the year. And turn around was closer to three to five months even with the usual administrative and peer review process that was used. And to date these time sensitive opportunities have been put forward by just a few institutes and centers at the NIH and just on specific topics but actually just last week a concept for NIH wide time sensitive funding opportunities past council review. Meaning it's well on its way to being a possibility in the future. Led by staff members in my office at OBSSR this broad funding opportunity would allow for responsive funding to what we're calling time sensitive events defined as a change in program policy or infrastructure that unexpectedly arises in a particular population. We're also the prospective evaluation of a new policy or program that would impact health related outcomes. So what we're talking about here are discrete events that might be a missed opportunity with the usual grand timeline. And as with earlier iterations of these more responsive opportunities this will offer shorter times between application submission and research initiation. With the hope that it would better support this critical responsive research with important practical implications for health. So that's just a snapshot of what NIH has done in the past what it's tried to do in response to COVID and what it's thinking about moving forward. And I think Luke's been dropping links in the chat if you'd like to look at those for more information. But I welcome any questions that anybody might have and also any thoughts that our speakers might have on any of those ongoing or upcoming programs. I mean that's awesome. Yeah. So I just I mean yeah we need more right more better more faster more yeah mainly more better I would say it's okay but yeah certainly more faster. I think more faster. Okay yeah well he says he's a biologist who who yeah has a science she can believe in. No I'm kidding yeah no both is true. I just think so until we actually perfect forecasting you know we're going to be faced with a future that's increasingly uncertain and in my understanding Bron correct me if I'm wrong is biology's response to uncertainty is diversity and this sort of explosion as it were not quite Cambrian era but explosion of different experiments for funding and I think there's a lot more experiments and experimentation be done but I salute NIH for engaging with this noting that some of these things will work and some of these things won't and you know DARPA is famous for doing that too. We try to create an ecology in which lots of different efforts can be done and you know we'll kill some which is sort of how major works right if it's not working you're done that's why DARPA is a sometimes street you know if you want to live comfortably with a 15 year grant we're not the agency right that's not the agency it's not clear to me that that's yeah there's a big role for that at the moment regardless but but yeah so I think these sort of experimental approaches are really important the key is of course that the NIH needs to let us know how it's working right they need to they need to spread the good news which also includes this didn't work and and I think that applies to other funding agencies as well both internally I mean to include when I was at DARPA we funded a seedling small projects with Paul Smaldino doing essentially sort of modeling simulation of they wrote that article about the natural selection of bad science we were funding some modeling work on like okay so what's what's the solution to this and there's strong evidence that modified funding lotteries right having people get across a certain quality threshold and then just draw names out of a hat helps to prevent the Matthew effect you stop funding the same people already have the resources which you don't stop them but now the the up and comers have a chance right because it's not based on sort of name recognition or necessarily even just you know past performance again noting that the quality threshold to my knowledge that actually hasn't been done in the government that is something that I think should be done as an experiment shouldn't be the model but should be a model and this this you know landscape of experimentation so yeah awesome hasn't been done in our government I know other countries do something like that fair point fair point yeah yeah yeah you know just riffing on a couple of things about those announcements I mean well two points really so one is the sense of urgency which I love because you you know we've seen it from NIH a few times but it's also trickled down into the research community which again is something that you know and maybe this is my stereotype around of the biological sciences or medical sciences that there is some sense of urgency I mean I think there's evidence for that in the peer review process right that peer reviews are a lot faster and you know in general and sort of biology because sometimes what you're dealing with is important and I think one of the frustrations in psychology like you know classic example are you know flagship social psychology journal JPSP journal personality and social psychology has has these you know year long two year long horror story you know four rounds of three month revisions that's quite common and part of part of what irks me about that I'm not alone is that there's no urgency right there's no sense of like well this is important we need to get this out there it's really just like no it's it's theory it's abstract you know it's sort of this platonic ideal and you know that's but this this sort of event has really changed that and I think that's good I think psychologists are now having a sense of like oh wait what I'm doing actually matters now like I need to get this out so it's true with funding but my point is it should be true of peer review too and and our peer review journals need to adapt and again it's it's sort of comes back to this question of culture change like we need to wrap our heads around how do we you know of course you need to maintain the rigor and the quality but you also need to understand like this is important we need to get this information out there and we just we really don't have a model for that is there a technical question sorry to jump in but is there something equivalent to bio-archiver pre-print server uh in the social sciences there is yeah oh yeah yeah we have great pre-print servers like our archive and pre-registrations and all you know yes um but I think those still don't quite have the stature of the peer reviewed sort of imprint and you know we're working on that but that'd be a good development and then the second point is Adam's mentioning we never have like the biggest cliche in psychology it's probably true in other fields too is you end a paper by saying more research is needed and we wrote a paper about this saying we actually need a mechanism to understand when more research is not needed we need to understand like when do you just sort of say stop which again I think is another problem with the theoretical approach if you're testing a theory we really have no like very rarely in psychology our theories actually declared like no this is just wrong let's let it go and stop doing research on it um and again that's a big detriment to the field because most theories are wrong it's this weird duality like we recognize that um and yet they just live forever there's all these zombie theories right and um and and part of it is I think part of the solution is effect sizes um and I think DARPA I should give Adam in particular really good credit for this which is like in in psych this sort of significant threshold idea is it impedes progress because any small effect can be significant you know with enough sample and precision um but you know sometimes we need to get to a point of maturity in the field to say you know what this effect it might be there it's it's so tiny it's meaningless so let's just stop like let's look at a different route here right and that's going to come from looking at effect size it's going to come from a more problem focused rather than theory focused place right if the problem is you know how do we combat misinformation well okay so there's probably a hundred different sort of theoretical models that might make oh yeah you know you'll get like a d an effect size of 0.02 on this and it's significant we need funders and again DARPA I've seen I've been a victim of this right DARPA needs to say you know what if your effect size is not you know 0.2 or bigger just forget it right like just there's other ways um and it gets like the problem doesn't care about your theory you know pick the thing that actually solves the problem and that's something that psychologists we can do that and we can and I started to see it so there's reason for hope but it's not sort of the predominant way of doing business just very quickly um the the other I mentioned forecasting perhaps people thought I was kidding but I'm I'm actually not I think there's a lot of value still to be done forecasting in a particular to Pirelli's point to bring things like social behavioral sciences to forecasting make predictions tell me what your theory will predict under these conditions it will be wrong right but that's the start because that's the feedback we need to improve you know the application of a theory for example so it's not that I'm anti-theory I'm anti-theory that cannot be falsified or will not make predictions and believe me uh even having had a large carrot at DARPA there are amazing ways for people to try to get out of making predictions um and and I think that's yeah so so I'm just piggybacking on what Elliott's saying is is the effect size is really important there's the other ways of improving this but I think ultimately I mean biology really did start making significant progress when people were no kidding willing to make different predictions based on different models and actually test those out see which ones you know worked as it were so uh yeah so so I'm with you in terms of like the other challenge we have though and the nice thing about forecasting in a way is we don't we don't keep track uh we don't have an archive of who made predictions and how they were wrong um or or right uh so when you're actually I know Christy you want to touch on things like infrastructure um I think that's one of the challenge of the current publication processes is we don't know the predictions you made Pache you know pre-registration which I'm a big believer we don't know the predictions you made that didn't work uh so the feedback we're getting back is really incomplete uh and problematic so I'm back to the beginning of that supply chain again that feedstock of the the mess that's in there that now we have to use and you know Elliott tragically as you know you know number of people had to come out and talk about you know evidence readiness levels to try to convey this idea like stop like you're just picking articles out of the literature and assuming that because they're peer reviewed they apply to this really complex problem no god you know um and the fact that we had to blow the whistle on ourselves in a way um it's symptomatic that that problem was away but um anyway all right so we have just a few minutes before we have to turn over to the next session um any additional questions you guys have for each other um or any additional comments you'd like to share with the audience today so I'm going to share uh out of a good point I like your idea of tracking prediction so there's a I know um there's an institute at Berkeley that has the social science prediction I don't know if you guys have seen this before um that Ted Miguel runs um at Berkeley um I think that's that idea of tracking our predictions is a really really useful and interesting one um that in particular the behaviorals that could be useful I see in in various ways that we are you know we're not doing that so much in our behavioral and social sciences um I just wanted to throw that out there as an example um but uh I wanted to hear more you know Elliott you mentioned for example you've noticed since the urgency I've seen I mean certainly everybody's well aware now that with the pandemic that you know there's a need for um uh us to leverage our science to address real practical challenges in a time you know a timely manner you mentioned peer review and how I mean that horrible process for journal you know review um are there ways that you have seen or examples that you have seen um in Bronwyn I'd love to hear from you from you know your various venues of changes that are being made right now um that we could you know because we also think about peer review on our side obviously at NIH uh we're not you know in the Center for Scientific Review but we communicate with them and um are there ways that uh you know we could um you know take advantage of things you've seen that have been working well to move the science forward more quickly um to leverage how we're doing our peer review on our end yeah I mean I'll just chime in real quick and say sorry to interrupt Bronwyn I mean I think I you know bias you know I spent five years as an NIH reviewer and NIH grantee and all that so I like the NIH system and I like that it formally separates approach and significance I mean I'd say that's the biggest thing that we don't do in their peer review of manuscripts right because psychologists do this thing to ourselves and we're not the only field right where like you you know it's not uncommon to get more pages of review than the manuscript is itself right and so that's all just critique of the approach right from the NIH perspective whereas the significance can't outweigh that whereas I often see in NIH review to and it's a strength of saying you know what there's some limitations to the approach there's issues here's they here here they are but the significance is so high that this is a high impact project right and there's no way for peer reviewers or there's not we basically just don't that doesn't occur to us to do that in our papers um very rarely it'll happen I've seen it but for the most part they're just critiquing the method right or like you well you haven't you know contextualize this within the literature blah blah blah very rarely will reviewer say hey you know what there's a few problems with this but it's really important we need to publish it right away that like you almost never see that whereas I do feel like you see that and maybe it's more in the sort of biomedical engineering world but that's just kind of rare in psychology and it needs to be less rare I was just going to add from just a practical perspective and just a personal perspective my sense is that the sense of urgency with the journals and their review process is there too and that's it's certainly influencing me uh about my approach to to um peer review of papers um and I just feel I'm on a clock with the journal in a way that I never had before there they have expectations of turnaround time they're following up they're chasing you down and you know they're they're asking busy people to to do their job you know to contribute to the peer review process for for the benefit of science but with expectations now that are much greater in terms of process and timeline than than I've ever felt before and I think that's really moving the needle and I'm sure they're feeling it you know because of science twitter and preprint you know from the point of view of like the the value of the product they're putting out they need to be faster to to keep up with that and so I think that that um pressure expectation and incentive is just running through the system and I I hope we keep it because uh I think we all you know know whether whichever side of the soft and hard science um kind of uh access you're on it it's too slow period and uh and and I hope that we can keep that sense of urgency so the games we make in the academic sphere can be translated yeah and I was just gonna say on on our side just to be fully transparent you know Kristen mentioned um our use of things like the administrative supplement and uh OTA and other transactional authorities um but uh as you may be aware with the administrative supplement you know all of a sudden NIH program staff that have no not traditionally been reviewers of scientific review the first level of review or the only level of review were called on uh because of the the nature and pace of the science to all of a sudden put on hats as peer reviewers in a way that you know for some of us that are newer I think it's not such a abrupt shift but for some folks that have been there for a while that hadn't done peer review for a while or haven't interacted with CSR you know the Center for Scientific Review it's a you know it was a different it's a different ballgame and it's a different skill set obviously and um so it's something worth thinking about too so in part I'm you know we were asked I think just by just like everybody everybody's just tried to get everything done um and it was a numbers game uh more than anything um and uh you know there are ways that people have to get I guess what I'm interested in is you know not now but if we ever if you have feedback for how um you're being vetted as a peer reviewer you know for things like this how you're referring uh colleagues to to get them in the pipeline um to feed the system um you know any thoughts you have about the model of how we were pulling in you know program staff to do a job that we really weren't you know we were doing as as our side hustle you know so to speak in this in this case um we'd love that feedback from from the scientific community you guys and others um uh because we want to do a good job moving things as quickly as possible on our end so that you can get the research done as quickly on your end um as as possible I'm here we hear that message yeah quickly I was gonna ask Bronwyn like um so they're asking you to do more of you faster um what to your mind has been the best technical leap that has enabled you to be able to do that don't save the internet um and actually you don't even have to answer because we're running out of time but but I I would welcome people uh I I see little uh pots of efforts here and there to sort of reimagine peer review um open peer review ongoing peer review but I haven't really seen aggressive experimentation along the lines of like some of the funding models with with peer review um and as you know before I left um we started a program called SCORE at DARPA which in part was meant to sort of maybe help with that process um uh by by bringing sort of automated tools and ML approaches to help reviewers essentially it was never meant to replace humans but to scale this up um if if people know of really interesting sort of you know experiments with peer review I would love to know more about them because it seems to me that there has been a sort of a deafening silence at scale in that regard and I don't want a negative note so I'll say something like rah rah like uh we've discovered that science is a human endeavor which means it's imperfect but also means we can fix it or you know improve it so uh back to a slide supply chain guys thanks very much for the opportunity to chat here it's a true honor yeah thank you all this has been great that's a pretty good summary I was going to hit a few points that I think we we focused on is most important so kind of breaking down this distinction between basic and applied sciences um meeting people where they are um realizing really the human element um on all sides of science um coming in with a more solution orientation problem focused orientation um and also thinking much more about how we can be more interdisciplinary across the board um so thank you all so much for your participation and thanks again to the organizers yeah thank you all Luke I take the challenge thanks everyone welcome back to you with an answer next time yeah please please I want to hear it all right thanks