 All right. Good afternoon or good morning wherever you're logging in from today. We appreciate you joining us. Thank you for taking your time out of your busy day. My name is Marcy Reedy. It is my great pleasure to welcome you to this afternoon's webinar on supporting open science and the promotion and tenure process. Lessons from the University of Maryland. I'm a community manager here at the Center for Open Science looking to promote reproducible and transparent research practices among the education community and to expand the knowledge and know how of the open science disciplines. Before we started, I did just want to share with everyone one quick link that will hopefully be of use to you as you move forward. And that is to a knowledge hub that we have developed on the OER Commons webpage. We are referring to it or call it the Open Scholarship Knowledge Base, and it is the place for you to go to understand the who, what, and how of open science. Whether you're a student that is looking to expand your open and transparent research practices or a mentor that is looking for material that you can use to support a very able and energized graduate student. So I wanted to share that with you. You'll have resources also supporting our discussion today. And then without further ado, we wanted to move forward because we have a very exciting discussion planned for the webinar today, and I do use the word discussion very deliberately. So we were talking earlier, we just wanted to take this moment to encourage everyone to feel free to set aside some formality standards and feel free to enter questions in the chat function as the webinar proceeds. We really encourage engagement and discussion and hope to make this as tailored to your interest as possible. So, so please don't hesitate to speak up and also share your experiences and your stories, if you feel comfortable doing so. So it is next my great pleasure to be able to introduce and welcome Dr. Michael Daughtry, the department chair at the Department of Psychology at the University of Maryland. One of the intros or as we were preparing for this webinar, we were kind of doing some crowdsourcing on Twitter, on social media, on who is doing interesting things around the promotion and tenure process. And Dr. Daughtry's name came up repeatedly. So we're so pleased that he could be there. And one line, and I'm just going to quote from your, your bio one line, if you don't mind, but one line that really stood out to me that I absolutely loved is when he says that the view of a mentor that that you had and one that is carried on that you've carried on, is that the process of doing science is more important than the results one obtains. And so I thought that was just a really interesting context and important framing method of what we're trying to think about as we move forward. So, thank you for joining us. Yeah, my pleasure, my pleasure. And thank you for inviting me at this is, I think a really great opportunity to share, you know, some of the things we've done here at the University of Maryland. Maybe impart some lessons learned for the rest of the community who is interested in really promoting and advocating but also moving beyond just advocating for open science but actually doing the hard work to make this happen at your institution and I hope I can provide some help. So at the end of this webinar, Marcy is going to put up a link to a Google form that allow you to reach out to me, and I'm happy to work with anybody, any groups, any individuals, any departments who who just, you know, want to explore what they need to do in their department in order to provide that survey put up here at the end. So, Marcy mentioned my bio online and there's another part that I really love about my prior mentor and I remember early on in my graduate career. We were talking about research and and some research projects we're getting ready a lot launch and he looked over his glasses and just said, Mike we just need to be fair to science and the sort of this mantra, we have to be fair to science. And that stuck with me my entire career and I frequently returned to that principle of, you know, science is a process but we have to respect that process we have to be fair to the process because ultimately our goal is to understand fundamental laws of nature. And we can only do that if we're fair to nature and for fair to science. And so that has stuck with me and it's a principle I think that also is embedded and underpins the open science movement. And I just want to give a quick shout out to the Center for open science, Brian Nasik and all of his leadership that he has has put into this over the years to really bring this this focus. I think to the center point of academia and to help those of us like myself really move this process forward, without that, that leadership. We would not be in the place we are right now where we can start to talk about policies, the policies that allow us to codify what it means to be fair to science. Okay, and not only that fair to our community partners are funders are taxpayers that make all this work possible. So, I hope to move everybody from that point of I'm an advocate for open science and I want this to happen to now I'm going to do something about that. And so we're going to maybe hopefully get you to that point of being an activist by the end here. Okay, that's my opening here here that's awesome. Thank you so much appreciate that one of the mantras, we have it at the Center for open sciences, make it possible for open science practices, make it normative, so that it is appearing everywhere and the lawyer peers, and then implemented into policy so that's exactly, you know, as we move along that's, that's very true. And one of the things getting to the promotion and tenure process that is so critical about that is that if you don't like align the incentive structures for the promotion and tenure process, you're never going to see it widely adopted. So, on the structure for today. We have generated through a brainstorming process, a series of questions that we had about the journey that the University of Maryland psychology department went on in developing new guidelines, promoting open science and their promotion and tenure. And so we're going to take the liberty of answering the questions as we move forward. But should you have questions that we don't address. Please, please raise them. So without further ado, we'll move right forward. The first question that we have or was hoping to get some, some input from you on was, what gaps did you notice in the PNT process before you made your changes, and what motivated you to create and adopt these new guidelines. Well, when I started this process, Marcy, there was a, we were long overdue for revision of our criteria, our criteria were last revised in 2006. And probably they reflected the historical perspective even before that right so I'm guessing that the revision in 2006 really was very similar to what they look like prior to 2006 so they were long overview and we are sort of at this point in time where they, we had to by university guidelines, revise them. So, the biggest gap was is that they were embarrassing, embarrassingly incomplete and outdated. And by that, I mean, they really relied on a lot of sort of historical views of what it means to do evaluate research and when we talked about evaluating research, the question oftentimes is are we evaluating the reputation of someone, or are we evaluating the substance of one's work. And what I think most of us will probably realize is that a lot of the metrics that we kind of rely on, really go back to that notion of reputation, right, and then we have to ask what drives a reputation and oftentimes it's things that are completely unrelated to the science one is doing. In fact, completely out of the control of the, of the scientist, what sort of reputation you have now of course you can get good you know doing good science bad reputation for doing bad science. But my guess is that there's a pretty bad correlation between the science and the reputation. And in fact, you can get bad reputation quite easily, as we now know. So, yeah. So that said, revising our professional or promotion tenure criteria was was really a priority of mine when I came on as chair in what was it. In 2017, it was one of the things that I spoke with my department about when I was quote unquote interviewing for the chair position. And much of what I was focused on at that time was issue surrounding responsible conduct of research and reproducibility, because you know, where who are the gatekeepers of this. And notably, there's some gatekeeping that happens at the journals at the review stage but we know that those are really imperfect gatekeepers. And we also know that the reputation factors can drive whether someone gets published or not, depending on you know who you are what your name is where you got your degree from. And so, you know, the more I got into this, I started actually gravitating towards sort of a more moral perspective on why we need to do open science, such as, you know, as a public institution here at the University of Maryland, or even just, you know, an institution of higher education or mandate really is to serve the public, and specifically, we do our work for the betterment of society. And it's really just difficult to see how our work benefits anybody other than ourselves, if it's not made maximally accessible. So if our work is locked behind a paywall, or our data is under lock and key, it limits its impact, by definition, if something cannot be viewed by others, it's going to have limited impact. If it's not accessible by the community, it's going to have limited impact. And so the goal here is really, what can we do to make that impact broader, make our accessible, make our work much much more accessible to the public. So that we can have a bigger impact on science. And we can think about this not just as impact of science, but sort of a more general principle, which I call impact as access. So, access is a necessary requirement for impact. You can have impact in many different ways. And it could be through scholarship. And it could be also through who you serve, right? What communities is your work geared towards and serving? How do we serve our community as scientists and administrators? Now, what do we do in the life course of our work to make our community of scientists more inclusive? And so those are all issues about accessibility, accessibility of research and accessibility of people to the research enterprise. So this idea of impact as access, I think underpins almost everything that we've done in our promotion tenure policy. And I think allows it to have a much broader impact as it's realized in terms of faculty work products and research. Yeah, I love the discussion of framing it not only practically with the impact and accessibility, but also from a morality frame of this is the right thing to do for science. Yes, definitely. So then the next question that follows, we've kind of framed the setting and a little bit of the problems that were experienced. Why did you feel that these guidelines were important? Why were they the solution that drew you to this quest for greater impact and accessibility? Well, there are many reasons, really. First, I really thought it was important that our faculty be evaluate on the substance of their work and not just looking at indicators that boil down to reputations or reputational criteria. I'm not going to go into all the research on this, but if you look at the work on bibliometric indices, there are some serious questions about their validity. And I think some research that would imply that they're actually either completely unrelated to the quality of the science or in some cases negatively related to the quality of the science depending on what aspects that you're looking at. And my guess is that those indices play an outsized role in the evaluation of scientists. Now, if you're an industrial organizational psychologist or in human resources, you know that it's not actually not so cool to use indices or criteria for evaluating employees that could create what's known as adverse impact. You know, impacting one subgroup more so than another subgroup. But there's some evidence that bibliometric indices hold the promise to actually have adverse impact on employees. And so, you know, there's some potentially I've never heard the legal arguments raised about these. But when I started looking at this literature, I thought, oh, God, you know, if the lawyers saw that these criteria that are in our promotion and tenure documents have been shown to have manifested biases in, you know, males versus females or racial disparities in, you know, how those criteria manifest, they would probably want us to take a very serious look at the inclusion of them in our documents or in our criteria. So, you know, returning to this idea of reputational criteria, really the goal was to move people away from those things that the candidates don't have control over to those things that the candidates do have control over. So, you know, you can't pick your reviewers for journal articles, right? And oftentimes it's the luck of the draw. Did you get people who are going to be sympathetic to your particular theory or viewpoint or open to new ways of thinking? Those are things that can determine whether or not a paper is accepted in your preferred journal or not. The other part of this is that, you know, if you're doing work on problem X, where do you send that? Well, you can try for the maximal impact factor journal of recognizing that it'll be a little bit out of place and probably won't be accepted. And so you kind of clog up the arteries of, you know, that journal, or you can send it to a journal that might be more field specific, but would target the right audience. And so we wanted to give people the power to think about where is the most appropriate place to publish my work. Without them really sort of worrying about in the back of their mind is, is this going to count the same as X? So if you're doing work that is important to particular communities, then it makes sense to be publishing in outlets where those communities are, you know, reading that work, right, or who are sort of monitoring those those outlets. A few, I guess, the other things that I don't want to sort of just kind of drone on here, but we really wanted to make sure that faculty felt empowered to do science the way they felt like they needed to do science. And this is both the questions they ask, the populations they're addressing, but also in particular for early career researchers to be able to carry out the science in a way they felt maximized the integrity of the science, right. So people who were interested in or who really felt that transparency, openness and accessibility were cornerstones of why they're doing their work. We wanted to make sure that they felt that they can continue to hold true to their own personal values and still do their work and have that work counted. Now, of course, we all know open science, working with underserved populations or hard to reach populations can come at a cost. It takes time and effort to do those things. And the faculty need to know that they're supported and they need to know that that work is valued and viewed as important. And so when we set out to do these things, we were really trying to, in essence, add a layer of protection and reassurance to our younger faculty, many of whom are coming in with this idea that, you know, I'm doing science because I want to have an impact on my community or I'm doing science because I really feel like, you know, I can have an impact on the scientific endeavor. And they have these sort of worldviews about how that ought to be done. And then they get into the academy and we say, whoa, whoa, whoa, whoa. No, that's great. Save that for when you have tenure. Right now, just pump out a bunch of articles as many as you can. And don't worry about those things because it'll be detrimental to your career. You know, as a young faculty, even as an established faculty member, that's demoralizing to say, don't do it the way you want to do it because it won't be valued. Do it this other way, you know, and then knowing full well that that's not the best way to do it, feeling like you've been cornered. And it's no wonder people want to get out of academia when they feel like they don't have the power or the agency to do work the way they want to do it. So those were the sort of key components of why I feel like they're important and why we really pursued this in my department. Fantastic. It sounds like really opening up and giving some freedom to the faculty to pursue their best choice. Yeah, we hope so. Pausing right here. Just wanted to see if there were any comments, thoughts, questions from the audience before we continue to move forward. Feel free to chime in if you have any experiences or thoughts that you'd like to share. So the next question. Oh, Sandra has a raised hand. Feel free. Sandra, did you want to unmute and speak up? She may need permission. I don't see you. Oh, is it Sandra? See that she needed permission like a loud talk. Yes, can you hear me now? Yes, we do. We do. Thank you. I didn't actually mean to raise my head. I just I was trying to type a chat message, but it said chat was disabled, but then I saw the Q&A. So that's where I can type. So now I know. I do have a question because you mentioned how people may not want to publish in the highest impacts, but not so suitable journal, but maybe in a better suitable journal that may have a lower impact factor. How do you end up evaluating that? Well, you know, my attitude is, is that you evaluate the substance of the research. So I mean, if what we're going to do is say, well, I'm going to evaluate you based on your impact factor. That's, you know, as a someone who's supposed to be evaluating the research, I'm sort of pushing that off on some assumed process that went on at the journal level of which could have had all sorts of things going on that determined whether or not that paper was accepted for publication. So my view is, is that you focus on the substance of the research, not on the outlet in which it is published. I see. Okay. Yeah, it makes sense. Thank you. So another question that has come in that raises raises a good point is what about what are what were the responses from faculty who had made impact through these publications. So I'm assuming this is from the previous status quo of researchers that were finding success within establishment journal articles, correct me if I'm wrong, but you know what kind of reaction was there among those faculty. Yeah, I can maybe touch on this a little bit later when we talk about we have some other questions coming up here about the pros and cons of the obstacles and so the bottom line is by the time we finished this process, everybody was on board. We had unanimous agreement over the criteria, the people who I thought might be a little bit more resistant to change at bought in and it's a process. So it's not like, you know, people looked at this and said, okay, I agree with these criteria. Let's move on. We actually took a good amount of time. I want to say four to five years from the inception to the point in which we have these done and you know, the question you raise here is, you know, the definition of impact. So are we measuring impact by citations. Well for measuring impact by citations that really put citations on the left hand side of the regression equation, as if that's the thing we're trying to predict. But I would argue that the citations part and should be on if we're going to use it should be on the predictor side of the equation. It's an indicator of something, not the criterion to be maximized. And I think that's a key thing to really keep in mind that we don't do science to maximize our citations. If we want to agree if we believe that that's an indicator of impact, that's fine. Of course, then we need to look at the research and say, well, is it a measure of impact. There's some questions there about whether or not it is, but certainly some some articles are highly cited some art. And there's a whole literature on that. So I spent a fair amount of time talking about what we mean by impact, why we should be thinking about citations in a different way. And drawing on the data, the literature to show that, well, hey, look, you know, if we're going to think about citations like this, we'd be prepared for the fact that there are gender biases, there are racial disparities. And these are not good. These are not good properties of a predictor variable if we're using it to evaluate people. Have other forms, other measurements of impact. Yeah, the broadening. Another question on real quick, we can go to this one before moving forward. I think this is a great question about portability faculty that if they weren't staying at the University of Maryland, how would they, how would their UMD measurements be applicable at another institution. And where their concerns expressed about that. You know, I think the issue of portability may have come up once, you know, ultimately, you know, if people are going to leave the university and go someplace else or, you know, they have to make those decisions themselves. I am a big proponent though of preparing your career in a way which makes you mobile. You know, I think everybody from staff on up to faculty should really be prepared to make themselves mobile. But, you know, the other thing here I think that is really relevant to think about is there will come a day. And I think probably now we're looking at this in terms of, you know, a few years where these types of policy changes will be required by virtue of federal policy. And so, you know, I think the issue of portability here is probably going to go away. This is just a guess. But you know that this is a legitimate concern that individual people will just have to grapple with. And, you know, ultimately, you have to make your case that I'm doing my work in a particular way that holds true to my values, the values of my institutions, the value of my science, my discipline. You know, when we sign up to be academics, we don't do it because we want to maximize our citations. At least, I don't think so. We want to do it because we want to do good science. We want to impart some good on society. Right, yeah, right. Exactly. And I see we have one more question there about leadership. Yeah, we may want to hold, David Moore, this is a great question. We have some focus coming up on forces supporting and resisting the implementation of the guidelines that this may play into. So maybe if you don't mind, we'll pause on that question and revisit it here in a moment. That sounds good. So we've kind of talked a little bit about the problem, you know, with broadly with the incentive structures and, you know, what motivated you to go into these guidelines and why you felt they were important. But did want to spend a little bit time into the specifics on the guidelines and how they supported open science and the behaviors that they were designed to encourage just to give more need on the bones of what the guidelines were. So if you would mind speaking a little bit about that. Yeah, yeah, and I don't want to emphasize the word support here because I think that's the right term. I'll come back to this later when I give you some tips. We avoided mandates. I did not want to go down the road of saying there is one way and only one way that you must do your science but rather what I wanted to do was make it possible. And and to reward those people who were going to do it by recognizing the workload that comes with doing open science. So just out of the gate. I was opposed and no one actually proposed. Oh, probably not surprised there but no one proposed a mandate on on on open science. So we didn't want people to really feel trapped or in a corner but rather we just wanted to support those people. So the question here I think is really what's the swath of behaviors we wanted to support and so this isn't just about did you publish an open access journal. Or there's some problems with the open access models and ABC charges, some things that are, I think, publishers are going to end up probably manipulating for for economic gain. So it's not just about making your scholarship accessible, but really making the underlying work products accessible. So for those of you who who most of you probably are in tuned with what's going on with the National Academies Aligning Incentive Initiative and Helios. If not, you can go to, I think it's heliosopen.org and you can, you can look at that whole initiative and what's going on there. I was working on some of these these guidelines. The National Academies was organizing an aligning incentives workshop and I had the pleasure of working with a few people early on in this. And early on in that process they were developing some template models for evaluative products, things that are part of the open science work product model that lies behind all the research but typically isn't made public. So things like the data, the underlying data, the metadata, analysis code, statistical procedures with the work of the Center for Open Science, you know, pre registration plans, registered reports. You know, all these things we wanted to be able to document and to be able to capture as part of the work product that people were were doing right recognizing that a paper is just the words on a sheet. But there is all the work that goes into preparing that and that we wanted to be able to capture the swath of work and recognize those work products to the extent possible through our evaluative process. And so in our, both in our annual review but also in our promotion tenure documents we do now actually ask people to provide pretty extensive annotations of a set of publications so that we can look at, you know, data sets. We can look at accessibility of data. We can look at accessibility of analysis code codes. At first they might have engaged in to ensure reproducibility in one form or another, and really giving people an opportunity to say more about the research that goes into a CV line than what you get out of a CV line. Right, so it's a pretty nice little extension on a CV that just allows people to advocate for themselves but just describe the scholarship in a little bit more detail along the set of dimensions that capture open science but other things too. Actually I lost track of the question I was trying to answer. No, you were perfectly in line, you were perfectly in line talking about the behaviors that guidelines were trying to encourage. Yeah, that might be most of them, you know, making the scholarship accessible. You know, here's a place where I can talk about leadership, campus leadership. I'm fortunate to work at a university where my leaders recognize and value the importance of open science at the time we started working on this. We had a new VPR who had just come over from NIST and it turns out that she was a proponent of open science. And my dean at the time was a psychologist by training and fully aware of all the stuff that was going on in psychology, the reproducibility crisis, the need for really thinking about open science. And he gave me the latitude to work on this project and the encouragement. My associate dean put me in touch with the provost's office. Turns out the provost's office was like, you know what, this is kind of an important issue. We've been thinking about this from the perspective of responsible conductive research. And so it turned out that campus leadership was mostly on board. Actually, I shouldn't say mostly, they recognized the importance and they endorsed it. And they allowed us to move forward. Maryland may not be the same as other institutions because here at Maryland, units determine what the guidelines are. And the upper level administration sort of looks to the units to tell them what we should be evaluating in our field. And so we have a little bit of latitude due to that too. So, you know, these issues were already bubbling up on campus at that time. Very good. All right, moving right along, let's look here at some of the next questions we have. But just a broad question, you've already factored a lot of this in with the leadership. But what factors supported the adoption of the guidelines and which factors opposed them? You know, so this is an interesting question because, you know, when I first started getting into this process, my first approach was to take our existing guidelines and start to tinker with them. And, you know, trying to infuse some open science process. And mostly I got some looks at me like, you know, what are we doing? Like, what's, you know, why would the, they're like, okay, yeah, we could probably do this. But at the end of the day, what I realized early on in that process was that we weren't really going to make progress if we took our current standard and try to change it, that really we needed a fresh start. And so, through a variety of external factors, including this thing called a pandemic, I was able to put that on hold and restart at some time later. And I restarted it with a completely different approach recognizing that we weren't going to be able to simply tinker with the existing criteria that really we need to do a complete rethinking of what our criteria were. Otherwise, they were just going to be really bad and incomplete, but have the word open science. And that just wasn't enough for me. We really need to have clear guidelines that provided clear guidance to our faculty and clear guidance to our external reviewers. So, you know, there were, by the time we got to the point where we were voting on it, people had bought in. And I kind of realized this when we were in one of our faculty meetings and we were literally going line by line. What do you think of this sentence? What do you think of that sentence? And the people who were jumping into the conversation were the people who I thought, if anybody is going to really object loudly, it's going to be, you know, some of these individuals. But it wasn't like that. By the time we got to that point, I think people had seen and recognized the importance of what we're doing and had bought in and saw the value. And so kudos to my faculty, really, quite frankly, for being so collaborative on it. So, you know, I think one of the questions that came up in this process, you know, it's a factor of post. And this is sort of reading the tea leaves is, you know, the read of some of my faculty, my read of some of my faculty was, well, what are other people doing? So constant questions were, are we allowed to? What does faculty affairs say about this? What are they saying in the Provost's office? Are we allowed to do these things? And what about other people in the field? Are we going to be the only ones? And I'm like, you know what? Someone's got to be first and I would rather not be last. Okay. And, you know, for me, it was a matter of these changers are coming and we need to be out front of that wave, or we can be trying to, you know, save ourselves at the end. And I felt that it was really important for us to be deliberative where we could take some time to do this right. Think through the process and make things happen now as opposed to waiting until everybody else did it. Or no one does it, even worse, right? And we find ourselves not having made any progress on this. But, you know, one of those sort of undertones was, you know, what are other people doing? You know, is this possible? Can we do it? And ultimately, I came back to that something Marcy, you said early on is, you know, we do it because it's right, the right thing to do. Right? Sometimes you say, you know what? This is the right thing to do. And we should do it for that reason and that reason alone. And we return to that sort of general theme time and time again throughout our discussions. What is the right thing to do? The right thing to do is not to keep our stuff locked up behind the firewall or to keep our work siloed and private. The right thing to do is to share our work as broadly as possible. And I'll just give you a little anecdote that I learned from my librarians. And that is, you know, when you have a publishing deal, you set up with a journal or with a journal company like Elsevier or whatever. There's usually wording in there that the research, like the articles or the products can be shared publicly. But the definition of shared publicly is anybody is allowed to come into the library at the University of Maryland to access that material, which is just the most ridiculous form of public access that you can even imagine, because it's not public access. It's public access for those who feel comfortable coming on to the college campus, which means you also have to have some transportation to come to the university. You have to have sort of the willingness to come on to a campus full of people who are unlike you. And when you start thinking about it in terms of access to real, you know, real people's access to that material, you start to realize just how messed up the system is and formative access versus real access. Yeah, exactly. And yeah, and if you really endorse these views of equity, of inclusion, then there's really no other way to think about it than really breaking down those walls that have prevented us from sharing our material. That resonated with people. And, you know, I have good faculty in my department who's, you know, they know what the right thing to do is, and they were able to do it, so. Yeah, that does resonate. It hits home and it's a strong, powerful message. And, you know, if you have, you know, doing the right thing is your North Star. You know, I think that's a strong motivating factor. So that's wonderful. And I was really inspired to hear, even down to the detailed line about feedback from the faculty as you were going, they were getting into the specifics to make sure that they were comfortable and fully understood. So, you know, I'm a biased party, but I'm sold and I'm bought in, you've convinced me, that's for sure. But next kind of transition. I'm kind of interesting. You made all of these arguments. Now they've slowly been adopted and implemented. What has been the reaction since, you know, as they went into effect? What have you seen? Well, first of all, let me just say that they've only been in effect for a few months. And honestly, there's a grace period here. I mean, people are allowed to be grandfathered in to the old policies. So we have those legacy policies that are still there. And if someone doesn't feel comfortable going up under the new policies, then that's fine. But eventually people will be going through these new policies as they're specified. So that the grandfathering sort of protects people who feel like they were, they aren't consistent with the way they've been doing science for the last five, seven years or whatever. So the reaction. So a few things. One is one of my senior colleagues commented to me sort of offhand that, oh, so you mean what we just adopted is progressive. Yeah, yeah, like, this is the way we should be doing it. And this is what people really aspire to do. But yeah, we're, you know, we just adopted what I would call a pretty aggressive progressive policy and his response was cool. Another one. Yeah, and I've had one or, you know, a small number of other faculty who have reached out subsequent to our adoption of policies and just thank me. For really pushing it and saying that it was just, in their view, a really important step for our department but also for the field. And so, you know, I think there's some acknowledgement that this is the way we should be doing it and probably the way we should have been doing it for a long time. And thanks me for making it happen. But again, it's, you know, it's, it's the community that makes it happen, not one person. So we're still, you know, feeling our way through these things. So I don't have a lot of reaction but, you know, generally positive, you know, there were no, nobody objected the policies. It was a purely anonymous vote, and no one voted to oppose the policies. So that says something, I guess. Says a lot. Yeah. Returning to some of the audience questions, this kind of follows. What about the practices now that you're building support and hopefully to see that mushroom? What about other departments in the institution? Have you seen this pop up anywhere else? Not as much as I was hoping. I know that there are other advocates on campus who are pushing for change. And, you know, and it's, you know, there's a lot of variation across fields, unfortunately, and sort of where people stand, or at least their knowledge of what open science is and what it means. And even open access, I think is commonly misunderstood as you got to publish in an open access journal that you pay an open access fee for. Well, no, there's other forms of open access. And, you know, part of this is really about educating people and making sure people are making sure people are aware. So unfortunately, you know, right now, there hasn't been as much uptake on this at our campus level, not as much as I hope. We also had some untimely turnover in our administration that sort of now kind of go back to the well and start, you know, talking with new administrators about, you know, what we're doing and whatnot. I will say that our campus just happens to be going through a campus-wide process in the next couple of years for re-envisioning tenure and promotion criteria. I don't know what that's going to look like for sure. I suspect there will be elements of open science. I don't know if they'll be fully endorsed like we are in our department, but I think that will be part of it. But it is a priority on our campus generally. How it will manifest, I'm not sure. Another great question too about the point of promoting diversity, equity and inclusion, you know, with these guidelines, you know, kind of looking at the evidence base for that. Have you seen anything or read anything about adopting these processes that would improve DEI in the academy? Do you see a conceivable pathway that you would envision? You've already spoken to that a little bit. Yeah, so this is really an important question. When we talk about my, I'll come to my tips here at the end about, you know, how to sort of think about framing and really broadening the conception of what our revision was about was, you know, not thinking about it as open science per se, but really thinking about it from the perspective of inclusion, diversity and equity. And so many of the things that we ended up putting into our policy, you know, wasn't just about making our science more open and transparent. It was about broadening our definition of inclusion. Open science is one piece of that. But our criteria were very intentionally designed to try to eliminate barriers, to eliminate biases that may pop up and to start respecting to a much greater degree that there are, you know, a lot of young researchers who are coming into the field who they want to do good and they want to do good in particular communities or with particular communities. And so we made a point of really building our criteria around making that possible, making it rewarding and or rewarded and making it recognized. And so it's all part of the policy. So that, you know, in an ideal world, right, those are policies that are comprehensively addresses that set of issues, open science being one thing. But the other part of it being really thinking about who is it that we study, who's invited into the research endeavor. What are the challenges that individual people have in the conduct of their work as they wish to do, and then really writing the criteria in a way that enables them to do that, and to recognize those challenges and barriers. Yeah, so moving forward, going through C on the the next question before we we run low on time here. But what about tips you might have for people or departments who are looking to move in this direction. So that's the key question. Yeah, okay, so I have seven tips. Alright, so bear with me. A tip number one I'll elaborate on these just a little bit tip number one is identify your leverage points I'll come back to that tip number two is engage your higher administrators were possible. Number three is think broadly about change in your P&T criteria, again, not just about open science. Tip number four, which really should be number one, which is identify your core values because everything should be framed around that set of core values that may differ from one unit to another unit may differ from one discipline to another discipline. But you really I think if you can frame up those core values, it's really really important. Tip number five is dig up the data find the data. Do the work to find the consensus documents that you can use to provide your arguments to your faculty. Tip number six is be prepared to compromise. And then tip number seven is avoid mandates. So let me just return to number one, identify your leverage points. So you really need to figure out what's possible in your department or at your universities, different universities work different ways. And so what we were able to do here may not actually be very easy to do elsewhere. We controlled our own destiny and I was able to work within my faculty to make it happen. And so my leverage points were my faculty and trying to figure out how I can herd them into the right direction to make this work. So you really need to know where at the university, who you need to leverage to make it happen and also having advocates in your corner. So when I was able to go to my dean or my associate dean or the vice provost and say, hey, we're doing this. Is there a problem and they could say, no, this is great. We look to the units to do this. Move ahead. I could go back to my unit say we have approval to do this. I've checked it three times and we can do these things. Number two related to that is the engage or hire administrators. For those of you who aren't familiar with Helios. Look it up a lot of institutions have sort of signed on to the Helios project. So there's a lot of stuff going on at the higher administrator level. One thing that I was really struck by in some of these groups is that actually literally floored that being some provosts are looking to the faculty to do stuff. And simultaneously, the faculty are looking for guidance from the higher administrators. And so we're sort of at this impasse of who's going to tell us what to do first. Who goes first? Yeah, right. By the way, there's also another project going on through the Council of Graduate Studies and the Association for Education Research, or the AERA. So CGS and AERA is also has another project going on, but many of the deans and provosts and campus leadership are involved in these efforts. And so there's things going on out there that many of you may or may not have any knowledge of. My third point was think broadly about change. I brought this up earlier. You know, what are we trying to do with with open science? We're really trying to build a more inclusive science and more accessible science and the principles of accessibility and inclusion can manifest in a whole bunch of different ways. And so I encourage you to think broadly about what sort of changes you can make. The more broad you think, the more people you can bring into the fold, right? People want to see accessibility and inclusion address in a way that is meaningful to them. Open science may not be the part that's meaningful to them. That's okay. But there may be other things that you could do that would be very meaningful. And so you can bring more people into the fold, I think, by really broadening our definition. I won't say much more about core values other than look to your university's mission statement and see what they say there. If you have a departmental mission statement, even better, use data. There's a lot of work out there that's available on shoddy indicators of research evaluation. Have your faculty read the Dora statement of SFDora.org. I did this early on. I bribed them with ice cream. Get them to read the statement so that they see that there is a community of scientists out there that are saying, don't use these summary journal indicators so much, right? Back off those and think about reading the substance. Compromise is important. You know, you're not going to be able to fix everything, but if you can move the needle, that's probably good. And then I just said earlier, avoid mandates. So engage people, energize them, make it so it's possible and it's rewarded and cross your fingers that it actually happens. Have trust in your people and inspire them. Yeah, I think that's really it. Yeah. All right, we have a couple more minutes left, but definitely this is a fun question I wanted to get into as well. But what's the one thing you wish you knew before you launched this endeavor? If you had to go back in time and tell yourself. I'll share two things. These contradict each other, so bear with me. First, I wish I had a better sense of people's understanding of open science and why it is necessary. So I, you know, kind of overestimated how much people were paying attention to what was going on in the field. I happened to be in my little bubble, reading a lot of work coming out of, you know, Brian Nasik's lab and other people's lab and and thinking, oh, my God, we got to do something about this. But I learned that actually that was me, but not a lot of other people. And so there was a period of time when I felt, OK, I got to go back and I have to sort of start sharing out some things. And so I did the slow bleed of information sharing to get people sort of acquiesce to the state of affairs and why we were doing it. And then the other thing sort of contradicting this perspective is just how OK people were with change and prepared in a number of people wanting it, but never spoke of it. And so so I wish I knew that there were a lot more people who were ready to engage in these activities and welcome them. So much to my delight, I had a number of colleagues who were very explicit in their support early on and recognize the need to change. Fantastic, fantastic. We had another and I did not realize this. This is interesting. A great question from an attendee member that says, in a Canadian court case, student evaluations of teaching were prohibited from P&T decisions at one university because of inherent bias. Sex, age, appearance, etc. Has there been any history of public complaints that existing the existing P&T policies using things like the citation metrics that we discussed were also biased and unjust? Not that I know of. I didn't find anything on this. But I will say that in my reading of this literature, to me, it's it's a it's an easy target because there is enough research now that shows that there are inherent biases. So I haven't heard of anything. I wouldn't be surprised if we see something sooner or later coming. Great. One final question. And I know we are down to our final three minutes. But any quick summaries or input you would have for things that other departments, things departments can do to support open science. Yeah. Okay. So what are the things can you do to support open science? You know, the P&T policies are really, really important. But I do think that there has to be a very consistent message and consistent set of actions that a department does to support these activities. Can't just all be about mandates or promotion and tenure policies. We include, you know, things that I've done in my department. I include or we include mention of open science in our job applications or our job ads so that people know that it's coming. That this is something that we value and it is important. I created a funding initiative and put a couple hundred thousand dollars behind it in our own department to support what I call broadening participation. So it's an initiative that we started here to really broaden participation in science that includes open science. It's not solely focused on open science, but it also includes things like open education resources. So we provide funding for faculty to create free and accessible education resources to create open science pipelines to engage in other activities that might broaden representation in science in various ways. So that was really important. I also include in startup a funding line for open access publishing. So that's a clear sync signal both with, you know, here's what we value, but also with the money to back it up to say, we want you to make your work accessible and find it important. So that's required in all of my startup packages in my own department. Sounds fantastic. We are running. We are at 259. I did want to take a moment before we leave to provide a sign up for. For those looking to follow up. From this discussion, if you're interested in collaborating, looking forward to how you may be in greater contact. Do you like to say something about that, Mike? Yeah, so about this, so I just want everyone to know. Yeah, so no, I'm happy to work with anybody in the departments and any individual who wants to institute change. So if you sign up on that form, give me your email, and I'll try to organize some working groups for those of you who are interested in doing this and maybe sort of help guide you through this process with my lessons learned. I'm happy to work with whoever signs up whatever your unit is or whatever your particular situation is. So go ahead, sign in, log in, whatever, enter your email and tell me a little bit about what you want. Thank you so much. The link is in the chat. Michael, thank you. You have been a real thoughtful, thoughtful advocate for this. Appreciate all of your detailed effort that you've put into this and being a leader. So just want to recognize you and say thank you and appreciate everything that you're doing to do this the right way. So thank you so much and thank you for your time today. You're welcome and thank you. Yeah, and thank you everyone for joining us. We're at three o'clock on the dot. So we will say good afternoon or goodbye and we'll be in touch soon. Thank you.