 Okay, I have about 1232. I'm sure there's going to be more tuning in, but let's go ahead and get started. So first of all, I'd like to welcome all of you to this month's research ethics consortium in the Harvard Medical School Center for Biowethics. I'm Bob Trug and I'm the director of our center. And today we're going to be exploring the new research, research rigor reproducibility and responsibility effort at Harvard Medical School, what we call the R3 initiative. And together we're going to be exploring questions such as who is responsible for ensuring responsible research practices in the academic setting. Is it the individual researcher, the laboratory leader, the department chair, the university journal editors or somebody else. So that kind of we're going to look at what factors lead to breaches in responsible research. And then third examine where the line is between questionable research practices and outright research misconduct and what are the best approaches for preventing such practices. The session today will be led by Professor Alexa McCrae. Dr McCrae is professor of medicine at Harvard Medical School and at BIDMC and an elected member of the National Academy of Medicine. Her research focuses on the significant problems that persist in the dissemination and exchange of scientific information in biomedicine and health. She is the co founder of the HMS department of biomedical informatics. And as a PI on the NIH led undiagnosed diseases network, a program that seeks to provide answers for patients and families affected by undiagnosed conditions. And she recently led a National Academy study entitled open science by design, realizing a vision for 21st century research. And she will be joined on the panel by Dr Mary Walsh, who is a member of the HMS office of academic integrity and a special advisor to the R3 project. And to complete the panel of speakers were also welcoming Dr Jim Gould. He is director of the office for pot postdoctoral fellows at the medical and dental schools, where he creates professional development programs for all of the schools postdoctoral trainees. So for those of you in the audience today, you're going to have two ways to participate in the discussion. You have the chat box where you can communicate with our panel and each other and I'll be keeping an eye on that and suggesting comments or questions to our panelists. Also, if you have any technical issues today. Julie will be monitoring that as well and can help with with technical problems. We also have the raise your hand option at the bottom. And after the formal presentations, we'll open the floor for discussion. And you may raise your hand using the zoom feature at the bottom of your screen. And when called upon you may unmute yourself and ask a question or make a comment. So without any further delay, let me turn it over to to our three panelists. Thank you very much, Bob. So today, as, as Bob said, we're going to be talking about what we call our three at Harvard Medical School. And we've, we've, Bob is already nicely introduced us to the group. So the outline, Jim, for the, you can have the next slide yet the outline for the talk is just to, you know, let you know what exactly we mean by rigor reproducibility and responsibility. So we'll be defining some of our terms and I'll give a little bit more background on the effort. And then Mary will turn the talk over to Mary who will talk about some of the work that she and others have done in looking at what are some of the themes in our three that we have found in our community through a variety of methods through surveys through meetings and so forth. Then Jim will follow up by talking about our education and training work at HMS in our three, and we'll round out the to talk with our mission statement, and as Bob has already listed some of the ethical questions that we can discuss as a group. Once the formal presentation is over. So what do we mean by rigor rigorous and transparent research practices. These are methods and practices that support the confirmation and validation of research findings that is supported by doing reproducible and replicable research. What we mean here is that this is research that's conducted and shared importantly and shared such that others can obtain consistent results using the same methods and conditions of analysis, and I'll go into it's slightly more detail about this in a few minutes. The first principle science is research that is conducted and communicated transparently fairly honestly, respectfully, and in accordance with established research practices or norms. And we work to guard against research and misconduct and questionable research practices. What do we mean, and it's not just us what does the community mean by research and misconduct this is the official definition of the Health and Human Services Office of Research Integrity. And most of you know this but I thought it would be worth just reviewing it. So the, the characteristics of research misconduct, there are three of them one is fabrication. That's where you make up your data results and recording or recording or reporting them. Falsification is where you manipulate research materials equipment or processes, or change or admit data or results, such that the research is not accurately represented in the research record. That's falsification that is where you're manipulating your research materials and so forth. So you're to everyone it's the appropriation of another person's ideas processes results or words without giving appropriate credit so it's fine to to cite other people but you absolutely need to give credit. And, and I think it's important for us to note that research misconduct doesn't include honest error or differences of opinion, so that people can make mistakes and there certainly can be differences of opinion but that's not. That is not part of what is the officially seen as research misconduct questionable research practices are ones that are likely much more prevalent than outright research misconduct, and we thought this would be a nice quote to put up from an article written way in 1994 by Doug all in a statistician. He says what then should we think about researchers who use the wrong techniques, either willfully or in ignorance, use the right techniques wrongly misinterpret their results, report their results selectively cite the literature or draw unjustified conclusions, and he says we should be appalled. And he points out that numerous studies of the medical literature and both general and specialist journals have shown that all of the above phenomena are common. And, and he notes that this is surely a scandal. And lest you think that this, you know is very old it is of course an early, an early study and early actually editorial that in fact questionable research practices are are continue to be quite prevalent there was a study recently done in the Netherlands in 2022 study that did a survey of thousands of researchers within the Netherlands and the result was actually somewhat shocking that, you know, a very high percentage of researchers admitted anonymously but admitted to engaging in one or more questionable research practices in the last three years. So this is certainly an issue that is still in front of us and one that we would like to be able to address. So then we wanted to let you know that there's actually quite a lot of work going on in the national and and global community. So there for example, there have been a number of surveys run by by nature, and have been reported there's one in 2017 and one in 2017 they continue to do a number of surveys in various areas, but these are related closely to our three issues. So the, the love hurt relationship, the one that was reported on in 2017 was a global survey survey of about 5700 PhDs around the world. About a quarter of them listed mental health issues in the in the work and in the laboratories that they were part of. They found that the work could be quite stressful, and we're questioning, you know, is it worth it in other words will it pay off to continue to stay in in research and in academia. One of the other results of this survey was the many people commented that mentorship is key good mentorship is key to a healthier lab environment. The 2018 survey was of approximately 3000 science scientists around the world, and reported many reported that laboratories tend to be stressful toxic tents, and that there is pressure to produce a particular result. Two, two efforts within the within the UK I think are worth mentioning the UK Royal Society initiated a two year program where they ran a number of workshops and meetings and so on. And they found that research culture, which they defined as behaviors values expectations attitudes and norms of the research community. And they spent a lot of time discussing how to effect positive change and pointed out that both need to have top down and bottom up approaches so top down being what can research institutions do what can funders do and so forth. So bottom up what what is it that researchers themselves can do it's it's a study worth looking at. Then the, the welcome trust did a survey of thousands of researchers investigators across the UK. And they, they found appropriately that and what we all believe is that most researchers are passionate about what they do. They love what they do. However, there is a high level of stress on the healthy competition and job insecurity or some of the areas that they pointed out that make it difficult to do their work. The director of the welcome trust on on receiving this report said, the results painter shocking portrait of the research environment and added that poor research culture ultimately needs to poor research so we do want to avoid poor research culture. So this is just to let you know that there's quite a lot of work going on and these are just a very small sample of some of the studies and surveys. There's a robust literature in this area. Then the next. Next thing we wanted to point out was that the National Academies are taking this quite seriously and had a number of relevant studies, starting about five years ago now with the first highly influential study, fostering integrity and research. This was a 2017 study. They have a quite helpful best practices checklist for for institutions about how you can best foster research integrity in the research setting. I will start out by saying that it's very important for the institution to demonstrate that fostering research integrity is a central priority at all levels, including for faculty and for institutional leaders. The open science by design report that Bob already mentioned, it was a report where we looked at how can we make it easier for people to share their data because as as many of you know there are now mandates. They might think of them as incentives to share data to make research more openly available. And our our study pointed out that if you start by by using and implementing open science practices right at the beginning of your research, and then you continue to do that throughout the research lifecycle. By the time you get to the end of the study, your, your data and methods and all should be very easily shareable I like to say it's you know almost be like pushing a button that says, okay now share my data because of the way that you've collected your data the way you've reported on your data, and so on so that was one of the primary. That was the primary point of, of that study. That was followed very quickly by a reproducibility and replicate replicability in science study, also done by the academies, and the contribution among other contributions of course of this study was that sort of clear definitions of what's meant by reproduced reproducibility versus replicability. They defined reproducibility as being computational, that is obtaining consistent computational results. When you're using the same data, the same computational steps the same methods code and conditions of analysis. So of course you can only do this with this definition of reproducibility, you can only do reproducible studies if you have access to the data if you have access to to the methods and so on. And that's distinguished from like replicability, where you're obtaining consistent results, but most likely using new data, your own data, but with the same methodologies and so on. Very recently in 2021. The National Academies under the leadership of Marsha McNutt who some of you may know she is the president of the National Academies but she was also the editor in chief of science, and she and colleagues established this strategic council for research excellence, integrity and trust. And this is an ongoing effort. They've already had five meetings since October of 2021, when their first meeting was about trust in government and through scientific integrity, and you know importantly through evidence based policymaking. The second meeting was about improving the experiences of early career scientists we've already heard a lot about laboratory environment and what early career scientists and actually all scientists experiencing. The third meeting was on misinformation and measuring how do you measure rigor how do you get at how can you make this, this whole process more evidence based. The meeting brought in funders and so what what's the role of funders here what can they do. And finally the fifth meeting which was just held last month was discussion about the Office of Science and Technology Policy memorandum which was an update of earlier the memorandum of 2013, which requires public access to publicly funded research, and particular they strengthened that original memo of 2013, and are now requiring that by the end of 2025 that all publicly funded research needs to be made openly available not only the publications but also the supporting data by the end of 2025. Then next, I want to turn now to, so I wanted to give you something about the, you know, the national, the international background context for all of this work. The, the HMSR three effort was initiated in 2019 when Dean George daily the Dean of the Harvard Medical School in recognizing the importance of our three principles and practice, formally initiated the effort. The effort since then is involved many faculty and administrative partners from across Harvard Medical School and its affiliate institutions. Those of you who are not at Harvard Medical School may know that there is the basic science research that pre pre clinical research some of the clinical science research that goes on at Harvard Medical School proper, but there is quite a lot of extensive research that goes on in our affiliate institutions, and we are all members of Harvard Medical School. Be sure to include the the broad our broad constituency. So we've had multiple retreats and committee meetings led by my my close colleague, David Van Vactor, David Van Vactor was professor of cell biology, and is an expert in responsible conductive science and training for the responsible conductive science, both at the PhD level and beyond, and as well as looking at how can you. How can this work that is the the training and the education that happens for for PhDs for students for postdocs and other trainees. How does that translate into the laboratory setting. I think Jim will have a little bit more to say about that in a few minutes he'll certainly be talking about some of our educational training activities. And then I chaired a committee on on scientific culture and had multiple meetings with very, very good interactions of faculty members from across our HMS wide community. And I want to emphasize that the focus of this effort the focus of the are three effort is on continued research excellence of all of our scientists of our entire community on fostering the positive scientific environments, and also on evidence based methods to prevent so the emphasis on prevention prevent breaches in scientific integrity. So that background I'm going to turn it over to Mary, who will tell us a little bit more specifically about the work that we've done here at the medical school to date. Thanks Alexa. So the discussion I'll be having with you today refers to data sets that we worked to develop from the initiation of the are three effort, and what we started to describe as the HMS are three landscape. And as Alexa had mentioned there was a number of individuals invested in this discussion that were identified across our community in 2019. The, the teams were built on academic and administrative partnerships. And one of these groups I'll refer to during the data discussion where where we work to identify areas to floor emerging themes and priorities for our are three effort in moving forward from this initial identified group of invested decision groups and faculty, staff, students and postdocs, and all of this information is relevant, and that the work that we're doing isn't incredibly relevant to all of these communities. We want to think about how to organize our efforts in an evidence based fashion, and from where we could derive this information, given the expertise that we were starting to align for the discussion. One of the groups is that we'll talk about that we discussed with and Alexa also referred to is our faculty advisory committee identified by Dean daily in 2019 to participate in the development of their three effort at HMS. And this is from across our basic and social sciences and our agent SDM our Harvard School of dental medicine appointing department. The group that I'll refer to is our are three working groups. And this group represented subject matter expertise and operational areas of our three endeavors from our administrative and academic team. And so in working with these two, these two groups and the number of individuals in these areas are our effort work to proactively and retrospectively think about reviewing potential are three aligning data to help guide the prioritization of our efforts and our endeavors moving forward. The areas that had kind of come to us through our discussions where we identified data collection opportunities from within these groups included data survey work from our HMS students and trainees. And this was collected from our graduate students who participate in our responsible conduct of science RCS. And this is a competency survey survey that was done in the fall of 2019. From our postdoctoral community on the responsible conduct of research and data management sessions are CR sessions that were also held in the fall of 2019. Our three committee faculty data where we did a survey of preliminary landscape analysis of our perspectives from our participating individuals in the R3 effort which was completed in the spring of 2020. And also case evaluations from our Harvard Medical Office of Academic and Research Integrity AI office in terms of our discussions of potential research misconduct. And that work was completed in the summer of 2020. Jim, I think we can move to the next slide. So the first discussion, the first data set. And what I'm going to do with you is just go back to that first slide Jim. That'd be great is the R3 committee survey that was done in the spring of 2020. And this was completed with our faculty advisory panel and individuals participating in that in the committee discussion. There were approximately 33 individuals that were asked to participate, representing voices from across the research community. We asked them to provide their perspectives and their perceptions of the emerging R3 discussions in the community and why we're getting involved in this and why it's important to us. But some of the questions that were asked to this group in the survey included, why are we engaging in this work, what's important to us, what resonates with us in terms of the R3 effort and the discussions we want to have. And so in the next slide, so basically we focus on, for our discussion today, I thought it would be helpful to look at two examples of the outcomes from those survey questions. And so we're going to go ahead and take a look at some of the questions that were asked to this group in the spring of 2020. So we're going to go ahead and take a look at some of the questions that were asked to this group. So I thought it would be helpful to look at two examples of the outcomes from those survey questions, which include one, some of our perceived strengths in our community that resonate with us and as as you can see from the I chart on the left, certainly there's a feeling that there are strengths in the expertise of the HMS community invested in this conversation that we have the subject matter expertise to weigh in on this and that's important to us. I think that we feel that the institution is committed and is recognizing this as a challenge that needs to be addressed. And that there is work to be done with, with our entire community in moving this effort forward, and that we have a lot of fantastic programs in place already for which we can derive resources and training and development from where we don't have to reinvent the field, but we have a lot of great things happening in our community that we can synchronize across and you can pull from in terms of program development resource development and training development. And the next slide is on the flip side of that and thinking about what our challenges are coming from and in this particular case. Some of this will sound familiar and given what you just heard from Alexa in terms of the conversation happening nationally as well. There's certainly the same kinds of challenges that we're facing in our community in terms of fragmented and siloed organizational structure and expectations in terms of our research practice. A lot of competing priorities or challenges in the environment in terms of what our teams have to manage in the day to day research work. And certainly that there are gaps in our training and or resources for which perhaps the institution or our teams can step in and buttress those. And, and some of the comments also if you see on the right hand side reflect that as well in terms of learning how to destigmatize air in our community. And how important it is to be proactive in terms of correcting our work when challenges arise, providing protective spaces to do this kind of work. Certainly understanding the intense pressures on researchers in this environment while trying to balance all of this and thinking about a lot of the disconnection that there may be in terms of our day to day laboratory environments where these challenges arise. And understanding how to navigate these conversations and establish resources where these conversations can take place. How do we start to bridge these these gaps through the R3 effort. So, the next data set that we thought was incredibly important to pull from and this is representative of taking about our student population. And this is a work that was collected and is currently ongoing actually for Dr. Jason. He's this. He's the director of our student development in the Harvard Medical School Program and graduate education. He's also a very active member in the R3 working group and its development. And, and involved in leading and developing the responsible cognitive science content for our graduate students. And Jason felt that he had some really rich survey work that he had started to do with the graduate student population. And in particular, there was a graduate student survey that he had performed in the fall of 2019 amongst incoming graduate doctoral students. So the G1 students that were surveyed starting one month before the start of their fall semester in 2019 and closed out about the end of the first week of class. There were approximately 144 students included in the analysis that had completed the entire survey. And so this represents approximately 60 to 65% of the incoming graduate student class. And so, while the survey wasn't designed to collect information on R3, a lot of our retrospective analyses were looking back at what's the data we had in hand from our subject matter team, subject matter expertise teams, and mining that with a perspective or a lens for R3 aligning themes. And what Jason was able to do was he evaluated a number of competencies across the graduate student class and he worked to bend them. So we looked at 67 competencies of incoming research areas for the graduate students. Align those in four bins and then he started to subgroup those in sub bins, including those for critical thinking and research skills. And then ultimately within that critical thinking and research skills group felt that we could drill into a little bit more of individual R3 related skills. Some of the perspectives of self-identified competencies for the graduate students incoming in this program. And overall, some of the across all skill areas, what was found was that approximately one in three incoming graduate students were not even moderately confident in many of these skills across the board. And approximately 70% report that they are moderately confident in less or less in all of these areas. But in thinking about the critical thinking and research skills. And Jim on the next slide we, there's a visual of how Jason and his team were able to drill into this into these data, where at the top of the screen, you'll see the color coding as two percentages of the self reporting competence in these areas where the lighter colors are where you're seeing somewhat confidence and the darker areas up to black on the far side you're seeing extreme confidence representing 100% of responses in these areas. And ultimately they were bend into these three bins in our three area of critical thinking, quantitation and computational literacy and research skills. And in the research skills area was a place where Jason worked to drill down and think about individually where some of these items were in terms of our three aligning skill sets. And, and three of those areas kind of rose to the surface in terms of some of the reporting data, which includes experimental design, responsible conduct of science, including research ethics, and data management and curation. So, Jim, if you hit one more hour, I think this is one more. And using electronic laboratory notebooks. So in thinking about drilling down into the confidence the reported confidence levels in some of these areas you can see that there is a wide range of somewhat confident to very confident or extremely confident in certain areas. And then the next slide is kind of the summary of this visualized data. Which includes that there, the G1 students are apparently at the time at which they're reporting at this early stage less confident in their skills relating to these are three themes. Particularly critical thinking and research skills than in other areas, which includes some of the earlier bins of academic and career management thinking. Not as much self confidence it with quantitative and computational skills, certainly a lower overall confidence and data management and data storage, which is a recurring theme that we'll talk about. And a small number of students being extremely confident with experimental design. Interestingly, there were a higher number of students at the time that rate themselves extremely confident when it came to the responsible conduct of research in the US category. However, this may reflect what students necessarily don't know in terms of the practice and responsible conduct of science work because as Jason has followed some of this information over time. So he saw a pattern where the confidence declines in this area students progress through the graduate student training so looking at later students in the G1 classes. Wow, yeah, I raised. I was just experimenting with the, the hand raised function so that I could recognize when people are raising their hands during the discussion. Sorry. That's great. Active training opportunity. So, the next set of data is reflective of work that was collected from our postdoctoral population and I'm going to summarize it but I have a subject matter expert here with me. This is part of Jim Goldstein in terms of the responsible conduct of research courses that he routinely has with and for postdoctoral students that cycles throughout the, the academic year on or on a regular basis and in this particular session that was completed in the fall of 2019 was focused on research data management. And this survey work was collected pre and post during postdoc pre and during the postdoctoral session on research data management and and research data management course. It reflects approximately 60 to 65% of the particular postdoctoral fellows in this particular training event. The event was focused on thinking about data management and project planning data ownership and materials and data sharing incentives. And this includes how, you know, local labs track record and store experiments and data re analysis of data sets and principles of data access and sharing, such as the share principles, which were first described by Wilkinson L, which include data sets and corpus of words that are findable accessible interoperable and reusable. Again aligning with our three themes that we've seen developing our community in supporting transparency and utility of research data for building on experimental findings. And a lot of the information that was taken from the emerging themes from this these conversations included that, while these principles were aspirational. And people were invested in thinking about these the actual execution of this, these kinds of principles in our day to day operations are very are much lower so for example 70 70% finding this certainly worthwhile and something to be invested in. There's actually only 15% of our postdoctoral participants here reporting that these principles are in fact adopted and utilized in our respective fields in our laboratories. Importantly, in some of these discussions. We're thinking about the discussions of research data management and thinking about the teams that do this work and postdocs are reporting during these conversations that and thinking about this with their lab leads. And there's a lot of PIs that 72% reported in thinking about data management and these principles, not as a PI or a lab manager oriented them to research data management in their laboratories and almost half reported that research data management is not discussed actively in the lab at all. In fact, downstream and in day to day applications and experimental work. At least one in three of postdocs for reporting they were unable to find project data, and one for them have been unable to reproduce results from their own lab. And these numbers were the same regardless of whether the work was their own work or colleagues that they were responsible for in the lab. And so the next slide is kind of a visualization of that just to give us a sense for some of these emerging themes in terms of our three discussions in research data management in terms of the discussions that PIs are formally having or informally having or whether or not these discussions are happening at all in group settings, which does not seem to be the case in our community. So again, another area where we're starting to see aligning emerging themes across our data sets. That of information that we'll discuss in aligning some of our three landscape analysis work was done by the Office for Academic and Research Integrity. And I did this work while I was chief scientific investigator has now shifted to special advisor for R3. And I did this work in partnership with our other senior scientists in forensics that in the Office for Academic Research Integrity. And what we were able to do was help actively collect information from our case studies, and we worked to identify a set of data that from open to close happened. And so complete cases that were that were happened between five year periods in 2010 to 2015. These data represented approximately 56 and a silly and 16 basic and social science departments, 72 departments in total across our community were the data from where we were able to derive these from the rates per case load for all of these across all departments were similar. There were certainly common themes that we'll hear about again that that emerged from this data set including challenging data management data analysis practice mentorship or supervision. And these themes, aligning with our R3 effort were similar regardless of whether or not research misconduct were found. So these conversations were actively happening and we're with source of information back to our appointing departments with regards to some of these challenges that we see in the R3 environment. The next slide that breaks down some of the information that we saw in these challenges in in these cases through 2010 2015 hands down across the board for all cases 100% of these cases involved for practice and scientific data handling and management. And certainly in many of these cases which can sound a lot of discussions where that va data, the original source data for represented experimentation were partially or completely absent in at least half of the cases that we, we reviewed and discussed. 56% of these cases involved for practice and data visualization so figure development representation of the underlying research record in visual figures. In published papers and 38% of cases involved for practice and data analysis. So, removing data point in appropriately handling date handling data and or statistical analysis of data sets. And in many cases, there were case there were a number of combined issues in these conversations where both representation and visual management of data and the underlying research record and value data alterations were significant challenges during these cases. And I think the final slide that I have here ties into some of the discussions that from all of this information and certainly from the emerging themes from our research misconduct. Discourse, we're tied together very nicely by Dr Dennis Brown who was invited to come to speak to Dr McRae's scientific culture committee, Alexa scientific culture committee he spoke to that group in 2021 on research rigor reproducibility and responsibility experiences and perceptions from HMS and beyond. And his experience comes from his role as a faculty member at HMS is an expert in cell biology at MGH recognize also however for his many contributions both at HMS and internationally for efforts and research rigor reproducibility and responsibility. In addition to serving on study section. He also was served as the APS the American Physiological Society, where he was editor in chief of the American Journal for Physiology and Cell Physiology and served as their 90th president. Most recently, he also importantly is the director of the MDH office for research and career development. And appointed to the HMS faculty committee on conduct, which serves as a deciding body and the recommending body on the cases of potential research misconduct that I just discussed with regard to potential falsification fabrication of data in our environment and the recommendations that come out from those conversations at the end of discourse. And some of the takeaways that he brought for us and for our teams to consider involved. What we can do both locally as principal investigators in our laboratory environment and thinking about our raw data, how we're having conversations with our teams with regards to data management. How we're produced, how we're approaching practice every day and thinking about how our our PIS can be advocates for these practice pipelines and improvement of these practice pipelines to address challenges and our three that we're seeing in our environment. And then certainly how hopefully our institution can think about stepping in. In addition to supporting our core facilities and thinking about efforts and data management. Also thinking about rewarding teams and individuals who are invested in these conversations of our three because it's such a critical component of the success of research excellence and operations in our day to day environment. So all of these data that I just kind of reviewed that our team tried to collect help build a framework for consideration of importance of our three concepts for our community to consider and build upon and I'm going to turn the discussion now over to Jim for the what those frameworks began to look like based on these conversations. Yeah, and as you say we we now had to actually build a framework for all of the data and knowledge and the principles that we have uncovered and try to develop. We had to actually put them in place we actually had to build a framework for building a program and initiative and an effort around. So we were able to identify through the community, as well as internal work that there are kind of five major ideas of our three research design data management analysis interpretation visualization scholarly dissemination and scientific culture, which ties all of these together with an underlying element of education and training as well as building and building tools and resources across our three and breaking these this framework out into individual work. There's research frameworks hypothesis generation moving into data management there's documentation, there's data storage, you know statistical analysis and validation pipelines, moving into scholarly dissemination is publications, but there's also fair principles, data sharing, sharing open access, moving into scientific culture. You know it is a cycle but they each one of these components actually inform everyone else and everything else as part of this framework, again underlying education and training and tools and resources within scientific culture, which I believe is probably maybe the most important part of this because it's tying a lot of this together, and the importance of the actual culture in which science is done through leadership mentorship, responsible responsibilities, ethics, policy standards and academic development. So we had to create this framework in order to place out of the principles and landscape analysis in context and be able to to share it back and build programs around it, and in moving into building sort of programs and educational development. There's a philosophy that I tend to use in my postdoctoral training. There are kind of four major areas in which I concentrate my trainings on making their research better, making their, you know, access or writing papers and their, their access to funding, maybe more secure or opportunities expanded, or the grant writing that much better, but also thinking about and developing their professional, you know, the professional selves as well as their career. There are kind of three aspects to this training for these training programs where we, it's relatively easy across all of these sectors to build awareness. You know, we don't we would need maybe a 45 minute to an hour workshop or seminar just to build awareness across, you know, research or across all of these elements. But it's even harder, and it gets more selective, because there's increased time and increased expertise needed to build skills within these areas. And then finally, the ultimate goal of any type of training is actually gain experience. And it is very difficult to gain experience across these elements, especially in research in manuscript writing, grant writing or funding, as well as career development. But this is the Holy Grail that that pinpoint of experience, and most of the experience that is developed in all of our training actually happens in the laboratory, not in our trainings, not in our over zoom, not in the classroom, it actually happens in the laboratory. So we, we, you know, using this training and programming philosophy, we need to bridge the gap from theory to practice for our three, but also bridge the gap from classroom to laboratory to actually implementation and use of our three principles across all of these sectors. So, you know, again in building, you know, the postdoc training curriculum but also it works as well for our PhD students, where we have an asynchronous training timeline there's an expectation of them to be gaining all of these skills and competencies and trainings where they, you know, they're at orientation. They have an expectation of continuous research progress. There's kind of three phases of their training early where they're just learning new things. There's mid where they becoming a bit more expert, a bit more independent and then there's late where they're on an independent path, but also then looking for the next step in stage of their career, whether it be post, you know, PhD program to postdoc, or PhD program to industry or other, but also postdoc to faculty or postdoc to industry. So, we build, you know, this community but also this curriculum around research progress, which happens mostly, I would say 99.9% in the laboratory, but we as a community build professional development opportunities and career planning opportunities. And we were able, we can be able to connect all three of these phases of training through individual development plans with individual trainees, graduate students and postdocs. I believe the PhD students, there is a requirement for individual development planning with between each mentor and each trainee. It doesn't, it is not mandatory at the postdoc level but again running a postdoc office. I highly, highly recommend doing that but it helps bridge the three phases across, you know, kind of the three timelines of training early, across career planning, professional development and research progress. And as you'll see in this, you know, this graphic, RCR, Responsible Cognitive Research and R3 are part of that professional development but also is interweave into research progress. And in developing themselves, our trainees, our PhD students and our postdocs, they need to be gaining transferable skills that they can take from the laboratory into their professional next stage of their career but also be able to develop themselves while they are producing rigorous and responsible science. And, you know, as a, this is a sort of a continuum of transferable skills from hard skills and methods and technologies, quantitative and computational skills into operational experimental design, data interpretation, moving into soft skills, management leadership, teamwork, sort of the personal, interpersonal human skills that, you know, we tend to just call soft skills. These on the right hand side, I would argue to, you know, to unbloom and face that these are as important or maybe sometimes more important than some of the skills on the left hand side, not because, you know, getting your research skills honed and you know, becoming an expert is not important in, you know, continue your career, but you have to do this in the sense of a community and other people in a laboratory. So if you can work with people, it's very important. And I bring this up because our three principles is not just about getting research done. It's about getting research done in a, you know, in a community. And the National Postdoc Association, the NPA has published their core competencies. And one of the things that I wanted to bring about is using that, you know, transferable skills continuum from above. Their core competencies from the NPA across most of those skills with an emphasis on RCR, an emphasis on discipline specific knowledge, leadership management skills, communication skills, professional research skills development, and again, crossing the entire continuum is professionalism, you know, responsible conduct within the laboratory and with the personnel that you were working with. And breaking out that, you know, the circle of the R3 framework and cutting a, you know, sort of linear, linearizing it across research design, data management, all the way over to research culture. I was able to align the general curriculum of the responsible conduct of training that the PhD students receive, as well as the RCR training that our postdocs receive. And I'm also the, you know, the program director of the RCR program for postdocs as Mary alluded to. So, I am very familiar with the nine sessions that we offer throughout the year for the, for the postdocs scientific citizenship, which distinctly sits in the research culture. But you will see, I would say that you would see that eight of the nine sessions that we offer in the postdoc RCR are entrenched or anchored in the research culture, and then move across the R3 framework. And the only one that doesn't necessarily touch research culture, but it's very much very close to it is our research data management, which, you know, you saw some data from a few years ago. And I think actually one of the attendees is one of the authors of some of that data. But as you see in the RCOS research, responsible conduct of science that the PhD students take, and they actually have two rounds of RCOS in G2 and in G6. And it's exactly the same course, but they are four years, you know, four years more senior into this process. And even though the content is the same, the context begins to change. But as you can see, except, you know, across these six competencies, they seek six sessions. Again, research culture is very, you know, an anchor, and it moves across the R3 framework with the exception potentially of research design. We wanted to align our RCR with the R3 framework. And we need to do probably a little bit better job, a little bit more work to make sure that we are reaching across the entire R3 framework. And one of the things that I predict will happen in the future, I don't know when exactly, but the RCR programs, both the PhD program and the postdoc program will become part of the R3 effort in general. And we will better be better, better situated to align both the curriculum across the PhD and postdoc programs, but also the R3 framework. And our colleagues, as Alex has introduced, David and my back to, but also a former curriculum fellow, Yelena, as well as our current curriculum fellow Jade, have developed a framework for our three skills and competencies. And as you can see on the screen, based in the center, there's scientific competencies, resources and training, assessment and feedback, progression and evidence. There's an entire process that we need to work through as we begin to develop and implement and actually train for these competencies in our three across communication, teaching, mentoring, management leadership. Very similar to the competencies that were shown above across the National Postdoc Association, as well as the transferable skills that I try to highlight in our training. There's current pressure development, resilience is very important, low resilience equals low resistance to research misconduct. So we need to bolster not just the science, but the scientists themselves, scientific knowledge, critical thinking, experimental design, experimental skills, data analysis, and responsible conduct of science. To further this, we actually need to build an actual implementation plan. And Yelena and Jade, as well as Davey have come up with a very nice rubric for the framework and implementation where we can take across three phases of conceptual knowledge, operational knowledge and application knowledge, sort of the depth and experience of the knowledge across from novice trainees all the way to potential expertise and expert trainees. And using this framework, we can begin to actually build a way to pilot, create ideas, pilot, optimize and implement in our graduate programs as well as our postdoctoral curriculum. Where, you know, the pilot would actually take place in our BBS program, the biological and biomedical science PhD program, one of nine, I believe, PhD programs that we have at HMS. And within BBS, mostly because Davey is the director of BBS, it's a little bit easier to pilot within that program, and it's actually one of the larger programs across the nine PhD programs. Begin to pilot across specific courses, you know, develop specific R3 courses, but also be able to extract R3 principles that are taught across all of the courses in BBS as part of the graduate program. And then optimize and align curriculum training framework, develop maybe even a software platform to track training across our PhD and postdoc trainees, and then actually implement across the entire HMS training framework, PhD, postdoc, and even beyond. So with that, you're thinking about, there's this holistic approach to training that I like to zoom in on actual development of courses and trainings and specific content, but we also need to zoom out and make sure that the product that we are trying to put in front of our trainees is as good as possible. We need to have an idea of quality improvement through identifying strengths, needs and gaps, delivering relevant programs, developing also developing reasonable policies and guidelines, including and serving the entire community and doing it over and over making sure that the product and content we've put in front of our trainees is actually good quality and what we actually want to be sharing. And when we do this, we can begin to measure outcomes of inclusive hiring and recruitment of our students and postdocs, because this will have a positive feed forward effect. We will have supported and satisfied trainees, because they are being in a scientific culture and community that supports rigorous science, productive and innovative, rigorous research from our trainees, the decreased time to completion, and hopefully current confidence in their career progression. And sorry, and with that Alexa, I'm going to bring you back into the conversation to share our mission statement. Thank you Jim and thank you Mary for both of you for the extensive overview of what we've been up to. So I just wanted to highlight I think you've already seen this in the, in the work that as we were publishing and advancing this, this presentation today. But I thought it'd be worth just reading it together. So Harvard Medical School supports a culture that advances research rigor reproducibility and responsibility. HMS is committed to identifying exploring and fostering our three principles and practices relevant to our research community through cross discipline conversations and collaborations and we certainly have been doing quite a lot of that already and we will continue to do so. And to ensure the continued and advancement of research excellent at a excellence at HMS, the HMS are three effort identifies opportunities to enhance our three. And what's our focus with a focus on evidence based research, institutional support training and educational programs, and resources and tools for our students trainees faculty and staff. So what about some of that today. There's more work going on. We can discuss as we go forward. So, if we then take a look at the, we did prepare at Bob's request and prepared a few ethical questions. And we can discuss those but we certainly can discuss other things too in the discussion period. So who's responsible for ensuring our three practices in the academic setting. And what does that involve on the individual researcher, the leader, the laboratory leader, department chair, university editors and publishers what's their role, funders, etc. What factors lead to breaches and our three practices, and in particular to questionable research practices what what are some of the factors that lead to the two to those. What distinguishes what QRP is questionable research practices from outright research misconduct. And what's the best approach for preventing such practices which which is what we're really all about incentives modeling best practices education and training sanctions mandates, and so on. So I think we can take the slides down Jim and then we can move into the discussion period, and maybe we'll start. So Bob yes I can see the chat so maybe I'll just start by answering the first one, which is Harvey man says he's talking about the Doug Altman definition of questionable research practices. He says that they could involve a large gray zone and absolutely you're absolutely right the literature is always cited selectively. And as you point out it's impossible to cite everything. And authors will understandably cite the literature that supports their findings, misinterpreting results and drawing unjustified conclusions are both potentially very subjective easily overlapping with the prior slide about honest differences of opinion. And he goes on to say he's reviewed many manuscripts in which the question whether the authors conclusions are justified, but we're not considered the research questionable. So you're absolutely right. I, maybe I could be a little bit more specific about what others have meant by questionable research practices and particular the Dutch study for example that identified 11 questionable research practices and one of them relates specifically to this question of selectively citing the literature. And what they what what the that practices is to selectively citing references to enhance findings or convictions in other words to just you know when there's been when there's been a lot of disagreement in the literature or their alternative ways of looking at it. Leaving leaving that literature out and only citing the literature that is that supportive. Some of the other questionable research practices. There is insufficient attention to the equipment skills or expertise, insufficiently supervised or mentor junior coworkers, inadequate research designs are unsuitable measurements and measurement instruments unfair unfairly reviewed manuscripts grant applications or needs conclusions not sufficient substantiated improper referencing of sources, inadequate notes of the research process failure to report important study details in the publications and this is beginning to slide over into research misconduct insufficient conclusion of study flaws and limitations in publications, and a more another study. A report 2015 by Fiedler and Schwartz lists some other questionable research practices. For example failing to report all of the studies dependent measures, deciding whether to collect more data after looking to see whether the results were significant. Failing to report all the studies conditions stopping collecting data earlier than planned because one found the result it's another was finding the result that you're looking for so you stop data collection well. Rounding off p values selective selective reporting studies that worked. We already talked about that descending deciding whether to exclude data after looking at the impact of doing so on the results and that's, again, a problem. Reporting an unexpected finding is having been predicted from the start, and claiming the results were unaffected by demographic variables and one is actually unsure or knows what to do. That's just a more that's rounding out a little bit more. Again, this this quote we thought it was a very interesting quote and from from quite a while ago from a statistician that spent a lot of time thinking about this. And then Liza Dawson asks whether any of the survey data are published and Mary, perhaps some of Jason's work has been published. I'm not quite sure. Yes, so Dr. Jesus has been active in the field and has certainly been publishing on his competency data and is following up on that as well. And ultimately, part of the conversation for the R3 effort is thinking about opportunities for us to continue to publish on these data. How do we do this, how do we do this effectively how do we utilize the data in our community and share it more broadly. And so there, I think definitely I think we all feel definitely opportunities for continuing this trend and utilizing this evidence based approach but also walking the walk in terms of publishing and sharing that data more broadly. Okay, thank you, Mary. So, and Jake Earl asks, like most RCR programs. It seems that HMS is our three efforts focus primarily on PhD students and postdocs. What interventions are you pursuing or considering for improving the knowledge skills and most importantly behaviors of faculty and senior research personnel. This comes up quite often. I'll let Jim weigh in in a moment but certainly some of the work that we've been doing well we've already talked about making sure that there are ways to transfer what to take what has been learned in the by the trainees into the laboratory setting so that the how are we going to affect that well there are a variety of methods where we can do this. For example, we can we can use the independent development plans, the IDPs to perhaps involve sign off by by faculty by principal investigators and so on. But we're doing something actually and this is this is the work of Davey Van Vactor again here at Harvard Medical School and colleagues on the summer summer work CI CIMER work, which is providing formal training at the faculty level and senior research personnel as you say here. Jim, did you want to add anything to that. Absolutely and one of the things that I want to bring up two points this is the first point being, this is why we have involved faculty from the very beginning of this conversation, understanding that the community doesn't move unless the faculty moves with it. So we've involved faculty as part of this conversation. And many of these ideas that we have shared this afternoon have come from faculty themselves and Alexa has drafted Alexa and Davey have drafted a very nice strategic plan to go to Dean daily from, you know, the discussion that we've had in the voices of our faculty so they've been involved in the process of our three. The other thing point I wanted to make is, there is an emphasis on our trainees, one being our current trainees will be our future faculty at some point. And if they understand these principles from the very beginning of their training, all the way to when they're training others, we can sort of build that pipeline into tuition. But as Alexa talked about, there is a program for our faculty to be trained as better mentors at the very least, through the summer program which is the Center for the improvement of mentored experiences in research. And that's based out of University of Wisconsin. But there's also something that we need to do a little bit better with is socializing and sharing and updating our policies and guidelines that are already in place for the, not only for research but also professional integrity for our faculty and everyone else in the research community. So that is something that came to be highlighted by the new director for professional integrity from the academic research integrity office at our most recent RCR session on Wednesday, saying, you ask the audience of postdocs, not faculty but postdocs. Are you aware of this policy? Are you aware of this policy? Are you aware of these resources? And nobody knew about them. So that is another issue that we need to bring to the floor. That's part of this ongoing R3 effort. Absolutely faculty are a focus on this, but right now the implementation and training framework that I shared earlier are already in place. So we are emphasizing trainees at the moment, but we need to actually build a framework for our faculty. As Julie noticed here, for those of you who would like to make a comment, we can see that if you'll just raise your hand and we can unmute you. So please feel free to do that. Alex, I didn't mean to cut you off there. No, it's okay because Bob this next question is actually for you. It's about the Masters of Bioethics. So Dr. Wilberforce Musogha Kabiru says Masters of Bioethics is in great need of pursuing PhD in research integrity, any contacts and opportunities. So I think I'll turn that over to you and maybe Becca who isn't on the call. Yeah, well, I would say please reach out to us. We'll respond to you directly by email with any questions we can provide. Our website is freely available as well at the Master of Bioethics program. So look forward to hearing from you. I might just ask a question. You alluded to this issue, but something that we hear a lot about is the intense pressure on postdocs in laboratories, the struggles some sense financially, the pay scales and things like that and how much do you think that kind of pressure influences the problems with people really for almost livelihood reasons needing to be successful and that intense pressure to be successful leading to taking shortcuts. And is that an important factor and if it is, what are some things that you're doing to alleviate that? Yeah, so absolutely, you know, some of the major challenges which we haven't talked about too much but certainly the surveys I alluded to when I talked about some of the global surveys is that we're living, you know, the challenges and biomedical research today are very vast. We're living in, you know, this is an overall hyper competitive environment. There is, as I think you noted, there's an acute need to secure research funding, you have to secure research funding if you're going to be successful. There are considerable and growing administrative requirements that are often driven by federal mandates and so, you know, you could look at those federal mandates in two ways you could say, well, they really are incentives for doing the right thing for sharing data so there's a new data management plan, for example, at the NIH is going into effect at the end of January. And there's, there were the mandates from the OSDB, the Office of Science and Technology Policy memos to make incredibly funded research publicly accessible. So those are, you can see them either as incentives or you can see them as mandates, I prefer to see them as incentives. However, that puts a lot of pressure on laboratories and scientists and researchers and laboratories. So, what is it that we can do, well, I think we've laid out some of the areas, for example, mentorship and laboratory oversight is it working more closely with your mentor and as Dr Brown pointed out, when he spoke to us that it is, you know, it the mentorship issue is key that you need to be able to provide the appropriate mentorship and oversight in the lab, you know, reviewing the raw data, that kind of thing. Maybe your lab has gotten too big. So these are things to consider. Again, we're not mandating anything. We're, we've spent the last year and a half, two years collecting a lot of information about our community and doing a lot of listening to what our community is saying. There are, you know, there are very few rewards, as Mary said, for for practicing, you know, open science or for pursuing our three principles in a variety of ways. So I think that, you know, not pressuring as Dr Brown pointed out, not pressuring everybody in your lab to have a, in our world to have a publication in nature cell or science or perhaps in the Journal of Medicine, you know, those being the big journals and that if you have a publication in one of those journals that you are, you know, it's going to have a deleterious effect on your, on your future as a scientist. Those are all, those are all negative approaches to how you build scientific integrity, you know, among your population. So again, our focus is on being as positive as we can and helping to move the needle. So you say how well by providing lots of resources by providing tools. We didn't spend too much time talking about our research data management group as part of this are three activity, but they have created a number of very excellent tools for people to use in in managing their data better they have. There are librarians on staff various folks on staff, our research computing staff to help folks. We haven't said much about our cores, we have excellent cores throughout the medical school that provide extensive resources to help people do their work. So, so to answer your question in a word to try to provide resources, tools, training, education, and a positive outlook on doing science as one of our, as one of our committee members said, all he wants is for his trainees and people in his lab to think of Harvard Medical School as having an exciting place to work. That's what we're all about. And so, that was a long winded answer to your question. I wanted to jump in as well Alexa and talk about not just the science side of support but also the human side of support. And Bob you asked about sort of the stress and stressors that postdocs face. And I think COVID and and also the recent spate of inflation has laid bare the disparities that some of our trainees face and barriers that they face. And it's almost that we need as a community the scientific community, we need to acknowledge that, you know, we need almost understanding Maslow's hierarchy of needs, where we need to understand that our trainees everybody a part of our community has physiological needs that need to be met that aren't being met. They have safety needs that aren't being met in the laboratory at home on their commute. They want to be, you know, feel belonging to the community, which aren't being met. The three, you know, baked the base three layers of the Maslow's hierarchy aren't being met for our trainees. How are we expecting them to be resilient and actually produce rigorous science without addressing those needs as well. I just wanted to bring that up. Yeah, I'm sorry and I noticed that Dr. cover you have your hand up so Julie if you can make it possible for Dr cover to speak. Yes. Ask to unmute. I don't know if you still that We're still have a question there or or not. I see your hand up and I see your, can you unmute yourself and then you'll be able to speak. There you go. Great, go ahead. Thank you very much for giving me this opportunity to speak. I've been attending these conferences for almost some time. With even surgical bioethics conferences that you had earlier on. And my intention. I'm first of all, I'm Dr. couple we both was I work in Uganda my surgeon and then my bio ethics resident second year finalist. And I'm very interested in doing a clinical ethics research but also since I also lecture in McKellar University, I'm very interested in pursuing my PhD in research integrity. I had the gotten supervisors to supervise but then I was looking for the opportunity to pursue. I know Harvard, but ethics center also supports a bit of some programs. Once in a while, but currently there's no opportunity. That's why I posted that and during his presentation. I realized that actually the focus is about building better researchers from the students that can end up making better faculties in the future. Sometimes very difficult to start looking for the faculties. The young ones who are meant to mentor are not well mentored. So I think the approach would be very good. And I also borrow the same idea, building a young scientist to become more responsible. Our research is easier than mentoring your own mentee mentors. It is a little tough for the independent researchers or even for institutions, somehow somewhere that may not be very easy. So I think that approach would be very much welcome even in our low income setting whereby students are more dependent on the person who is mentoring them. So if they get someone who is straight forward, transparent and honest, then it is easier to pick up along those lines and then you train very good young mentees who actually pursue and mature into responsible research and they will be fostering a responsible conduct of research and yet promoting research integrity. Thank you. So I'll be waiting for that opportunity. If it comes any time, I'll be grateful to hear. Thank you. Thank you for contributing and thank you for joining us. I mean, I think one of the things we really hope to do with these seminars is to be able to reach international audiences and people like you. So thank you for being here and I hope we can continue to meet that need. I'll just follow up to say, just to Dr Colbert is point in terms of how important experiential learning is, and to Jim's discussion with regards to how critical that is in the pathway. Part of this effort is really thinking about the day to day operations and how we can model in practice what we expect on the other side. So it's a great comment. Thank you, Dr Colbert. Well, let me say, you know where our time is about at a close. I don't see any other hands up at the moment any you guys covered an incredible amount of material in a really clear way I want to thank you for that and compliment you on that. Any final closing comments before we end the seminar. No, thank you. Thank you for this opportunity. This is our, this is the first opportunity that we've had to talk more publicly about the work that we've been doing behind the scenes for for some time now. There will be more to come. There will be more public discussions and more publications as Mary has said. So thank you for the opportunity. It's been great. Thank you. Absolutely. Okay. And thanks to everyone who participated and enjoy the rest of your day wherever you are and have a good weekend. Bye. Bye everyone. Thank you.