 And I'd like to welcome up Elaine Chen who is the Cummings Professor of the Practice of Entrepreneurship at Tufts University and Director of Tufts Entrepreneurship Center and the Tufts Gordon Institute. So welcome Elaine. Thank you Lisa. This is odd I usually don't stand at a podium. Anyway so I have the great pleasure of introducing the next segment which is how at COS we make it possible and make it easy through infrastructure specifically the open science framework and we have a great panel of speakers starting with Eric Olson who is a product manager at COS who is going to talk to us about how idealists are innovators who are the first people to adopt practices that drive change. Then we're going to have Professor Fiona Fidler and Dr. Elliot Goode from the University of Melbourne share how they actually use OSF in real life and then Nikki Pfeiffer Chief Product Officer at COS will talk to us about how COS uses a user-centered design philosophy to make it easy so that anybody can adopt this practice. And without further ado I'm going to hand it over to Eric. Take us away. Thank you Elaine and after getting a great overview of the theory of change from Lisa, Brian and Tim we're actually going to step back for I'm going to ask you to step all the way back to a world where the barriers to culture change are formidable that we can accept but if we haven't made any progress on making that culture change and there are no policies there aren't any norms the infrastructure isn't there yet training hasn't been made available there isn't a shared understanding of what to do and how to do it if we're in that world where do we even start building toward that change. And as Elaine set us up for our topic here mobilization of purpose people resources and energy is critical component of every social movement Brian referred to Civil Rights Act earlier we're going to see parallels here in our journey through the theory of change and they will almost always be catalyzed by idealists and if you want to know what the idealists look like just look around the room here at everyone that's in attendance today and idealists are change agents they don't accept the system as it is as we've described earlier their primary motivation are the values that brought them into the field into research idealists are willing to act on their values regardless of what others are doing or what is rewarded idealists are central to the theory of change we've described they are the catalyst that get it started by being innovators they are the first to adopt the new behaviors idealists demonstrate that the actions are relevant achievable and have intent the intended impact idealists become champions of the alternative vision for research practice and initiate the formation of communities of practice in their field their collective actions create waves of momentum to inspire others and this sets the stage for accelerating adoption and moving into the mainstream that Wajin is going to tell us a little bit about shortly the actions of the idealists in one field can then inspire idealists in the neighboring fields to initiate similar actions that start to parallel culture change movements in neighboring disciplines and if idealists are inspiring idealists continues to generate across disciplinary boundaries then the aligned waves of change can become mutually reinforcing to broaden attention on those new practices and provide a push for further adoption crowdswashed replication robustness reproducibility studies provide an example of catalyzing idealists to initiate change the initial crowdswashed replication projects were the reproducibility project psychology and the first many labs study the purpose of these projects was to gather systematic evidence about the replicability of a sample of findings besides the evidence that they produced about research credibility there were a few other outcomes and advantages of this work these projects provided a way for the idealists to do something rather than just be frustrated by the system these projects formed a community those community practices that we mentioned earlier participating idealists learned that they were not alone in this innovation and these people aligned idealists are these projects aligned idealists for collective action their influence was dramatically amplified by all working together we have used and refined as crowdsourcing model many times over the years for an expanding range of topics including the reproducibility project in cancer biology that Tim alluded to and the score program more importantly other innovators generalized as crowdsourcing model for research in human development comparative behavior neuroimaging electronic health records research and many others many labs concepts inspired many babies many brains many fishes and lots of other many many's some of those noted in green I know this is hard to see the different colors up here even create in institutions or tools to make this crowdsourcing work sustainable and scalable within their communities this illustrates both the formation of innovation communities within research silos but also the formation of transdisciplinary community of innovators the shared methodology created a mutually reinforcing meta of movement connecting and strengthening the individual reform efforts in addition to creating reform communities the crowdsource projects gave participating researchers experience with open scholarship practices this was possible because the projects were co-mingled with infrastructure to implement open scholarship RPP and many labs were the first open science framework users the projects were gathering evidence that suggested the need for open scholarship and with OSF providing proof of concept of life cycle open scholarship practices we conceived and built the OSF to enable increasing rigor transparency and sharing of the entire research life cycle as the name implies the open science framework provides scaffolding to guide research producers as they conduct their own research as well as enable research consumers to recognize the entire context of a study rather than just the small window into the work that the manuscript can offer the structure provides specialized workflows allowing users to create and share what they need to when they need to while also linking the materials from all phases of their research together all that work is preserved and perpetually available for sharing and discovery to illustrate how SF supports life cycle open scholarship Fiona Fiddler and LA Gould will present how they use OSF as part of their research process sorry hi I'm Fiona we're going to tell there's two of us presenting in this section and we're going to tell you a couple of stories actually I'm really sorry about this oh thanks okay so both of these stories so the talk comes in two parts the first part is a story about using OSF across disciplines and the second part is about using using OSF in one specific project albeit a very large project and both both of these stories have elements of using OSF across the life cycle research life cycle I guess that's the kind of connecting theme so part one of our story starts in it was the 22nd of May 2014 and I was in San Francisco at the APS conference that was the date that I made my first OSF account and it was in a workshop that Brian was running called open science framework tools for your workflow and I looked around OSF before but it was that workshop that was my conversion experience and it was very quickly followed by my first kind of big doubts about what I was doing the the obstacle I faced the obstacle to adoption of that I had faced was an interdisciplinary one and so despite being mostly a psychologist back in those days I had recently started working in environmental decision centres and when I collaborated with psychologists OSF was fine OSF was great and most of my psychology collaborators were kind of advocates of reform anyway so they were very enthusiastic and engaged with these sorts of efforts but when I worked with ecologists and environmental scientists things were different not because they weren't technically or quantitatively sophisticated they certainly were but the problem as I reflect on it now was our different understandings of what the research life cycle itself was what its practices entailed and what it meant to be supported across that so some interdisciplinary challenges are really fun I really love discussions about how language and concepts change across disciplines and I like I could spend a lot of time just talking about that in contrast using different platforms and programs across disciplines is the opposite for me it's tedious it's frustrating so part of my championing championing OSF in ecology was about removing that stress in my own life kind of minimizing the number of platforms that I needed to engage with so in 2016 I somehow managed to convince Elliot to help me organise an OSF workshop for this large quantitative and applied ecology group that we were both part of it was a pretty sizable event that we ran a two-day event and we had a COS delegate come and join us all the way in Melbourne Australia to run a hands-on workshop about an introduction to OSF and we had other presentations covering statistical inference, p-hacking, pre-registration and so on but the draw card for the ecologists and the environmental scientists was the promise of improving their data management so one thing that I already knew before this workshop was that my ecology colleagues what they did the field work that they did was a very distant thing to the kind of experimental work that my psychology colleagues did and either modelling that the ecologist did was very different from the typical hypothesis testing that psychologists did but I guess what I hadn't appreciated though was how different our underlying models of what the research life cycle was so unsurprisingly then the emerging sticking point in our workshop was pre-registration so and this was around the time that the COS pre-registration challenge was kicking off too there were detailed templates for pre-registration available during that competition and these templates struck our group of ecologists as very hypothesis testing or hypothetical deductive in nature and they couldn't see the ecological modelling work in them that didn't mean that they weren't concerned about the same sorts of underlying problems that we were problems like statistical power and research and degrees of freedom they certainly were and fortunately Elliott's the sort of person who's intrigued by these kinds of challenges and so started a PhD project translating pre-registration templates to help ecological modelers manage their degrees of freedom this wasn't simply creating a new pre-reg form this was expanding the concept of pre-registration to fit with a different research life cycle so us psychologists do things like generate probabilistic maps of where a species is likely to be found across the landscape so that we can predict how endangered species might respond to threats like bushfires or climate change and how they then might respond to different conservation interventions that can help them respond to these threats and these predictions are used to help conservation managers figure out what to do with their limited environmental dollars so what we do is quite different to experimental hypothesis testing work and so the common refrain goes but I can't pre-register my research I don't do hypothesis testing or but I'm a modeler so ecologists first objection to pre-registration was around the content of pre-registration templates they felt that the sorts of decisions that were included in those templates didn't reflect the sorts of decisions that they commonly faced when undertaking their model development their second objection was around the the process of pre-registration modeling is typically non-linear it's an iterative cycle of development and modelers struggle to reconcile this with the idea of pre-registration being a one-off unalterable action taken prior to collecting or analyzing the data and modelers are faced with the genuine need to look at that data before making certain types of decisions so for example checking the distribution or form of some variables in their models or perhaps checking for assumption violations and this breaks with pre-registration's critical requirement of data independent decision-making meaning that models can't generate a single unalterable pre-registration of their modeling development and analysis plan so questionable research practices and researcher degrees of freedom are not in the not exclusively in the domain of hypothesis testing research they're also a problem in model-based research so we advocated that pre-registration can and should be undertaken in model-based research but that it needs to reflect the norms and practices of the research context in which it is applied so we developed a template specific to ecological modeling that captures the entire model development process from construction to testing and analysis of those models and next we've proposed a new methodology for pre-registration based on a previously proposed idea of adaptive pre-registration so we can implement this expanded view of pre-registration or adaptive pre-registration with our ecological modeling specific templates using a combination of Github and the OSF and this leverages the pre-print annotation version control and project management features of both platforms and there are two key features of the adaptive pre-registration process firstly the registration of flexibility so this is where the modeler can supply a decision tree consisting of predefined rules about when particular modeling strategies should be implemented depending on the outcomes of previous points in the modeling process and secondly interim pre-registrations where the modeler follows an iterative process of pre-registration so they can proceed from ideation to pre-registration to execution of that pre-registered analysis plan and back to ideation again. They create interim pre-registrations at different points within the model development process depending on observed outcomes of the pre-specified decision trees and currently we're piloting both the templates and the adaptive pre-registration process with a model based study in Victoria Australia that seeks to understand how riverine vegetation responds to different environmental flow regimes and the ultimate goal of that project is to help managers decide how to determine what optimal flows are. All right so that's part one now that now we're going to tell you another story yeah so fast-forwarding now to 2019 when our research group is about to embark on the replicats project where CAT stands for collaborative assessments for trustworthy science and so we're back in social science territory now and I've dragged Elliot and some other ecologists over to this project with me and specifically the environmental decision modelers and from 2019 to 2022 for those four years there were about 20 of us at the University of Melbourne working on across four different faculties or administrative units in the University so it was a big project but replicats in fact sat within a much bigger program the DARPA score program which Tim's already introduced that had five other teams and various sub teams within those teams and fortunately Elliot was still sort of in the business of making impossible things happen and so set up our data management plan for that project within the OSF so our core task on replicats as part of the score program was evaluate published research evidence so we had we recruited thousands of participants a community of participants to our project and over those four years evaluated four thousand research papers every paper was evaluated by five people in a little review group and this included their predictions about the likely replicability or robustness of those papers and those individual judges measurements of all of those reviewers needed to be aggregated somehow into a single confidence score for each paper so all of these as well as that we also had qualitative data so their reasoning and their justifications for those judgments so all of this data was aggregated and synthesized into confidence scores for each of those papers that acted as an indicator of how credible the evidence was and now Elliot will in the next three minutes explain how OSF helped us with that so pulling the the data flows and systems together into a single coherent data pipeline that generated our internal data product metrics and analyses as well as calculating and delivering our confidence scores was a fairly daunting undertaking we integrated multiple different platform systems and programming languages into a single tool change and continuously operating pipeline at which the open sign framework science framework was at the heart the most important component of our data ecosystem was the elicitation platform and on this platform we repackaged the research claims metadata and papers on the OSF provided to us from our friends at COS and we presented them to our participants to use in their assessments and within the remit of the replicats project we need to figure out well how do we aggregate all of these estimates from our participants and so we pre-registered 28 different aggregation methods on the OSF and then we package them into a series of functions on GitHub and this is currently a published open source package that anyone is free to use and we built another package called copycat so we didn't have to copy and paste all of the r scripts that tied everything together from downloading COS claims metadata to importing it into our platform from downloading our data and then taking it to aggregate and then deliver our confidence scores back to the OSF and tying all of this together was yet another cat themed repository and this was catnap and so one of the key difficulties in managing this whole data pipeline was the continuously evolving and dynamic nature of the data so we were receiving regular batches of claims data with which we had to upload and give to our participants and of course we had ongoing workshops and online assessments so we were continuously receiving new data so we integrated some continuous analysis that automated this entire pipeline and of course sent all of our results back to the OSF. There are two stories illustrating us how OSF has helped us across the research life cycle and now I'm handing over to Johanna. So in my haste to get to OSF I didn't introduce Johanna properly Johanna Kohun was with OSF and COS since the very very beginning so she's seen 10 years of growth and she's an expert in how OSF can be a persuasive technology and Johanna. So I'm Johanna Kohun and thank you for having me be here I have been in the open science community since 2013 I was a project coordinator for the reproducibility project psychology and this does not there we go okay so that work got me interested in how science practice changes over time and how technology can play a pivotal role in that change and inspired by this question and my experience at COS I went to UT Austin to get my PhD there I conducted a qualitative study of open science and OSF its development and its use and its non-use also and in a nutshell what I saw was a strategic effort at persuasion and consensus building so this was what I'll talk about for the next few minutes while at COS I'd seen firsthand that open science is not universally popular you can look on Twitter too and HCI research emphasizes that when users and designers have different values there are likely to be difficulties in gaining technological adoption we know many researchers want to keep their work private to reap rewards that they might have in the future and conflicts could also arise because they value things like their time over the possible benefits of open science so how can a system like OSF not just avoid value conflicts but also bring about actual change in researchers behavior so to answer this question I observed OSF developers interviewed them over Zoom interviewed users and non-users and gathered documents and artifacts like GitHub issues for analysis and what I saw is that OSF uses persuasion strategies to align user behavior with open science best practices but simultaneously OSF also aligns itself to its users conceptions of how open science should get done so first we'll talk about some ways OSF persuades a persuasive technology reinforces changes or shapes attitudes or behaviors without coercion or deception and persuasive technologies can work by making the desired behavior easier to accomplish by providing an informative experience and by providing social cues and on OSF we see these strategies being implemented by design so one basic social cue we're all familiar with is peer pressure or put more formally it's normative influence we see our peers and our colleagues behavior and we align our own to match theirs and the nature of open science being public means that this persuasion tactic is built into open science systems and this normative influence is how people like U12 or user 12 in my study learned to pre-register their research I talked with U12 about their experience on OSF and they told me on OSF there's a bunch of pre-reg templates to download and they kind of guide you with what content you should put in there I think they're just hosted on someone's own OSF page and then I just download a word document and I type into them and so U12 was influenced by this public example of open science and aligned themselves to it believing this to be normative behavior and this kind of normative influence is especially powerful because while it might be initiated through a technology it can propagate to tertiary researchers who don't use that technology yet and bring them on board so U12 for example had a student that they mentored and they learned to pre-register in that same way with the word document and you might be more familiar with registering research on OSF through the built-in feature that we've already talked about and that actually provides another really good example of persuasive technology tunneling tunneling leads users through a predetermined sequence of actions or events that they might not have engaged in otherwise so by constraining navigation and creating a captive audience tunneling prompts users to enact certain behaviors and then also create its expectation for how you would do that behavior in the future. The registration feature on OSF guides users through the step-by-step form ask them to disclose facts about their research study you should show your recruitment strategy analysis plan etc and this tunneling makes registration easier for users because this process is clearly defined and facilitated. And it's also worth noting that this is not the only way OSF could try and persuade users there could be a page of text telling us about registrations there could be a video there could be incentives and all of these things have been done but there's also this design choice a strategic one to use tunneling to align users behavior with open science goals. Tunneling allows OSF to establish expectations for what this particular open science practice should look like and this creates what comes to be understood as the right way to do open science at least on this platform and you might be thinking that if tunneling through this form is the right way to do open science then ostensibly U12 did it wrong with the word document and U12 actually worried that rather than use the registration feature and follow the tunnel that OSF developers built U12 filled out that word document that they'd found on searching OSF they uploaded 12 pages worth of information this was essentially a draft of their final paper and they uploaded that as a PDF because U12 had already learned to pre-register based on this public example and so they didn't realize that there was another way to register without using the PDF and I showed them the button to click to use OSF's native registration feature and they said oh god is this like to formally freeze it I would hope that my pre-registration still counts because I guess it's time-stamped so through the study of OSF's use and development it's become especially clear that there isn't yet a firm routine for what open science practices like registration should look like and infrastructure developers and researchers alike have interests that they want to protect U12 wanted to fit in with their peers and OSF developers want users to follow a tunnel that captures their research plans in this auditable database but negotiating these interests is far from over the way that open science gets done is still changing and this also isn't just because of what COS employees think something really interesting is how these persuasion efforts dovetail with the ongoing negotiation so rather than act as a steamroller that flattens the dimensionality of science according to the operators protocol OSF uses persuasion to align its users while also conforming itself to meet users expectations so time and again I saw OSF developers turn to their users while making decisions about the platform they conducted user research they collected user logs they referenced issues logged in the code repository and we can see this negotiation in the registration feature for years the way that OSF implemented registrations required that users have an OSF project first and D2 OSF developer 2 told me it's always been a sore point with users that they have to go and create a project first in order to create a registration because that's a very us way of thinking about things but rather than double down on their way of thinking D2 and COS acknowledge that there are alternatives D2 one on for users a registration is not a time frozen view of a project it's something that goes into a registry and is its own object in early 2021 things changed and away was introduced to register research without having to start with an OSF project and this better fit with users mental models and OSF developers knew that and pursued the change because they done user research and repeatedly leveraged user data to drive decision-making and they continue to use this data as they evaluated the change to so during a meeting that I observed in July 2021 D7 told the project team that these new no project registrations were a success they were about twice as popular as the original registrations so even though that the registration feature was designed to persuade users users have in a way persuaded developers to align the platform with users concept for how open science should get done and this kind of mutual consensus building or ongoing negotiation shows us that how open science gets done is still changing and the processes like user research and database decision-making that COS engages in these processes actually ensure the negotiation happens and again this isn't the only way things could get done COS could develop a plan and implement it top down with no opportunity for feedback but instead OSF attempts to persuade users while also evolving to better align with users natural behavior so this technological consensus building helps OSF reduce the likelihood of value tensions creating a better and a more persuasive product and because this negotiation is ongoing it means that U12 wasn't wrong U12 is showing us that open science practice is more flexible than we might have realized and incorporating feedback loops like COS does helps embrace these different perspectives to build technology that's more persuasive and more assistive thank you thanks to my participants and there's some of this interview data online what I wanted to come back to is what Eric started with which is really to talk about those idealists again and how idealists are the catalysts for adoption of new technology and behavior but technology designed for these idealists will fail idealists are tolerant of friction because they are motivated by the behavior but for most people there are these competing motivations and concerns that create barriers just like Hannah was talking about researchers are busy they have existing workflows and tools that they use and to reach early adopters you have to lower these barriers to entry and make it possible to integrate new tools into their existing workflows so let's talk about those barriers to entry and how do we better understand them perfect example that Hannah was talking about where we use user research to focus our design towards users called user-centered design and we really want to understand who we're talking about who are these users and it's not the idealists in this case we really are focused on the mainstream and we want to understand what technologies they use what are their current workflows and that's how we better understand those barriers and build solutions with integration and interoperability and then we watch how they use it we want to understand what's working what's not we want to test and iterate and continue to repeat that cycle and ultimately the goal is to meet researchers where they are and take an incremental approach ultimately the optimization to enable adoption of open science is the path so there's some risk to taking this approach that can prevent the scaling of this adoption model and the first is generalization so when we think about the disciplinary silos and the opportunity for overgeneralization is present where the solution isn't fit for purpose and it won't scale the other option is to under-generalize where we create these unnecessary silos and have redundancy which limits standard setting and mutually reinforcing waves so the solution is to standardize when possible and customize when necessary so here's an illustrative example which has been a theme through the other talks with Fiona and Elliot and Hannah that really focuses in on preregistration and we'll start talking about the templates available on OSF first we started with very unstructured formats where we didn't talk about what to register or how to register and over time we developed those preregistration standards for our own large-scale collaborative research as Tim mentioned many labs this evolved to a standard that was used for the prereg challenge which was aimed at incentivizing researchers trying out preregistration eventually this evolved into a general standard on OSF that's a template that many have used and found ways to innovate on top of which has brought the domain experts to develop new formats on OSF and other registries again perfect example that Elliot gave so the hope is in the future we'll continue to see experts come create new formats hopefully a model-based format will be coming soon and others from other communities to make it possible and easy for communities to try preregistration so another risk to this approach is overconfidence we need evidence to confirm that our solutions are working we need evaluation it's key we need to innovate test iterate and evaluate again and we want to make sure that we commit to the mission and not solutions or our own solution so one example of this is quick files this is a feature that we released back in 2018 very excited we had created a workflow for sharing files could be data could be protocols could be preprints whatever it was quickly and easily without the creation of a project but what we learned over time and looking at the usage metrics was actually that this wasn't meeting open uh life cycle open science in the way that we had envisioned it wasn't enabling those standards for metadata and the linkages between different artifacts of research back to the rest it was just single file sharing so ultimately we decided the right decision here was to sunset this feature and we did that in 2022 another risk is bureaucratic burden so we know researchers are busy we actually really want to empathize with that and understand what it is they're doing and how they're doing it and we want to make sure that adding new practices isn't the solution because practices that just get added on to what they already do are not likely to be adopted so we need to come up with better solutions that add value that help them that integrate into what they're already doing so that we can lead to the adoption we want to make open science not burdensome but just part of everyday research and how it gets done so again making adoption possible and easy is the foundation for the successful reform movement