 So just to start off very quickly, I know there's probably a range of experience represented in those of you who are here today, but I'll just give a very brief overview. My name is Nadia Ortelt. I'm the Strategic Partnerships Manager at the Center for Open Science. The Center for Open Science is an organization in which the OSF sits. The OSF is the open science framework. We'll just refer to it as the OSF throughout this conversation. And on top of the OSF, which is a free open scholarship platform for data sharing, project management, we offer a number of products, including two that you'll hear about today, which are the OSF collections and OSF registries. And I just want to give that sort of brief overview because a lot of people don't actually know that there's a distinction between OSF and COS. And what we're trying to do with today's webinar is highlight what this software infrastructure platform can do for open scholarship. And so with that said, if you do have any questions about that, please drop them in the chat and we'll answer them. We have a number of folks from our team here at the Center for Open Science, including Kristen Steltenpol. And I'll just give a brief intro for her. She's the amazing training and education manager at the Center for Open Science. And Crystal is spearheading initiatives to build awareness and skills in open scholarship through training and education. As a community psychologist, Steltenpol Crystal is passionate about improving individual and community well-being. Her goal as a mixed methods researcher is to use the best methods to ask the right questions and find feasible action steps. And Crystal is here today in conversation with Shelby Billups, who serves on the research partnerships team at Character Lab, which is a nonprofit organization dedicated to advancing scientific insights that help kids thrive. During her three years at Character Lab, she's helped facilitate over 200 studies on the Character Lab Research Network. In addition to her work in research facilitation and experiential improvements, she manages the research partnerships team's user experience research practices and leads the organization's fellowship program. She graduated from Swarthmore College with a degree in psychology and cognitive science, and her research focused on race, gender, and language perception. And we're really excited to have her in conversation today with Crystal and just a few kind of housekeeping things. We have folks also from the Center for Open Science who are waiting in the wings to answer questions. Please drop them in the chat. Shelby and Crystal will be chatting for about 25 minutes, and then we'll open up their conversation to some questions that you may have. So feel free to drop them throughout the duration in the chat and we'll help kind of collate those and make sure they're accessible to Crystal and Shelby in about 25 minutes. So thank you all for joining and I will let you both take it away. Thanks, Naya. Hi, Shelby. It's great to see you again. I was wondering if maybe it would be helpful to start with, if you can tell us a little bit about your general workflow at Character Lab, what it is that you guys do and how the OSF, the collections and the registry, like help you to enable that work? Absolutely. No. I would be here super happy to talk about this. Just a little bit of a backstory, we'll say that the creation of the collections registries workflow at Character Lab was one of my first projects actually at the organization. So it holds a very special place in my heart and any time anybody says, OSF, for collections or registries, I always get a little sighed. I'm like, Shelby and I go, I know, I know, let's talk about it. So that's all just to say very excited to be here and talk more about it because it is very near and dear to my heart. So just to give a little bit of a high level background at Character Lab, what we do is we work with researchers to connect their studies with the schools that we partner with on our network. And these studies cover just a wide range of topics. But at the core of all of them is that they're all centered around supporting teen thriving and well-being in some way. That is core to our mission, core to what we do in all of our work. And that is also core in all of the studies that we help facilitate. So roughly how the process sort of works from the researcher side, from the operational side is that researchers will submit an application that includes information like their target sample, the study's description, what it is intended to do for youth and the variables that they're focusing on. And we use that information as well as the expressed interests of the schools to essentially match studies with schools to see how can the studies help the schools and how can the schools help the research itself. And what we do is that we then work with the schools and the researchers to support a seamless facilitation of these studies and then get that data back to researchers and share back what we learn from those projects. So overall at the heart of everything that we do is accessibility and transparency, like giving young people access to a space to share their voice, especially on the topics that directly impact them, and share back with educators and students the information that will actually benefit them in an actionable way. So these two aims, I think we can all agree really strongly relate to one, our core value of scientific integrity, but also the practice of open science. So that sort of brings me more to this project is that to push these aims further of accessibility and transparency, what we wanted to do was leverage the work and the platforms at OSF to see how we could continue those aims and accessibility and transparency with the studies that we facilitate to see how we could essentially, how do we share the picture of the studies that we facilitate on our network? How can we show people, hey, here is everything that we do. Here is the impact that these researchers are putting forward and the things that we facilitate on our network. And what we came up with was we wanted to see how we could use one of these platforms to essentially house a custom registration for every single study that we've ever facilitated that matched the layout from our application. So essentially, how can we take an application that has all this really good, really juicy and impactful information about a study and turn it into a registration that anybody can access and learn about? So that was our initial goal and our initial scope. Now, the challenge that we sort of faced was that researchers may submit multiple applications for multiple runs of a single study. So one researcher could be working on a project on growth mindset, but they want to do a pilot and then they're going to do a larger scale study and then they want to change something up. And those are all distinct registrations, but the challenge of just relying on those registrations and maybe just housing them on the registries page is that they couldn't see that connection. You can't really see that full story of how this project sort of, what was the journey of that that led to maybe a final impact or future directions? But what was great is that we were able to collaborate with OSF to come up with this unique workflow that essentially met the uniqueness of the work that we do with our researchers. So after months of collaboration and ideation and back and forth and honestly, I would be remiss if I didn't give a very special shout out to Mark Call. He was an absolute godsend in all of this work, could not have gotten it over the finish line without him. We were able to utilize OSF collections and registries to set up this really awesome and unique workflow that uses both platforms to help us visualize the whole story of each project. So now every researcher who applies and creates an application and a registration, it's all housed on the registries page. But at the same time, we also have our collection, which is our front facing platform. And each project is that project for a study and it's connected to each of those registrations. So I'll just give you an example to maybe better visualize that. I'd like you to imagine that you are a researcher and you have created this new growth mindset intervention. You're like, this is going to change the world. This is going to make such a great impact to students and youth. But of course, first, you want to pilot that study. You want to see if there are any tweaks that you want to make, any kinks that you want to work out. So you're going to run that with 100 kids in, I don't know, let's say New York City. And you do that study, you learn some things, you want to make some tweaks. But now you say, okay, I'm ready to run this with a larger population of 2,000 kids with this intervention with these new changes. You're going to run that study again with some new focuses, with some new edits, but still it is that same growth mindset intervention project. And then maybe lastly, you want to do a follow up study with a different population and you want to test for meaningful sample differences. That's still a different study, but all within that growth mindset intervention project. So what happens is that we have a registration for each of these distinct runs, but they're all going to live in one project on our collection page so people can see how that project has evolved. So that's a very high level, hopefully not too in the weeds overview of how we leverage both of these platforms to really meet our unique internal workflow and lean into that aspect of transparency and clarity. And honestly, the beautiful journey of a research project, because it isn't static. It's always growing and evolving and people are learning more things. So how can we essentially organize that information in a way that people can find and people can learn from? Awesome, thanks for that overview. I'm curious and to give us a sense of like scope, do you know off the top of your head like a ballpark number of research teams, a number of projects or any kind of sense of like how big is this collection? How big is this registry? Okay, that is a good question. And in the back of my mind, I'm like, you remember that giant spreadsheet that you had to do a big old upload of when we were first launching this. Because we had a backlog of a few years of studies and that spreadsheet honestly haunts my dream sometimes. But it does give a little bit of a background there. I would say on our network, if I'm remembering correctly, over 400 researchers currently, and there's definitely been more, and some have come on, some have come off. But in terms of projects, I believe we're in about maybe around 200 projects. But in terms of registrations, my heart is saying in like the 300s, somewhere between 300 to 400. So it's a sizable backlog. You won't get bored going through it. I think I would describe it. And because of that, there's that scope there. There's always this conversation around, how do you manage all of this? How do you make sure that all the Ts are crossed and all the Is are dotted and that things are where they're supposed to be and labeled correctly and there's good metadata and all of that kind of stuff? I know that CharacterGlam has dedicated staffing to managing this resource. Hi. I'm curious, how did your team decide to dedicate resources to managing this? And what does your support, what does your work look like on a weekly basis? Are there other folks that you interface with? What do you spend your time doing? Absolutely. I think the short answer is not a crazy amount. I think not a frightening amount as some might assume from the size of the collection. And that is really all reliant on the setup, I would say. So when we were first launching this project, it was very explicitly, the team, we had a very specific project team that was dedicated to the launch of this workflow. And they were explicitly dedicated to coming up with that idea of how we might utilize OSF for this goal to collaborate with OSF and build that process for maintaining and managing this in a way that was sustainable. So it really came down to the initial setup, and I think that that's where a lot of the really heavy time and energy went into. I would say the actual main setup for the workflow and the platform to really get things launched, about three to four quarters of work in all honesty. And to be honest, I do not think that we originally expected it to take this long, but again, it really came down to what our goal was to make our work more transparent. And we're an organization that very much subscribes to don't do it fast, do it well. So we were willing to take more time to do so intentionally and meaningfully if it was going to lead to a more impactful outcome. So with all of that set of work, how it actually relates to staffing and timing now, all of that initial collaboration work that we did with OSF to create these custom features was that they aligned with our current processes. So it's actually made, it really aligned with the things that we were already doing, we already had to do. And now these custom features just sort of help us like, okay, let's meet, OSF essentially met us where we were at, which I think has made a lot of these processes a lot shorter. So a couple of these features, I will name our custom registration that aligns with our application questions. So we don't have to type in new questions or reformat. It is like hear the questions that they submitted to one to one fit, which has been so helpful, as well as our custom filters that we have on our collection page. And one of my absolute favorites that I think has really been a thing that has made the work a lot more easy is our bulk upload that allows us to mass upload a CSV of all of the information in these applications and automatically create the registrations themselves, link those registrations to their corresponding projects or create a new project with that registration all in this upload. So all of that prep work has sort of led us into a little bit more of a plug and play situation. And these features are so scalable and honestly very easy for a single person like me to do. I will say definitely also have the rest of my research partnerships team to support when needed. But in all honesty, because of these workflows that we already created, it's been absolutely seamless. So honestly, after every data collection period, all I need to do is export our application data, put it in a spreadsheet, make a few tweaks, and then the bulk upload takes care of the rest. So as for a small organization, it's made our capacity a lot easier. So I would say day to day it sort of varies a lot more work, I think once we finish each data collection period, but even then I would say only a few hours a week, if that. So yeah, I would really give my flowers to the custom features that have helped things. Now I will also mention that a long-term goal has always been to leverage OSF as sort of a one-stop-shop for these products from beginning to end. So not just the registrations, but also here let's link the publications. Here's actually a news article where this was cited or a blog post related to this. And that works still being fleshed out so we don't know exactly what staffing will look like there. But again, I just want to give my flowers to the relationship that we have with OSF where we can ideate and innovate together in a way that is impactful to our users and also feasible and reasonable for our own internal staffing. So hopefully I'll have more to share with you all in the near future on that regard. Gotcha. And in that, I'm hearing you talk about interfacing with the COS team, the OSF team, getting all these different features in place and stuff like that. Can you talk a little bit about what that process was like? How did you communicate with one another and provide feedback and that kind of stuff? Absolutely. So it definitely, I think the contact times would vary based off of what part of the project we were on. So when we were initially ideating, it was a few back and forth emails of like, what do you have available? And then we would meet a few times and then as we were sort of ramping things up, we would have more regular meetings for back and forth. So we actually talked with our team internally and had these new ideas. What do you guys think? Oh, well, actually we have these things. Let's see how these might fit together. And then once we got into the building stage, Mark actually created a whole schedule of like, all right, here's how we'll be working. Here are like some good contact points. Here are the things on staging and here are the times and dates when we'll have things ready for review. We can have connection meetings to touch base and also rely on async work as well. But it was just, it was a very organized and very, we're here for you, I think sort of collaboration. It did feel very much like a shared project and everyone was very generous with their time and energy with us, especially as we were, I think learning some of the intricacies of registries and collections and how these might connect together. I had many conversations back and forth internally, like, all right, let me walk you through. There are two separate platforms, but here's how we're gonna use them together. And that can take a little time for people to wrap their heads around. So I was really appreciative of the support that we also got internally to help with those conversations within Character Lab so we could move things forward together a lot more easily. Yeah, that's awesome. That actually leads me right into my next question, which was, you know, and I'm also curious about this in terms of the other folks that are here today too. How are conversations with education and other researchers that you're working with when you talk to them about open scholarship generally, when you talk about like this collection and this registry and how this works together? You know, what are people nervous about? What are, what conversations are you having about those worries? Or I guess even what are people excited about? Like, what does that look like for you guys? Absolutely. I will say that we have been very lucky with the researchers that we work with when it comes to open scholarship. They're a very understanding of its value, and we've been really lucky to just work with researchers who align with that or already have it integrated into their practices. So I will say overall, with a lot of our open science initiatives and aims that they've been quite positively received, and I think we've been lucky there. But when we do ask about, we can't say it's always been perfect and we ask about these apprehensions that we know just from the greater landscape and conversations around open science and even in things that we hear in conversation with our researchers. So a couple that I'll actually name from some past UXR that we've done with our researchers are one, history. This isn't how I was trained, these practices, like we didn't do that when I was coming up, like a little bit of a learning curve there. They aren't used to that. That's not how they came up when doing research. Another is concern for more exploratory studies. There's a fear that they might be essentially stuck committed to what they said in a pre-registration. Maybe they see a surprise in the data, but they said they were going to analyze it one way, but they kind of want to analyze it another way now. So I think there's fear of that commitment too in the research. There's also fear of participants seeing information about the study before it's run. So what confounds might that add to the data? So those are just a few of the concerns that we've heard. I'll also mention that more recently we had this conversation with a researcher who also named fears of scooping that can come with open scholarship and actually heard their own experiences of having their work scooped. And I think it was actually a dissertation. So those aren't things that are very valid and real apprehensions to that. So with all of that that we've heard and that we've learned, the ways that we're trying to navigate this is really by holding two things central, and that is support and flexibility. So through the supports, the things that we try to provide are a couple examples are a one-pager PDF as well as an FAQ webpage that's available to all our researchers that explains, hey, here's how you preregister. Here's where you can do so and articles and information on why it's important and why it's impactful. We're also always available through our research email if anybody ever has any questions. And we also manage the upload of all the registrations ourselves. I think that is one thing that has sometimes been a little bit confusing, preregistrations and registrations. The registrations are things that we post after the data collection. So it shows like the full picture of the intention as well as some of the outcomes. The preregistrations are things that the researchers are putting up themselves with analysis plans and things like that before data collection. So I wanted to make that clear because that can sometimes be a bit confusing. But the registrations are things that we submit ourselves. So we take that weight off their shoulders. And then there's also flexibility. And through that, we don't have very strict requirements for our preregistrations because of, again, these very valid concerns related to these open science practices. So we have no issue if a researcher chooses to put a preregistration into embargo, for example, or wants to keep it private up to a certain point. So it really comes back to listening to what our researchers are telling us and seeing how we can be flexible and supportive in the work. One last thing that I will mention on this topic, especially because it was related to the unconference that you all actually hosted last year, I actually chatted with some organizations who were working outside of academia and another common theme was how open science practices aren't really as incentivized in academia and that it can be more pushed around publishing and getting science out quickly and how that might impact things like tenure. And what we discussed was that we all have an incentive to provide. And what we as organizations need to do is really critically reflect on what is that incentives that we can provide and how can we leverage that to promote open science practices. So something that we've actually recently done is we provide in-kind grants to researchers who need support to run on our network. And what we've done is that we've added a couple of requirements and one of those requirements is that researchers who receive a grant must pre-register their projects. So again, coming back to listening to our stakeholders that plays such a strong theme and role in the work that we do around open scholarship and how we facilitate these conversations as well as create actions following what we hear. Gotcha. Okay. Yeah. That's really helpful. I'm curious about I'm curious, taking a step back to the pre-registration part. I know that the OSF introduced the ability to update pre-registrations last year, the year before. Do you see a lot of use of that in your guys' work amongst your researchers? That's a good question. I have not to, obviously, but I actually was recently looking into a registration or pre-registration and did see that feature leveraged. I will say that I was actually quite excited to see that feature. I think one, just as somebody who's uploading the registration, sometimes concerns of, oh, gosh, what if I put a number in wrong and oh, no, no, no, this number actually needs to be correct. So that was very helpful, I think, just from an administrative end. But also, as we think about a conversation that we've been having is, how do we talk about and push forward research in a story? Talk about it in a story that is not done, that may never really be done, because it's always growing. Research always grows off of each other. It's always spawned from something else. Maybe it's run in another sample and you learn something new. Research is this never-ending storybook, and I think that ability to make edits to pre-registrations or maybe new scopes, I think, again, promotes that visual of research always evolving or changing. Us wanting to still stay committed and transparent to what we say we're going to do, but also have that flexibility and malleability knowing that surprises happen. That's the magic of research. You're going to find things that maybe you didn't expect. So how can you still stay true to the rigor, the integrity of it, but also share those surprises, share that story, and still make that story accessible? Yeah, awesome. If folks that are also here have other thoughts that they've been hearing or other concerns or other questions that they've been hearing from their research communities, we'd be excited to hear those in the chat as well. I'm curious, taking a step back and looking at how this project has evolved over the last several years and all of the work that you guys have been doing, what does success look like to Character Lab in terms of this collaboration, in terms of using this tool? How do you measure whether this particular platform or any platform is meeting your needs and helping you to accomplish your goals? Yeah, no, that's a good question. And I hope this isn't a bit of a cop-out answer, but I'm going to put it all back on our stakeholders once again and hearing what their thoughts are. It really all starts with talking to them. And again, since that is the core of the work that we have to do or that we do is talking with youth to get their voice, to see how that might impact our actions, we bring the same practices to the work that we do that's maybe not specifically as dedicated to the actual research process. So I would say one of our biggest goals was making the information about studies easy to find and navigate and with these, the tags and filters on the platform, that has already helped a major part of that both externally as well as internally. We use it as also like an internal searchable database and that's been really great to leverage. But I will say that when we have spoken to our research partners about this platform and about this particular workflow, especially for the ones who weren't aware of this platform, they typically get pretty interested in the ways that it can be used. Two special ones I will mention is that one, just to learn about the different types of research that we facilitate. Like, oh, I didn't know that character lab. Facilitated studies on, I'm not going to bring up growth mindset. We definitely do a lot of studies on growth mindset. But you know, on how people navigate racialized situations and how youth can still have resilience in those situations and things like that. As well as number two is potential collaborators. So as people can navigate this collection and learn about these different projects, they can also see people who are doing something similar to what they're doing or innovating in some way and they'd love to learn more or work on something, a product in the future. So those have been two really exciting things that I think have been a little bit of a marker of success for us to see how we can facilitate those collaborations and promote that scientific growth in some way. I will also say that one of my favorite comments from a researcher when talking about this platform was that the two things that came to her were inspiration and accountability as the two main functions of having all of this work in a visible spot that they can find inspiration for new projects, new collaborations, new futures, as well as accountability to hold themselves to like, all right, here's what my study was about and here's the impact I wanted that study to have. How can I continue this work to hold myself beholden to that impact that I was intending from the get-go? So that's just been very exciting for us to hear and to know that this is a space that we could offer to our stakeholders. And honestly, as we continue to grow and work alongside this platform, it really comes down to the responses of our users to know if we're on the right track and if there are new spots for innovation. So again, I can't gush enough about the dynamic that we have with OSF because we just always know that that collaborative innovation door is always open to us. So always excited to see how we can continue to grow there. But yes, in summation, absolutely just hearing what our users are saying because they're the ones they're the ones who tell us where we need to be headed or if we're on the right track. I'm envisioning like it would be kind of cool to see like a network analysis over time and seeing like who's co-authoring with who and see like maybe being able to maybe parse out like the role of character lab in having those connections get built. That would be really cool to see. Oh, absolutely. I always, my heart always flutters a little bit when I review an application and they're like, oh yes, citing XYZ researcher. And I'm like, oh my gosh, we helped run that study two years ago. It just warms my heart to see those connections. And just again, because research is just this never-ending story and they're always jumping off from each other. I think any way that we can help facilitate that connection and that magic is just incredibly exciting and something that, a priority that we definitely hold dear. Awesome. That is, those were the questions that I had that had for you. I did see Leslie marked in the chat. If you, can you talk a little bit more about how you connect to like data, code, pre-prints and all of that to pre-registrations? Absolutely. So for the pre-registrations, since we manage the collection ourselves and those registrations and do all of that work on ourselves, but the pre-registrations are the responsibility of the researcher. What we have is basically this period where we're reminding researchers, hey, don't forget to submit your pre-registrations and they will send us either an OSF link or a PDF from an AS predicted. And we will go in and add those pre-registrations ourselves to the collections. Now with data and code, to be completely transparent, that is definitely an ongoing conversation. I think there are a lot of, I think data privacy and data ownership things where these are the researchers' projects, not ours, but we manage the data. And because we are, again, working with youth data, there's definitely levels of privacy and things there that we want to be mindful of. So we aren't connecting data or code to pre-registrations, but that is, again, it's an ongoing conversation to think about how could we potentially do that in a way that is safe for our participants and meaningful. Because it's not always just the data, just the code, but also to make it actually accessible and open. It's the scripts. How are you analyzing the data? How do I understand this data? It's these notes, these explanations and things like that. So yeah, it's definitely, it's a beefy bulk of work that is probably would probably be a whole other project for us to navigate, but along with an answer to say, data and code, no, but pre-registrations to the individual projects, yes. But yeah, I think it would be wonderful one day for us to see if we get to the data and code, data and code realm one day. Yeah. And as you're saying, like it is something that you really want to, you don't just automatically do that. Like we want there to be like stakeholder involvement. You want there to be conversations with communities about how they want their data used. Do you want to have conversations about like just access? Like do we do a managed access route? Do we do like, how are we allowing people and in what ways and to what end do we do all of this? We want to make sure that that's an informed decision, not an automatic. I'm checking off a box decision. Absolutely. Because we work with a lot of different people and a lot of different mistakes and all of that. It's, we have youth ascent, but at the same time, you know, if we're making the data public, that could be something where, okay, do we need to speak with caregivers? Do we need to speak with, you know, school districts and how to navigate that? And then, you know, our legal and researchers. So it's a lot of, a lot of voices in their conversation and we want to be meaningful and mindful of making sure all of those voices are elevated in a way that is thoughtful. Yeah. I see Nadia has her hand raised. That's right. I just had a question actually for you, Shelby, which is a question about kind of process and timelines. Do you have a kind of like rough estimate of how long it usually takes researchers to kind of add information to the OSF? And then on your end, how long does it take you to kind of process, let's say, a registration from start to finish? I guess like the bigger question is also like at what point do you have so many, you know, registrations or, you know, researchers, projects that you're sort of helping to administrate or manage that you, that you can't manage it anymore or does it not reach that point? Yeah. No, that's a good question. I would say in this, I think some of these larger long-term goals with I think more intense management are a little bit down the line, but at least how we currently have it, it's a very manageable process. And again, because of the custom work that we have done with OSF to set up some of that initially, it's truly, it aligns with our internal timelines very well. So a researcher will submit an application during our specific application period. This might range from, I think it's typically about to give a good estimate, maybe like three to four months before the actual data collection period, and even then that might be a little bit under. So that's when we would get that initial registration information and then we would put up the registrations, I would say about a week or two after data collection is done. So we can add in any other information of like, okay, here's the actual like demographics of the school that it was actually run with and the actual sample size that it ended up running with too. So we can have the most accurate depiction of the study that we put up, but I would say like long-term maybe four to five months as the whole process, but again, that registration, it just aligns with our already existing research timeline that we have internally. Now when it comes to, I think some of this long-term stuff of how we're thinking about, all right, how might we add publications that are coming out of this or somebody did a presentation and they want to add the slides. I think that is something that we are actually doing some more work thinking about in sort of a general space. We call refer to them as insights or things that people have found through character lab activities and they can range from publications or just like a little summary of what they found. And that is actually some work that we are doing right now to think about, all right, what's the best way to turn that information back to educators, caregivers, practitioners. And that's also information that we could put on our OSF projects to again show that full story. But that is definitely something that will take a bit of time, especially if we're thinking about what if we made some custom resources based off of this research or if we're making a research summary, you know, we want to have conversations with the people we send this to to make sure the language is accessible as well as the researchers who made this. And that is something that could take who knows how long, especially factoring in how long it takes researchers to even analyze data or if they're planning offshoots of projects and they're like, this won't actually be finished until we do these two connecting projects. So that could take anywhere from one month to three years we've seen. So maybe my answer isn't four to five months. Maybe it's one to three year, one month to three years. But depends on which part of the process you might be focused in on. Thanks, Shelby. That's really helpful. I'm curious in listening that I have two kind of follow up questions. One's I think pretty short. And then one might be a longer conversation. The short one is I'm curious about the is a character lab doing mostly like intervention research. Is it focused on like one intervention? Is it that, you know, do you have folks doing like mixed methods like some qual, some quant, some descriptive work, like all that kind of stuff. And then my other question that I think is probably the longer one is what impact do you think that this using these tools has had on the rigor and reproducibility of the work that has come out of character lab? Absolutely. Short answer to the first one. We definitely do a bit of it all. Correlational, longitudinal, some measurement validations, interventions, you know, just a really cool range. And again, because our application is very open and our core is really like, is your research going to be something that is actionable and impactful for schools? That's really probably our biggest research criteria. So we definitely have a very wide open range and we see new types of projects every round. So yeah, that's my answer to number one is all types, all types, which has been very cool to see over the years. Now, in terms of these tools and the rigor, I think it's been interesting. I think it's been interesting. Now, I would say that, especially the folks who have already been doing pre-registration, so I know one researcher was like, our entire university, like the core tenant of open science and like everybody pre-registers, like that's how I came up. I would say probably no major changes. That's just what they've been doing. No shifts there. But in terms of rigor, I think there's more of a hope. That hasn't been something that I think we have measured to intentionally. I think we're just of the mindset, like we work with really great researchers and they all always do awesome work, like we trust them. But I will say it has been very illuminating as we've been getting more and more pre-registrations to learn more and I think get a little bit of that background a bit more. So with that information, we have actually been having conversations about how we're going to be essentially evaluating quote-unquote scientific integrity of different insights at Character Lab. I will say previously, when we would think about insights, we would typically think, okay, a publication. When a study is published, boom, that's an insight. But we know that that barrier to entry is very high and I think though we feel pretty confident about the scientific rigor and integrity because you have this jury of, maybe not jury of peers, I think that's for when you commit a crime, a panel of reviewers, maybe that's a better descriptor, but a panel of expert reviewers who review this and say, great, this is good enough for a publication. That would give us sort of that, like, yes, this must have rigor. But now we're thinking about, all right, if we aren't saying that an insight is just a publication and it is just an insight could be, hey, we learned these three bullet points from the research, something that's still super impactful to the place that it was run in, but maybe they aren't going to publish it or maybe it's going to take like two, three years to publish and we'd love to get this back to the schools sooner. So I think things like pre-registrations are some of these steps that we are trying to take to promote rigor and scientific integrity and feel more confident in the work that we share out without some of these higher barriers like journal publication. So maybe in summary, not something that we have very much intentionally measured, but something that we are thinking about and thinking about what workflows, what, as we're thinking about this incentive structure, what things can we incentivize to push forward and just continue these practices that promote rigor and integrity in any way that we can. Yeah, and it seems like, I mean, such an important component to figuring out even the answer to that question, right? Is this more reproducible than it would have been if we weren't doing this? Is this more rigorous than if we weren't doing this? Is it's the transparency aspect of it? It's the documentation aspect of it, right? Like we can't test if this is rigorous. We can't test if this is reproducible if we don't know what was done. But by ensuring that people are documenting those steps and hopefully that they're doing it clearly and in language that anybody or almost anybody can go in and understand what's happening that does lead to better determinations from the audience to be able to say, okay, I think that this is reproducible or I'm going to actually try to reproduce this or I'm going to try to replicate this and I can actually do that because you've set up this structure where things are transparent. Leslie's also asking, what other types of open science practices do you think your research network might be? Interested in? For example, are you guys exploring registered reports at all? You might also think like on the qualitative end are folks engaging in any kind of reflexive practices or anything like that? Absolutely, no, that's a great question. So I will say we have received, so something that we send out quarterly to our network are basically these like insight check-ins that are asking things like, hey, do you have any updates on any of the projects that you've run? So things like these presentations or maybe new publications or registered reports, pre-pents and things like that that you might want to share with us. And that's how we receive some of these more like not publication sorts of information that we also have accessibility to and can think about how might we turn this back out to our audience of educators and things like that. Now in terms of some of these like reflective practices and another thing that we are actually doing, so I touched on this a little bit earlier as we're thinking about our insights practices and especially how we are turning back the information about our studies to educators and parents and things like that is that we are creating or trying to think about how we might have these things called like research summaries. So how can we basically create these interesting interactive, hopefully visually appealing, I'm not a designer. So my mock-ups are nowhere near visually appealing, but I'm like, I'll give you the information and maybe a spreadsheet. If you think a spreadsheet's nice, there you go. But that is one of the ways that we are sort of navigating another way that we can make the research open and accessible and written in a way that is taking away the scientific jargon and telling people like, all right, we ran this study. Here's what we learned from it and here are the things that you can do based off of this. So that's a way that we make those findings that sort of turn around open. And one of the ways we're trying to gather that information is potentially by creating a study reporting template that we are sending out to researchers after the fact. So once they get the data after a little bit of time, we'll ask these questions of, hey, what did you learn? How did you do the analysis? What are some things you found? How might people be able to use this? Do you have any videos or graphics that might help illustrate this point? Would you be interested in having conversations with the schools about what you learned so you can share more about the data and your findings and how people might use this? So yeah, maybe not as formal open science practices, but really coming back to how can we, in that sort of vein of how do we make things open? How do we make them public and visible to both further the research space, but also to further the work, further the application, further the practice, again, for those communities that we are running this research in and who are giving their meaningful feedback and thoughts and insights with the hopes that there can be meaningful change that comes out of it. So I think that is at the heart of a lot of the more open accessibility transparency work that we do. But yeah, no, that's a really good question. Yeah, and that also, to me, as a former professor, I'm always thinking of the educational component. How cool is that to have these options for, maybe a student could read this, who might be thinking about doing research in this area and we can see how things are going. So that's really great. Thank you. I think we're actually a couple of minutes over, so I wanted to... This has been such a really engaging conversation. I feel like I could talk to you about this for another two hours. But I really wanted to thank everyone that's here and to mention to folks that we're having, as mentioned at the top of the hour, we're having other webinars that are happening. We also have Love Methods and Love Data Week are coming up. So we're going to have a lot of cool different case studies and stuff like that that we'll mention in different webinars. So please keep a... Thank you for the link, Blaine, cos.io slash events. You can keep up to date with all the different webinars that are occurring. And thanks everyone for coming. Thank you so much, Shelby, for joining us and talking to us about Character Land. We look forward to continuing to work with you on improving the platform and making it work for you guys. You guys are doing amazing work, so thanks. And thanks for sharing. Thank you so much for having me. This was such a joy and thank you everyone for listening in your questions. I had a very good time. Thank you. Thanks, everyone.