 So, yeah, the sessions trying to give a bit of a general introduction to a new publishing platform that myself and Alex work on. It's called octopus.ac as you may have guessed from the title of this talk, and it's trying to bring quite a new approach that we think is fairly unique to scientific publishing. So I think first off, Alex is going to talk a bit about what we're trying to achieve and the, I suppose, the ethos behind octopus in general. Yeah, so over to Alex. Thank you. Right, I'm going to try and do the share screen thing. Let's see how my computer deals with that. Sorry. That spinning wheel. Getting there. Is that, have you got it? Is that all right, Tim? Are you seeing the right version and everything? Yeah, I'm seeing, I'm seeing the slide deck on PowerPoint rather than the present of you. No, it's just with the UI and everything at the moment. Is that that should have gone on to. There we go. Yeah, perfect. Yeah, so I came up with the idea of octopus, and I have a bit of a strange background in that I used to be a filmmaker and then I came back to academia. And when I came back to academia, I sort of thought this publishing system has some problems and obviously I wasn't the only one thinking that. But I had an idea for a solution for it, which is now funded by the UK RI, which is the UK's government science funder. And we built it in partnership with JISC and Tim's from JISC, which is the UK's digital infrastructure organization. And we're also in partnership with the UK reproducibility network who are a network of academics and institutions and organizations who work towards helping promote best practice in research. So the, the, the sort of problems that I was facing is this issue of how can we do research better, and how can we communicate it better in a way that helps us do it better. And it's about ensuring that we have an, an infrastructure that helps maximize access for all researchers that's efficient at sharing the work of all types and, and also provides a meritocratic system. So one that actually incentivizes good research practice and make sure that good research is rewarded because I felt that the current system was not really doing that. That's because at the moment there isn't a really good way in which researchers can demonstrate good practice and share all of our research outputs. And although there are other outputs for research, other than journal articles, there's no getting away from the fact that the main one, especially for researchers within academia is journal articles or monographs if you're in a domain that uses monographs a lot. I'm from the world of sort of STEM, which is very journal article focused. And I looked for example at a typical author guide from one of the leading science journals. And it really to me hit home that it was defining a classic high impact publication that most researchers will be familiar with looking to maximize carrying a focused message maximizing impact through readership and citations. And that as somebody who worked in the media industry really hit home as the wrong kind of incentives if what you're trying to do is encourage people to do the best research possible. That's all about trying to tell a good story and trying to get readership and be popular, which of course makes sense for most journals because they rely on a high readership. And I think that the problem stems from journals and journal articles really being pulled into different directions. One thing that they're trying to do, which is all about their readership is disseminating useful findings to people who can use that research. And that is a really important job, and it drives a lot of the journal revenue. And if you think about the most popular bits of journals it's things like the news and views and that's the sort of thing that they're describing in those author guides. But that aim to be really attractive and readable drives a whole lot of issues like word counts and publication biases and questionable research practices which you probably know all about. On the other hand, journals also trying to be the primary research record, what a sort of a full record of almost being like a patent office to record what people have thought and what people have done and record things in full detail so that they can be reproduced and people can have access to the raw data. And all of those kinds of things are really specialized kinds of communication and they would almost get in the way of a good readable story that drives readership. You kind of don't want all these extra analysis and extra supplementary information in the way of your story. So these two different kinds of writing and these two different styles of communication I think are pulling in opposite directions and journals are doing one of them really well. But what I think we need to do is to split these two jobs. And that's what octopus is designed to do is designed to be the new primary research record to take on that second rather thankless job and leave journals to do the other one properly, so that they can carry on disseminating findings in a really accessible way. So octopus is there designed to maximize access to the primary research, but it's also then it can design its incentive structure entirely around supporting good research practice. So it can of course take advantage of all the 21st century tools be digital first, but there's a lot more to it, because I think the key to solving many of the issues in the research process today is can be tackled by breaking up the concept of the paper as the main unit of publication. So, the research process that we all go through kind of naturally comes in a series of steps that we do one after the other. And each step requires quite different skills and different resources and forcing people to get right to the end of this process before sharing any of it is I think causing so many problems. You're kind of retrospectively forcing a linear narrative on to a process that is definitely not linear. And you can only publish your results when the data supports your initial hypothesis or it makes a good readable story these are all the sorts of pressures that tend to lead people to feel like they have to cherry pick data or find significant results all those things that we don't want to be doing as good researchers. So instead, if we were designing a research record that was free of all of these dissemination incentives, which we can do if we start from scratch, we can allow a structure of publication that much more closely matches the actual process that we as researchers do. So we have each publication type being linked to the one before it in the chain. There are other forms of publication that you may know of that do this kind of thing there's one called registered reports which you submit an article to a journal at the stage of having a protocol and get peer review. And then it's kind of accepted by the journal at that point and it doesn't matter what the results are so that's also designed to break that link that can push people to wanting their results or using their results to drive the story. You create the story first, but this structure kind of takes that to a more extreme level. And those of you who are wondering why it's called octopus. There are seven types of publication there on the screen, but there is an eighth type in octopus, and that is a peer review. And we know that peer reviewing is a really important part of the research process because it helps us all critique each other's work and make sure that we've all discussed and thought about all the possible things that could be wrong or things that are right with research. And the eighth type of publication in octopus is a review. And you'll notice that all of these types of publication are linked to each other. So you form these kind of branching chains you can't publish data without the protocol or the method that that data was collected according to already being available for other people to know. So peer review can be linked to any kind of publication of course, but it's treated like a publication in its own right, because so often peer review is, well, it's usually something that researchers do for free as a kind of feeling like they're doing a gift to the research but it's a really important skill and it's also often done anonymously and secretly, rather than being open and transparent for other people to be able to learn from the critique and the review of others, and for others to be able to read those. So we hope that by incentivizing it by making it as important to part of the research process as any other, then we can encourage good reviewing practice. So that's a kind of very quick overview to what octopus a structure is, and I just want to very quickly talk a little bit about why we've structured it that way so you've got instant publication there's no barriers to publication as soon as you press the button saying I'm happy to publish there and it sends messages to any co-authors you have and they also approve it and and then it goes live once everybody's approved it. And that means that we can act as a kind of patent office it undermines that fear of people being scooped, because as soon as you've had an idea and you've got it out there, you press that publish button it's their sort of time and date stamped with your name on it. So it's a way of recording everything that you do as you do it and making sure that you get the credit for it. Also means that other people can learn from it very quickly. So in terms of credit, because they've got smaller publications, you've also got smaller groups of authors working together, which means that you can get much more meritocratic recognition so specialists specialist data collectors specialist people who are really good at implementing things in the real world. Often don't get the recognition that they deserve within the research publishing system as it is at the moment they get kind of either ignored from the author list or they're in the middle authors and not getting the credit for the work that they do. So in this system, they get the credit exactly for what they do. And importantly, as I was saying, avoiding this need to tell a story means that you're not under pressure. It doesn't matter what the results of your and the findings of your research are what's important is that it's been done to the best standards at every stage. So that means that that removes those incentives and pressures for questionable research practices, but it also means that you can publish a lot more of your stuff you can be sharing stuff that at the moment doesn't make a good enough story for a journal. And I think that's really important for the research community because we can all learn much more from each other, if we are all sharing much more. This sort of equal emphasis on each part of the research process, each of those publication types or latest them being of equal status. I hope means that each part of the research process is getting the sort of emphasis that it deserves. So it's getting all the space that's needed to describe it and share it makes it much more reproducible, but it also allows us to assess the intrinsic quality of each bit of research. And so it's not all about what you discovered and novelty and all of these things that currently drive the system. Instead, it's allowing us to really concentrate on really solid work. You kind of get these emergent findings, which everybody can then know are much more reliable because they've been built on this or emerged from this really solid, high quality base of research. And I hope that eventually this will change the way that we all approach research. And it allows us to be specialists in what we're really good at and what we enjoy doing, and to feel like we're getting the credit we deserve about that. And I hope that that will change, not just the research culture and help people feel that they're getting what they deserve and what they want out of their careers. But also to build a much more collaborative worldwide and barrier free research process, which means that everybody who relies on research outputs can feel like the whole process is generating much quicker, much fairer and much more robust research. And that's the kind of very quick overall aims of octopus and what we're hoping to achieve and how we're hoping to achieve it. And I'm now going to pass back to my colleague Tim, who can show you a bit more about it. Thank you very much, Alex. Yeah, let me just get my screen share ready. So what I'm going to do is I'm going to have a bit of a talk through some of the features we've got on the platform because we've got our platform already to go and ready for people to use. And heart back a bit to some of the things Alex has talked about in terms of how we're approaching trying to put these ideas into practice and actually get people able to take advantage of some of these these great ideas from proving things in research publishing. So let's see it's that window there. If someone wouldn't mind just giving me a nod when they can see it there. Brilliant. So here we are on on the platform itself. And what I'll do first I think is if I hop over to one of our publications that we've got in the system. Then I can start talking about some of the things that octopus is doing a bit differently. The most obvious thing here is these different stages that we've got listed up across the top here. So this is what Alex was talking about with these separate stages in the research process as we can see here. We've started off with research problem and move through this project all the way through to a real world application with in this case different people actually being credited for different parts of the work here. So you can see someone has done the analysis here and got the credit for it on the back of a previously existing chain of research. We take this a little bit further than just letting people publish in a consecutive chain like this and actually let people collaborate on each other's work. So in this example here, you can see that one author has gone through and started with research problem and put in various publications as they go. But someone else has actually come in and proposed something a bit different here. So initially we've had a publication for a method to use online surveys in this project, but someone else has decided to take this and bring in the idea of using qualitative interviews on the back of the same hypothesis. And the exciting thing here is what this means is if you have a research project where perhaps perhaps you're taking it in one route or perhaps you'll just you don't have the time or resources to go through and take this all the way through to the completion of a research project. You can still put some of these earlier stage publications on octopus. For instance, a research problem that you won't have time to investigate. For instance, you can get a DIY on there. So it's got a proper identifier and then someone else can come along and work with you to actually see that through into some later stage publications so they can add it to the back of your work and effectively just build on it. And the vision we have for octopus is that this will create a very collaborative environment where there's really not much of a disincentive to getting your work out there early and bringing in research problems and things like that that you might otherwise not have followed through with. And you can see where they're taken by the rest of the community. And as you can see here, we have these separate lists of authors for each of the different publications in the chain, meaning as Alex said, you can get, I suppose, more granular credit for which piece of work is being conducted by who. So that's, I suppose, a brief overview of what that publication looks like in the system. I suppose we should probably talk about where this sits in the wider context of octopus as a platform in its entirety. So if we take a look at our research problem here, what we'll be able to see is that the research problem above this in the hierarchy. So this is a fairly core principle of octopus is the idea that all research is interconnected in one way or another, and should build off the back of other people's research where possible or where relevant. So in the same way that in this case, we have our two different methods building on a single hypothesis research problems themselves. This initial point in the in the project also has the capability to build on other items. So in this particular case, we've got a higher level research problem with how best to communicate personal risk for COVID-19. And then that's been taken to a more granular problem. I suppose would be the best way to pose that with this problem that you can see up at the top here. And when we expand that to the scale of the entire system, what this means is everything stems from a common parent. So we have what we call the God problem right up at the top of our publication hierarchy, which is essentially something very broad. I think it's why is everything in the universe the way that it is something along those lines. And below that we have the Library of Congress classification system to let people pin new research that perhaps or new research problems that don't have something to build on or that haven't been built directly on an existing research from someone's put here. They can put them onto the Library of Congress classification system, but for everything else. The vision is that people will come in here and have a look for research problems that are in some way ancestral to what they're working on and add them on the back of them. So that's the, the sort of hierarchy of problems, I suppose, as we call it in the system extrapolating on that even further. If we look at other items such as the results here, you can actually write link research problems to these items as well. So it doesn't just have to be research problems that sit in this hierarchy. Every single publication in the system can have further research problems built from it. If someone sees what you're doing in your work and they go, well, that's actually raised an interesting question from looking at what you've done and I'd like to build on it. And similarly, as you're looking through the system, you can find places where the publications, perhaps pose a new question that you think of and you can build on them by using this research problems feature. So the idea really is to structure the entire system and this entire network of publications around something that's going to help people work with each other and credit each other for where their work has come from, I suppose. It's called Octopus. As you may have noticed, if you're very keen on it, there are seven publications up at the top here from research problem through a real-world application. And that's because the eighth is the peer review. As Alex said, Octopus is in one part about very much giving peer reviews the same weight as any other sort of publication. It's a skill in itself and therefore should be credited appropriately. And for that reason, peer reviews can be added to any publication as a separate item, which sits in the system as its own publication type effectively gets its own page and so on, sitting below any publication that's been added to. I suppose we should take a look at how publishing actually works in the system. So if I hop over to my account here and we hit this publish button, I will run you through what the process of actually publishing looks like in the system. Because we have these smaller publication types, it's not necessarily a hugely lengthy process. You can publish some fairly concise items as you can probably see here, for instance. Actually, that's quite a long one. Let's have a look at one like the research problem. For instance, this is a fairly brief publication. This idea with breaking things up into stages means that you can publish smaller, more focused items instead of needing to put out an entire research paper as one piece, which means this publication process is not particularly time continuous, relatively straightforward. So let's just get a title in here. Let's just call this a workshop because I'm not feeling particularly creative today. And we'll say this will be a research problem and we'll go through into the publication process. So here we are with our new publication. This will be a CC by 4.0 license, which means it's an open access license. Other people can distribute and remix this as long as they give you credit. And here we can add in our affiliations for any organizations we've worked with if we choose to do so. The key features here are looking for a link to publication. So let's just say this is a research problem that's being made off of the back of this item here. So let's see if we can find it. Okay, just for the purpose of this demo, I think I'm just going to attach it to another one in here. This is our test site, so I don't have to worry about actually cluttering up our live system with the publication I've made for a demo. So there we go. We've added it as a child of this publication here, meaning it's now part of our octopus hierarchy. We can go through the process fairly normally here, just adding in our publication text, any references we need to add and so on, making sure that we've recorded any conflicts of interest we may have and things like that. And let's move straight on to the co-authors feature. So co-authors in particular are, we need to make sure that we've got the proper approvals place in process for co-authors. What we don't want to happen is that someone adds in some authors and is able to publish on their behalf, I suppose, crediting them without those co-authors having a chance to actually have a look at what's going out with their name on it. So we've added a workflow to make sure that co-authors have the opportunity to review everything and give their approval before it goes out, which I'll try and give a quick demo of here. So I'll just pop in this account, which is, as you might have guessed, actually me, but I'm sure we can manage. And there we go. So we've added in a co-author and to make sure that they're happy with this publication, we need to get their approval. So we'll head up to the top here and click this button that says request approval. You might have noticed before I added the co-author that I said publish because we didn't have anyone else's approval together. But now that we've got someone else on here, we'll need to send this over to them to make sure that it looks okay. So we'll ping out that request. And what I'm going to do on my other screen is I'm going to quickly open this up as the other person and tell myself that it looks okay. And whilst that's loading. So this is the screen where you can review a draft publication. Obviously, this one doesn't have a huge amount of content in there. As you saw, I wasn't particularly diligent with writing up my publication text. But this is the screen where you can check everything looks good and your co-authors can do the same before you actually put it out there to everyone. So we have this review process in place to make sure that it can go through all the proper checks just to make sure it's ready, really. So I've just approved this as the other author that we can see listed here. And that should come through in just a moment. There we go. So my other authors said, yeah, this looks great to me. It's ready to go, which means I can just hit this button here and publish. I'll just take a moment to get that ready and redirect me to the live page. And there we go. We now have a live publication on Octopus with our other author credited who just went and approved things in the background for us very graciously. This publication has its DOI. This being a test site, this obviously isn't a real DOI, but it would be on the live site. This publication has its own open access license, so other people would need to credit you for using it. And overall, it's a fairly straightforward, quick process. I believe that's that's most of what I've got to show you at the moment. We're adding various new features as we've progressed through sort of developing this platform. So upcoming, we've got a reversing feature, which is quite exciting. And I'll just run you through a few of the ideas around our sort of next steps. So if you are on the mirror board, sorry. Can I just talk about red flagging. Oh yeah, sorry, I thought I was missing something. Yeah, please feel free to jump in. One of the things that Octopus is obviously trying to really support is best practice. And so obviously we want to be able to allow people if a review a peer review is is not strong enough if they really think that there's been some kind of misconduct such as plagiarism, image manipulation that kind of thing. We do have the ability to red flag something so Tim I don't know if you can see the red flagging thing on there. No, because of course it's your own publication. If you go to another publication, you can raise a red flag, which then allows you to notify other readers that there's some content concerns with that publication of various types and there's a little drop drop down. And then that allows the person who's raised a red flag to send a message to the authors, and they can respond and perhaps it isn't plagiarism or they can defend whatever. The person who's raised the red flag can either lower the red flag or not if they don't think the issue's been sorted out. So this I hope you know at the moment we've got this problem with things being in journals which have then been retracted or there's serious concerns raised about them but there's no way to flag that to readers, or there's no systematic way of doing that. So we hope that this red flagging system kind of sits alongside the peer review system and allows people to do that. And that's live at the moment. Excellent. Yeah, I knew I was forgetting something there, thank you. And so I suppose building on that and segueing rather into our talk about some upcoming features. I hope reversing will be, will allow people to do is if they do have a red flag raised against their publication to say that perhaps something needs to be clarified for instance. The author will be able to go back in and make some changes to the publication and create a new version of it and put it up on octopus to address perhaps, you know, the quality concerns or something along those lines that are reflected in that red flag. And I'm assuming as a whole is it's very straightforward in principle in practice there are some complexities with things like versioning do is and so on. But in effect, it's, it's a feature set will be adding to let people publish new versions of their existing publications, and to let readers switch between those versions if they need to go back and have a look at what's been done on on our previous version. So we want to create this, this view of the problem hierarchy, I suppose is the best way to put this idea of everything being connected in octopus. We'd like to add some visual representations of that to the system so that people can explore and have a look at, you know, have a look at different areas of research, see if there's anything there that's going to be a relevance to them. In both the form of a heat map. So that's something else we've got upcoming as well as that and on the similar topic of working through trying to make it easy for people to find particular publications were also looking improvements to our search features. So various additional parameters to do things like have a look for publications where they've they've filled out a statement saying that they can't do harking so that they've already collected their data and things like that or sorry they haven't collected their data before they propose their method and so on. And various integrations. We're looking at at the moment with with institutional repositories create systems and things like that to make sure that we can start putting octopus outputs into the repositories of organizations that are affiliated with the work that's been done in octopus. So yeah, there's quite a lot coming up. We're obviously very much open to to feedback and comment on any other features that we people think could be of use for octopus one in particular that we're in the process of consulting about at the moment to the idea of adding quality metrics. So moving moving away from impact and towards quality as the, I suppose the primary motivator for research is is is a big part of what is trying to do. And to achieve that we'd like to add some sort of quality scoring system to our publications. So having a look back over here at our chain we've got these seven different types of publication plus the peer review. And each one of those will have different characteristics which could reflect whether or not it's a high quality publication. There's of course being agnostic of things like impact and you know whether it tells a good narrative and things this is purely about for instance, for, for the results are they are they clearly formatted. Do they seem relatively comprehensive and things like that. We'd like to consult. I'm sorry to interrupt you it's just that I think this is it's a potentially controversial issue isn't it, whether you have ratings of publications and I think if it were. So essentially the issue is that if you want to assess things research purely on intrinsic quality. What does that really mean. It's a qualitative way of assessing quality. But I don't think we can get away from the fact that quantitative metrics are really helpful, especially in a digital world where you might want, for instance, order search results. And I think that there isn't anything intrinsically wrong with having quantitative metrics what is dangerous is having proxy metrics to quality. So I think that what we really want to do is think about what quality actually means, and to allow people to rate things on those specific predefined criteria. For instance, whether there is enough description in the method for it to be reproducible or not for somebody to be able to reproduce it. I think that is probably a metric of quality that we could agree on for methods, and that it will be perfectly reasonable to expect people to be able to rate something on that. And I also think that quality metrics are different when you're thinking about these smaller units of publication than if you were considering a paper as a whole. Because I entirely agree that rating a paper is just far too big a thing to be able to rate. I mean, I suppose you could have lots and lots of different criteria but there's so much going on in that. Whilst these smaller units of publication I think are much more reasonable to try and think about predefined rating criteria on. But yes, Tim was saying we're really it's really important that we get those right, which is why we haven't rushed into putting them on the site straight off. Because if we create the wrong metrics, then we're going to completely undermine our whole purpose of trying to reset the incentive systems. So I think it's it's really important that we get the right criteria for each publication type, bearing in mind all the different domains that might be using the system so that's why we're, we're really keen to hear from people on what they think those criteria maybe should be. And we've got some drafts. So if you're interested in being part of that conversation, then I'm sure Tim's going to tell you how to do that. Yeah, exactly. And thanks for segueing me in there very smoothly. So there are lots of conversations going on about how we introduce various features at the moment, in particular these quality metrics. So if you would like to get involved in those conversations, we'd love to hear it, whether that means just popping something there in the chat now. We can open up to Q&A in just a moment. Or alternatively, we do have a link to join a community where we have sort of regular catch ups just to discuss, you know, what's coming up with the platform, how we might like to implement certain things and have slightly wider discussions about how to promote best practice in the research community with Octopus as the framework for doing that is supposed to be the best way to put it. So let's see if I've got my link here somewhere. I should have it. See where it is. I'll paste this in the chat now. If you did fancy getting involved with those discussions. I think we have our next catch up in about two weeks. And we'd love to see you there. I have some questions appearing on the mirror board so I can segue into those. So one of them is about red, about anonymity basically, which is a really big topic and the concern is particularly about anonymity when it comes to raising a red flag. So it's a question that's often raised when we're talking about peer review as well. So nothing on the Octopus platform is anonymous. You have to log in and everything that you do is logged against your profile. Tim, perhaps you'd like to go to a profile page to share. So one of the things that we're hoping is that transparency and the fact that everything is linked to a profile will decrease the possibility of people acting unprofessionally and increase the fact that people will. I guess that the transparency will allow a greater degree of responsibility, I guess, on the platform that may be a naive hope. And it may be that people will create all kids that are anonymous essentially and then use those in core practice. I really, I do have some faith in the fact that research cultures or that cultures can change and that the research culture can change from the one at the moment where people feel unable sometimes to critique people who are hierarchically above them. The Octopus is designed to try and break down these hierarchies. I think hierarchies can arise where there's such an imbalance of power because you've got such a narrow bottleneck towards promotion and publication, which is a step towards promotion. By increasing the meritocracy and broadening the availability of breaking down barriers to the availability of some of these, well, publication being the main one of getting your work out there and getting it assessed. I hope that we can actually flatten hierarchy because there are not so many gatekeepers now. So I hope that these problems will decrease in time, but at the moment, there isn't a way of doing anonymous peer review or anonymous red flagging. If you had a serious concern that you wanted to raise anonymously, there are of course research integrity pathways to do that in many countries already at the moment and I hope that people would avail themselves of those. And then perhaps a research integrity officer would be able to raise the red flag on the behalf of an anonymous whistleblower. So that's, yeah, that's my thoughts on anonymity but I'm really interested to hear other people's thoughts on anonymity. And the other concern that's been raised is can concerns that arise in red flags also arise during peer review. Yes, of course, I mean peer reviews there are open so something that you were writing, and you wanted to raise as a red flag you could also echo that in a peer review publication and and I expect that would be very relevant in order to raise in a peer review. So even then it's that hopefully that author who has had that critique would then reversion and say, you know, oh, yeah, absolutely I totally hadn't noticed this, you know, statistical mistake I've made or something and I've now corrected that in this new version. I mean hopefully that's how critique could work. So I suppose at this point we've been through some of the, you know, the major features of octopus and the major things it's trying to achieve. It would probably be interesting if anyone did have any burning questions. They'd like to go through just to have a look at look at those. Whether that's putting them down on the the mirror board here or whether you've got anything you just like to pop in the chat or unmute yourself and say hello, whichever way works for everyone. So we've got an idea arrived, which I'm very very slowly zooming in on. I'm still sharing my screen but I can't, I can't quite read that one without diving straight in there. Okay. We'll take a look at afterwards that looks useful. All right, well, it sounds like we don't have too many burning questions for everyone, which is quite all right. You can always feel free to get in touch with us. Or if you join that user community link that I put in the chat there you'll be able to ping as many questions to us as you like. As we we sort of continue to grow and expand the platform. And in the meantime, if you did fancy it, as I showed you it's very much a fairly quick process to get something out there on octopus. So if you did have a research problem, for instance, sitting there in a file drawer that you may not be going to get around to octopus is great place to just throw it on there get the DIY on it and see if anything comes from it. Yes, maybe throws not the best word there to carefully apply it. Yes, I think, I think if you are thinking this is an interesting platform you're interested in open research, and you want to try it out are a couple of things that certainly I've been trying it out for one is research problems. So quite often we're not precious about sharing the problem that we're working on, especially a higher level problem. Perhaps you can bag yourself the can we cure cancer problem something like that. Other than that, also, as Tim mentioned those sort of file drawer issues, I personally have some small data sets that I'm never going to publish in the journal. I should be getting those out on octopus when I can find the time. I mentioned time. In fact, it's a lot quicker to write an octopus publication than it is a paper because they're so small so it's a good kind of incremental thing that you can find a couple of hours and write one. And then finally, the other thing you can do is if you published open access somewhere already, you can, which I have also done, you can use that content and sort of postprint it on octopus. So, you know, if you have the copyright to it, you can put it on octopus and share it in the optimal structure as well, which I think is quite a good way of exploring what the different publications look like and feel like when writing a publication chain. As you do in octopus. So those are some ideas of things that you can do now. And it will be great if people want to try it and give us feedback. And yeah, keep adding things to the mirror board, get in touch with any of us, join the community. Yeah, help us try and reform scholarly publishing. I should also say that you can of course use it like a preprint server. And we've been talking to several journals who are all very happy to accept material in a journal article form that's been previously published on octopus nobody that we've talked about has had a problem. We're doing that they consider it just to be like a preprint. Well, I think that's about everything. So yeah, as I said, feel free to join the user community. We hope to see you there if you did want to get involved a bit further. And I think we'll wrap it up there. Yeah, thank you very much everyone.