 Shall we wait another minute before we start? Yeah, OK. I think we should start now, Marco. What do you say? Yes, since we have two speakers. OK, good. Wonderful. I see there are still people joining us. So let's start with the introduction. And good morning, good afternoon, good evening, everybody. Welcome to our open science seminar, jointly organized by ICTP's STI Unit and the Marie Curie Library. We're very happy to have you here. And we'll again have a very interesting topic today. We're going to talk about the pitfalls of open science and propose possible solutions for that. So today, we're going to have two speakers. The first speaker is Dr. Tony Ross-Hellar. He is the leader of the open and reproducible research group at Gratz University of Technology in the No Center in Gratz, Austria. He has a background in information, science, and philosophy. And his research focuses on a range of issues related to open science evaluation, skills, policy, governance, monitoring, and infrastructure. So welcome, Tony. We're looking forward to your presentation. And our second speaker is Professor Jean-Sébastien Cault. He's Professor of Low-Dimensional Quantum Condensed Matter at the University of Amsterdam. He is a strong believer in openness, in scientific publishing, and is devoted to building new infrastructure to facilitate this. He's also the founder, chairman, and lead developer of CyPost. We're very happy to have you here. I hope, Jean-Sébastien, I hope that I pronounced the name correctly. So forgive me if I didn't. And we were OK. Very good. Thank you. And you're going to talk about CyPost. So we're very much looking forward to your presentations, to your talks. And we'll have a question and answer session afterwards. So for everybody, if you have questions, please feel free to post them already in the chat. And we'll come back to that. OK, I think I hope I haven't forgotten anything. So Marco, is there anything that you want to add? So wonderful. OK, so then let's start with Tony. Tony, the floor is yours. Can you see my slides? It should be full screen in a second, hopefully. Wonderful. We can see them, but not yet full screen. Yes, now it is. Yes, perfect. Thank you. OK. So firstly, Averyn Marco and ICTP, thank you so much for the kind of invitation to speak to you today. I saw from the list of speakers that you've had and that we'll have, this looks like a fantastic series of events. And obviously, sorry that I can't be with you there in Triesta, beautiful Triesta. Can I just double check? Can you see my full screen or is that part of it grayed out? No? Yes, it works, right. Excellent, thank you. OK, so today I will talk to you mainly about a project which we just recently completed called On Merit, which ran from October 2019 to March 2022. It was funded under the European Commission of Science with Enforced Society program. And we used sociological, bibliometrical, and computational approaches to study the way that open science impacts on equity, and especially the way that maybe some unintended consequences it has, negative consequences for the equity within the scientific system. And as I'll explain, I have always seen, I think a lot of others, equity is really one of the key aims of open science. There are many, obviously, and this is maybe part of the problem that we can talk about. But I've always seen equity as a very key aim of open science. A few years ago, we noticed that there were some areas where maybe the way open science was being implemented might be problematic in terms of equity. And this was the basis of our project, which I'll describe to you today. So in a seminar series on open science, I guess I don't need to tell you what open science is, but just to say it's a bunch of different things. So making publications open access is obviously a very different set of practices, very different set of concerns, different set of problems than is making software open source or opening methods, protocols, materials, or making data either open or at least findable, accessible, interoperable, and reusable. And so open science is kind of a boundary term. Some people would even include elements of citizen science, which seek to bring down the barriers between the academy, the ivory tower, and wider societal actors. And there's a whole lot of ways that the peer review process can be opened up and the processes of researcher and research evaluation can be opened. So open science is a bunch of different practices, but it's also many, many different principles or aims which underlie these, I think. So here is a slide, which I still use. It's very old now. So in 2015, I think, which is really when I really joined the open science community and became an advocate for it, I guess, I just put on Twitter, what do you think the principles of open science are? And this is just a list of all the different answers that I got there. So it's a lot of different, very good things, obviously transparency, accountability, inclusivity, responsibility, community, collaboration, visibility, rigor, equality, and science for the public good as well. You see already in there that there are a lot of concepts which relate to diversity, equity, inclusion, and so on. And the first thing I want to say is that I've always seen these, and maybe this is a point that we could discuss, as key goals of open science. So if we look back to the Budapest Open Access Initiative, which just had its 20th anniversary, very utopian language in, it was this kind of still effective quite a lot by the utopianism that accompanied the early stages of the web. We see language that open access, which is, and here meaning making publications open access, would enable us to share learning between rich and poor and lay the foundation for uniting humanity in a common intellectual conversation and quest for knowledge. Michael Nielsen, his really seminal book, Reinventing Discovery, has a whole chapter about democratization and how open science and new web-enabled ways of doing science can democratize processes. In a stakeholder study, recently increased equity was listed as a key success factor for open science. So a key factor by which success should be judged for open science. And here's another paper from 2020, which says that open science principles of openness and transparency provide opportunities to advance diversity, justice and sustainability by promoting diverse just and sustainable outcomes. So equity is a key aim of open science, I argue, or it has been for many, many people at least, but it's one aim amongst many others. So increasing transparency in the research system, if you ask the funders like the European Commission, you would see the word efficiency popping up and they're very often they mean return on investment for funding. And so what I want to point out is that many of these principles might be a odds with each other. If you want more efficient science or you're trying to fund excellence in science specifically, then the distribution of resources within the system, this might have implication for the distribution of resources. So open science is a bunch of different practices and principles and the definition of open science is being pushed by various groups in various ways. So whose agenda is at play in open science? Is it researchers from lots of different disciplines? So in physics, you maybe have a very different idea of what open science is or should be, then in social sciences and humanities, but maybe in different parts of the world, we'll have very different conceptions as well. I think we definitely do. From the vantage point of research funders, obviously as I said, efficiency often comes through as very key and open science was sold, at least initially with open data policy and so on, that data is the new oil and kind of research data is this unexploited resource that we can open up and it will fire up the economy and so on. All these kinds of promises were made for what open science was and could do. So from the perspective of research institutions, and we know of course, from the perspective of the traditional and also the newer publishers, the way that the traditional publishers of kind of maybe could be accused of co-opting the language of open science, open washing, the way they kind of tried to ignore it and then they tried to degrade it and then they've tried to co-opt it, in my view. How do all these different agendas for what open science is shape the open science reality that we are constructing and still are constructing? So open access is very well progressed, obviously. Opening data and making data shit fair, there's lots and lots in place already for that, but open science is far from a done deal. And the next point that I want to make is the uptake of open science is not free. So to do open science, you need a lot of infrastructure, which relies on resources. You need a lot of training and support. And so this is in terms of materials, but also in terms of people and capacity within all institutions to support people. And then you need a lot of political will, either leaders of institutions or governments and so on. And access to those advantages, obviously, isn't equally distributed. If we think about what we said about researchers and the many different types of agendas that might be played from people in different disciplines or different regions, and then we think across those disciplines and across those regions, the access to these kind of resources, obviously, is not equally distributed. Just to say where we're starting, one of the places we're starting from, academia isn't equal. So there are structural inequalities that persist across regions and demographics. Global North research still dominates, pushing global South research to the periphery. Even within the richer regions, the goal of excellence has tended to further cluster funding towards the already well-funded. Women, of course, still occupy relatively fewer higher positions, achieve senior positions at later ages. And I started out in philosophy and so come from the humanities. And I think it definitely is true that because of economic advantages that they bring the STEM subjects, science, technology, engineering, maths are privileged over social sciences and humanities, I think so. So academia itself has structural inequalities. And within science, there is also a mechanism of cumulative advantage, which seems to mean that those who already have are the ones who are most rewarded. So the sociologist Robert Merton in the 60s proposed the Matthew effect in science. This comes from the Gospel of Matthew. So to everyone who has will more be given and he will have abundance, but from him who has not, even what he has will be taken away. And so basic point was here, that Merton saw was that already successful scientists tend to receive disproportionately high rewards in comparison to their counterparts. If you already won that prize in your very early career, you're much more likely to win the Nobel Prize. And I think a lot of this came from interviews with Nobel scientists, but also in terms of grant funding. So there have been studies which show if people get one grant, they're more likely to get the next grant and so on. And in fact, this effect of cumulative advantage is a place throughout academia. So journals in terms of the, because they have the higher impact factor, the researchers are brimming to get in there and this kind of has its own effect of cumulative advantage at the level of institutions, obviously. The rich institutions we know are a lot more well supported. They have a lot more possibility to hire. So at the level of institutions, departments, countries at the level of individual attributes. So in terms of gender, there's a paper called the Matilda Effect, which studies effects of gender and cumulative advantage. And in terms of race as well, there's a paper about the African Eve Effect studying effects on race. And it's also across a range of scientific activities. So in terms of citations, if you're already very highly cited, you will just get more citations just because you were already in there. So academia is an attention economy, we know. And so this effect of cumulative advantage is also a work in peer review in public engagement and funding acquisition. At the end, I have the bibliography with links to papers which describe these effects. And so bringing it all together there, open science isn't a unified ideology, but a diverse bunch of principles and practices. Equity is often stated as a core aim, but just because things are open, it won't necessarily ensure equity. Factors like region, gender, discipline and access to resources continue to shape the possibilities of participation in an open science world. And perhaps most crucially here, there are various routes to the implementation of what we call open science and the how is crucially important. Along with that, the why. So what are the principles, the main things that we are trying to achieve with open science? And so although I'm not talking about the pitfalls of open science per se, I'm talking about the pitfalls of open science done in the wrong way for the wrong reason. So the question that Merritt started with was might open science be at risk in some cases of reinforcing existing privileges or creating new ones? The first output that we had was a paper published at the start of this year in the Emerald Society of Open Science, which was a review of the literature. So we reviewed 268 relevant studies across so both scientific and also policy work for what dynamics of cumulative advantage or structures of inequality that could be at work in the transition to open science across disciplines, regions, demographics. And here we found a lot of different concerns. So on the right, I just put the table just to show you it was a lot, I don't expect you to read that, but on the left is kind of the summation of this. So the threats are the costs of participation in open science, the fact that there are different political agendas at play. The main point that I'll move on to next, which is about the discriminatory nature of the article processing charge business model. Cumulative nature of data inequalities, which means that it's not only making data open isn't enough to level the playing field because you still need certain data skills to be able to take advantage of that open data. The platform logic of open science, which is leading to an accumulation of an accumulation and growing together of services all along the research workflow, especially by the traditional publishers and also the kind of surveillance capitalism element where they're able then to extract analytics from these services and then sell them back to us. Lack of reward structures is the difficulties in the logics of participation, exclusion of certain societal voices and the resource-intensive nature of translational work. Just briefly on that, we found that we did a lot of work with policymakers to see what impact open access was having. And the fact was that it was still marginal because even if the papers are open, a lot of them are and even their assistants aren't going to read them. They're going to wait until they have a policy brief written in their language digestible and delivered to them. And writing good policy briefs is also a resourceful activity. So open science improves the practice of research but not automatically, not without new risks for inequality and other adverse effects must not be naive. These issues all arise as a result of the ambiguity and of open science as a term and the politics behind it. The fact that it is resource-intensive and transitioning towards it needs money and certain network effects of cumulative advantage that are at play. There is an argument that open science kind of ignores other kind of ways of knowing, especially from the humanities or the social sciences, so narrow epistemologies and there is an argument that there is a logic of neoliberalism. I won't go into that now but I'm definitely happy to discuss it afterwards. So within on merit, we sought to look at these questions especially to look at the effects of barriers to accessing literature and the effects of open science and responsible research and innovation practices on career progression and promotion policies and the dynamics of cumulative advantage in training. So this is within academia. We also looked at industry and the uptake of open science resources, drivers and barriers to that uptake including through a scan of the European patent literature within policy. So this is part of this work which I refer to. We looked at the uptake of open science resources amongst policy makers, the drivers and barriers and reflected on the ways in which open science breaks down barriers to participation in both research as well as participatory policy making. The project is now finished and we have a lot of results which we are in the process of preparing for publication which hasn't gone quite as fast as we thought but it never does. So on our results page, you'll find all of these studies. One example, and so this is covered especially in what's there deliverable 3.2, cumulative advantage in open science and RRI, a large-scale quantitative study. So what remains of my time today I want to discuss this particular question of the ways in which cumulative advantage might be at work in the transition to open access publishing and especially the effects that the model of open access publishing or that is based on charging authors or their institutions, article processing charges, what effects this might be having already. And so I should say this builds on work which already predated our project. So a paper by Carl Seiler et al. And this was looking at health, their results found that authors affiliated with high-ranked universities and well-funded institutions tend to have more resources to choose, pay options with publishing. And their research suggested that new professional hierarchies were developing in publishing with different open access publishing options, prominent for different groups. Just as there is stratification in institutional representation, there is also inequality within access types. I moved on quite quickly here to talk about the article processing charge model, I think. And so just to be clear, not all open access is funded by article processing charges. Of course, in fact, most fully open access journals don't charge article processing charges, but the most high-throughput journals usually do. And this means that most, and so this along with the fact that a lot of the traditional publishers have now moved to have this hybrid option where you can either publish closed access for free or you can pay. This means that there is an increasingly big market for articles funded by article processing charges. And these article processing charges are increasing. So at the moment, I think it's between 1500 and 2000, but we've seen the bar being raised. So nature communications has 5000 and it seems that there's kind of a normalization with various deals that nature has been doing at the moment that to publish in the nature of the main journal, it would probably be a fear of about 10,000 euros. Just for contrast, so I have a project with colleagues in Ukraine. They tell me this is about three to four times the monthly salary for a professor there. Another paper which recently came out and looking at this effect found the same that this paper was different in that it looked at the mirror journals. So when some funders said we won't pay hybrid, what some clever publishers did was they just created a new journal which looked exactly like the other journal and put the article processing charge papers in this other journal called the mirror journal. But this meant that there was a very easy, well not easy, but a very good comparison between who is publishing in this closed journal and who is publishing in this open journal. It's the same journal. So you don't have to worry then about kind of maybe they're from different disciplines or maybe they have different impact factors or so. It's the same journal. And this found that those publishing in APC, the open access, they found that this was discriminating against those with limited resources, especially those from less resource regions and institutions. And so just to lead on to our research which will hopefully be published soon, we found exactly the same. Those from more prestigious institutions tended to publish in journals with higher APCs. Here, I think I'll move a bit quicker. And there are stratification effects which seem to be getting worse over time. So in fact, this gap seems to be increasing over time as well. Based on this in the final stage of our project, and so we didn't only look at this issue, obviously we looked at many others. In the final stage of our project, we worked with funders, research institutions and researchers to co-create a set of recommendations which we've now published and you can find at this link here. We identified from our whole project and all the issues that we looked at, we identified four priority areas for action. The fact that open search research can be so resource intensive, requires a lot of infrastructure and support services with people, article processing charges and stratification effects which we just discussed. And we also discussed problems of societal inclusion in research and especially in policy relevant research and the need to reform reward and recognition structures because this is really a barrier to a lot of uptake in open science. So I think I was meant to talk for between 20 and 25 minutes. I think I'm almost there. If I finish them with our recommendations that we co-created. So funders institutions and researchers should collectively demand greater transparency from publishers on publication costs regarding prices and services where possible support open infrastructures to collect that information. The more we know about what publishing costs, the less we'll feel like we're being gouged with these what can seem very high costs. Funders institutions and researchers should support alternative publishing models. And we're about to hear about an excellent one where those show potential to be more inclusive, including consortium funding models for open publishing infrastructures which support open access publishing with no author facing charges. So there are various models for what's called diamond open access, which is there's no barrier to readership and there's no barrier to authorship. And there is plenty of money in the system obviously and we must have a way of sustainably that sustainably distributing that money to sustain open publishing services. Funders institutions and researchers should encourage and support the use and maintenance of sustainable shared and open source publishing infrastructure to reduce costs and promote open standards. Institutions and researchers should ensure the accepted version or later of peer reviewed works are always deposited in an open repository and what I've discussed the problem with open access obviously is a problem with what's called gold open access open access publishing. We still have the option to publish or to deposit our publications in repositories and we should always take that and that could be supported by funders and institutions supporting authors' rights to self archives to publications by implementing what are called rights retention strategies which is where you write in already to your contract with the publisher that you retain the right to deposit your publication. I will finish there. There were a couple more slides about rewards and recognition but I'll share the slides and that can be distributed to everybody. Thank you. Thank you, Tony, for your presentation and also thank you for already thank you for sharing the slides. I would say we move on to Jean-Sébastien with your talk. I think this fits perfectly well since we've been talking about diamond open access and side post as Tony has already mentioned is a very good example. So the floor is yours. Okay, very good. So thanks a lot for having me. I'll try to squeeze the talk a bit so we still sit in the time because some of the things that I wanted to say have already been said anyway. Let me introduce you to side post. If you haven't seen it before it's a kind of concretization of many ideas about open science that I wanted to implement a few years ago. So what's the plan of what I want to tell you today? I want to take like five minutes to introduce you to side post if you're not already familiar with it. I'm gonna give you some examples of implementations of best practices that the initiative tries to do. I want to talk a little bit about the machines behind it. You know, I'm a scientist, I like machines and for me the whole point of this whole thing is indeed to kind of empower scientists and academics by giving them the tools that they need to cover all their publishing needs. I want to underline some of the difficulties that we've faced and are continuously facing and maybe also finish with a couple of lessons out of the experiences and certainly about the kind of general area of diamond OA that's already been talked about by Tony. Okay, so what is side post? It's quite simply a complete publishing system. So it takes care of absolutely all the different layers that you might need when you're thinking about scientific publications. So, you know, from preprints, preprint services, we have our own preprint server to metadata deposition curation and interlinking and who runs it? It's very much a grassroots initiative. It's entirely created and, you know, empowered by people within academia and what does it offer? I mean, in a first iteration, it's really the academic journals that we have on there, although there are additional services to kind of provide additional value on these things. And if I had to summarize it by two key words, it's really about openness and quality. The idea is that you leverage the idea of openness at lots of different levels in order to increase transparency, verifiability and overall quality. Okay, so that's where it kind of comes from. And what does it aim to achieve? Well, you know, when you've been in academia for long enough at some point, and certainly when you're a scientist anyway, it's your job to think of new things. So we want a complete reform at all levels or at least offer some additional things that don't necessarily exist in the current landscape. So we want to implement something which I personally call genuine open access, which is, if you want the kind of moral extrapolation of all the values that you might want to express within an open model in there. On the other side, we also want to greatly clean up the business model that's associated to publishing because it's indeed like Tony has already mentioned, it's been kind of co-opted a bit from a corporate side that has very different priorities, very different long-term goals here. So we also want to modernize a little bit the editorial processes. So, you know, it started in physics and essentially the big thing that we added here was the idea of open refereeing to make it essentially possible to have meaningful discussions based on referee reports that could really contain lots of things that other scientists could use. So that's one thing. And in the longer term, we also want to reform the impact assessment which is something that I'll come back to near the end of the talk. So just a couple of comments about the activities. So we have a certain number of journals, I'm just here mentioning those in physics because these are the ones that are really kind of running started publishing in late 2016. We've got about 1200 publications. As of now at the tempo at about four or 500 papers currently per year. So the best practices that you can associate to CYPOS or certainly if you're a physicist, you're not used to open refereeing. So we have an editorial process that I'll very briefly sketch on this. I want to spend some time talking about the consortial business model here because I do think that it's an important direction for institutions and funders to support in the future even if you're measuring that need purely on pecuniary criteria. I want to emphasize that CYPOS is also very much a community driven thing. So it's really meant to belong and stay in the hands of academia. I want to talk again about the machines, the infrastructure really emphasize its openness and what it means for its sustainability. So taking a few seconds about the editorial workflow. So what are the most important aspects? Submission is performed as per other journals. Submissions come in, they're checked for lots of things like plagiarism and internal checks. But then things start being different. The people driving our editorial processes are so-called fellows of the editorial colleges. Benefits of an Oxford education, I guess I've always liked the idea of a college. It's very collegial and it's like kind of distinguishes it a little bit from say closed door single editor very often not professional academic decision-making in other places. So one of the fellows at the college has to express interest in running the referring system on the incoming submission. So it is possible to be kind of desprojected if you fail to attract the attention of the fellows. So the fellows swim through the pool of submissions and pick out the ones that they think are really, really worth going through. The other ones, although it's a minority they'll be desprojected. And then the editorial process starts when such a fellow takes charge of the submission and opens a refereeing round. And the refereeing round consists of invited refereeing invitations sent to members of the community. But it's also possible to volunteer reports on submissions undergoing evaluation. You just have to be a registered contributor at SIPost which means that we've checked that you are actually a working academic. And then that empowers you to submit a report on any submission even if you haven't been explicitly invited to provide a report. That doesn't happen in most circumstances but it's something that we want to instill a little bit more as a practice as we go along. Then very importantly, the publication decisions are not taken by single editors. The editor in charge, their role is limited to formulating a recommendation as to what should happen to the submission. And the editorial college as a whole then takes a vote to determine the further process. So the result can be publication, can be rejection, can be return to refereeing, et cetera. So that's a little bit how it goes. So there are lots of different ways to verify things as you go along. There's a lot of openness also in the discussions, the availability of the material built during the evaluation phase. So that's the editorial workflow that's there that we use in all the journals that we have. Now, if I switch to the business model, the keyword that's been kind of adopted is diamond which I think is really, really unfortunate because again, it's packing lots of different things into one word. I personally much prefer metals and their variety. So how does Cyprus characterize itself? First of all, there's no question of having any subscription fees or article processing charges. I have a total abhorrence to the idea of the article processing charges. I will discuss that a bit later also. And I think it's becoming more and more manifest to everybody what the pernicious effects are of this APC-based business model. So platinum publisher as I post, no APCs, no subscriptions, no author facing charges. All the operations are fully not for profit. So there's no corporate entity in the background that tries to skim off a 40% profit margin on the activities of the scientists. This is really entirely open, entirely not for profit based on an official legal foundation in the Netherlands. Very importantly, although we run a very, very stringent referring protocol and also a very stringent production protocol on the papers, we are able to operate at a scale of something much lower than the current habits for APCs. APCs now really average something well above 2000. It's become the norm by certain journals to have 2,500, even 3,000, 5,000, 10,000. And it's only going up. So we believe we can bring that down by say half an order of magnitude as compared to the current model. The idea is really a consortial model, which means that institutions worldwide throw money in the pot for us to run the operations. If you're familiar with the archive preprint server, that's essentially what it's based on. So there's no invoicing directly for publishing charges for every single publication out there. We just give agglomerate the data to the institutions to determine the level of the support that they can give it to us. And I'll give you an example with the ICTP later on, if I get that. Okay, so another important aspect is that it's very much community driven. So we have these virtual general meetings with our colleges. The college in physics now has like about 160 fellows active in there from all different time zones, Cypost really views itself as an entirely international initiative, although finally enough in the Dutch press, they really like to insist that it's a Dutch initiative, but then I like to remind them that I'm actually not Dutch, I'm still French Canadian. So it's really an international initiative. Doesn't aim to have geographical localization in any way, although having activities throughout the world is really quite a challenge, but that's what we're going towards. Okay, so the infrastructure itself actually was built from scratch. So when I started looking at implementing this thing, I looked at the available systems and none of them were able to run the editorial processes that I was interested in. So, you know, tough luck. I just taught myself web programming and all these things. I was already using computers on the side to do part of my work. So it was kind of a pleasant experience to learn all this new technology. All our systems are built from scratch based on existing free and open source software systems. We like to be a bit on edge of developments as well. So if you know a little bit about that, we can have an interesting discussion about single-page applications, JavaScript-driven, or this new technology, HTMX, which I'm now trying to champion for all of that, but that's really for the nerds among you. And all these other systems that we're using are really meant through their openness to be sustainable, to be perpetual. So there's no hidden stuff that can suddenly disappear. These are like more or less standard technologies that are used all over the place and you just need to maintain them to ensure their survivability. If you look at the kind of systems that we have, the main elephant in the whole thing is of course the systems on Cyples.org. We have a number of systems associated to that to enable all the daily operations in there. Again, if you're interested in these things, you can contact me and I'll tell you about it. We've got machines also for external machines to query the data in there. So we're trying to provide also some information to those metadata lovers out there. I'm personally very keen on metadata. I believe that having a proper metadata system really empowers a lot of developments for these things. So I'm really kind of looking a lot into that. So all the technology behind Cypost is made openly accessible on our Git server at git.cyples.org. There you will actually find the actual code base that is running on cyples.org. So you can actually look at what drives the website that you're looking at when you're visiting Cypost. There's some documentation associated to that that we still have to build a bit more, but it gives you a bit of an idea for developers how it's built and where it's going. We also have a discourse server where some of the more public discussions take place about perhaps strategic decisions, some information we want to get from the community. These are all part of the constellation of systems that form Cypost as an initiative. Okay, so what are the main difficulties that we are facing? Of course, there are lots of day-to-day little difficulties and referees not answering emails and that sort of thing, but these are really, how could I say, normal ubiquitous in the industry. I think for new initiatives, there are two things that are of importance, especially for one like Cypost that tries to implement a different business model. The first difficulty is really the question of recognition and reward. So the impact factor. So let me start with that. The impact factor I blame for being, without any doubt, the deadliest poison against innovation in publishing. There's no point mincing words on this, this piece of information that is compiled in that particular way and run by these closed, unverifiable systems, essentially is a break, a total showstoppers for new initiatives. It will take you at the very minimum four and a half years in order to be included in the indices that you want. And that's if everything goes absolutely fine. And this to me is absolutely unacceptable because quite frankly, and I've done that test, if I take the least qualified person to judge scientific quality at Cypost, say our youngest most inexperienced fellow, I can still vouchsafe that this person is more qualified to judge scientific quality and impact than the most qualified person at the company providing me with the impact factor. Yeah, there's a certain value in counting beans. However, the tragedy of the impact factor is in the breaking that it does, the kind of slowing down of possible developments. You have to run an operation blindly for a number of years before you actually can get your recognition. So the way I like to express it to the scientists is that indeed when I started Cypost here, I was asking my colleagues to jump out of the airplane without a parachute. Yes, indeed, their papers would not be listed, their papers would not be recognized for a bit. However, I would provide them with a parachute before they hit the ground because there was a certain amount of time for this. But it would be a much better world if we just got rid of that thing and use perhaps a much more diverse set of metrics. It's not that I don't like the impact factor itself as a quantitative measure. I'm a scientist. I'm able to see the value of a reproducible measure on things. The problem is that it's become so ubiquitous and that it's breaking the new developments. So anyway, so that was a big problem. Second big problem, of course, is that this is the stupidest business model that you can possibly think of. I can't exactly go with Cypost's business model to the bank and ask for five or 10 million of funding in order to grow. All the growth in the initiative has to come from essentially gratuitous work on the side by people to make it grow. And then once it's there, then you can expect maybe some sustainable funding. The problem also is that institutions worldwide are not used to dealing with other systems than the APC-based system or the old subscription model. All the efforts that are currently coming from high above are in the end only facilitating these APC-based systems. So leaving only crumbs for diamond initiatives in the current landscape. And then essentially there are lots of knowledgeable people out there already very sympathetic with diamond and whatnot. So those will give you some funding, and that's great. But the prospects for growth there are severely limited by current happenings in that. So we'll come back to that a little bit later. So that said, yeah. Like I said, we have managed to obtain a large number of funders for Cypost. So we've got about 90 institutions worldwide now that do provide us with a certain amount of funding for this. I'm very pleased, for example, to have recently welcomed the ICTP itself as one of our funders and with a good amount that actually almost exactly covered all the expenditures that we made available on the site. So this was like a top mark exercise from the ICTP. We're very, very thankful for this. OK, so what are the lessons that the experience of Cypost gives is that when you think about the reform of publishing, and at least when you're starting, you think you're just going to build the system. And within a couple of years, it's going to run at scale. I had given myself three years to really make this thing float and scale up by a couple of orders of magnitude. We're now six years on the line. And I haven't reached the two orders of magnitude, just one order of magnitude. So it's too slow. More importantly, I think the current landscape is misdirected. And what I mean by this is that the incumbents are being favored by the current, if you want, style of negotiations that are there. The kind of emphasis on transformative agreements is, I think, almost fatal to the diamond landscape. It's extremely pernicious what kind of effect it has on that. I really think that this was the wrong way to go about it. There was an opportunity to force reform, but this was dilapidated by these perhaps watered down choices that were made from high above. It's very interesting because I've got many people, many friends working in business. And when I tell them about the developments in there, they say this world is completely crazy because it feels like it's not the client dictating the terms. It's not the client making the choices. It just doesn't work. It doesn't add up. It doesn't make sense. So perhaps there's a kind of correlation there that people end up in academia by the lack of sufficient business savvy. Because what I observe of all the happenings of the last few years is that there has been a very dramatic exploitation of the academic side by the other side. It did not need to be like this. And wouldn't it be nice if it changed? But yeah, I'm going to be happy when I see change. But I haven't seen it yet. I think Tony also mentioned these really, really nice things that are currently happening as well. So like this surveillance technology that's being installed in lots of these things. You'll have lots of denials from these things. But I've done quite a lot of things with hacking and information and whatnot. And what you see in there, I can tell you, it's not good. So it's not going the right way. Going on with lessons, I mean, what I'd like to tell funders and institutions is that if you do want diamond to scale up to its potential scale, you really, really need to consolidate the way you support it. You need to make sure that the funding is sustainable so that long-term development plans can be implemented. And please stop facilitating APCs. Stop saying that from a certain point onwards. Your scientists will be able to publish for free in those and those and those journals because you have spent two years negotiating with them. We don't get such press releases at side posts because institutions don't need to negotiate with us for two years. So one has to be extremely careful here to not kind of suffocate the potential of diamond in there. So really kind of try to invest in diamond because I think the returns would be quite substantial. And when you do talk to the corporate representatives of these organizations, you do have to turn to tune up your bullshit filter because a lot of stuff that you read is factually demonstrably incorrect. However, kind of push through because of the additional ease or perhaps the correspondence with the habits that you already have all being in place for you to do that. So, you know, just a bit more business savvy here, please. My final thought here to kind of summarize a little bit how I feel about all of this, it's that if you think about open access publishing, it's a bit like Linux. And what do I mean by that? Well, quite frankly, there's a reason why all the big mainframe servers out there run on Linux or those kinds of things. It's kind of a superior technology, it's a tweakable technology, it's a much more powerful technology than many other competing things. However, Linux is a completely vulcanized, disorganized, you know, community of people doing things in different ways based on different tools as competing against these huge corporations that have other reasons than say your computer's performance at heart. There's a lot of customer lock-in in these systems, it's a fragmented community, and unfortunately it's very difficult to break through the adoption beyond the level of kind of minority. However, yeah, that's what I think here, you know, so I was using Apple for my computers for many, many years and then I said, yeah, at some point I'll throw this all out. And indeed I said, you know, leaving all these non-open systems is a bit like leaving a multicolored candy superstore and setting out in the Canadian wilderness, yeah? So yes, indeed, you know, in your canoe you have to paddle a bit harder, you don't wash so often and there are mosquitoes. However, the views are much nicer, yeah? So it's kind of a good thing to try to go for. So that's it for the talk, I'll just open it up for questions. That's the team behind side posts. So, you know, good people like Paola Perez, Sergio Tapiazate, Yonvin and Vainan, Aisler Huda and George Casigas who run really all the day-to-day thing and I'm taking care of the technological aspects of it and a lot of the political things. Yosfana Mamma, Jasper van Weizel are the other two members of the side post foundation itself. And with this, I will thank you for your attention and give the control back to the chair. Okay, super. So I, you know, thank you very much for the great talks. We do have some questions. So let's start with a question for Tony and someone asked about what do you mean by training? So it was one of the needs, right? That you pointed out in one of your last slides. Yeah, so open science, obviously it needs a lot of support in terms of awareness raising of what it is, but then also a lot of training and support in terms of how to do it and what things mean and the best way to go about things. So access to resources for training for open science are not equally distributed. So within richer institutions, there are many, many people who are in place to assist you in putting, for example, fair data policies into practice if you get a European grant. So I said I'm working with, well, the projects on hiatus at the moment, we were working with institutions in Ukraine and they just don't have the people in place that are already there in richer institutions to help you to take up these practices. And my worry is that the more that we assume that this is the way that things should be done, this discriminates against those who are from institutions without those resources while they're there, but also once they move. So for example, I think I saw a job posting for a psychology institute that said you would be judged on your commitment to open access for that position. And if you hadn't had the training and resources available to you at your current institution to be able to put this into practice, and it is the case that richer institutions are publishing more open access and consuming more open access as well, then there are just these effects of cumulative advantage should be played in there. So as a follow-up question, are there any best practices or some practices that people can look at to inspire their own training activities, for example? Well, yeah, so I think it's important that training materials are made open and not duplicated across institutions. So there have been projects in the past like a foster from the European Commission, which has made a lot of training materials open and available to everyone. And obviously online training which is available across the board is great. Okay, so online material and online training. Okay, then I have a question for both that came in through the chat and they were asking about younger generations. So do you see a different attitude in younger generations in other words, open size? Tony, if you want to go first and then... I just spoke, so... Okay, okay, John Sebastian. Yeah, so I'm actually a little anecdote here that I really liked at the recount. So when I first tried to attract people to use the Cyprus systems, I was really aiming at the very well-established professors in my field, the big guns out there because I thought, yeah, where these people go, others will follow. But of course there are lots of vested interests in those people because they do feel that they need to get the next grant, not really for their own ego, but because of their sense of responsibility to their group. They are after all responsible for launching the careers of PhD students and postdocs and whatnot. So it's a completely legitimate position to be hesitant on that as you're a senior researcher. And then on the side, I kind of noticed that some junior researchers were starting to send their papers to Cyprus as well, even before the early days. And at some point it was so systematic that I kind of ran a little informal survey and I asked those young people, hey, why are you sending your papers to Cyprus? And they all gave me the same answer. They said, look, it's obvious the older people, they're not the ones who are gonna want to change the system, they're around for another five years or something, what do they care? I'm gonna be in academia for 40 years. So take my paper. So that was really, really quite a motivator actually. It was really, I'd like to think that we're not letting those people down who kind of jumped with us. I'd like in the future to see some of those young people saying, hey, hey, look, I was in the first issue. Yeah, you came in five years later, I was there at the beginning. So I've got the kind of profit's eye. Not you, you're just a sheep. So that was very pleasant. I agree, like, so early career researchers are obviously more, I think we're all, or most of us are more idealistic when we're younger. And also with the older generations, there is a certain survivor bias. Like I made it in this system, therefore the system must be working well. So I think there is however a big discrepancy between the attitudes of younger people and the actual practices that they use and one of the main things there is that it's still the case that they're very often judged on the publications, just publications, not open science outputs. So if you're producing code or data, this isn't judged. But also not only the publications themselves, but also the other speakers spoke about the impact factor. And I do think there's perniciousness goes beyond like holding back innovation in publishing. But the way that the impact factor of journals is still used quite a lot in researcher assessment is a major disincentive. So I just saw on Twitter today a major open science advocate, Thomas Suzy, has got himself in a bit of trouble because as an early career researcher, he still finds it in his career interest of publishing nature in science instead of, for example, open research Europe, a kind of open publishing platform of which he is actually an editorial board member, but he's right, he's right. And so changing that is, and that's why the reform of rewards and recognition is really one of the key aims and there's lots of moves underway from the European Commission and others to make that happen. Okay, super. Please, Atish, you want to join and ask your question. First of all, I want to thank you for very nice talks, actually. It's very informative for me. I have a couple of questions. So just to understand regarding Psypost, so when you say it's a diamond, so okay, do I understand that diamond means that it is open for both the user and the author and the reader, that is diamond. Yeah, so it's essentially about the way the finances are structured. So the kind of accessibility of the material, so that readers can access the material for free and that authors can send in their material without payment. That we view as the kind of first thing we don't even need to talk about. I mean, any publisher not doing that is like, I've got it wrong because you don't want to have what I like to call lead pollution, which is a coupling between your editorial systems and the financial systems. Because then you end up with the disaster created by APCs of these predatory journals, which was like so easily foreseeable. So here really the kind of diamond term is coined because the funding model for it is really a kind of a consortial model, not based on transactions at the level of single publications. It's really you're running a big infrastructure, people pour resources into it and the system is run for everybody. I see, I thought I want to understand that I see like an institute like ICTP can come. I mean, so what I'm saying is that, let's say universities in the developing countries, if you have authors coming from them. So in order to make it a diamond system, you will need additional, because if your current processing cost is 600 euros, as I said, your model is that this will come from a sense of community and consortium from within the community in the same way it has worked for the archives, for example. Absolutely, and it's kind of a funny calculation that you can make. It's one of these nonsensical at first sight calculations on business, which actually make a lot of sense. If you take first world institutions and look at their expenditures for publications, we promised them a reduction by a factor of five and that gives us enough surplus to cover all the developing world and whatever comes from there, if we scale it up properly, that's the idea. So it is just a reality that in terms of financial payments there's no expectation that developing world institutions will contribute the same as say the University of Amsterdam. And I will not hesitate to make that case to Western universities and first world universities that they should contribute more because they're still saving money. So what's their problem? They're looking to save money. That's what we're empowering. No, no, I think that's fantastic. I have to say I admire the idealism behind your thing, but it's also important that in the last six years we have somehow managed to not the reality intervene in the wrong way and keep your goals still in sight. So I think if there is any way I sit in an organization like ICTP can help, I would be more than... Thanks a lot for your support. I mean, so community support like this is all we need, right? Because then we hear what you want, we hear the way you would like to see this thing develop and that's where we're going. But the ICTP, and I emphasize, just recently signed with Cypose. So they did give us 4,200 euros which exactly covers the expenditures that we calculated for the services that we gave to the ICTP people. So it's like, you know... Similarly, if by a share calculation is correct, we should give more, what you're saying? Yeah, yeah, because things are scaling up, yeah? So this covers, you see this payment that recently came from ICTP. We kind of, you know, I should have shown you on the site. If you go to the organizations page, click on ICTP, you're going to see a tab with the financial calculation of our estimation of the expenditures we've had associated to ICTP against the payment from ICTP, beautiful balance. Yeah, I think it's like minus 83 euros in that it's like zero. It's perfect. But of course, this is about past things, you see? So then next year and the year after, things are going to be growing, rate of growth about 30%. So, you know, we'd like to be able to have this outlook on the things and that's what is not installed. An idea like Cypose is good if everything is stable. But if you're trying to scale it up, it's tough. So actually, I have a question regarding that. If I may ask, sorry, Marco, do I have time? One more minute. First of all, you said that you also act as a preprint server. Is this like in addition to what the archive does or is it something different or is it an overlay on top of it? How do you? So it's not an overlay. So we are not overlay journals or overlay preprint servers. It's like it's a different thing. So we connect to preprint servers out there of which archive is the most important, but we connect to others as well. So you can deposit your paper in other preprint servers. And when you submit the Cypost, you just need to give the identifier of your paper. And then our machines just go fish the information through the other machine. So it's really easy. But the reason why we have our own submission system is because, well, for two reasons really, there are some authors that need to do revisions on their papers and they don't like to have like five, six archive versions. So they always kind of told us, can I give you the thing? So we said, okay, we need our own thing. So you can have submission streams that start with a preprint on archive, but then the three resubmissions are on the Cypost preprint server. So that's the first thing. And the second thing is that we also have a branch doing proceedings, which typically are like shorter papers and people usually try to go the fastest with this. So they don't bother with the archive. They go straight to Cypost because they just want the end proceedings published. But it's by no means trying to compete with archive for anything. That's just an additional thing to facilitate. So can I ask you just one question about, I may have asked you this when you visited ICTP. So, I mean, what is your lesson from this JHEP experience? Meaning JHEP was eventually taken over by Springer, right? So what were the issues that forced them to do this? And how do you foresee avoiding it? And what can the scientific community do to make that happen? Yeah, that's a very good question. So I was not a party to the internals of the happenings at JHEP. I think JHEP was really, really started by the community with exactly the right goals in mind because they wanted to have like community control of the editorial processes and whatnot. But in a sense, maybe what proved fatal for them is that they were too early. They were like one step too early because they adopted a financial model that slotted into indeed the subscriptions and then evolved into what it is today. So I think there was a lot of personal involvement from some scientists in that and that slowed down over the years. I think maybe the one big difference I would like to emphasize with side post is that the expertise to run the machines and to build the machines, it's also from the community. I think that's the difference because it would be very difficult to integrate side post and it would be actually very difficult for a corporate entity to buy side post and integrate their system. Of course they could do it because they got infinite resources but I can vouch safe that the first thing that's going to happen there is that I'll tell everybody not to publish in side post anymore unless I'm dead, in which case somebody else is going to say it because I've been saying it all the time. But there's absolutely no point in side post if it does not preserve its community basis and consortial business model. It's just, and I tell that to the community the day you come back from vacation and side post starts charging APCs and is bought by this, you just stop using it. Build another one. But there's somehow that cost, so you're optimistic that your financial model you will be able to, I mean, if the community supports it enough, then you're, is that the idea? Yes, yes, yes. So the one thing that is difficult to implement is a growth rate above 30% a year, that stuff. Although it could, you have initiatives that mushroom by two orders of magnitude within a year and those are initiatives that typically go and get some money from investors that see a return in the end and then they will get 20. You look at academia.edu, right? They got what, 22 million. I mean, come on. I can't get, I mean, any scientist in their right mind doesn't associate academic seriousness to academia.edu. It's just a total failure in that sense. So why do they get the money? Because it's a for-profit thing and why doesn't SyPost not get the money because it's a not-for-profit? It's as simple as that, but it's something that we are condemned to and there's just no point in SyPost if it doesn't keep doing that, yeah? It's like, no, go. Super. I have one more question, but I will leave. Yeah, sure, go ahead. Let other people speak and then I will. Okay, there's few questions. One is kind of a general one to both, I think, and it is about the rewarding system. So you both pointed out that that is kind of the crucial issue, right? So, yes, what could be done in practice, asks Nicoletta on this side in your opinion. So Tony, if you want to go first. Yeah, so the reform of the way, of the criteria that are used. So as another part of our merit, we performed, so we looked at policies from institutions in seven different countries for just to see how much open practices, as well as kind of responsible research practices were actually mentioned and open practices were hardly mentioned at all. We also looked and found that still the impact factor and kind of problematic elements of quantification, we've got to have this many publications per year and this kind of thing still dominated there as well. That needs to be reformed. But like I say, I think that reform is underway. So the European Commission at the moment is building this, what has been called coalition of the willing. And there's kind of a groundswell of support for that. What that will look like in practice, I think is an open question because different institutions in different countries will still have different needs. And a lot of that, this policy charge is still led very prominently by Netherlands, UK as a lot of open science policy has been. And so whether that will definitely be in everybody's interest is open for question. I've got to get the door, so I'll pass it to you. And for me, just expanding on this thing. So I envision a kind of divide and conquer strategy with metrification like this. So what is the real problem with the impact factor? And let me make again my perhaps controversial statement. Inherently, objectively, there is nothing wrong with the impact factor as a metric. What is it? It is a statistic. It is a thing that you can compute if you have the available data in itself that is not pernicious. However, what is bad about it is that it has become synonymous with quality evaluation, which it is not. And it is used, I mean, the moment when the crime is committed is when people are ranked for one thing or another, be that internal promotion or grant grant giving, people are ranked and inequalities develop between people based on that information. It's completely ridiculous. So here's a strategy that I want to implement with a site project that was asked about online. It's a project about metadata. I wanna divide and conquer strategy. So sure, I want the impact factor. I also want the impact factor normalized by the number of authors. I also want the impact factor normalized by the square root of the number of authors. I want a true impact factor that starts counting citations two years after publication not within two years, after two years because that's impact. And I want to rename the current impact factor the splash factor because that just creates immense commotion and lots of citations that might very well disappear. That is not impact. You want to define your own metric with citations then let's do it. So you have then this whole class of different metrics that you could have and then you might have 30 different numbers there and then you start asking yourself the question, are these numbers actually really telling me something about the paper? Wouldn't it be better to just go look at the paper? I'm kind of seeing. So just to change the habits again by empowering the community to do something else. Why is the impact factor used? Because people can just log onto the well-known site and click on a few buttons and get the number. That's it. Yeah, there's nothing more to it. So that's my vision for this. I would like this to change that way. Super. We have two more questions in the Q&A forum here. So one is about participation from the Global South in the Diamond Open Access Initiative. So if you want to comment on that. Yes, so super question. I'm very grateful for the chance to sing the praises of the Global South here. So if you look, for example, what's happening in Latin America, I mean, they really kick ass there. They do open access in a fantastic way. They've got wonderful infrastructure, lots of technical capabilities, lots of community spirit, well-established, scaled up system, financial support. It's really, it's got everything you might want. If you could transplant the model from Latin America into Europe like that, we wouldn't need to have these discussions. It's infinitely better. So there are great inspiration. And as far as I'm concerned, collaboration with such people is fantastic. That would be really, really, really great. There are different weighings. There's lots of space for different types of journals. For example, Sidepost really tries to build top down. So we try to really start with the excruciatingly high levels of quality that we can filter out, small scale, and then grow a bit from the bottom. Most of the other open access initiatives, they do it kind of the other way around, right? You do this service for everybody and you try to scale it up from that. So there would be space for not really competition, but cohabitation of all these systems in an international level. And in terms of just business model, one of the things which I'm working on, which isn't out yet, is also a recognition for the person power that's put into the processes. Okay, we have an expenditure of about 600 on average per paper, but that doesn't pay for everything. It doesn't pay for the time of the referees, the fellows, the people who do this and take the votes on this. I would like that to be compiled. So then, yes indeed, maybe the North institutions, they give money, their academics give time, and then the developing countries' institutions, they don't give much money, they can if they want to, but they don't have to, but they can give expertise, they can give time. And that's valuable. So if you're able to kind of have a business model that also informs your visitors about what has been contributed in this in kind way. Yeah, that would be great. And yeah, okay. It would make them feel more welcome to the thing as well, because one of the fears I have is that indeed, they kind of come, they say, okay, yeah, fine, sure, I can send my paper to Cyprus and whatnot, but really my institution is gonna be asking, is there, it's like a subtle barrier sometimes through lack of comprehension of what the thing is. And certainly with the amount of publicity, that transformative agreements and all these new celebratory champagne things about these new agreements, if you're coming from outside Europe, these things work like a complete no-go. Yeah, it's just, it's so bad. We have one more question for you about Cyprus, it's about the topics. So there was one question about papers of innovation and technology. Are they of interest and what are the topics? Now it's physics and political sciences, right? Yeah, so we've got, so what happened with political science is that I was approached by a number of academics doing migration politics. And they said, yeah, we like the way you do things, can we have a journal? I said, fine, you can have a journal, yeah, just like I open up a journal in the database, but then they run the whole process, they form their editorial college. So there's one requirement to start a new field at Cypost and that's to have a group of academics willing to form an editorial college. And you want good people from throughout the world with the distribution between senior, less senior, but really well-established academics that are able to pull people in. So if we were approached by a community willing to do this, they say, yeah, we know people and we're kind of keen to do this, let's just do it. We've made some attempts to do it in mathematics, but that was, it's very difficult because mathematicians are quite traditional oriented in their things, so that really didn't take off. We're trying to make some efforts in chemistry. Chemistry is extremely difficult because they're very closely embedded with their professional societies for lots of reasons. So the competition space there is very difficult. And innovation and technologies, I don't know, we'd have to look into it, but if there's a bunch of academics willing to do it, then yes, it's just a question of flicking a switch on the database and then you have it. Okay, super. Ateesh, I think you had a last question. Sorry, I wanted to ask the question regarding, I mean, you talked about this fetish of excellence, which I think regarding merit, I mean, I'm asking the question to... No, I mean, and I think it's certainly that criticism is correct, especially if it is done with emphasis on some mechanical metrics like what you were talking about. But at the same time, there is a notion of excellence and quality that is undeniable in scientific publications and which must somehow be, I mean, otherwise in the science doesn't really exist unless we have some kind of a way of community validation about what constitutes correct proof or what constitutes. So, yeah, I just wanted to know what you think about it and also, I mean, in some ways, Psypost is an example which is going in that direction or maybe you could comment on this. There's an excellent paper, I don't know if you know it called, Excellence Are Us by Cameron Bailon and Martin even colleague. And it's just deconstructed, what are we talking about when we talk about excellence? And the notions that we have of excellence and quality, they sometimes kind of degrade into our hands into kind of that notion of, you'll know it when you see it, you know? But who knows it and who sees it? So who is judging? Like who is on the panel deciding what is excellent? Because like principles of homophily, very often they will be choosing people who look and sound and talk like them, who have come from the same labs as they have through the same systems. And this is really about the essence of that. So maybe one really interesting thing at the moment is the rise in discussion of lottery systems for funding. So people reviewing grant proposals, for instance, are very good at telling bad from good. But then the good from the excellent is sometimes they're not. And so there is a lot of discussion at the moment about maybe you introduce some sort of lottery system into this. The clustering of resources at the excellent institutions, whenever that happens, it is this feedback loop because the resources will end up in the same places that we know and we could pick them on the map, you know? And I'm sure in physics, you could pick them on the map as well. This is all. If I may add a couple of things, I mean, so indeed, I'm very, very close to what you were saying, where you say that excellence indeed, as a scientist, you get a feeling. I mean, for example, my biggest totem is Duncan Haldane. Duncan Haldane is a very special researcher. It's like a different taste and you don't need to tell me anything about his citation records or whatnot. I know Duncan, I know his work. I know the significance of what he has achieved, the depth of it. That's not measured in metrics. In a sense, it's even anti-correlated because some of his papers are not cited for years because nobody gets them. But then 15 years later, finally, somebody does it and then indeed you do see that there's something there. So you can't capture that in metrics. There must be space for all of these things. So I think indeed what I really like about the modern discourse about it all is the kind of tendency to give pressure to people to go look at the paper and go back to the source because that encourages you to know really what the content is, whether it's really good. And then you can form yourself a good opinion. You don't need to write little numbers and whatnot. And then word of mouth, you show it to your students, your collaborators, it becomes significant in one way or another. That I like very, very much. There is one thing which I would like to warn absolutely everybody against because I am totally against this. It is the Twitterization of research, the Twitterization of essentially saying, oh, we've got this new preprint and we do this and that. Why? Because this completely obfuscates the thing. You end up valuing people who are good at creating networks and creating nice things. This is not what we want. This is at least not what I want. I mean, in a sense, I would much rather sometimes that scientists were banned from discussing science on Twitter that they could maybe agree to go have a discussion forum somewhere, but not on Twitter. Why? Because publishers use that to try to propagate things. You've got networks building up to try to create some structures that don't exist except in the strategic planning of the information managers of these things. This is all polluting science. You don't want that. So as far as I'm concerned, Psypost wouldn't even be on Twitter. I wouldn't even be on Twitter if I could choose it. I hate the place. I think it's ridiculous. But you have to do a little bit of things just so you have connections to things. But a big word of warning here. This is not progressing in a way which is conducive to well-informed academia. It serves a communication protocol, but there's this layer of promotion in there that then starts being pernicious. So we'll see where it goes. Okay, thanks. Actually, I should tell you that I did publish my first paper in Psypost recently. But I just wanted to ask you one last thing that do you think it makes sense to have something like lecture notes, meaning like the Springer has these lecture notes. I mean, curated and editorialized, high quality summaries of a whole field. I think those, I mean, this yellow series of Springer was very influential. We think Psypost can think of using an alternative for that. We have a journal called Psypost Physics Lecture Notes. And this is for like research level research notes. So for example, we signed with the Les Usch School. So the Les Usch School, their lecture notes are now published through the Psypost systems. And we've also just signed, just come to an agreement with the Institut Physique Théorique of Saclay of Paris, where they will also have their own series of lecture notes in Psypost Physics Lecture Notes. So yes, so for lecture notes, this is already in place and it's a question of scaling it up. Then a relay. ICTP has very many high quality conferences and okay, this could be something that could be interesting. You gotta look into it and if you're interested, we can talk because indeed that would be fantastic. And also with the kind of projection of ICTP, it would really help bring the world, developing world into the systems as well. And that would be extremely valuable. That would be most welcome. So lecture notes sorted. One thing which we have not started, although I've wanted to do it for a long time, are reviews, yeah? So like, do, do, I'm in both actually. I'm in both, yeah. Okay, so lecture notes is sorted but reviews are not yet sorted. And, you know, there's no reason why we don't do like big reviews, like reviews in modern physics or something. So if you can convince the rest of my team that we should do it, but the reason why they don't want is because they say I've got too much on my to-do list first and they'll be happy to do it once I've done what I've promised to do. So it's my fault really. We have one last question. Can we develop a roadmap to site-positify the publication of an existing professional society? What would be the way, you know? So how should I interpret the question? Would that be to republish the existing corpus or just to from a certain time onwards do it? Yes, the right way. So this is possible, there are however limitations because the site-post systems, yeah, they're in-house built, which means that the systems themselves intimately reflect the editorial processes that we have. So one would then have to accept to run through these processes with the open-referring and the college-based decision-making. And if that is okay, then it can be done in a tiny moment. Because then the advantage of that is that you already have a college factually. You've got an editorial board, you've got knowledge of things. The one thing which is difficult with this is that the history of the journal is not automatically transferred onto the new system. So if you flip a journal like this, typically it's a bit like starting from scratch again, as far as the impact factor is concerned and whatnot. But these are the two disadvantages. So you'd have to run it the way site-posts runs its editorial processes. And you'd have to be willing to have to swim under water while the transition is being made. But yeah, indeed, this is entirely possible. Excellent. Eva, do you have any closing remarks? No, I think I've just looked at the chat. Okay, I think we answered everything. Answered all questions. Also with the Q and A, I think we got everything. Wonderful. Yeah, well, thank you so much. It was great to have you here. It was very, very interesting. And yeah, maybe. Good to have participants, yeah. Maybe we'll- Praise to the participants. Very nice to have you. Yeah, absolutely. And maybe we'll be able to have a seminar like this at ICTP soon. So that will also be, because we were just for the other participants we were discussing before that it would be nice to have some Italian coffee and discuss things, have some more time to discuss things. So I see from the participants that there is lots and lots of interest there. So yeah, we'll see what is going to happen in the future. So, but for now, thank you for coming. Thank you to the participants for joining. It was great to have you all here. Thank you. Thank you very much. Thank you. Thank you very much. Thank you. Very nice. Thank you. Goodbye, everybody. Goodbye. Thank you.