 Hello, everybody, and welcome to this new webinar. I'm very pleased to moderate this session. We are going to launch to present the National Open Access Monitor in Ireland. And this is the first webinar of a series of training that we will do with the research performing organization in Ireland, the research funders organization and the community in Ireland. But this is a public webinar, so to show to everybody the infrastructure that is provided by Ireland to move on in open access. So some housekeeping notes. This webinar will be recorded. You can ask questions or post your comments via the Zoom Q&A section. You can also vote the question that you like and that you want to the speakers will be addressed first. We will make the webinar available shortly after our webinar. So now I am the pleasure to introduce the speakers of today. Susan Rayleigh, who is the director of the Irish Research Electronic Library. Natalia Manola, who is the CEO of Open Aid. Johanna Gripari, who is the technical project manager of Open Aid, and mainly behind this infrastructure for the National Monitor for Ireland and the UNIDISP Spirits, who is another technical expert in Open Aid, who is working a lot in this platform. So now I am ending order to Susan. Hi, good morning everyone and good afternoon to our European colleagues on the call as well. I'm very pleased to be here today to showcase the National Open Access Monitor and to announce the launch of the baseline report that has been released today alongside the monitor. I think most of you will know the origins of the Open Access Monitor project, but just to remind you, the National Open Research Forum Action Plan for Open Research was launched in November 2022. It had three themes and one of them, theme two is to enable 100% open access to Irish research publications. And under that theme, the first priority was to develop a National Open Access Monitor to promote transparency, enable progress to be tracked, allow for identification of gaps and targeted interventions to ensure equity in open access in Ireland. So IRL has taken on responsibility for managing this North funded project. And it's been led by Catherine Verus, who can't be with us today, but has worked hard to make sure that the project is open, transparent and consultative. And over the course of the past few months, we have worked on agreeing a community agreed definition of open access and establishing shared criteria for monitoring open access. And these criteria have been taken forward in the monitor that you've seen today. I've said that the process has been consultative and it will be continued to be the development of the monitor is very much a collaboration in terms of working with stakeholders in the Irish research landscape to ensure that it meets the needs of those stakeholders. So I encourage you to engage with open air over the next few months to help improve the monitor. The report on open access in Ireland that has been produced in open air shows that we've actually come a long way in terms of achieving 100% open access. Of course, we're not there yet by a long shot, and there's a lot to be done in terms of particularly in terms of improving metadata, the consistent adoption of persistent identifiers, and improving the traceability and fairness of our open access publications. The report points to actions that we can address at grass roots level, but it also points to actions that need to be addressed at a policy level and in terms of investment in infrastructure. So I encourage you to take a look at that report and to feedback on the monitor to help us get to the point of having a, I guess, a world class, not just monitoring system, but taking a leading approach to achieving 100% open access. So with that, I think I'll hand you over to our next speaker, and we are looking forward to hearing more of your views on the open access monitor. Thank you, Susan and Natalia. Now, the floor is yours. Thank you, Julia. Thank you, Susan. I will try to be short, you know, people who know me that I'm never, you know, brief, is that first of all, I would like to start with two remarks. I think what Ireland is doing and the way that we are, they are approaching this monitoring and all of open access and open science is remarkable. It's a community effort, and I think it's an example for others to follow. For us in open air, we have, we are supporting them to build the monitor. But I think what I need to say is that we have learned a lot from the processes, from the transparency that we have seen in all the processes and from the community engagement. And unless these things are in place, we don't believe that things will be successful. So thank you to all the collective effort. Let me say a few things about how we are approaching it before my colleague Joanna and Leonidas take it forward. Why are we involved in this? For us in our strategy, one of the key aspects is not just to propagate and to promote open science, but also to monitor the uptake of open science policies. And what we all need to understand with this is that we're not monitoring to penalize. We're not monitoring to see how well we're faring. I think what we are doing is we're monitoring to improve, to learn from each other. And this is, I think this will be the great challenge on how to present it and the great benefit of the Irish monitor or any other monitor at the country level. Also, what is important, and we are thankful to the North, is that from the beginning, because due to transparency, they chose, they had the requirement and the mandate to use open sources. And I think that happened about a year ago, but now the community is around building on this notion of open infrastructures. And I'm just having here the UNESCO December meeting on monitoring framework with open technologies and open data. And as we all know, and for us, we are following the developments. This is another trend. So I think the open Irish monitor will be one of the trend centers if I may. Now, if I can come back to the open air and how we're building it, my colleague, as I said, will elaborate on that. But what we have been building in collaboratively, again, in open air with all our members, is the open air graph, which we consider a global lesson. So this is a graph, a knowledge graph, it's called a communication knowledge graph, that is trying to have data from many places and interlinks them together. So we are trying to do that in a very transparent, in a very open way. And what we would like to do also is to invite the community to work with us to make sure that whatever is being done at the country or institutional level, systems or national systems that we are aligned because this is very important for Europe. The other thing that I would like also to stress is that when we talk about open access monitoring, this is just one slice. In order to assess the open access or the open science monitoring is that we need to have data from all over the world. So what is about how we can collaboratively work is to build these small data sets around the world, each of us in our little corner of the earth. And to make sure that now what we do is we have very well curated sources and that we link to each other. And I think North, apart from the open access monitor, the other efforts that they are doing and their initiatives is the repository initiative. All of these are very important so that, you know, and they're not, they're not just important, but they are interlinked. So it is very, it is very much appreciated. All of the efforts that that island is doing just to give you some ideas, because we're talking now about publications open access is the monitor or the Irish monitor is about open access to publications. But we are moving slowly towards counting, you know, other research outputs, like research data software and others, and we are getting there. So the community, we are I think in a very well, you know, good spot in order to do open access publications, but we also need to move into thinking about other types of research output. And I will close my intervention here by saying that this is a community effort and I just want to show you how the open air graph, because you will find that on our site, graph.openair.edu, is how we are, we have started from here, if you can see my cursor move. We have started from 2000, from August 2023. And then because of the Irish monitor, we have so much improved the processes in this, in this, in this how we do affiliations, because we had to actually answer to a community problem. And this is how we would like to work with other countries in order to, you know, to see this trend increasing and to all of us in Europe, at least in Europe, to collaborate and make sure that this is something that we are all proud of, and that we can all use on. So I think just to close again, in my view, the Irish open access monitor is an example for other countries to follow. And let me tell you, is that already, we have seen, you know, we are seeing many countries knocking on our door to do, you know, to repeat, to repeat the exercise. And with that, I'm closing and, and, and, and having on to the next speaker, you are nothing. Hey, thank you, Natalia. Let me share my screen. Can everyone see my screen. Yes. Okay, thank you. Okay, so I'm going to give a brief presentation. And then I'm here with my colleague Leonidas. I'm going to give a short overview of the platform discuss how we build it, give a demo. And then learning that will present the various support options that we have and the next steps for the improvements of the monitor. So the monitor is hosted in the address that you see here, and it is accompanied as Susan mentioned by, or maybe Natalia to the National Open Access Monitor Island reports, which provides the baseline analysis kind of the starting point for the National Open Access Monitor Island, and that, and also includes areas for improvement, strategic solutions and so on. And you're welcome to take a look and keep us your feedback. Okay. We discussed already the main objective of the monitors to provide transparent and comprehensive insights into the state of open access in Ireland, and to serve as a crucial tool for analyzing trends identifying challenges and guiding policy development decision making for various types of users and stakeholders. So today we're at the pilot phase, and we're focused on enhancing the data quality and finding the platform and its functionalities, and we aim for the final product delivery to be in June 2024. So we're working very hard on making this a production quality monitor, and we very much welcome and need your feedback. There's a contact form available in the platform and I have provided the link here, and we will reply immediately and get in touch. Okay, so the monitor. The platform is composed of five different types of monitors, the national one, then the one for research performing organizations. Researchers for institutional repositories and for research funding organizations, and I will go through them in detail when I do the demo. How did we build this although you know some of this already. The basic principles behind this with the North strategy for open access in Ireland. So it's built on openness and transparency so we have done our best to document all the methodologies use. And please let us know if something requires more clarification. All the data is open and public, we adhere to fair principles and international standards to ensure trust was the monitor so that the public is engaged and people actually use it. Then we have readiness and timeliness and comprehensive coverage and accuracy. So basically based on the opener graphs uses of established methodologies and open database and operational workflows, and the extensive capabilities that it has in terms of coverage. We hope to provide insights that are timely and accurate. In terms of engagement and inclusivity our last principles, and this extremely important for us and it's a reason why we provided these different types of monitors so that different users feel engaged and reach out to us in order to improve data and explain whatever is needed and so on. The backbone is as you know is the opener graph, which at this last version includes 100,000 data sources 3.5 million projects and about 254 million different research outputs. This includes the graph pipeline includes several phases starting from the aggregation of different types of data sources, and enrichment by mining the duplication and reaching my inference and finalization so the final cleaning. All this produces the public graph API on which the National Open Access Monitor for Island relies on and feedback on the monitor is fed back into the graph. So it creates this continuous loop of improvement. The feedback can be several types that we will explain later in the presentation so you can link research products with each other. You can claim research products as your own kind of duplicate organizations, you can provide your project data in order to have a dedicated text mining as a panda. You can register your data sources in open and all of these different types of quality data quality improvements are fed back into the graph, and then they become part of the graph workflow and pipeline to make sure that you know they're constantly used and so on. So we're working, we're building on sustainability here. These data quality improvements are first viewed in the sandbox and then the production after they have been validated. In the sandbox, only RPO and RFO research performing and research funding organization managers have access to the sandbox and we will discuss later a little bit who can become a management. This includes the data that is shown in the sandbox is the pre release version of the graph, new functionalities and new indicators are shown here. This includes the integrated data quality feedback that I was talking about in the previous step. Eventually this shows up in the production environment which is the environment on which I will do the demo today. Okay, let's move on to the demo. Okay, so let me see if I can minimize some of these windows, right. Here, this is the home page of the monitor. Let's see, let me know if it's too big or too small please. If you scroll through the landing page, it has some information on the benefits of the platform, you can link to the different types of monitors and the different types of user actions that are available, as well as the contact platform, so just a summary page, as usual. Now in the monitors, you see here there is a drop down for different, the different type of monitor dashboard that exists. National RPO, RFO, researcher and repository. If I just zoom out a little bit you can see them here extended. There's also the resources and help that I will go through a bit later that has different links and then here on the, here on the right there's signing in. At this point I'm not going to sign in so everything that I will show you is completely public. Now let's go to the national monitor. This is the national monitor for Ireland. There are two parts of all the monitor platforms, you have the monitor part that includes indicators. Okay, sorry. And then you have the browse research products that is the research output view where you go and see the particular publications, data sets and so on, that support the visualizations that we see here. Okay, so let's go to the indicators first. Now this is kind of like an indicator report so you have different tabs here where you can select and see indicators, this is for scholarly production and I will go through the others in a while. Each tab has some number indicators at the top, so all publications, peer review publication, open access peer review publications with license for example here. And then some visualizations that have breakdowns and benchmarking of interest that you can scroll down and view in detail. For example here, we have used our foods of science classification, we have level one and here level one and within level one, level two. And one can see for example here the number of peer review publications within the medical and health sciences. It's just how it is distributed across different medical fields. Here we have SDG classification system used. Basically we have all the breakdowns of interest that have been identified by initial surveys of further that created this project. These are quite easy to change so if there is consensus that some are missing or something additional needs to be done or they should be viewed in a different way. This can be updated. Of course, all of these directly linked to the the Irish monitor database. So whenever there's an update of the graph and an update to the Irish monitor database, these are also automatically updated. So the first time is the scholarly production then the second time is the open access evolution where here we will provide snapshots of different numbers for the monitor and how they evolved over time. Actually, by the end of this week we will have the next snapshot for the monitor. So here you don't. So this way, you don't only see how open access changes by the year of publication, but actually by monthly near monthly snapshots of the entire open and graph, which provides additional information. Now here I am in the access right stop. Just want to clarify here that open access is separated between open access with license and without license. This is because the consensus from the Irish survey was that the Budapest definition, the best open access initiative definition of open access should be used, which includes a license. So only the one with the license is truly open access. And let me know that we are here we have the access rights trends over time for peer reviewed publications. Let me show you some of the interactive features that this graphs out. For example, first here on the right, you can click and have you can view, you can view the visualization in full screen. You can print the chart or download the image in different versions. The last one keeps the interactivity, you can and you can download the data behind this table. You can also similar to that you can click on a view data table, in which case when you scroll down you view the actual numbers. These are downloading it in XLS or CVS is very useful for someone that wants to create their own analysis. Over here on the bottom left we have embedding option, which would which allows any user to include this frame in their own website. And whenever the graph is updated and the database behind these numbers is updated that visualization will be dynamically updated as well. So it's a very smart way to include a quick, a quick monitoring snapshots at the level of interest depending on the user. Now, all of these series data series can be selected or deselected, and the graph is automatically adjusted. Okay, I removed everything. Let's say I want to see only the embargo once over time. And then the open access with license so the embargo is still there as you see just that the numbers are very low. And then by selecting an area of the graph, I can zoom in. And then I can click on reset zoom to zoom out. This can also be seen this functionalities by clicking on help here. So this help button, it follows you in all the pages. So first, if you can click on this video to see the basic interactivity functions of all the graph, or you can click on terminology and construction that open up this page that is split in three different sections entities inherited and inferred attributes, which is something that we get from metadata and constructed attributes. So for example, inherited and inferred attributes as organization country access rights and so on. And then constructed attributes has the gold diamond, the open access levels for different journals, then the roots to open access, and so on and so forth. And then constructed attributes we offer a definition and since we constructed it detailed instructions on how we build it. Again, if something is not clear here please do reach out. You can reach out by clicking the help button in any page and then clicking on give us your feedback and the contact us form opens up. Okay, then moving on to open access roots. Here we have some number indicators on different roots to open access. And in this particular part you can see that there are additional tabs below that have more more of a breakdown of the visualization. So first is open access mediation so these are some indicators on repository versus publisher mediated open access. We explain here that repository mediated is green open access with a license, while publisher is golden hybrid. And then we have several breakdowns of interest here as well. Also some top performing funding and funding and research performing organizations. However, here the other time is unrealized open access, according to the concessions for the project, unrealized open access includes closed access, and then green without a license and bronze that where these publications are accessible, however they are without a license. And then we have back them here so that's still separate that closed access, but they're not part of the open access terrain in Ireland. And then you have some different breakdowns of interest here as well. And lastly, the last step up here is open access types. Okay, where we have green versus gold versus hybrid trends. And then we have some fair aspects. So we have a bunch of indicators and breakdowns for licensing. And then we have other fair aspects, such as PIDs, funding references, abstracts, orchid IDs and so on. Then we move on to the plan S indicators. And I hear the focus is since 2021 where plan S was implemented. So here we have a graph that is the period with publication published in plan S compliant journals. That means in either diamond journal called open access with APCs, under transformative agreements in transformative journals or other journals. Of course, the definition for plan S compliance includes additional indicators such as a green open access is also compliant. But here we focus on the journal trends. I think we have the plan S factors in Ireland at this point, unless we are mistaken it's only science foundation Ireland. So we just have them here. The next tab includes APCs. And here at the beginning we try to give an overview of the data availability. So there are 23,000 publications that were the author or incurred an APC, but we only have data for about 1000 of them from open APC. And here also, there is a note you see this button here. APCs reported to open APC but any co-authors institution. So descriptions can be added as needed. So given this low coverage of data we do have however these numbers. And then when you go to APCs versus transformative agreements. Here we see a transformative agreements are on the light blue. So how much of the APC costs have started to be covered by, covered by transformative agreements. You can see which foods benefit the most here in FOS level one FOS level two. Then we have some cross country indicators. Island is always the first. Now you cannot see here because compared to the second one the US and the production is quite low. So if you do, if you do the zoom in function, you can see the numbers here as well or you can download this data to see directly. I click here and you send zoom back onto the original. So we can do some cross country benchmarking here. And then we aim to provide this visualization stacked by percentage as well so we can focus on rates. And then we have some bibliometrics here, downloads and citations. There are some notes that explain where we got all of these. Where one can see some data over time across different breakdowns of interest here as well. On the right throughout this entire report, there is this button here that if you click it, the filters, the filtering capabilities functionality shows up. So let's say for example, I want to have a time range of the last 10 years. And for feeds of science I'm interested. Okay, I can type something here but let's say social sciences. And then I am interested in publicly funded outputs. Okay, and then you can see here that the entire reports, you can go to scholarly production has also been filtered by the same with the same filters, and if you download the data, the filtered out version will be downloaded. Okay, and then you can clear it all here and go back to the original. The second part of any monitor dashboard is the browse research products where you view the different research outputs. Here on the left we have a bunch of filters that can be applied. We have the type of publications and we saw here at the top which have been preselected so publications and peer reviewed at this point. And as I remind you, the Irish National Monitor. So we have the type, the document type, whether it's peer reviewed or not. The access rights, the access routes, so we have green, yes or no, the publisher access, bronze gold or hybrid, and whether it is not diamond open access journal, the year range, the fields of science, a publicly funded, the country, we are in the Irish monitor, but because of the affiliation of co authors of publications, we have other countries covered here as well. Then we have funders, sources and research community. If you select a funder, let's say European Commission. Then, automatically, the filter expands in order, allowing you to choose the fund extreme and the particular project. Now, let's select, for example, here a gold open access publications. So what we see here, we see the results, they can be sorted by the elements, or by other criteria, they can, the first 2000 results can be downloaded and we refer users to the data dump in Zenodo, we can click here, we'll take you there. If they want to download the entire data set because we can becomes too big. Let's see, for example, here, a particular result. So we have the title, the publication type, the year, the two countries that are affiliated, the publisher that is publicly funded, who it is funded by, the authors with the orchid ideas when they are available. The different PIDs for the article itself, the beginning of the abstract, a link to download the full text when it's available. The different sources that provide the article, also clicking on this, you can link to the different sources, and then there's a linking functionality here that I will explain later. You can share the article, click to get the citation to cite it. The claim functionality, it is not right now highlighted because I need to log in and I will show you how this works in a second. Here we have the access routes in the particular case. We have a golden green open access. And here you have some of the citations, you have the citations and the views. Now, if we click on the particular publication, we open a more analytical view of the publication that includes the different versions that we have for each of them. Okay, for the same publication with its own metadata. From the different data sources, we see the licensing that is available in each, how it is offered. And then we have the subjects, and on the right here are all fields of science classification, and a user is able to suggest their own. If they think something is missing or it's a mistake, then further down we have the references, the related research and these are research products that are linked to this publication. And then you have a different popularity citation metrics that we take that we calculated using deep. Okay, and the right you also have where it is who is funded by and the related research community. So a lot of information. Okay. And the last thing to say here is that there is a basic search here. And there is an advanced search that works by you select a field that you can add additional fields to define your search if you're looking for something specific. Now, let's move on to the other types of dashboards. So we have the RPO monitors. If I click here. Here is the individual RPOs can use this to establish their visibility of their content in the national terrain to compare their performance with others to ensure compliance with open access mandate find weak spots and so on. So now I'm in the page where I browse the RPO monitors I can add a name here in order to search for a particular one. And let's say here they are sorted by the number of publications. So I select here Trinity College Dublin and again as before, I have different tabs of indicators, instead of cross country here we have cross RPO indicators. And we have the browsing of research products that shows now in this case you see Trinity College Dublin is selected along with the publications and peer reviewed filters, all of these they work like before. And of course we have changed the indicators themselves to be what we think is more relevant for RPOs. The sections are similar to before. If we go for example here to cross RPO that again the first one is always the one of the dashboard that we're visiting. So for example, if we scroll down here, then we can see that by average site. So here we have top RPOs by average citations for peer reviewed publication. Now here we see that in the particular case, Trinity College looks like the RPO citations on average is 25. Now someone called from Trinity and the next time I was listening they can say this number is misleading because we have way more publications or something is missing or I don't know anything. And then we will work with them in order to see if there's another indicator here that we can add that is more informative for example total public total citations, or if they believe that for some reason not all citations are counted we can take a look together and so on. Now continuing we have the RFO monitors. These are also sorted by number of publications. This is the browse page you can also type a name here to find organization and RFO you can look in for. You can use this for burst marking, we'll see open access performance, find weaknesses and so on. Similarly, Sun Foundation Oil Island has joined OpenAir which means that they have a dedicated text mining algorithm for them only, which means that the records are enhanced because we are able to cover this way additional publication project links and so on. Again, you have the same functionalities as before, different types of indicators filtering cross RFO benchmarking in this case and the browsing of research products. Then moving on to the research and monitors. We don't have a browse here because it's too many, but you can type directly. And then, but you should know that the research profiles are tied to orchid IDs. So if a researcher doesn't have an orchid ID they will not show up here. However, there is seamless integration with the orchid. If someone makes a change on their orchid record it will show up here and vice versa. So let's see how this works. Let me type a name here. Excuse me. So these are the research profiles that match Andrew Bowie. So I click on the first one. And then I see that Professor Bowie here, he has 112 publications and 107 are part of the Irish monitor. So what we do here is that we show the entire record of our researcher. So this is the researchers that have profiles here. They have two things in common. They have an orchid ID and they have been, they were at some point affiliated to an Irish RPO or funded by an Irish RFO. So we do give the entire record here, which is what is also shown if you browse research products. But the indicators that we show are for their performance for the Irish record. And we indicate this clearly by writing Irish when needed just to clarify. And you can see the total performance in open access. Now let's say that I want to add more records. Something is I go to browse research products and I see that there are things missing from my record. How can I add them all? Now this is through the claim functionality. And one can do it either here by browsing or through this way. So you sign in. So now at this point I am a researcher with an ID. So as a researcher, I have to sign in through orchid. If I am not a researcher, I can sign in via open air or the basic account or the institutional account. So I personally have linked my orchid to my Google account. So I signed through Google, but I'm within my orchid account. So now I'm signed in. And the first time you sign in, you are given permission for the monitor to have access to your orchid record. I have done this already. So I go to my orchid links and I can see my entire orchid record. Notice that I have not been affiliated to an Irish RPO, but it doesn't matter. I can see my entire record here. And then I can discover more research products related to me by clicking on the button here. I can search, let's say, so the things that are already a search for you and I could buy. And then let's say I can pick the first one and claim it at this work to my orchid record. And then it will take me through a series of pros where I have to approve and eventually add it to my orchid record. This means that in the next graph update, this record will both show up in orchid and in my profile here. Okay, it's a bit stuck. No, it's okay. All right, then we have the link functionality that allows users to link different research products to other research products. So for example, I can just type my name here. I could also add a title. Let's add my first name also. Okay, this is this is my publication. So let's say I want to link this. So I add this here. I want to add this publication. I can also upload a set of DOIs to do that in a bigger scale. And then let's say I want to link this to a particular project data for impact. The project is here and I add this. Now I'm not going to complete this because these are not actually linked. But again, you complete this linking and then it will show up with the next graph update. This linking functionality is available to all logged in users, not only the researchers. Okay. And last but not least, we have the repository monitors where repositories can go and see if their products are accurately reflected and how they perform in terms of access rights and so on. Here, for example, I have clicked on the corp open research archive and results can be browsed here to see if everything is up to par. And we will have a dedicated training on how data sources can be registered, reports can be registered. Okay, and then lastly I'm sorry this took a bit longer in the resources and help you can see various documentation, things like the different user actions, you can see more details and how they can be used. The methodology with a logical approach and technology and then we have here web statistics and activity logs where you can see the web analytics for the monitor, as well as actions that were taken by the users anonymous, of course. And the logs for the duplication of names, whenever something requires your permission to be published a user action, you will always be asked you will be taken to a consent form and so on. So if you are logged in, nothing will show up before you give your permission, of course. And lastly, we have a page for engagement and training where you can see the previous webinars and trainings that we had and the upcoming ones. So very, very quickly I'm going to go back to the presentation for learning us to go over the support options. You can just I'll just keep sharing my screen. Yeah, we cannot hear you. Yes, Anna. So in my session of the presentation, covering the support and training aspects of the project, along with the next steps following the palace. Regarding the support, we have a dedicated help desk system for the national for the Irish monitor. You know, help the URL. You can see the slide. So regarding the support services we guarantee a response, which request which ticket within one business day. In order to initiate the resolution process. I'm handling. These issues vary. They depend according to the incident and the severity and the severity type. When we have a content or data updates, we must see, I must say that the opening graph is updated monthly. The resolution window of any issue regarding content and data will be from one to two months. And it depends on the time of the month that the opener graph will be updated. Managers on the content and data base can pursue these resolutions in the sandbox environment in order to test and validate them. And afterwards it can be implemented in the production environment. Regarding the issue, who can be a manager and what do managers do? Managers do. First of all, we have two types of managers. It's RPO and I folk and have only one prime and primary dashboard manager, which apart from the from accessing the sandbox environment. Alongside with all the other managers will also have access to the open arts platform. In order to be duplicate the organization on the part of duplicate names. When the event state parent and the child relationships with departments or schools. Alongside in their organization, we have dedicated web in us for the open arts platform that will follow in the next month. So they are coming. Yes. They are coming training events. We have the training the second thing for research fund organizations on 27 months. We have the training for the research funding organizations on April 15. The training of RPOs. The agenda will be the management of the Irish monitor platform that the duplication a short presentation of the duplication in the open arts platform. As I said, we are going to have a dedicated for this linking functionality and the upload DIY functionality as you are now already stated in her presentation. So registering the data source is very positive. There is an increase systems of the institution seen open air via the open air provide and compliance with the guidelines for these radiations and regarding the RFO research funding organizations training. We're also going to cover the enrichment through text and data mining techniques and the linking a lot DIY functionality. You can check out our general community for the latest documents regarding the reports presentation webinar said whatever is relevant to the project to the Irish point of project. You can reach out via our platform. The help that you are a lot of platform. And we also share the next steps. Let me share my screen. Maybe I think it's maybe to leave to discuss if there's a couple of questions instead of going to the next steps. Okay. We have already answered a few of them. Yes, there are some questions in the chat. One is problems in the analysis of open access publication are the quality of the metadata. And guessing whether the publication is available in open access. As in most cases, the metadata does not include this information. In most cases, the license is missing. There is also problem in identifying and normalizing authors. The orchid identifiers of the authors can be very helpful, but in most metadata disidentifiers is not given for authors. And also the affiliation of authors of the publication are not listed. Another problem is the classification of publication into field of science fields as some research results are multidisciplinary. The biggest problem is certainly that there are many publishers who publish the articles in open access. They are not included in any databases. For instance, cross draft data sites, open air, open Alex open citations, many results are in restricted access. How do you solve these problems at the moment is it very difficult to get all this data. Okay, let's say there are several questions, let me attempt and then this can correct me and everything that I say that is wrong. So first of all, the easiest question, which is the classification is to FOS fields, our field of science classifier allows multidisciplinary. So this is not an issue. So what publication can be classified in multiple fields and subfields so that is okay. And we keep the ones that have the highest weight. But not only the high but have a significant weight so you can say that this actually on them. Now, of course, all these issues are very valid. So the open air graph tries to solve this by integrating 130,000 data sources. So the major one but also smaller ones, in order to merge them in in one, one publication is merged from several instances into one record. Therefore, the final metadata record that comes out of the open air graph includes the different elements of the different instances from all these different records. Also for in particular for RPOs we encourage them to register the data sources so they can directly work on data quality and see if everything is included as needed or if something is missing. So in general these are problems that we recognize in open air and we work very close to very hard to try to fix them. So for the orchid identifier. We do we do use them and it's very helpful it doesn't have a perfect coverage but okay. We do have the author names for other for each publication is just that they are not merged into their orchid record of course. And in terms of this at the affiliation this is an ongoing process that we target using the metadata from the different open air sources but also the text mining and Natalia. If I may add I think on the orchid this was a request from from from north and and the board that is is overseeing this because it's it's the it's the goal and the objective in Ireland to make sure that you know everything is connected and everyone is using identifiers so even though we realize this that means it's a shortcoming for the moment, we hope that as time progresses this is going to be, this is going to be rectified. Thank you. There are also some questions that have been already answered. That is, in the research profile dashboard. Is there any way to rank the researchers based on publication or citation in a specific discipline. So this is this is a request for a new indicator which is something that we can handle using their orchid ideas. Of course, there has to be a bit of a consensus on this. Otherwise the dashboard will become, you know, full. But we'll definitely make a note. It's definitely something we can do. We need to check also with the with the Ireland without with Ireland in order to build such a ranking indicator for the authors. Because currently we have dedicated author profiles, research profiles for each author without any rank indicator. Yes, I see Susan smiling, but I think you know this goes into my initial statement that this is not about ranking. This is about, you know, promoting learning from each other. So, you know, I'm not sure that the board, the Irish board will agree on any kind of ranking. Yeah, just to sort of endorse what Natalia is saying, it would be a little contrary to the north action plan to introduce a new way of ranking researchers. We are talking about a change of culture here, not replacing, you know, one broken way of doing things with something that's virtually identical but using open access as the metric. So I can't envision that, but of course we would like to make sure that this measure is is useful from a researcher's perspective as well. So we encourage any sort of sort of feedback from from authors and individuals, but this is not a stick we're developing or even a sales tool. This is to help us move forward in a consensual or to achieve consensus on where we need to move forward and what actions we need to take to achieve 100% open access and to also support that change of culture. Thank you Susan and thank you all for answering to this. If we can have like few minutes even if we are over time I will take the last question here in the chat, which is, if we want to have quality data on open access, we need to get all founders to require that research results are deposited in repositories or archives. This applies to research organization, which unfortunately do not have policies in place to require the researchers to do so. Unfortunately, this is very difficult to achieve. How have you achieved this in Ireland? How have you achieved this is is a question and of course we don't know that until we have a monitor that works. But I will say, you know, this isn't the only project funded by the north to working towards achieving 100% open access we have another repositories project which is focused on building capacity in institutional repositories across Ireland. The question will be addressed there. I think the question is phrased in a very kind of top down manner. But I think there's, there's a real need for investment and local infrastructure and local support to increase the capacity of repositories. So that I think will be an outcome of the north and the same with the mention of data site. The north is also funding the development of a persistent identifier roadmap and there is a persistent identifier task force in place. So the monitor is not the answer to all problems. This is very much a multi faceted. If you want to call achieving 100% open access problem or ambition. So I think the great thing about the north is it's actually very joined up in terms of its approach and funding the different pieces so when we haven't gotten to the point where there is 100% adoption of repositories across Ireland but we are aiming to increase the capacity of repositories and there's investigation underway looking at that. So I'd encourage you to look at that that north project as well. Thank you Susan. I don't know if Natalia would like to mention anything for the next. Thank you Susan. I think it's a it's a holistic approach. You know the monitor is just the tip of the iceberg you know all of the rest is what makes the difference. Perfect. Okay. Thank you so much for everybody to have been today with us in this webinar. If you have any questions or you would like to reach us out to please feel free to do via our channels social media or desk we will be very happy to hear from your specific case and to build a monitor for you. And we are looking forward to show you the next steps in Ireland. So thank you so much to Susan and Jake for the collaboration so far, and see you to the next meetings for the RPOs at a full. Thank you to the researchers and the university and the research centers in Ireland. And we are looking forward to the improvement of this monitor. Thank you so much. Bye. Thanks a lot everyone. Have a care the rest of your day. Goodbye. Bye. Thank you.