 So, hello everyone and welcome to our second training session of GraspOS. Here today, Leonidas Pispirica from Open Air Monitor represent the service Open Air Monitor for 30 minutes and after you have the chance to make some questions if you want to. So, Leonidas. And now we'll start his screen. Thank you. Thank you for the introduction. So, as I already introduced me, I'm Leonidas Pispirica. I work for the Open Air Monitor and I'm the technical manager of the indicators for the institutional dashpons. Today's session will be training a first class of the Open Air Monitor service. We will have a look on some general issues on monitoring research assessment and on Open Air Monitor. And then we will follow this, we will follow up with the demo of the service. And we will also discuss a few of the data quality issues in order to have the most the best version of the dashboard with the more enhanced data from the Open Air Graph. So, before I start my presentation, please, you can use the following code for 20 meters in order to answer three questions. And we will discuss them and your answers during the presentation. Let me also put the voting link to the chat. So, our first question on monitoring is why monitor? Why an institution should monitor its research output? We have separated in three parts on the institution by using the thyself word. First of all is to know for an institution to know thyself in order to identify the resources that are used for the research outputs production to identify the number of the research outputs that the research, the affiliated researchers produce to have an open science uptake and to understand it via the dashpons and indicators to identify the collaborations the institution, the researchers of the institution have with other institutions or countries to understand and see the visibility of the institutions to the research landscape and of course the impact of the research production of the institution. And after knowing thyself, the next step is to understand thyself in order to find pathways to gain insights and also to find opportunities to have a better research production and better collaborations to improve also the visibility and its impact to the research landscape. And of course the third part is to position thyself by assisting the research assessment on the decision making of the strategic people inside the institution. They're reporting by having various reports on the research production, the open science uptake, the research assessment and of course it assists the institutions to have a storytelling about the research output which is the number one product that the institutions have for the world. So the second question is why use open air monitor? First of all the answer is very simple, the most important one that it's all about open science and open data. We have inclusiveness, transparency, replicability through our service, then we have full coverage of open science beyond publications as we also cover data set software and other research products. First we have the link science where we have links between publications and research data and other research products. Additionally in open air monitor all the indicators that we have developed have been co-developed with all the stakeholders and the institutions that are involved have a dashboard in the open air monitor service and our goal is to have indicators that make sense to all. And of course the open air monitor is fully embedded in the ESCIFRA structure, we can start from the content providers and stating also to the included metrics. How? What are the methodological principles of our service? First of all it's openness and transparency. Everyone are openly and clearly presented. I want to excuse me only for two seconds. So can you hear me now? Yes, thank you. Sorry for this. So after the openness and transparency we have the coverage and accuracy. All the indicators, the dashboards are based on data that we collect for multiple authoritative data sources with proven AI driven mechanism. We use as a data source the open air graph and we will talk about this in the next slides. All the indicators are well documented, verified and are used by the scholarly communication community as they have been co-developed with the scholarly communication community. And of course our service is costed on a big data infrastructure with operational workflows and this way we warrant timely results. So open air monitor uses as a data source, as a source of data the open air graph. The open air graph is a basic service of open air and where we aggregate the research outputs from several data sources. And here we have a chart stating the pipeline, the open air graph pipeline process which goes through every month in order to produce a new open air graph. And the pre-processing is conducted after the aggregation through enrichment by mining and by inference, the duplication processes and finalization processes in order to have the final product which is used also from third party services. And of course open air studied value services as it is open air monitor. This is the coverage of the open air graph. We have several data sources that we aggregate research products. And of course this is a simple slide where we have the infrastructure that is hosted in the university also where we host all our services. So for the open air monitor it is a dashboard on demand. We have dashboards for institutions, funders, research initiatives and of course we have countries and for each dashboard there is an open air expert that is in contact and creates and interacts with all the organizations in order to create and have the dashboards. So we talked about dashboard on demand. What does dashboard on demand mean? Okay, let's start from the bottom and go through the top. As we have already stated we have the open air graph where the research product, the products of an organization are a subset of the open air graph which is a common global asset. Through the open air graph the open air experts in open air monitor service have initiated community led in collaboration with the community, the scholarly communication community, dashboards of indicators on institutional openness, collaboration, SDG contribution, publication production of course and several other indicators which we will see also in the demo. Then we see we create the default dashboard for the organizations and after this we are in, we interact with each organization and we create their own truly dashboard that they can use with their teams. So the indicator themes will go through also on the demo. The basic themes are of course the research output where we have publications, data sets, software and other research products broken down by the interdisciplinary field of science. The open science theme where we have the indicators of fairness, indicators on access sites and open access routes, general business models, article processing charges and planning syndicators. We have metrics for indicators for the collaborations of the institutions via the participation in projects and via co-authorship or co-creation. And in the impact theme we have the rich and frequency with the download and citations and the sustainable development codes. In all the themes, the indicators are broken down by several fields of interest and according to the respective theme, these fields of interest include research product type, domains, the field of science and technology over time of course, countries by data sources, by funders and several other fields of interest. Regarding the dashboards, the future of the OpenM Monitor and the dashboards, we provide user friendly dashboards where we have interactive visualizations where you can zoom out, you can see the data that is used for the visualizations. You can export them or to include them in a report or even a website. You can also download the data and the visualizations in different formats. We offer filtering capabilities, for example, filtering by year. We also offer browsing via direct links to OpenAir Explorer where we can see via the browse data the research products that are used to create the indicators. And of course we provide partial editing of visualizations where you can change things like the height, the width of the charts or even the order of the charts. Of course also you can control how you share your institution or dashboard. You have three options. You can have the themes and indicators in public, restricted or private mode. Public mode is open to everyone. Restricted mode is only for team members, only for users that are registered with an account in your institution and dashboard. And this is mostly used in order to check in the metrics indicators with the inside team of institutions and indicators that you don't want to show publicly to everyone. And of course there are also the private indicators. For example, indicators that are progress or you don't even want to share with any team member and only the managers of the dashboards are able to view these indicators. All of these are customized and validated in one-to-one sessions with every expert of the institutional dashboards. In these one-to-one expert consultations, we discuss and finalize the dashboard, change the dashboard features, the editing and the user management. We get indicator feedback about the metrics and indicators and also we can discuss and develop custom indicators that you might want to have in your account. And of course, we discuss the data quality and the validation, the data sources that we use and the duplication process of the organization, which we will talk about in a later slide. So before we go to the data quality issues, let's have a demo of the OpenM monitor service. In order to save time, I've left out the registration process, which is a very simple process where you just go to the OpenM monitor website and you just press the Get Started button and you get in contact with us. And then we start the process to create a dashboard. So for the sake of this demo, we will use the dashboard of University of Minio. Let's go through the indicator themes and then we will go to the admin menu to see the dashboard functionalities. First of all, we'll provide an overview theme. We have a few indicators stating the overall information about the institutions. And of course, if anyone wants to get deeper and see more indicators, we will go through all the other themes. We have in the numbers of the production, the research production of the institutions, number of publications, data sets, software and other research products. We also have a chart where we have the publications over time. And we compare all the publications which were here. They are columns to the peer reviewed publications and the grant supported publications. You can see also the interactivity as I scroll with as I go through the mouse to all these filters. For example, we can remove the peer reviewed all all publications and we have only grant supported. And of course, there are also zoom capabilities for starts that might want to see further details. We have data sets over time. We have the publication openness over time where we have all the publications compared to the open access and the gold open access ones. The data set openness. And we also have the top 20 collaborating organizations by number of co authors publications. And we also provide the publications by fields of science and technology, the level one of field of science and technology over time. And the publication by sustainable development goals. And for the funding team. On the overview, we have the project participations and the European Commission project participations of the institution, the number of the participations of the institution. The project participations are broken down by funder in the next chart. Then are broken down by the framework program and over time we have framework seven or 20 and horizon Europe. We also have the top projects by the number of the peer reviewed publications. The number of the top projects by number of data sets by number of software and number of other research products respectively. We have a timeline of project participations. We have the grand supported publications broken down by the field of science level one fields of science and also broken down by funder of the grand supporting publications and also by broken down by framework program and over time. In the easy projects by framework program, we have more detail indicators for each framework programs for every seven we have the project participations broken down over time and also by program for the horizon and 20. We have the project, the number of the project participations. We have the total project funding. Here we can note that this amount includes the entire budget of the project, and it's not only for the institutions allocated budget. These indicators are about to we are about to build this indicators in the next months. We have the participation broken down by pillar. We have the top 15 programs. We have the total project funding by pillar and the top 15 programs programs by total project funding. And of course we have the highest funded project projects, the top 15 highest funded projects by total project funding. And we're currently building the indicators for the horizon Europe. In the research output theme, we have the publications, data set software and other research products. We will go through the publications theme where we have the number of publications broken down by type. We have normalized. We have the article book and book chapter conference on the thesis and all the other types normalized. We have the publications over time. A combination of type. And over time, a timeline of the publications by type. We have the top 15 data sources by number of publications. The top 15 data sources where we collect from when from which we collect publications affiliated to the institution. We have the set of projects that produced publications, comparing all the projects. Participations of the institutions to the ones that have produced publications for each year. We also have the most productive projects by the number of publications. And also we are breaking down the publications by the field of science, the level one fields of science. And also the level one, the levels one and two of fields of science and technology. And here you can also see the interactivity where we can zoom and see more details for columns that are not showing the initial zoom level of the chart. Similarly, we have indicators for the peer reviewed publications. I will not go through with them because they are the same. One difference is that here we have the top 15 publishers with a number of produce peer reviews publications and of course the top 30 generals by the number of peer reviewed publications. So, going to the open science team. First of all, we have a few composite indicators where we have the percent of the openness, findability and fairness score as a proportion and also the proportion over time for the publications. On the open access section, we have several indicators for with the access rights of the publications. We will have the publications by access rights and also by access the timeline by access rights over time we will have the public the open access publications the barcode restricted closed access and the ones that were not able to have any information from the data sources that we have collected them. We also break down the publications by access rights and by funder. There's another chart where you can also zoom in to see the publications by funder on the open access route section. We have the publications by open access route. Here I want to state that for the definition of the open access routes, we use the unpayable definitions and all of these are documented in the resources menu. And in the methodology, the terminology and of course it's one of the indicators is documented in the indicators and the research institutions page. You can see a detailed broken breakdown for each theme and for each indicator. Continuing, we have the publication by open access routes over time. This is a very interesting chart where we present the published versus the deposited the green open access publications over time. Where we have the open access publications that are published in a publisher, the open access publications that exist in a repository, the green open access and the open access publications that both exist in a repository and are published in a publisher. For example, the preprint versions that in many institutions, the researchers, the researchers deposit them in their institutional repository. And we have the top journals by the number of gold hybrid bronze and green open access publications, respectively, and of course the breakdown by fields of science. I will leave the collaborations. I will go through quickly the collaborations where we have the collaborations by country. We have the number of collaborations and the top 20 collaborating organizations by the number of project collaborations and the co authorship and co creations. The number of co authored publications. The top 20 collaborating organizations by the number of co authored publications and breaking broken down by country. Additionally, in the theme in the rich rich and frequency section have indicators on the downloads, where we have the total downloads and the average downloads per publications per publications here. In the other ends we, we calculate the average only for publications that have at least one download. And we break them down over time by the year of the publication, the year that the publication was published and also the time by the year that the download of the publication. We have the total downloads over time by access rights by access route. These are the year by year the year of download by abstract availability, the ones that have abstract and ones that are not. We also have the top 15 journals by number of publication downloads from repositories because these downloads are mostly come from repositories that connect. That are registered in open air and have installed the open air plugin to show that we retrieve the downloads. We have the most downloaded fields of science and the broken down the breakdown by fields of science and this will be for the citations. You have the total citations and the average citations per publication, but also broken down by by time. We provide also the year of the publication here and of course the year of download in the next charts until the charts load. Let me go back to the open science in the publications section and check also the general business models section where we have the open application by general business model. We will have the full open access publication with the PCs, the diamond open access publications and the ones that are existing a transformative general. And of course we have also the top 15 full open access with the PCs diamond and the ones that in transformative journals for the article processing charges. At this time, all the indicators include for the PCs include publications that have at least an affiliated co author. These PCs are not the ones that are paid by the institution. It is a process that we are about to add these indicators according to the reports that the institutions will have in open APC. We have total APC's and average APC's per publication over time also broken down by fields of science by open access routes and of course we have the top 15 general by total APC's and by average APC's per publication. So let me go back to the presentation because I think that we have passed the time to continue and discuss a few things about the data quality and the validation. One of the important steps for all the institutions that need to take after they create after they create after we create the institution and dashboard and of course in one on one session with us. This one is to disambiguate the organization in the open aux platform of open air where the you need to the duplicate the organizations, the organization names as an organization institution might appear with different names and this happens according to the data source that open access can aggregate its metadata and of course you can you also need to identify any parent child relationships regarding departments, schools or other parts of your institution. And the second step is to to to provide and register your data sources your institution and data sources and repositories and open access journals or Chris systems you have at your institution. By registering and validating your data sources in open I provide you enhance the data quality of the institutional dashboard. This is a simple screenshot from the open aux platform. As you can see for universal menu in the bottom of the screen so the different name names that this institution appears. According to the different data source that we have aggregated metadata. And why register your repository in open air, apart from enhancing the data quality of your dashboard, you ensure that the open access publications from your repository are accurately reflected. And you facilitate the comprehensive coverage in open a monitor the prerequisites to register your repository in open air. You need to expose your metadata records via the idea of the image protocol and comply with the latest version three versions version three or version four of the opening guidelines. The benefits we have visible and richer content. You gain insights from the tail user statistics through the open air monitor and also through open a provide dashboard. The option to enrich the metadata of your repository records as we have the ability via the open air broker to provide you with enriched metadata for the records of your repository as we may find richer metadata from other sources that we collect. Research products. You can simplify open access compliance, promote effortless content dissemination via our services. We have the link science links from to several research products. There is an up to date collection. Of course, everything will be fully embedded in the eosk infracture and this will help to have accurate and qualitative metrics. And before we click before I close this presentation. Let me state that the open air has been awarded this year. The contract as a result from an open public tender to develop a national open access monitor based on public open data. To analyze and track the progress towards 100% open access for Ireland. We're in the process of creating this dashboard and of course, one of the parts of the dashboards that will be part of the national open access monitor of Ireland will be the institutional dashboard, the research performing organizations dashboard. Thank you for your time. So thank you Leonidas for delivering this presentation. I think in the presentation of this description of the dashboard of University of Mignon was very helpful for us to understand what the open air monitor is capable of. So I need, I want now to for our audience. If you have any questions so please feel free to raise your hand or submit it into the chat about the open air monitor. Thank you. So anyone. Until someone have a question. Let's go through the maintenance questions and the answer received. Just a minute. India has a question so. Yeah, hi. I just want to thank you for this presentation. I was just wondering the figures that you give investments in projects. So these budgets. I haven't heard it. Well, where do they come from because that's something that institutions, not easily share is it from the EC themselves or where do you get these figures. They come from a European Commission from Corda. Yeah, I can do the list of the detail data sources but I think the best data source that we received is. Okay, the corda platform for the European Commission. Okay. Also from from funders that provide information about the projects. Okay, thank you. Thank you. We have one question in the chat if you can see it learning this. Thank you. Well, you can. Contact us in order to see we have the option through open a connect service to have a community. In open air and through open a connect service we can also have a dashboard for the UR network. But please get in touch with us in order to discuss further details. As we need some information about you and collaboration in order to set to set this up. This is something that we are already doing. So when there are such a big organization that have more than one universities, we usually create a connect. So a community dashboard. And then basically the indicators are the same of the monitor for institutions. But in that case, we measure the network openness indicators. Thank you. Now, if you see the next question. Yes, regarding the duplication process process through the open orgs platform. As we collect as an open area we collect the metadata records from the data sources we also collect the affiliations. These affiliations might come for for each institution with different names. We keep them separately in the open orgs platform and the administrator of the organization can see. The proposed names that the platform have initiated the mass duplicates and state that it is indeed a duplicate. This name is for this is it for my institution. It cannot be fully automated because the parts where an institution is not an institution name is not the same. Though the name might be similar. That's why we don't have it automated. And we have also training sessions for the administrators of the institutions to use the open orgs platform. And yes, the dashboard and the can work also for small research organizations. As long as we have research products and research output. Of course it works. Thank you. And anyone else. Okay, and we can see them results from the mentee. For the first question that we see. Most of you answer that you are aware. You have the research monitoring and assessment activities in your institution, but you lack awareness of the specifics. All by the ones that are well informed about the specifics and. The third option is for the ones that are not aware of. Easy to join situations engagement in research monitoring. In order to have a dashboard, it would be. Good to if you. Get handle then this possibility to have a dashboard and open a monitor also to. To get aware about the specifics of the activities of your institutions and maybe you can organize the team. That run these activities inside your institution. In order to be part of the dashboard of your institution. So, regarding the indicator themes at the first place we have. The impact indicators followed by the funding and research output production. Then we have the open science and the collaboration. Okay, I believe that all the things are important. The qualitative. Research assessment and monitoring. Of your institution's research output. Publication type and here and on the additional metrics that you would like to see. Included. For the institution and dashboard we see. Was metric open science metrics that are not about publications. Okay, for now we also provide the metrics for research data software and other research products but if we can discuss in detail. And metrics that you could propose to have. Research data. Okay, we also provide metrics for research data. Citizens science activities this very interesting metric. We will keep it noted down. And investigate it. And normalized indicator by research fields. Publication type and the year. This time we have the field of science and technology broken down. We have the publication types. And also the year. And also have some indicators regarding fairness, but we are on the process of improving. These indicators. Okay, so. And do we have any other questions. More. So I will send here and the data link and for a feedback form. So if you can give us your thoughts. So, I will send it in this and tomorrow maybe or something so if there is any question we can make this session. So, thank you all for participating in this training event and thank you learn it us again. For your valuable contribution. Happy holidays. Thank you all.