 The seminar will be presented by Ioana Grigpatti. We'll do the presentation of the OpenAir Monitoring Institutional Dashboard, and we'll show you the features and the recent developments made in this service. And Leonida Pespedingas from OpenAir will do the demo of this service for you to get more acquainted on how to use it. So I will also like to raise and give you some news for our next webinar that will happen on the 13th of July. And it will be on another OpenAir service, uses counts, uses a statistic service of OpenAir. So if you want to know more about this and the recent developments made by this service, please join us. You can go to OpenAir portal on the webinar section and you can already register yourself. So after the webinar, we will have a Q&A session where you can address your questions and comments to the speakers. And now I keep the floor to Ioana and Leonida. Thank you so much. Hi. Thank you, Paula. So as Paula said, my name is Ioana Grigpatti. I'm a former econometrician and current data product owner at OpenAir, which means that I worry about indicators and data all the time every day. And I will give a brief presentation about service. And then Leonidas, who is the institutional dashboard guru, is going to give you a live demo. And hopefully nothing will crash. OK, so let's start. I'm assuming you can see my screen. So the OpenAir monitor can be found at monitor.openair.eu. It is a monitoring service, big surprise. So the goal, when we built the service, we wanted it to be used for the monitoring and evaluation of research activities in a reliable, consistent, and timely manner so that the users can understand the research funding pathways along different dimensions of interest, assess the open science policy uptake, and the impact that they have, the research activities have on the society as a whole, see what works and what not, identify weak spots, reveal hidden potential, implement good practices, showcasing their performance, for example, to funders or to other policymakers, and turn all the data that we have in OpenAir into insights, in an automated way, of course, to lead to evidence-based decision making. And I guess that for everyone who is interested in a monitoring platform, these are the key points that have to be addressed. So we built the platform trying our best to satisfy all of these so that the platform can be used for monitoring and decision-making, analysis, and then storytelling and reporting. So there are three types of dashboards in the OpenAir monitor that are on demand in the sense that you have to tell us that you want one. The institutional dashboard, which will be the focus today. We have 12 dashboards already and nine more on the way. And Leonidas is the co-design expert. And we also have different types of dashboards dedicated to research initiatives and to funders, where for research initiatives, research communities, Alessia Bardi is our expert. And Harry Dimitropoulos takes care of everything related to funders. So we separated the people among the different types of dashboards to really have build expertise on the needs of the particular type of organization. So the guiding principles, of course, since it's OpenAir, it's all about open data. And our guiding principles are openness, usability, and replicability. So what is the monitor? It is a data and visualization monitoring platform, which is built up on the OpenAir research graph. Which includes linked scholarly communications from around the world. And it has different research outputs besides publications, so data, software, and other research products. It is based on OpenScience principles, which means open data sources, open APIs, well-documented metrics, and indicators. We have a dedicated methodology page that we keep building and improving on the monitor. So one can easily see how something is constructed. And also, on the one-to-one sessions that we have, we try to make sure that we cover all the aspects that are relevant for an institution. So we try to meet the community requirements for the different types of dashboards. Basically, our main aim is to build trust in our indicators and our data. OK. All right. So what is it exactly? So we think it is a user-friendly data and visualization platform. And I'm going to list some functionalities that we have. I will summarize them here to give you an idea. And then, Lelandidas will present them in detail in the demo. So there are exporting capabilities. So the data and visualization can be downloaded for analysis, to put it in reports. It can also be embedded in other websites. So for showcasing and stuff, there are filtering functionalities so that the entire set of indicators is filtered by a particular aspect. There is the research product browsing. What does this mean? It means that there is an area in the platform where you can see individually the research outputs of institutions that are behind the indicators. So someone is viewing, you see, let's say, a timeline. There is a jump in a particular year. You can go to this section, to that particular year, and see all the research outputs that are in open air for your institution for that year that have been used in the calculation of all the indicators. And then see if something is missing, what's causing the jump, and so on. And download that table as well. And there is a partial editing of visualizations. And I'm impartial because you cannot affect the data that we show, right? So this dashboard is customized and validated in one-on-one sections with the experts. Now, the level of contact is up to you. So our experience so far shows that we have one-to-one session where we present the different functionalities again and show the indicators. And then we see feedback of additional indicators or visualizations should be shown. And then after the dashboard manager or the user has had the chance to play with the dashboard, we usually have another session to discuss the data quality. Sometimes, for example, they may notice that a particular data source is missing. And then we add it, and then it shows up there and so on. And then there can be additional exchanges if needed. But usually, after two or three, we have a stable dashboard that people are using. There are each indicator slash visualization can have one of three settings that the user themselves decides after logging in. It can be public. So these are indicators that would be used for showcasing. So you tell a potential funder, please go see it. Look at my dashboard over there, and you can see what I have done in the last years and so on. There are restricted indicators where someone has to sign in and be invited by you to view them. So these are usually used for teams. So the manager invites the team members by email, and then they have access to this restricted set of indicators. And then there are private indicators that are for work in progress. So they're not shown to anywhere besides the institutional manager. So what we do from our side when there are new indicators, we upload them in the profiles. The manager is notified. They can change the setting of the indicators if they like them. They can change them to public to showcase to the rest of the world and so on. But they receive a notification. And then one is obviously able to play with the public dashboards of the other stakeholders if they wish. OK, in terms of the indicator themes, I'm not going to cover all the indicators that we cover. Although in the next month, besides viewing a dashboard, you will be able to go to a particular to the resources part of monitor and download a full set of indicators, which will be with the definitions and construction, which will be useful, we believe. So currently, we have indicators with respect to funding, so grants, projects, and so on for an institution. There is different types of research outputs. Open science, where we cover fairness, access rights, article processing charters, and journal business models. Networks and collaborations. Right now, we have some stuff with respect to funded publications, but there's more coming. And impact, where right now we discuss, we have indicators with respect to reach and frequency that have to do with the downloads from different resources, sources, and so on. The road map for indicators that will be completed by early fall 2022 is, first of all, we have a set of improvements suggested by the users of the monitor for the particular ones that we have already. So we're working on this. Then we have a set of indicators on plan S compliance. Indicators that include UN sustainable development goals, so research output by goals and so on. And indicators that include different fruits of science done to FOS level three at this point. OK. Now, in order to really understand what's going on and reveal these weak spots and make decisions, we provide visualizations that break down the indicators by different aspects of interest. If additional aspects of interest are required, so different metadata elements for the breakdown, this can be very easily added because we have a tool that we use internally to create the indicators directly from the open air graph. So some of the breakdowns that we have now is the research activity type. So publications, data, software and projects, time, different data sources, different organizations, at top journals and so on. Just the ones that you will see in the demo are just an example of the ones that previous users have found interesting, but this can be augmented. The data behind the indicator, so priority number one, let's make sure we show something that makes sense. So for those of you that didn't know already, so we show the data that is of the open air research graph as I saw before. And here, this picture represents the pipeline of the open air research graph. So there is aggregation from different major resources, sorry, sources, which is then deduplicated so that different records of the same publication, for example, are merged into one. And you are always able to access all of the records, like the regional instances, as we call them, they do not disappear. Then this is enriched with additional metadata via text mining, okay? These are cleaned. And then they're ready for statistical analysis which is what you see in monitor and our fellow monitoring service, the open science observatory. And also indexes, which is what you see in explore and other services as well and other places. Okay. Extremely important for the institutional dashboard is organization disambiguation. So in the metadata of research output records, we view one particular organization with lots and lots of different names. And these changes, the number of names keeps increasing as we merge more and more records. So what did we do? We developed this open orgs tool, which is super easy. You go and you enter your organization, for example, organization name and then it automatically presents all the duplicates. So organizations that almost have the same name, you go and you say, okay, this is a duplicate, this is a duplicate and you validate it quickly. And then we have also added, by we, I mean our friends at CNR, but an opener, you are also able to identify the parent-child relationship. So you're able to say that University of Athens is the parent. Department of Economics, University of Athens is the child. The minute this is validated when the creation is complete, it immediately goes to the next round of updates of the open research graph, which means that the institutional dashboard that is created with one organization ID, will now contain the information of all the different alternative names that the organization was showing up as. So in order to do this duplication, we ask managers of the institutional dashboard to do it as well as not. So on June 28th at 11 am CST, we will have a training of how to do this where you can also, if you would like to have an institutional dashboard who will give you access and so on. The presentation includes the Zoom link and you can see how you do it. This massively improves the data available at the dashboard because we show, we take all the outputs that are put into the right organization together. Okay. So just the last thing to say before we move to the demo is that as we said before, our priority is data quality and of course the indicator quality as well. So we work closely with the users of a dashboard to make sure that all the major research output outlets are integrated into OpenAir. That the project outputs are correctly identified. So we have data, we have in OpenAir right now 25 funders and we're increasing the numbers and we're using inference to track from acknowledgement in footnotes and so on to track project outputs even after the project ends. And we do, there is a quality verification with the funders themselves as well. Validation curation, let's say. We link the different research products. So one is able to see the links between a preprint and a publication and data and software if it all comes together. And also one can verify if all their output is correctly counted in the output of the country as well. Because a lot of users are using OpenAir in order to look at country level indicators as well. So we make a lot of effort into data quality and we welcome our users to work together with us on this. The monitoring dashboard makes it easy to do so because you can see a breakdown by, for example by content providers and one can easily identify if something is missing. And we make it so that the records are duplicated, clean and visible to you and searchable. So we really want to build trust on this indicator to make the dashboards useful. So I spoke enough, let's move to the demo and there's a link you can see in the slides where you can access a public dashboard as well. Thank you. Leonida. Thank you Joana for your presentation. So now we will proceed to the demo of the institutional dashboard from the monitor service. Okay, here we see the homepage of the OpenAir monitor. First of all, we will login with your institutional account. I'm performing the login with my OpenAir account. And then by clicking browse, you will have the list of the organizations that are included that have a dashboard in the OpenAir monitor service. We are interested in the institutional dashboard here and then you can find your institution. And by clicking manage, as a dashboard monitor, you go to the admin web page of your institution where you have three tabs, the general tab where there is information about your institution such as the name, some internal information that cannot be editable, such as the Elias and index ID that comes from OpenOrcs, index name and index short name. Here you can change the name for your institution, for your profile, and you can also add a description here for your institution. And you can upload the file or share a URL for the log of your institution. And of course you can also set the status, the visibility status of the dashboard where as Johanna mentioned in her presentation, there are three types, the public status of the profile, the restricted and the private status. The same levels also exist in the indicator tab where we will discuss it right after. Now, before we go to the indicator tab, let's see the user tab. We're here, there are two levels of user levels, the managers and the members. The dashboard managers are the administrators of the institutional dashboard and they can perform all the functionalities that we are describing in this demo. They can also invite other users to become managers or members. Members are users that are able only to see indicators and categories that are in restricted mode. And here you can, as a manager, you can invite other users of your institutions. For example, the team you want to create for the internal monitoring of the indicators of your institution. So on the indicators tab, we have the main five, the five main categories. As Johanna mentioned, her presentation which are funding, research output, open science collaborations and impact. Let me mention here that each of these main categories can be set to public, restricted or private as long as any other category and subcategory in each tab and of course in its indicator and the hierarchy follows and the level, the set, the level of the status level, the status follows in the hierarchy. For example, if we set the funding category here to private, there is no one will be able to see except for the institutional managers. We will be able to see the indicators or the internal categories of this or the subcategories of the category funding even if you set them all to public or private. So let's continue to the first category, the funding where we have the project's information and the overview and horizon 2020 grants tabs. In the overview tab, we have two number indicators regarding the institution's project participations and the institution's project participation in the European Commission. Follow up with the graph charts, we have the project participations of the institution by funder and the project participations of the institutions or the European Commission project participations of the institutions over time for the FP7 and the horizon 2020. Let me mention here that all the graphs are interactive. Here, for example, you can hide any set of information from the labels of the graph. You can also set each indicator, restricted private or public as we mentioned earlier. Additionally, by clicking edit on one indicator, you have the option to change the title of the graph and also you can write a description of the graph. For example, if there is a specific chart, you see a large jump on the data from year to year and you have some internal information from your institution that can justify this jump. You can write it down in order to inform your users why you see this jump in the chart. And then when you hover over the chart, you can see the description. You can also change the status. You can also change other information such as the width and the height of the chart. And also you can change the chart type to buy line or bar accordingly. Well, let's continue to the description of the indicators. Apart from the project participations by funder, we also present the project participations and research output by the number of projects and by type of research output that is produced by the projects. We mean projects with publications, project with software, with data sets and other research product. We also have the project publications by open access route, green open access, gold open access, hybrid and bronze open access. And we also have the four charts with the top projects by number of the research outputs, which are publications, data sets, software and other research products that are produced and are affiliated to the institution. In the Horizon 2020 grants, as a number indicator, we have the Horizon 2020 project participations from the institution. We have the total project funding in Euro and we also have the average per project funding on the Horizon 2020. We present the project participation by pillar and by programs and we also present the top programs, the top 20 programs of the Horizon 2020. And we also present the project funding, the project funding by pillar, as long as the top Horizon 2020 programs by project funding and also the average per project funding over time and the top Horizon 2020 projects by total project funding. Here I must mention that, I should mention that the data has been created from the European Commission. That the data is current and there is a constant data exchange between open air and the European Commission. And also we are in the process of incorporating the Horizon Europe project metadata, which is around 1500 projects. So let's continue to the research output category where we have the publications, data set software and other research products accordingly. We have as a number indicator the total number of publications of the institution. We present the publications by type, the number of publications over time, the number of publications by data source type, which have been aggregated by a specific data source type and are found in a data source type. And also we have the top data sources by number of publications and also the top journals by number of publications. And accordingly we present the information for the data sets by type over time, by data source and the top data sources. Here we have only one data set registered for the specific institution on the software tab. As you can see, there is no software for this institution and in other research products we present accordingly the same information. Of course here we don't have the top journals because we are not discussing about publications. Following up the open science category where we have the open access category and the fairness. At the open access in the access rights tab we present the number indicator part we present the number of research outcomes by access rights. Research outcomes are for the publications, data set software and other research products. For some reason the numbers are not shown here. This is strange because and here you can see an important information that the number of the closed access publications for the University of Gettingen are almost the same as the number of open access publications. Which means that the open also aggregates information regarding and metadata regarding closed access publications. Okay, regarding the other charts indicators we have the publications, we present the publications by access rights, open access, closed access and bar when restricted. We also have the data sets, software and other research products by access rights. And we also have a chart, we present the same but over time for each year. As long according to the information we have aggregated. In the open access routes tab we present the publications by open access route, green open access, gold, hybrid and bronze open access. The same information but over time here. And also here we have the green open access separated by the gold, hybrid and bronze open access in order for you to see the level of the difference between the numbers of these publications. We also have the published open access via the deposited open access over time. The deposit open access is the green open access publications where we have the published only, the deposited open access only and we have the publications that also published and deposited open access. And then we present four charts with the top journals by the corresponding access route, top journal by gold, hybrid, bronze and green open access publications. In the transformative journals tab, we present the publications that are in transformative journals, have been published in transformative journals over time both open access and non-open access publications. And then the diamond open access and Article Processing Charges A Proses are the nominal indicators which have, the publications with Article Processing Charges, the number of publications with Article Processing Charges and the number of diamond open access publications which are the ones without Article Processing Charges. In the charge indicators, we also present the information open access with APCs and diamond open access publications over time. Here we see that there are diamond open access publications, but there are a few very few diamond open access publications in numbers regarding the open access with APCs. That is why you cannot see the bar, the column bar in the whole chart, but the interactivity graph helps you separate and see the numbers that you want. We also have the top journals by number of publications with article processing charges and the top diamond open access journals by number of publications. On the fairness category we have six tabs regarding the position identifiers, the abstracts, OKIDIDs, CC licenses, funding reference and duplicates. On the position identifiers tab we present the research outcomes, the whole number of research outcomes, and then the ones that have a position identifier. We have the publications by PID type, the ones that have a DOI, Handle, Publiment, Central ID, and others, and the same information regarding data set software and other research products. At the abstracts tab we have the publications that have abstract or not in the metadata and the same information presented over time for each year. At the OKIDIDs tab we have the publications with at least one OKIDID and the ones with which does not have, do not have an OKIDID. The same information here presented over time and accordingly the same information regarding data set software and wherever this information is available. In the CC license tab we present the open access publications which have a CC license and the ones that do not have a CC license, the same information over time and accordingly the same information for data sets and software. And the same information here on the product publications with funding reference in the metadata or without the funding reference, the same information over time and accordingly for data sets and software. And finally we have the duplicates tab where we have we present a number of copies that are available for publications in the open air research graph and of course these are potentially different versions of the same publication. For example this might be a preprint or postprint version and we have the number of publications with one copy with two copies three copies and following the same information is for data sets software and other research products. So in the collaborations tab we have the collaborations via product participations on the publications tab. We have we present the co-funded product publications over time via funders where we have the publications that are funded by zero or one funder and the ones that are funded by more than one funders. And we also present the co-funded product publications over time via project participations and the separation here is the categories are the participation to none or one product and participation to more than one project. And we also have the collaborating countries by number of funded publications. We present it in a world map here where the map is zoomable of course and by hovering on a country you can see the number of publications from the specific collaborating country and we have the same information for data sets software and other research projects accordingly. And finally the last category the impact category where we have the for now we have the research frequency subcategory and the downloads which have been which are used by the openers user scout service and we have the category the SDGs and the citations are soon to come and to be presented here. Well as a number indicator we have the total number of the publications, the total downloads and the average downloads per publication. In the chart indicators firstly we see the downloads per publications over time by year of download where we present the average downloads per publications by the year that the download has been made. And next there is the downloads per publication but over time of course but here we have the downloads per publications are counted by the year of the publication that the publication has been published not the year of download. Following we have the downloads per publications per publication by project participation over time project participation to zero or one projects and the co-founded publications which are publications that have been participated to more than one project. We also present the downloads per publication with abstract over time by year of publication. Then we have the repositories downloads per publication by access rights. We have the open access and the closed access publication. These are the downloads from repositories for example the closed access publications that are closed access are published in a journal and a version of this publication has been found and downloaded from a repository as a pre-print or post-print and this is very important information for the institution. We also present the same information here over time. Then we have the downloads per publication by open access route. Similarly we have the green, gold, hybrid and bronze open access and the same information over time. Also we present the top journals by number of downloads per publication from repositories and also the top journals by number of downloads from repositories. You also have the charts of downloads per publication by open by access rights which are the open and closed access and the downloads by access rights accordingly and the same information as above but this time over time from 2007 until 2021. So these are the indicators that are currently presented in any institution and dashboard. I must tell you that when we add open air ads a new indicator this will be included in the bottom of the corresponding section that it will be added and it will not change the interface as you might have already customized it through the admin interface and the default setting for each new indicator is a restricted setting, meaning that the institutional, the dashboard manager and the registered members will be able to see the new indicator and you will receive a notification in the app, in the monitor service when you log in the notification windows or in the right part of the page here where myself as a because I'm an administrator, I have a lot of notifications here. I must also show you the filtering functionality which is it's not visible in the admin interface but only in the institution's profile. This is a page of the institutional dashboard where a user, whether it's public or restricted can see the corresponding information that we just described. Here in the right side of the page you have a filtering ability of having restriction between the years in the corresponding graphs. You can put any number you want or you can select for example the last five years and all the charts that have information over time we have the filter that we have just put for years. Also with the option search research outcomes you will be shown with a search functionality inside the research outcomes that are previously shown in the institutional dashboard. You can find here all the publications, data sets, software and other research products that are shown in the dashboard here. And finally we have a methodology page where we have the terminology and construction. It is constantly updated with information for example we have we describe its term from the publication research outcome publication data set from here to even the green, gold, hybrid and bronze open access information also for the downloads. So this was the demo of the dashboard of the institutional dashboard. Thank you very much. Thank you so much Leonidas and Johanna for your presentation and demo of this service and now can institution access to these specific indicators to monitor open science. So I see that Johanna was very active in the chat so most of the questions were already answered but some of the participants are already leaving so we are just on time. But for the ones who are here and have some questions to address to the speakers please do open your microphone and just say your question or comment. Does anyone want to raise a question? I guess there is a question from Ruh. How is it preserved if we create a new indicator on each indicator's page? Perhaps you can speak up we can clarify the question a little bit please. Okay I think the question is if we go another indicator how is the shape of the profile remain the same? If this is the question then the answer is that if there is new indicators they go in the bottom of the profile and again it is not viewed publicly right because they have to be approved by the institutional manager and then the institutional manager can just move it wherever they want to leave it private so it doesn't it's not shown to the dashboard. Kenny yes yeah there were some buttons on each indicator's page like plus mark and create a new indicator and when we push that button a new window will appear and if we make a graph or indicator and then is it preserved automatically on this page or we should restore on local machine. Yes for now this functionality is not enabled for the dashboard manager meaning that you cannot create your own indicator. If you want a new indicator you can contact us we have a meeting and we discuss it and then we create it for you. Okay I understand actually we have a redesign that is going to be pushed to production in the next month and this feature will not be there. All right thank you very much. Okay and does anyone has? There is another question okay how is the total number of publications calculated are those all publications written by authors from this institution or only those publications which are archived in repositories. So we have in the as you see as you maybe saw in the NIDA's demo we have about 50 percent for the particular dashboard so close publications so these are taken from journal aggregators and other sources so the affiliation is in the metadata of either the repositories or the journal. If it is the journal it is the author affiliation if it is the repositories we assign it to the institution and only if it is an institutional repository. Okay hope this answers the question. Then if it's okay for Bogun then Maurici is saying that they're happy they don't have to figure out what to do with the JSON file yes you don't have to do anything with the indicators the functionality will not be there because in any case someone it's for us to deal with no worries. Anything else? Congratulations very good job. So thank you so much I already shared a link for to get your opinion and evaluation form of this webinar and please give us suggestions and comments for future activity or public webinars that OpenAir will organize and taking this in consideration I'll share again the link for our next public webinar on the 13th of July so if you have interest it will be about the users count service that is also used in this monitoring institutional dashboard and in another services that OpenAir provides so thank you all once again for being all here and thank you Leonidas and Joana for your great presentation and we'll see you in our next public webinar so see you soon goodbye.