 So, welcome again everyone to this workshop on national policies supporting the US implementation. My name is Federica Tandongo and I'm from GAR, the National Research and Education Network in Italy, which coordinates the ESPILAR project where I later were part of the dedicated move from the national initiatives to Transnational Service. This will show the next slide. This workshop is jointly organized by ESPILAR and it's three regional implementation sister projects funded in last year in pre-EASC, called ESC-NARDIC, NIFOS Europe and ESC-SYNERGY. We will start the workshop with four speakers that will introduce these projects and briefly present some highlights from their findings, the findings of the landscaping works, that are a flag of answer to the policy state of the art. So, we took for ESC-SYNERGY, PDIANA percent of which for NIFOS, Volker-Bekman-Kürt-Liedler and Trost Rasmund Senna for ESC-NARDIC objectives. So, going somewhere starts for knowing where you are and that's why when our project started, we devoted so much effort to get a picture as accurate as possible of the current situation. As we all had the same objective, we also set up at us first to collaborate and make sure the information we'd collect would be comparable and we are in good company with this landscape effort and we believe that the work done by ESC project will nicely complement the landscape report together by the landscape working group with whom we collaborated in this month. It is exactly from this collaboration and discussion with them that we first thought about moving from a snapshot of the situation to living indicators and that's why alongside the four regional speakers, we have a fifth speaker in this session, Ignacy La Bastida from University of Barcelona who will share the experience gained interlearn project on designing KPIs in this case specifically on research data management policies. This last presentation is intended to introduce the second part of the session where we will be asking European about which indicators could be used to measure the year's readiness at the level of member states and monitor its progress. We will start with a working list of possible indicators and collect feedback through the slide do pull them. During these days, we had so many time that and during these years actually that ESC is a process and not a product and we believe that there is a need to find indicators that can help us assessing where we are in this project in this process and what we still miss to go to the next level. Throughout this event in the in the sessions that we attended, we open at buzzwords like monitoring KPIs, indicators and many more sign in ins. So I think it's not just us but there this need is a shared one and a lot of people is working or giving some doubts to this topic right now. And also the good attendance to this session seems to confirm this doubt. So we don't want to provide an out of the book solution of the box solution to this need today, but so we want to start again a process to the final one with your input and because this process won't end tomorrow, you are very welcome to get in touch with us and participate also in the next steps of this work either through the US secretariat liaison platform or directly contacting the projects that cover your country. Now I'm I'm going to leave the floor to our speakers but before that I have just a few words for housekeeping Rob this next slide. So to participate in the discussion you have to know speakers we are going to talk one after another but we will take questions from the audience after the presentations. For this please use the Zoom chat to ask your question or provide your feedback. For the sake of keeping our schedule if we get a lot of questions and feedback we only pick some but don't worry because we are going to follow up with all the questions and comments we get later. In the second part you'll get more space and you will be asked to answer a number of questions that we will use to refine our working hypothesis for this part we will use Slido which by now I expect you'll don't hold any secrets from you anymore and you will see that many questions will see free text comments so you'll have the opportunity to elaborate if you wish to do so. If we have time we will pick some of your feedback and ask you to comment. A way to make this easier is that you add your name in the comment if you are asked to speak. I or one of the coaches are going to unmute you so wait for us. After the event as I said you are very welcome to get in touch with us and these are the links to find more info and contacts. Thank you and Rob move please to the first presentation. Can we add the ODEX slides? Okay do you hear me? Yeah fine thank you good morning I will be the one the first one representing the one of the IOSC 5B project. I'm representing IOSC Synergy. I'm from the Czech Republic of the Maserik University and Cessna which is an international event within the country. Please next slide if you can or can I move it? Yeah fine just about the IOSC Synergy. The idea behind the IOSC Synergy is in fact to help to expand IOSC capacity and capabilities by leveraging investment and existing know-how and resources of national digital infrastructures through piloting selective thematic services. So what we decided was to select few thematic services which span already several countries and to make these more widely available and also to use this as test cases for all the barriers which stay between delivering such a cross-border service and also using such a cross-border service. So we are interested in the landscaping. The primary goal is to identify barriers to the international collaboration analyzing them gaps and providing them recommendations to actions to be applied at the national level and also eventually at the international level like harmonization of the policies. We are covering seven countries plus Brazil but the landscaping doesn't include the Brazil and in fact I mean the project covers eight European countries but Germany is covered by another IOSC 5B project so we are not collecting data from Germany. And the apart from the first goal which is the barriers for the international collaboration the second was to map the IOSC awareness and to get more understanding on the situation in individual countries which are partners within this project. The next slide. The actual landscaping exercise which was happening during the first half of this year and is now finalizing with individual reports from all seven countries. And if you look at the countries themselves you see that we cover small countries like Slovakia or the Netherlands or Czech Republic but we have go to large countries like to medium to large like Poland to large like Spain and UK which means that we are not a homogeneous sample of countries within Europe and this also means that in our case aggregate numbers don't properly reflect the situation correctly because especially in quantitative indicators naturally the large countries will prevail and the small countries without a proper normalization will simply be hidden within the reports. That's also why we are providing individual country reports and not too much focusing on aggregate. We use different approaches in individual countries from questionnaires for example run in Poland, Czech Republic and some other to collection of data available or either publicly or from previous questionnaires run with slightly different reasons. So this also makes data about reponsiveness within the individual countries not too much meaningful sorry there is in such setup because especially for the disc collection of the data the responsiveness of individual partners doesn't make sense at all. So what we also did was that we focused on large research infrastructures and funders. The research institutions like universities they were not always covered. Some of the countries did it some of them they left them out because the time span and the situation within the country was not favorable for this. And the next slide yeah the results as I said because of the heterogeneity and because of the different approaches to collect the data results are usually not fully quantifiable but we have some clearly quantifiable results which directly goes into the EOS awareness and eventually then related barriers like the data from Portuguese survey which shows that we have reasonable but half and half familiarity with EOS against something similar to what the participants on the survey thought as the EOS effect and benefits with the benefit the situation is better because it is almost three quarters of all the participants and however the perception of their own contribution to EOS is rather low and the similar data are from the Chiquilla we have the data fairness that they consider their data fair somewhat or very much just 17 percent and also if you look at the EOS benefits they go from 58 percent in natural sciences where they see the benefits to just eight percent in social sciences which means that in this area the EOS awareness is very low and the potential benefits are practically unknown and the qualitative findings from other countries are just at the same level so what the EOS synergy findings can or how they can be summarized is that in general in the countries which we are covering there is still rather weak awareness of EOS especially among the actual scientists not speaking say about some policy bodies and there is also a low awareness of goals and there is we also specifically asked about the adoption of open and fair data principles and the adoption is still weak so these are areas where we would like to focus our own future work to actually contribute to increased awareness of the EOS of its benefits and also specifically to contribute to direct involvement in EOS activities because this is clearly seen from the Portuguese result contributing to EOS is very low and this is just a sample but this is general in other countries as well and there are huge differences on one side Netherlands and UK which are much more well and ready and on the other side for example the Czechia and Slovakia when the real awareness is extremely low at this moment and this is from the EOS synergy thank you very much thank you Ludek we have a question for you in the chat do the respondents state what they expect to benefit the benefit to be yeah to some to thanks for the question to some extent yes I mean as EOS synergy is focusing for the on the cross-border collaboration we tried to ask them and there was a help in international collaboration and support so that they will not be forced to go every time through individual harmonization of policies security related stuff and so on between individual institutions which are collaborating say within a project or even informally so but this is a natural for our project because this is also the goal but this came even from the country why it's always removing barriers from the international collaboration more than helping them access to some new resources I mean there is always call for new resources but this was not seen as something which the EOS is I mean it was more neutral but but but they were it was clear that the major problem is complete heterogeneity between countries okay thank you our next speaker is Diana please you have the floor okay let's just move to the southeast Europe so I'm presenting the NEPOS and Rob this is almost the second slide please go to the first one yeah and now to the second just a few words about our project NEPOS is supporting a development and inclusion of open science initiatives in 15 countries in the EOS governments and also the aim is to build the national level bodies that articulate open science clouds work in the on the country level and also on top of that the one of the goal of the NEPOS is also to providing technical and policy support in onboarding the already existing services but also to build a future service providers so last summer all our partners are very active so at the end we collected more than one thousand stakeholders in all our countries and the numbers are showing in the map and this data set is of course really available and it is already deposited in Zenodo so a landscape survey has been running in the last three months at the end of the 2019 and we had 57 percent of participation which is not excellent but also not so bad uh there are we have two goals with this survey the first one to have a Sinep shot exactly to have an initial mapping of open science relating stakeholders but also infrastructure services and policies in all our countries at the beginning of the project so we will have something to could to compare at the end of the project but also the goal of our survey is to provide an insight into the local capacities and I could say much more the needs of the countries and partners for the EOS readiness so please go into the next slide one one of the NIFOS task is also to introduce within the community the ESC philosophy and the fair principles of data so and our survey shows and our results are shows that our countries are not very familiar with ESC this is on the left on this graph and of course this show us that there are really need to facilitating the adoptions of ESC and much more about fair principles through training training which we will plan to do during the our project the situation is a little bit better so especially in those stakeholders which we called supporter and of course open science facilitators within the country they are a little bit better in familiar with the fair principles but not enough so it show us that we will have a lot to do on this task but Rob please next slide since this session is about national local science policy I will share what our results shows so if we when we ask our in our survey about the policy we use policy not just if they have adopted some let's say very legal document we also ask them if they have an open science strategy or plan or we also ask them if they have some kind of memory of understanding of the partners in the countries because some of our countries that say Slovenia is the best examples they really have a good open science strategies and more than that they are working according to this strategy but what we realized during the survey that and also looking for the data out of the survey which means we noticed that in the registry of open access repository management policy roadmap they list it is listed only 41 policies in our country so which means they will have to help our countries in this policy task also we noticed that in some of our countries and the best examples are Romania and Moldova the open science is part of the open government plans which means they have a strong strong help of the government and we expect of them some use top-down approach such we already done in Serbia because we already have adopted national local science in Serbia and Montenegro are planned to do is in the same way which means top-down but on the opposite there are a good example as in Greece because they created a national open science task for which is completely bottom-up approach and in this task for there are participations of open science stakeholders across the countries and they are supporting by the general secretariat of research and technology and they are planning to release the open science plan from for Greece so the countries are of course difference and the good point is that all our 15 countries have the letter of support from their ministry they have to provide it before starting the project so and we will do our best to push the government to ask the government because in majority of our countries the government the ministries are the one who are taking care about research not just the policy legislative but also for the funding and we expect some help also from that side all information about policy we we collected they need some verifications and we because majority of policies are in local languages which is not understandable for all but with some help of our of our partners we will try to understand them and also help them to make those policy better and in that direction the NEPOS is just in action an already formed guided interview which will make the situation about policy much more clear and much more understandable also in the last few months there are few actions that some of our countries did not have representatives in ESGB but now some of them succeed to have it such as Macedonia we push all our country to be part of ESCA landscape working group report since they all of them fill the country sheets and we think that it will be good to include our country also in other open science relevant project such as open air or research data alliance so this is all activities from the NEPOS project and I think the next slide is just a thank you screen okay we are a bit tight on the schedule so I'd move to the next speaker who's Volker Beckmann and keep any questions for later okay good morning while we wait for the slides coming up I'm Volker Beckmann I'm working for EOS Pillar in this context and I'm working for CNRS in France so I want to show you some results from EOS Pillar on the landscape survey next slide please so what is EOS Pillar it's one of the 5p projects so it's coordination of national open science efforts and in this case across Austria Belgium France Germany and Italy so it contains the whole package of ensuring the contribution and readiness for the implementation of the EOSC helps use cases to to put in their the examples into the EOSC it works on on policies it works on business and governance models and tries also to help the countries to align their activities with the EOSC so landscape survey activity so we were the first survey to start among the 5p projects so we launched the survey already in September 2019 so thanks to AUSTA that's the Austrian social science data archive we had a very professional approach so they really helped us getting on track with a machine readable survey so what I would call multiple choice so you you you don't have mainly free text forms but really things you can analyze in a machine readable matter manner afterwards was very focused we we tried to keep it the survey very short so less than 20 minutes response time and that was a lot of discussion and reducing the number of questions we wanted to ask and we iterated the questions also with many colleagues from the different target groups in order to make sure that everybody understands what we are asking which is not always the case so we had at the end we had addressed 2200 targets so stakeholders and we grouped them four categories e infrastructures research infrastructures universities and funding bodies and we got 27 percent responses so the previous speakers had 57 percent is not so good but I was very impressed with more than 50 percent response so so we didn't reach that so the analysis report is under review right now so it will be published as Anita Bortos and collaborators 2020 and will be available under this link which which I showed there which is already active and I want to thank the colleagues from Vienna University KIT University of Ghent and CNRS in making this survey happen so what are the next steps for our work here so we want to use these results in our activities on national initiatives to transnational services and the results are also used individually in the participating countries to prepare our national EOS associations and actions next slide please so I didn't write this here but a general remark on the on the EOS readiness so the fair data or the fair principle awareness is quite good across the countries was quite high let's say around 80 percent whereas the EOS awareness is surprisingly low in some cases so it's not so bad for funding agencies and e infrastructures it's not so good on on the universities and was for me surprisingly quite low for research infrastructure so we indeed have some communication issue to solve there so results on policies so we asked the funding bodies what kind of grant conditions do you do you apply so in order what what is necessary in order to get funding from from your agency so 36 percent require open access to the publications 30 percent require data management plans and 32 percent require that the data are open and less than 30 percent require that the data are also then available on the long term so I think takeaway points here are that there are rules in place and even if there are rules not yet in place they are usually strongly encouraged and this is really an ongoing process and I think these numbers are going to go up quite significantly in these countries in these five countries over the next few years for universities we ask are there is there an existence of written regulations and or policies so if we group this together then 21 percent of the universities require long-term availability of data and this is stronger requested in Belgium and Germany 19 percent require data and certified repositories this is stronger requested in France and in Italy and 80 percent require fair data and here Italy is a bit stronger in this field and 13 percent require that the data are open and here Italy France and Germany have a stronger demand for that so concerning universities as I said before the EOSC awareness is not so high so also the return in answers was not in all countries so very high so I think that's really something we have to keep in mind when we look at these numbers um so across the countries universities have most frequently adopted policies and written regulations uh and on average the percentages of universities publishing formal and written regulations of policies is almost as large as the percentage of universities with informal regulations okay so this are just the the few points I want to make and if you have further questions I'm happy to answer this and you can put up the next slide that gives a bit the the results of the the two points I mentioned as a graphic thank you thank you Volker now we move to our forward speaker from the regionals trolls this the floor is yours yes thank you hello Ega and my name is full task and I come from the games e infrastructure cooperation and I'm part of the Nordic next slide so um as you can see from the map we are covering the Nordics and the Baltic states we are eight countries involved 24 partners including the Go Fair Foundation and the German computing standard of climate research I hope that is correct all in all we are 200 people involved in this so this is a rather large project in the Nordic context the project is about supporting alignment of national policies and practices with EOS in a cross-border environment this is something that we are accustomed to in the Nordic context I was involved in the implementation of GPR and aligning our national regulation and policies while implementing new regulation is much easier than doing it afterwards so we are always very keen on doing that in the Nordic region also of course we are trying to promote and support the update of practices across the Nordics many institutions and scientists request that we help support them in this and we will do our best to accommodate that we also want to increase the discoverability of Nordic and Baltic services through the Osh portal which means that we are trying to make tools to to support onboarding of services and I believe that was presented in our session yesterday in order to to to promote YOSC and to get fair practices out we are establishing what we call a knowledge hub which is a system in which we can support new services and communities with information on research infrastructure providers there are these approaches open science policies and use cases guidelines and recommendation and user experiences will also be openly accessible on this portal we also looking to accelerate the progress of EOS by piloting a solution in a cross-border environment specifically on cross-border data sharing which is something that is always of great interest in the Nordics we are small countries we usually have good data but not a lot of it and the more we can accumulate data across borders it helps and we have experiences with that we are looking specifically on climate data and on on sensitive data we are supported by the Nordic infrastructure collaboration organization on the Norfolk and which again is an institution on the Nordic Council of Ministers so we do have links to the policy level with the project and we will communicate all our findings and results through those communication channels next slide so we have done some desk research on our open science policies in the Nordics and the Baltics and all countries are in a transition towards open science but at very different stages I believe it's only Finland that has unadopted policies and strategy other countries are somewhere along the line of actually implementing the overall strategy and looking at specific topics like fair or data management and again other countries are in the process of planning how to structure a strategy process one thing to note is that stakeholder stakeholder involvement varies considerably in some countries the ministries and funders are involved archives are involved at least in Denmark we have had a considerable involvement from the ministry of science who has actually sort of helped get the universities together and discuss our fair policy and approach towards ESC also other ministries because of the PSI directive have been inspired by what goes on with ESC and fair data management in how to structure the way that they can transfer their data to scientists and citizens what we also have found is that across the Nordics there is a considerable issue regarding how to align what is access policies towards data and research infrastructure and how that is covered through any infrastructure resource provisioning there are usually some legal or contractual arrangement regarding access to HPC and storage and or allocation mechanisms that are not aligned with the principles of fair and ESC and that needs to be addressed we're also looking at legal aspects on cross border sharing of data and we have done that considerable for the last 10 15 years so we do know a lot about some of the issues cross border sharing of sensitive data is at least in the Nordic context more of a policy issue than a legal issue the legal framework is very similar but how it is handled nationally is not and the dilemma is one of either you have a very flexible system and discretion on part of the data controllers or you have a very clear regulatory framework and very rigid the usability of that data and those things are very hard to balance next slide please so the landscape survey that we took over with the help from Jaspila we sent out the survey to 200 people or 200 institutions and we got 106 respondents which is a reasonable response rate but it differs a lot on the individual questions and with four categories of respondents across eight countries we're really some places where we're really few respondents so bear that in mind when I go through the results the respondents 80 percent of the respondents are familiar with fair that only applies to 50 percent of respondents with ESC four out of four 14 out of 15 funding bodies has rules on open access for the grants eight funding bodies impose rules on open research data also half of the higher education institutions and the research infrastructures and the e-infrastructures have open access policies and open research data policies but that varies according to the grants specifically on the level of private funding or grants towards industries there will have other regulations usually only very few organizations have informal guidelines to watch fair and only a very small fractions have actual formal rules and frameworks for the scientists in those institutions so in general this and this is also something that we're here there's not a strong coupling between open science policy and intangible guidelines for scientific practice and this is also some of the things that we are trying to help support to the just noted project we want tangible guidelines for scientific practice next slide yes you can check the project through the links below and thank you very much and look forward to a nice discussion thank you very much tools unfortunately I see there are quite a lot of questions but we only have about 10 minutes left for this first phase of the session so I'd give the floor to our last speaker and then come back to the questions if we have time otherwise we follow up through the chart or later please acknowledge the through this viewers thank you thank you very much for the invitation of participating in this workshop I mean as a last I work at the university of Barcelona and I was involved in this project learn that was as I just mentioned a project aimed at establishing a policy model of policy for research data management I work in the in the library I'm in charge of all the open science business and that's why I was invited here too because I was in charge of this kpi in this project but if we can pass to the next slide just a brief introduction of the project that project was funded through horizon 2020 it lasts for two years from June 2015 to May 2017 and as I as I said the main goal was to have this model for a policy on research data management especially for research institutions and also there was a toolkit developed that's available for everyone in the website of of the project that is still alive and the toolkit was to support the implementation of this model and the model was inspired by the Leru roadmap for research data as as you can see here we we we were five partners four European partners and one from Chile from Santiago there were three universities in Vienna University College of London University Barcelona and also the liver that is the network of research libraries in you well when we start the project we were always thinking that we will need some kind of indicators we will need some kind of information evidences on how the the implementation of the policy was going on but before we also wanted to provide our institutions any research institution with a tool to have also some kind of information so kind of indicators of how ready they were the institutions to work on research data if we go to the next slide the first deliverable we we produced was this survey we didn't want to have a typical survey and to gather information from different institutions what we wanted was more to provide institutions with a self-assessment tool so the idea was every institution could go to this survey initially in English but also we provided in Spanish to know how they were doing in the different topics here we we choose 13 13 topics that we struck from this roadmap for research data and we decided that for the institutions would be good to know if they were ready they were on the way or they were not really ready and that's why we use this traffic light system so red yellow or green and as you see here we try to tackle all the different aspects when you are facing or the challenge of making a policy so if if you had a policy if you had a leadership the roles information dissemination infrastructure evaluation of course all the legal status the selection of data training and finally the idea of open was in in planning that was initial the initial tool we had and we received almost 400 answers so we know that there were all over the world people using this self-assessment tool especially in Europe but also in Latin America thanks to our partner in Chile well the the project was going on and we had a model the model is available also for everyone to have a look and I have to say that for instance the University of Barcelona and others universities have chosen that model to implement our new policy on research management and as any policy we thought that we want a policy for something so we needed to find some indicators if we were achieving what we were looking for with the policy if we go to next to the next slide once we had the the policy we decided to look at the indicators and the key performance indicators that could ensure us every individual institution that we were successful or not on what we were doing in our policy and again with the experience we had on this assessment tool we thought that we would need two group of indicators one the indicators aligned with the work of elaborating a policy so it's not just the point that to have a policy but also if we are ready for that policy and to implement the policy we thought that it's not just having a policy we need infrastructures around the policy we need services around the policy we need maybe cultural changes around the policy and probably we need also to measure if we're going on the right direction and afterwards of course we need some indicators about the implementation how good we are doing once the policy is placed are we achieving the what we were looking for because to have a policy is easy but the most important is to know if we are successful on implementing that policy so we have clear goals if we go to the next slide we'll see just another view of the KPIs for the elaboration so of course the final indicator is yes we have a policy if the policy was aligned we propose in the model the services the roles the dissemination the cost infrastructure all those are things that we can measure we can think about when we want to establish the policy in this case on research data management if we go to the next slide we'll see the KPIs for the implementation and here again is what we are aiming at for instance we are asking in the policy data management plans so we need to know how many number how many data management plans are being elaborated once the policy is implemented we if we are want to know how many datasets are restored which ones are published are shared if the researchers are engaging the policy if they are using our facilities or not if they are attending the training sessions the cause of monitoring the the the implementation and of course if there are updates and reviews needed or required because one of these indicators are not reaching the desired values if we go to the next slide you'll see all these indicators in a list and we didn't just go want to have a list as you can see here we have different KPIs and we also link that with the toolkit we have I mean we expose in the in the toolkit different best practices different cases that could help any institution to develop research data management problems so we link the KPIs in these themes we also link with the measurement what we want how we're going to use these indicators what is the unit of measure so for instance if we are talking about services so the number of services also we will propose some expected values and it was also open to every institution I mean we can put here well our expected value is to have two people working in that of three people so to measure if we are reaching those expected values and also we explain a little bit what was the rationale of this measure that was the idea for the different KPIs and in the in the next slide please you'll see that we also had an scorecard and that was the idea to be helpful with to be a helpful tool for the institutions well once you have your KPIs when you have how you're going to measure which is your expected value you can start the scoring you can do that every six months every three months every year and then you can also again with the idea of the traffic lights to have this idea if you are red, amber or green so how far you are in this KPIs in these indicators and once you have done the first review of your indicators you can then update your policy or make some reviews if you want to especially if you are you have a lot of reds then maybe you you have to think about having some changes or changing the strategy we thought that this this idea of the indicators is a good tool for measuring the success of a policy as I said before we think that it's important to not just place a policy but to have an idea of what are we looking for on with this policy and a clear measurement of these achievements if not it will be just the document that will be signed but it will be no possibility to monitor the implementation of the policy so that's all that's what I wanted to share with you and I'm now happy to answer any questions you have thank you thank you Ignasi for this very inspiring success stories so I think for the for the moment we we don't have time for for questions for taking questions but we can come back at the end also because after us there is a break and so if someone is interested to stay I suggest we we take 10 more minutes for specific questions to our speakers now I'd like to move to the second part of the of the session which is the one where we ask our very lively I would say from the from the charts audience and so please Rob can you show the the slides and oh okay questions thank you so everybody please join Slido with your several other two and select this room which is the national policy development supporting Iosuka so let's just give some time to everybody to join since a lot of people was ready and he responses in and counting okay I think that while a lot of people are finished to vote we can take one more question from the audience so there was an interesting question about international research infrastructures does the other one to comment no okay if you are shy no problem so while we are we are voting maybe I can say a few words about what we want to do with this with your input which is to working well first of all this question is to see whether we are on the right track or not because it seems yes if I have to to look at the answers which is a good thing and then in the next questions we will propose a list of a working list of possible indicators so we identified while we were doing our landscaping effort so the idea is that that's the view we got but we should refine it talking with stakeholders so the next step would be to engage with the governing board members and the national initiatives representatives and community representatives and the idea is that we will work on this hypothesis and then get back to the working groups of the executive board some of them are very interested in this indicator approach and we believe as that as a regional project we are in a good position to contribute to the task of creating sensible ones also with the community help of course so just one more minute to vote and now let's move to the next question please okay since overwhelming majority is yes it would be interesting to see what the nose had to say yeah there are some interesting comments in the chat by the way so I'm just taking some the indicators should be the same for each country otherwise the picture is skewed and incomplete I totally agree with this approach and that's why we are here and not discussing at home with our friends and another comment indicators what really matters is to know how much I would sort of scientific data is produced how sorry there are too many comments in the chat and they move please Rob can we move to the next question I think that the minute has elapsed okay so this is quick because I only saw two no so will please those who answered no elaborate what would they use if anything instead of indicators I think that was yes there were two and by the way we also have some questions coming in from through Slido oh yeah would you like to briefly show those as well while we're gathering okay thank you let's do that perhaps I can answer the the second question okay please go I mean the results the data will be will be public at least for your pillar but I think it's a general approach of course it's in the context of the open science cloud so the the data include also a breakdown by country in your spiller server we have only included this information if it would if it's significant because if the numbers are too small then of course a breakdown by by country does not seem to make sense yeah if you if the person who asked was interested in a wider picture as I mentioned in the chart there is at this stage there is a process to put together the common information we collected through the four surveys and this is a part of the US secretariat co-creation activity so this should come in the next few months and be in time to feed in into the the next version of the landscape report and there will be a breakdown by country as well well I answered so can we go back to the poll now well thank you okay and there we are so we only have one so here's there's a comment okay they should be the same for each country and in the question so there was another comment about using them to monitor and not to provide votes to the countries yeah okay I think real adopters yeah that's actually part of the of the indicators we we are considering although as you may know it is quite difficult to define what an active user and a real adopter is but we are open to to discuss this so Rob will you please move to question three please so which of the following should be responsible for tracking the indicators yes COVID-19 sport the ministry at the national level national open science initiatives or other things you may want to add can I ask to mute your microphone so if it's not needed please thank you okay I see in the chat there's quite a lot of suggestions so and we'll make sure to to collect them all if they as a cloud it makes it a bit of a strange response now okay yeah it is strange you can see that some things are recurrent like national and perhaps I can feed some information from the back end so aside from the the majority which is as you can see the national open science initiative there were some that also add aside from the the options mentioned above the Governance Board Ministry so the EOS AISBL the EOSC association yes mostly those there is also yeah so the part there where you see legal that's the EOSC legal entity and then they would pass the information to the relevant stakeholders and the ministries okay I think for for using this data we need to make a bit of polishing in the data collected but still okay there are a few words that are significant here and give us an idea of what you think one more minute to vote and then let's move to the next question okay it seems that there is a parallel discussion on who is impartial and the answer is probably no one is so does this implies that we that this imply we need to to have more than one assessing the indicators this is a question to the audience meanwhile I think we can move to the next okay so who should have access to this measurement in your opinion should it be public okay the first one answer this seems that the first answers demonstrate we are really in the middle of open science while we are voting do we have any other questions from the from the slide do yes yeah okay bring up the new one so let's keep people one more minute to vote and have a look okay I know that's okay these are more comments than questions okay that's fine let's move back to the to the question and I guess we can move on because it's pretty clear that with some variations everybody thinks they should be publicly available and verifiable so how often should we read this this monitoring okay a very sensible comment seems to be that this depends on so also on the particular indicator and we have another comment it depends what the indicators would be used for if for instance they would be used to inform strategic decisions there might be sensitive information that's not to be shared with everyone this is a comment from the chat Valentina would you like to elaborate on this while we are voting shall we unmute him yeah okay otherwise even if he wants he can't yeah yeah sure I know I mean it was just a just a last moment comment I thought it depends on the type of indicators and what they they would be used for so I think in principle I would like to see all indicators to be publicly available and accessible to everyone but I believe that you know first of all you we would need to to define the scope of them and and and understand why they would be they would be needed that's that's why I thought you know you may want to restrict access depending on on the indicators themselves and and yeah and and about the the the purpose of that that use and can you make some examples do you mind no I mean no I don't have examples I mean I just thought perhaps as I said as I was as I wrote so if the idea is to provide I mean to use these indicators by you know to make some national strategies maybe yeah not I don't know perhaps you want to collect information about member states yeah I don't know really I don't know it's it's all very theoretical at this stage so I don't think I yeah okay now I think what we can conclude for your comment is that there might be some sensible sensitive information we may want to exclude from the publicly available data thank you let's move to the next question please so we are here proposing some categories that we believe would be relevant to measure country readiness so you you can select all the ones you think are relevant meanwhile there was an interesting comment from date about machine read readability which would allow the to continuously update the monitoring which is sounds a very very good suggestion and very in line with the use can fair environment thank you very valuable so there is another comment from Patricia about sustainability missing from the categories you'll have the the chance to propose this in the next question but yeah this may be a sensible suggestion okay last few seconds to vote on this okay will we move to the next slide please so now you can add any categories it's that that you can see missing from the previous ones though the though the case asking if sustainability is not part of the organization and governance well it could be or we can believe it is important so important to to be a category in its own right the next question would be how do you assess sustainability because and this is a quite difficult question in my opinion you will see ludic ludic says that sustainability can be taken as a synonym of to funding i'm not sure this is the case because sustainability means stability of funding in time would you comment Federico it was naturally a very rough simplification but on the other hand from my point of view and in my experience whenever we speak about sustainability and we forget the funding then we are just in clouds yeah and i see it funding as is becoming the missing it is the funding is covered if you look at the sub ballots for the governance and an organization the funding is there as well so that's why i just put these two together definitely sustainability is not the same as funding but it's a very important at least relationship between yeah there there is a clear relation between the two now the problem is how we are going to assess this but i think it's time to move to the next slide so now this we are presenting here a group of indicators we identified as possibly significant for the architecture category you can select all of them if you want so please this is a multiple choice and you can select as many as you want so i see why we are voting i see a comment from costas known of the both seem to capture actual usage by end users i think we can unmute costas and let him comment on this oh but i want what i wanted to say is that while all these are really nice from the provider's point of view or from funding agency point of view it doesn't show at all how many what is the actually usage you may have quite a few e-degradations and of all or and run services or slas are available but if you don't show it that you know the sla is actually put into place that it's satisfied it's successful that the there is actually usage behind it and the slas are fulfilled now perhaps the only one that comes close to what the what the to actual users is the searches but even that is not indicative because one can search and not do anything but it's one first step but i would like to see you know adoption of the of existing of an old services um users of an old services things like that okay so um okay usage but how do you define usage and are you do you want to collect usage statistics or are you thinking about i don't know usage satisfaction satisfaction or something of the sort or both by the way meanwhile Rob please move to the next question so people can add what's missing federicate just to the use i think it's like i think that the important that that we can use things which the companies are doing when they are in fact collecting data how they are paid for the advertisement did your search actually ended up in going to that page going to the service fine that's good because it it attracted so this is the first level i would say just to make this as a usable indicator okay please um i think this session is going very well in terms of engagement so well we are late so i propose we take up 10 minutes of your break but we have to move quickly so please be quicker with your finger on your phone and because we we now have to move very quickly to the remaining questions so if uh and i suggest we do that that this way that if for the for the questions where you are supposed to provide missing or additional information we can also use the the zoom chat if we don't have time to to fill the the slide do please Rob can you move to the next one so these indicators are for the category organization and governance and again you can select as many answers as you wish okay suggestions are flowing in the chat thank you for this and okay just a few seconds few seconds more to vote okay we can move to the next question please so um this is your chance to provide additional indicators as i said uh uh we'll stay here on the for a moment but if you wish to continue to provide input here you can use the chat please just as a reminder this is the category's indicators if you can't remember from the previous question so okay we move on okay thank you please continue to input in the chat if need be and so these are indicators for the policies group again please select as many as you like meanwhile in the chat there is a suggestion that we consider stakeholder involvement to be a cross indicator for levels which is probably a good idea if we are able to to define exactly how we can monitor this indicator but agree stakeholder involvement should be one of the key points of the year otherwise it's just old friends pushing an infrastructure and there is a comment from Perlov about the funding sources shall we unmute him or let him elaborate okay hi Federica yeah no following the year's week in general i would say that there is a need for a more deep discussion about funding both at an EU level and at a national level also looking back to you know several of the documents that have been produced during the last eight to nine months that funding is something that is seen at least in the way i read it that funding is you know something that is simply expected to come from several sources right but i i'm not sure whether this has been you know actually anchored with all of these stakeholders which you know are expected to fund uh yes right so yeah that's my comment okay thank you since a rightful concern uh shall we move to the next question now please so again if you spot that something missing in the policy indicators please include it here and this is a reminder of what we already had so 10 minutes after the end of the session so let's move on and please okay i see uh this is the these are the indicators for the group infrastructures um i saw there is quite a lot of infrastructure people in this webinar so i expect to receive a lot of feedback here again you can select all if you want and you will be able to comment on the in the next slide meanwhile while we are voting i see in the chapter is this a parallel discussion on stakeholders bearing into different countries which is a problem and one of the reasons why it is not easy to just count them on the other end um we have some big categories and we can work with them in my opinion to to find a way to describe them across countries because as we said at the beginning indicators are not really useful if they are not the same for the countries so that's sort of a problem but i think one that we can work on and uh these is also a comment from trolls about the indicators many of the of the indicators that are not aligned yeah so probably um this is a message for us that really this is the the right time to align some of at least some of them i'm allowing one more minute to answer this one we'll move on the next one okay thanks you go to the next one please and uh as we are short of time can i ask everybody additional indicators which by the way it's for the infrastructure categories and not policy categories there is an error in the in the slide do okay thank you if there are any other suggestions please use the the questions to to provide them because we are moving to question 16 now at some point we have to leave the the room to the others i'm very sorry because this is getting really interesting and there is quite a lot of participation and by the way 10 zero for that but we really need to start closing this and we have still a couple of questions so training at these are the indicators for the for the training again try to be quick when booting Chris we we made a note of your point of national on national federations thank you for this one more minute for training so have your your say now or well you can also have your say later and contact us so we'd be happy to to follow up so this is not your last chance after so shall we move on please so again as for the the previous one on the a few seconds put this and so we move to the next one meanwhile we got another input in the chop thank you trolls sounds a good one by the way i believe that on certification we we could add some more things in the indicators so let's discuss this after the session because this is something we barely touched so this the next question is about certification which company and today we've given the last comment but we are intending this here in a more broad sense so the idea is whether you see that we need a formal yes credit certification for services or organization to be part of it some sort of minimum level to reach or if this is something they could just self-assess and if yes who should be so there is a question what does your credit mean well if we did our job with indicators well that you you have a level of proficiency let's say at least some of them that allows you to jump in at day zero but that's open to interpretation and it is not easy to define the the right set of things that can make someone or some organization use credit that's i think the the the whole point of this session so i'm allowing just one minute to complete this then we unfortunately we won't have time to wrap up but we will be in touch with you if you want and because we want to do something with all the information we collect and input we collected today so Luciano is commenting that it is very difficult to agree on these because the the devil is into details yeah you're right the devil is always into into details but we are not agreeing anything or signing with our blood anything today we we only want to know whether this is something we can work on or we don't need it and so we can work on something else okay i think we are at the end of the session and probably people's energy and we probably need a coffee so can i ask Rob to move to the next next and last question and leave it open for comments so that if people have additional comments after the coffee they can still fill in okay this will remain open unfortunately we have no more time for discussion and questions but we do have 10 minutes for coffee so i'll thank you everybody for your very enthusiastic participation and and well you won a virtual giveaway because you will receive our regional project booklet that we just published so thank you again everybody have a nice day and a nice persecution of this very interesting event thank you and bye thank you