 in particular for metrics that are already practices and for PIDs you may have people who have specific systems and we have to understand how this connects to the whole world. Yeah, okay. This is where the unexpected adverse consequences can happen. Yeah, yeah, exactly. So I think I'll move us on again to the next slide. So just for some general comments on the stakeholders or others, I don't know if anyone maybe thinks there were certain groups missing there, or perhaps some of the ways in which we're consulting or, you know, doing this work currently doesn't engage certain groups well enough and that we need to rethink, you know, how we connect with scientific communities or how we try and avoid those adverse consequences. It always takes a bit longer to think of comments you might put here. Yeah, academic publishers, that's a good point. We hadn't put publishers in that list of stakeholders. Yeah, and I think that is a good point about the RIs and the clusters there. They're useful as brokers but not for all. I think this is one of the reasons I'd maybe try and think about the end user as well, but I'm not, you know, trying to make sure we're connecting with people even if they're not working through certain infrastructures. Okay, so I'll give this just another minute or so. Global stakeholders. Yeah. I think it's important to try and make sure we're aligning and also flagging the RDA there as well. Yes, young people are important because they are more easily knowledgeable about things which happen on the internet, but it's not so easy to find them and to make, to manage, to make them talk, speak up. Coming as I am from a library as well, I'm glad to see that someone's highlighted the role of research libraries and all of this as a key part of research infrastructure. Yeah. Yeah, and I like this comment here that we've got the right stakeholders, but it's useful to dig a bit deeper to think about the specific roles and the needs. And the management of universities are important in the metrics question. Yes, they are research organizations. So we have to take into account all the research and building your organizations, including universities. And we have a comment in the chat as well on the research infrastructure as a service providers. I think with a lot of these groups there are, you know, there's overlaps between these stakeholders. So what we wanted to think about next is the kind of information that needs to be gathered and we've put some general options here to kind of choose between, you know, the feedback on implementation, the kind of challenges that might be encountered when people are trying to apply the policy, the pit policy in the metrics, any kind of unclear or missing aspects, things we've overlooked or how we can improve on on that. Well, there might be other things that you're considering as well, either from having looked at the documents already or some most votes here coming in for challenges and recommendations. There's no person with a thought on other. I don't know if they want to know what it is, either a mute or pop and chat. Well, I think we actually I think we've got a free text question in a minute that we can pick that up on. Yeah, so most most coming in here for the recommendations I think yeah that's the most important thing really trying to understand what's not working how we can improve the policy and the metrics. It's built on feedback from implementation and challenges encountered anyways. Yeah, yeah. Yeah, it's interesting to see those two quite close together because you would probably expect that most feedback on implementation people would want to give would be challenges in implementation essentially. Yeah, yeah. Or things which do not work. It can be not a challenge just something which doesn't work. Yeah, yeah, exactly. Okay, and I'll just move us along with the best ways to gather this information. So we've we've been using a number of a number of different methods as we've been doing consultation so far. So actually in terms of us we've we've been putting the documents up for review and getting written responses or we've presented at events and run run kind of workshops and we're planning webinars. And other groups have done surveys. We've also had interviews when we've developed the interoperability framework. And one in the current eos structure, there's obviously the governance board as well as the exec board. So there's ways to gather feedback via the governance board members as well. It'll be useful to try and know which methods work best for people and try and focus on those more in the future. So we've got here the community workshops going up most highly working groups with targeted members invited to those. Actually still an open question on on how we, you know, set up the structures for the eos going forward at the moment we've got these working groups and I think that's that's worked very well for developing the policy. Maybe we need some kind of continuation of that for the implementation phase. So groups that will address the metrics and look at reviewing these aspects. Yes, it's clear that it seems that the combination of working groups and webinars, more, more open webinars is something which could be researched. Yeah. So Sarah, are you able to comment on the approach to inviting members to the existing working groups for this round just to give people a bit of context. Sure. So so the way this was done. Currently the governance board. Essentially every country could nominate somebody to the different working groups. And most of the nominations came in that way. Some groups change that slightly. So I added a number of members for the fair because I mean we had a good representation from the governance board but there were certain skill sets that I wanted to make sure were included and also balance in the geography and career stage and gender balance. So I added a number of members as well. And within the architecture working group, there's various representatives from different projects. So it'd be good to think about how we target those groups. You know, for me, it's really important that people have good knowledge of the area, you know, so around the pits and and the metrics. And that's why Rachel and François have both been involved and various other members. But yeah, I think it's, it's good to get feedback on how we structure those in future as well. One thing that's intriguing me about this poll is that the structured consultation with written responses is a lowest because that's mainly what we've been doing so far. So possibly that isn't working well for people. I know sometimes when you get a big document to read it's hard to, you know, hard to compose your feedback. So maybe we do need to do more kind of workshops on webinars to try and gather in the thoughts or to to pose specific questions around what we think are the most challenging aspects. I would say also that we have to maybe facilitate the answers. By allowing people to put notes in the document itself and so on. Yeah, exactly. That's been one of the challenges we've had. There's been a reluctance to put out Google documents. So when we're publishing PDFs, it's a lot harder to comment directly on the text. So I think we maybe need to, you know, revisit some of those questions because people find it a lot easier to just annotate something directly. I also see a comment from Hilke in the chat, the fact that other maybe also data on usage, and I think it's really well put. Yeah, yeah, definitely. That's really important for the metrics. Yeah. Okay, so I'll move it through to you. It said on mine that Anders raised his hand. Anders, do you want to just unmute and ask him. Yeah, thanks Rachel. Yeah, I just wanted to add to that, to that discussion that I think it was, it came up somewhere that it was important but the channels were for these structured consultations that that the pit forum had worked well and we got loads of response but some of the documents that had only been communicated to the Yes, Secretariat had got very little response so that's probably something to consider. Yeah, definitely. Actually, that's if people might have comments to give on on that in this section now that we've got kind of a free text answer. But one of my observations. I think the pit forum it's, there's already a community of pit providers or people who are engaged in in pit infrastructure there. And I think that really helped us with the pit policy because it's, there's a ready community who would want to comment on that kind of document. The platform is quite new. I don't know how many people here have registered for the liaison platform, but I get the sense that maybe we're not reaching people in the same way when we've put other documents out. So if you have observations on, you know, where you expect to be consulted like where, where are you looking for news about EOSC, you know, what kind of mechanism should we be using to get these documents out to you to comment on. The existing communities you're in, like the research data alliance or go fair or through the research infrastructure that we should be using more as well. So if you've got any general comments on, you know, how we should be engaging what, what kind of information we're gathering anything on these last couple of questions that you want to add by all means add that now. So if you're thinking about that, I was just going to mention that obviously on the previous slide we had quite a positive reaction towards the webinars and workshops and as a little plug on the 10th of June will be running a consultation webinar for the second version of the I haven't put information about that out there yet because we just decided on the dates a few days ago. So I will try and push information out about that workshop but it was on the 10th of June for people to hold in their diaries it will just be an hour hour and a half. So look out for that. Yeah. So we've got a number of comments coming in now so yeah I think one of the things we really need to do is work through existing organizations so research institutes not just the university engaging the clusters or existing global community platforms I think I think that's really why the PID forum had worked very well because it's an existing platform people are already on. And I think the point about you know making it easy for people to comment I think that's really important. Yeah and I just I sense the frustration there's so many documents already available. I found this too there's so many things to comment on it can it can be very difficult with the amount of content coming out of EOS shared interactive documents practical interactions. So this forum continues to remain open so I'm going to just move us on but you can continue to add comments in here if more things come up later. Yes, I must admit when we were composing the questions this is one I found quite hard to think of an answer so it might take you a little while to to reflect on this but as we've mentioned the current governance of EOS skins at the end of this year. And think about which body should be in charge of maintaining the PID policy and metrics going forward. So we've already identified there's a need, you know to check this to gather feedback. How do we do that, you know, like is there a particular existing group or or mechanism or what kind of group needs to be set up to to maintain this policy and metrics. Yeah, so RDA or possibly the EOS scam. ASB ASB of the legal entity, working together with W3C. Yeah, or task force to be set up by the governance. Non EOS members the RDA should be part of it. I think one thing I'm always concerned about is having the people who are a kind of implementing something also being the ones governing it. So I think we need that probably needs to be some overlap but I think we also need some separation there so yeah research community should be represented in this this governance or the donor foundation. I think that's important on the PID side of things. I think it's really helpful for us if you know different organizations we should be considering here because there may be groups that we should be engaging with because I think we have to set up some kind of task force to review and maintain these things should be some kind of working group structure like, like we have currently. So the oversight of the process should be with the EOS legal entity. Yeah. And I think that that legal entity will, you know, will need to form some kind of groups. So, and I guess my question is the extent to which where you have persistent identifier and metric certification policies to what extent they're done at the same time because there's some overlap in terms of ideas or how you would approach each of those. And I guess it would, there would be benefit in doing them at the same time but there would also be benefit in doing them separately. Yeah. It's true that we see that knowledge, as you said before is important. So to have some kind of specific something to deal with each of the types of recommendations is certainly a good point. Yeah. Now, there's one one comment here which is taking us on to our next question so I'm going to move on so somebody's noted here an independent neutral body could work in close cooperation with the EOS legal entity. Our next question is actually about those characteristics. You know what does this governance body or task force that we set up whoever's maintaining the PID policy or metrics. What kind of characteristics does it need to have so independent had been mentioned already there. We've also thought, you know, perhaps we need to think about representation and if we're thinking about that representation of, of what groups is it, you know, making sure that you've got relevant service providers or scientific community or diversity again in in what ways might that be important. So if there are particular things that are important to you in terms of the governance of the policy and metrics. What, what are those aspects so independence has come up here, in particular from the service providers or from commercial interests, making sure it's sustainable. That's really important. International and domain diversity. This is actually something we tried to do when we were composing the fair working group initially because I think fair itself you know means different things to different communities so we tried to make sure we had people from different disciplinary backgrounds and different career stages for that reason. Not for profit a bit like orchid and underrepresented communities. That idea of people, you know, the underrepresented was something that is quite important to me in terms of the feedback. When we're looking at how the PID policy and metrics are being implemented. We find certain communities aren't engaging Guinea or maybe we've set the bar too high or maybe things are not applicable to them. I think we might need to probe into, you know, those communities are underrepresented and try and understand why. So again that's an important thing to consider. Sorry, there is a comment in the chat on transparency, which makes sense here it is. Okay, yeah, from Antonio. Yeah, sorry, I'll scroll down. Yeah, so again independent from commercial interests being transparent. I think that that's one of the criticisms we've had initially about the working groups that people weren't sure how they were being convened so I think trying to be clear on the processes and how somebody gets a seat at the table I think is really important. To say informed by current usage. But yeah, and that is a good point too. There are many good points but this one is one of them. Yeah, and I think actually this is the idea of how we convene this kind of maintenance board or task force and the characteristics it needs. I think that's something we should be putting clear recommendations on in our final document. So this is really helpful feedback. So again, I'm going to just conscious of time I'm going to move us on but by all means keep adding comments because they will be in the mentor that we download at the end. Now one of the question that came up in our work is about the fair principles themselves because this is like the main basis of our work. We've been around for a long time and potentially you know they may change or maybe they're fixed they won't change. But we had discussion do do we need some kind of governance of the fair principles themselves because that's inherent in our work is what everybody not even just the Oscar other countries and other groups are basing their work around. So currently most people saying yes they need to be governed. Few people saying you know they're not going to change so there's not really an issue there. A few people saying other we're going to go on but I in a second but I don't know if people who say other want to put comments in the chat or speak out. Most, most votes here for the principles themselves being governed. Or perhaps but you know not urgent. So this penultimate question we have here is around how they should be governed. There's a comment in the chat actually from Keith saying it's not the principles themselves that need to be governed, but their implementation. Yeah, except that when there is a discussion about criteria and implementations people begin to discuss on what the fair principle what is written in the principle. Yeah, I think I think this is sorry to add interpretation in addition to implementation. Exactly that was going to be my point so I think the principles themselves are pretty static. But actually when we've been looking at what it means for different communities to be fair. People have very different implementations or interpretations of what that means. So I think some of the context around the principles is really what needs to be governed not necessarily well I mean, there may well be changes in the phrasing of the the fair principles themselves. And I think we do need to make sure that we're all referring to a canonical version. So, so that's important but but yeah it's really that understanding of what it means I mean who determines that this text string means a specific thing, or means a specific thing in a certain context. So Andrews has mentioned there are already several versions without governance. This is actually why this question emerged for us I mean there's not really major changes between but I've found it difficult to know what is the published version. So there's a version on the force 11 website which I think was the first publication of it, but they've been publication since and there are slight tweaks and I think this is really why the question had come up for us about the governance. So looking at who should maintain the fair principles most people are going for an international group. Yeah, I think that's really important because they've gone very much beyond, you know, beyond the EOS context or beyond the kind of life science community where it started. I think it needs to be some kind of very broad group that would look at, you know, the definitions and the implementation there, brands putting chat on RDA working group. Yeah, I think it needs to be some kind of, you know, open forum, really to look at the governance and Rob just but a clarification force 11 was a consultation before publication. The interesting comment is that there is a suggestion it's useful to have an AOSC body to maintain AOSC interpretation of a principle, which is what we are supposed to see now to do now and I think it's good to keep that in mind also. Yeah, exactly because within EOSC, you know, fair is a set of rules that we're following so that this has really been my concern that we need to be clear on the version that we're using and what our understanding of it is. So I actually, my preference is probably towards an international group but I think we have to be clear about our EOS context as well. There's another comment in chat Mark say maybe make sense to think about this as a fair culture and set of stories that are told and we told and nuanced. Yes, it's a kind of thing we said in turning fair into reality. Yeah, we need stories. Yeah. Yeah, exactly. So our final question to you is really a catch all to try and think of things that we haven't addressed here in terms of how we go forward with the PID policy and the metrics and the implementation of those. So is there anything that we haven't touched on in this session that you'd like to pick up on other things that need to be addressed in terms of the governance and maintenance or other questions generally that you had before you came to this session. Let's give people time to think it's sometimes hard to formulate your, your questions or comments. Yeah, use cases we need examples of usage and problems. That's maybe something we could try and set up because we've talked about you know how we're gathering the feedback, but I think if we, I mean within the EOS work plan we've focused on two specific use cases for this phase of work up to 2020. But maybe we need to ensure that we have clear use cases from a range of communities to guide that feedback on the policy and metrics. I think that the comment on certification is something which is already taken into account since we have a separate document about certification. Yeah, so actually we didn't clarify that at the start. So the, well, I think you did say François, the task force at your chairing covers metrics and service certification as well. We're only picking up on the metrics here because there's a fair workshop on service certification as part of EOS Cub Week, I think on Wednesday. Tomorrow. Oh, is it tomorrow? Okay. I think it's tomorrow morning. Here negative answer examples help by defining what's not to be done not working. Yeah. But again about metrics and certification one of the points is that it should take into account many different things not only data. And for instance certification of PID services should be put up on the table probably and many and services and not only data. For a certification. So this means lots of work, which has to be done. Yeah. And I like to see trust because trust is at the core of certification and of any system to share data and open science. Yes. Yeah, exactly. I was just looking at that comment as well. Yeah, I think, I think having the community respect and trust is a really critical element for EOS, you know, people won't engage with and use EOS give it's not trusted connecting to Chris communities. Yeah. Yeah, I think there's a lot of current kind of providers or services that would need to engage who funds the governance structure. Yeah, that's a well I mean currently, you know, this is community supported. You know the governance board members and exec board members and all of the working groups are people volunteering time. The terms of reference for such a body. Yeah. We have just about 10 minutes left actually before the break. So I'm wondering if anybody wants to raise a question or speak out as well if you do by always raise your hand. If there's other points you want to pick up on that we haven't really discussed in detail. Andrew has a comment in chat here generally EOS school be a journey along road to handle research data properly. I think this is this is one of the concerns I have that you know, but I hope people don't expect that at the end of this year, everything's fully functional. You know, it is very much a journey. What we're doing within the working groups the kind of policies and recommendations we're putting forward. Kind of the baseline to get us working but then there's a whole phase of implementation and adoption that's needed. So we've developed a couple of use cases within our work plan to essentially demonstrate that EOS quirks and start to build from there. But initially that will be focusing on open data there's many more challenges around you know sharing sensitive data sharing data under certain conditions, and all of this takes time to develop and get working well. Yeah, and just to say this is why we need the governance. I think the governance is really important so that there's a forum, you know, to gather that feedback from swaths and Rachel other things you'd like to pick up. I was just thinking about this question about trust and actually it relates to a couple of the comments we've had back to the question we've got up on the slides at the moment around actually with persistent identifies to some extent we can measure trust. So someone mentioned about checking that persistent identifies actually still resolve in 10 years time. I think that's a very big element of whether people actually trust persistent identifies is that they are persistent in the kind of timescale people are thinking so actually there are ways we can measure that trust. I think the other element of trust is about evolution. And it's that when new things come along that are useful that we make sure that people can use them. And, you know, while we have policies that are very much based on existing workflows technologies and what works now. We know that's not going to be the same in five to 10 years time. And so again it's about this kind of willingness to evolve what we're saying from now into the future essentially. Yeah, there's a calm. Sorry. I was just going to say I was just going to say there's a comment in the chat from John Claude Bergman, and that we should concentrate on the overview and governance and let the providers do the work on the ground. And I think actually the the follow on from this we can't organize everything on a semi benevolent basis is really important. So there are, I mean there will be a legal entity established we will be looking at the the funding model for EOS but I think you know there needs to be investment to to run the basis of of EOS. And I would like to comment on the comments about global partners from Keith. I have seen that we have participants from far away. Thanks to you. It's useful. And I think you have seen that maybe you are not the only one to care about global aspects. Very good you were here and to to be involved in the conversation, because when we want to share that it's global and this I hope everybody understands that this requires to be to have a, I think that the discussion about having a an EOS point of view on global things is also meaningful somehow. Yeah. Yeah, and Keith also notes that the disciplines are global. Yeah, which is really important research doesn't know those geographic boundaries so we definitely don't want to be building any Oscar silo it needs to be something that works, you know, across all of these different infrastructures in different continents or countries. I think that's why you know engaging through groups like the pit forums been really important because and we've connected with, you know, the Australians and the US and people in Asia as well. So, so working through mechanisms like that or through RDA or go fair I think it's really critical to try and get that international reach. So just commenting again on the use cases and success stories adding additional vote for that I think that's, that's really the critical way we need to gather this feedback in and make sure that the policy and the metrics continue to be relevant. You know they are things that will need to change in future and should change and we shouldn't be be worried about, you know, making amendments to those. Yes, going when you have success stories you also need to know what doesn't work. So it's counter part of it. Yeah. Okay, well I am tempted if people don't have other comments I want to raise now I'm tempted to give us an extra five minutes of coffee. Just to say a huge thank you to France cars and Rachel for, you know, essentially convening the session pulling together the questions we wanted to ask about the metrics and pit policy and how we govern that in future. We will, you know, take all of your comments from the Mentimeter so that can feed into the recommendations we make in this work. And just a reminder that we do have, you know, these papers out for feedback so if there are comments you haven't made yet by all means do have a look and let us know if you think we're going in the right direction or what adjustments are needed. I also have a comment for trust IT. Do you, can you save the chat also because there are some comments in the chat not only in the Mentimeter. Yeah, yeah, exactly. I'm pretty sure we can. But yeah, we'll make sure we get that somebody's also asked about the Mentimeter results if they can be shared. Yeah, they definitely can. I think I might actually be able to share a link just now that lets you see that. Let me kind of exit the Mentimeter. Oh yeah, here we go. Let me go to share presentation sharing link to live results. I'll just pop this in the chat, but we will also make sure that we make this available, you know, the results with the slides and everything else. So thank you again for all of your comments and feedback and for engaging and I think we have a half hour. Well, we have a 35 minute coffee break and then we're back for the second round of sessions. Yes, yes. Yeah, go ahead. Yeah. So, yes, now if you're finished, you can leave the breakout room and come back to the main room and just have your break practically. And so you will be reassigned after in the new room. So please leave the room now. Thank you. Excellent. Thank you very much. Okay. Thanks everyone. Bye bye. Bye bye.