 there. So by all means, if you have any questions as we go through or any issues, raise those in the chat. I'm Sarah Jones from the Digital Curation Centre and the other person you can see here is Ellen from Dance. I don't know if you want to introduce yourself, Ellen? Yeah, I'm from Dance as you say and I work on open air, mainly on the RDM training work package and also involved in other projects. This is Eudat, Eelsknau and, well, we've worked together, Dance, we've worked together with Mayan Grotsweld and Eliana Grankhauser who are not present here and also Emily Herrmann from Udent who is now also moderator. So just to mention. Excellent. Okay, so what we're going to do, we're going to speak for probably around 40 minutes. I'll give you an introduction to the survey and some of the highlight findings and then Ellen will dig into more detail on what we found and then what we'll do, we'll come to the recommendations and kind of next steps from this. So to give you a bit of background to the survey, this was conducted by Open Air which is a project that both Ellen and myself are involved in and also the third data expert group and I'm here kind of representing the two of those and we ran the survey over last summer, so between May and July last year and it was really intended to help the EC improve and iterate its guidance and approach to DMPs. We've had some questions raised on Open Air webinars about how people could give feedback to the EC and one of the tasks of the Fair Data Expert group is to collate and to help evaluate the DMP template and to help them develop discipline-specific guidelines. So what we did in the survey, we asked about general attitudes towards data management plans and a number of specific questions around the Horizon 2020 template and the support that people felt was needed and we had a lot of help to circulate that survey, lots of e-infrastructure projects like UDAT and Foster and others helped us distribute it and also groups like Libre and the Research Data Alliance and partway through the survey we realised that we were getting more research support responses rather than researchers so we also worked with the European Commission's project officers to specifically target researchers with active projects and to work through year and EuroDoc which are early career researcher networks and that was to help people to help get more responses. Overall we received 289 responses and around half of those were from researchers and 60% of people identified themselves with research support so they could select the two categories if desired. All of the results as well as this infographic are available already on Zenodo and I just wanted to flag this as a kind of quick overview of the findings. What I'll do is give you a couple of highlights from this and then Ellen will talk through more detailed results. So overall there was a positive response to the data management plans. 60% of people thought that it was a positive experience even if they had had some reservations so you can see in this first quote that people had felt it looked like an administrative exercise but actually when they went through that process of creating the plan they found that it made them reflect on the best format for the data or how to make them available and they found value in that process. There were nonetheless some people who felt it was a negative experience overall that was just 16% and 24% of respondents here had chosen not applicable and it's not 100% clear what they mean there but I am taking it as they're kind of sitting on the fence so there are some positive aspects but also some negative ones and you can see we had a number of responses like this quote at the bottom here where there were some things they thought were helpful like it making them reflect on potential issues but they also had some frustrations so sometimes our comments like the template was too long and cumbersome or some of the questions were too specific or too vague. Overall there was a good understanding of fair. We asked participants to respond to a series of statements and this statement that I don't understand what fair means was the one that provoked the kind of strongest reaction. People really disagreed with that so I think the basic principle of fair is really well understood and is something that people appreciate but some of the terminology poses difficulties and I think things like terms like interoperability in particular when people are thinking about how to implement that they get more confused. So what we found was that the language at the moment is providing a little bit of a barrier and there were 40 terms overall that respondents mentioned were unclear to them and you can see that interoperability is the kind of primary term there that came out as an issue. The other terms that posed problems in particular were things like metadata and ontologies and controlled vocabularies and in this quote you can see people are saying it's more of the ICT type of jargon that people are having issues with. So I think it's really important that there's support on the ground to help people understand the requirements and how to implement them and in this second quote you can see that one of the participants mentioned that they've got help from the Swedish National Data Service and without that they wouldn't have really been able to finish the DMP because they just wouldn't have understood or been able to clarify what the questions meant. So I think there's a key role for both the ECs funded projects but also for data centers and for the kind of institutional support provision. The template structure was also found to be problematic and this came out across a number of questions. People found that there was quite a lot of overlap between you know the different elements of fare, the findability, accessibility, interoperability and reuse and this meant that certain questions were repeated or very similar questions so things like metadata came up in all of the different sections and it also meant that the ordering wasn't always that logical with the questions. So people were being asked to explain how they were going to make their data findable before they were talking about the repositories that they were going to use so potentially things could you know already be answered in one question and then they'd find another question asking the same kind of thing. People also mentioned that some of the questions were a little bit precise and that they would prefer more drop-down options and examples and that's one of the themes you'll see come out as Ellen talks. We also asked the users to prioritize what kind of support they would like and we gave them 10 options on a kind of on a matrix and they listed their top five priorities and what you see here is the weighted scores and you can see the kind of top priorities are really about having more tailored guidance that's pertinent to that researcher's context so the top answer was suggesting relevant standards for their particular field and data type and you can see the fifth one down is about recommending repositories that they can use and then the other sec that came up at the top here were about having more examples or suggested answers more drop-down options and discipline specific guidance so really what the priority at the moment is for users is having kind of more tailored support. There were also some options about kind of data exchange across different systems so connecting up different research tools or maybe pulling information in so you could pre-fill aspects of the DMP or push information out to share with services either at an institutional level or with data centers. These were of a lesser concern to people at the moment although obviously some people did choose those as their priorities too. So drawing out all of the results from the survey we pulled out seven recommendations and these are what we'll return to at the end and talk you through in detail but for the moment I'll hand over to Ellen who will go into more of the specific questions and the survey results. Yes thank you so I will go through a number of other questions and answers of the survey and then we will return to the recommendation with Sarah again. So that's why I move on to the next slide. So at the top of the slide you can see that this is about question four that's why it's between brackets. Experience with the DMP template this was a question that respondents couldn't were asked to enter to what extent the statements that were that are also on this slide represent their experiences at the DMP template. Originally this was a five point scale but we summarized it in the report. Another thing that I should mention is that questions could be skipped by the respondents and this one was only answered by 60 so 173 after in total 289. But you can see in this table that the DMP template is considered by most respondents to this question as a very useful template. 105 have said they agreed it is a very useful template so it's rather positive I think. Most of the respondents to this question also agree that there is a clear structure and the questions are grouped into helpful categories there's also agreement with the level of guidance and provided contextual information. Now as mentioned before the concept of fair that is at the bottom seems clear to the majority respondents although responses to other questions such as questions five to eight may clear that these are questions about implementing the fair data principles as mentioned language seems to be a barrier. Many respondents would would like to have more drop down instead of free text answers as you can see and taking the average the number of questions seems to be perceived or right. Not the last one but the one before questions might be perceived as irrelevant but more respondents disagree on that point. Let's see the next question yeah so the question seven was about guidance and the respondents could note any guidance that is missing or could be improved but this question was answered in comments by 81 respondents that is 28 percent of all the respondents naturally only people would would like to point out missing guidance I would answer this question so the word cloud gives an expression of the answers given by the respondents to this question it is clear from the answers to this question and question six on confusing and inconsistent and redundant terminology that was also mentioned earlier in the webinar that guidance could be improved by providing more examples of DNPs repositories how to improve interoperability cost estimation and more subject specific and specialized guidelines. The guidance mentioned by some to be too complicated and technical and by others it could be made less vague and less generic people asked also to provide information on where to find more specialized guidelines not work fluently. Question eight was about coverage. The respondents could note any topics that are missing so this one is not about guidance the topics that are missing or ones that should be removed in the DNP template in this case 15 percent only 42 of the 289 respondents indicated missing topics and the topics that were mentioned are listed here where listed as missing topics but these aren't currently completely missing in the template except for software so what was meant was what is a good repository for software what are the possibilities for long-term software storage in terms of metadata and licensing well as most topics are not up to some extent already in the template the comments are mostly about the level of detail people expected per area as can also be seen in the in the comment it also depends on the project that you are working on so there's a quote here on the slide I think this would probably vary according to individual projects it's possible that a one-size-fits-all approach is not practical. Yeah question nine was about the process so the participants could note any issues they encountered following the EC guidelines for example with knowing when a DNP is due how it will be reviewed how to include costs and so on now first of all again as you can see the respondents ended only 63 issues in total and 19 respondents took the opportunity to say they encountered no issues but of the 289 respondents 75 percent skipped the question skipped this question and did not enter any issues so a minority entered issues and the issues that are mentioned here in the table it's about cost clear guidance and clear requirements the issues that stand out here we think are about the process of updating a DNP doing a project what should be provided when and how do you do this technically understanding how these are reviewed by what standards and by home and if the DNPs would be monitored later on also mentioned is the issue of estimating and including associated costs what costs are eligible now about reviewing there's a quote there says processes around the assessment of the content are unclear to me and the other quote stated before clarification of what is an eligible cost regarding research management is urgently needed for researchers and also for RDN support teams at universities sorry now the next question respondents could could give other issues or provide other suggestions and here around 85 percent of the respondents skipped this question so there's still 48 people left who entered issues that are shown here and again some of those said they had no issues or suggestions and the two topics that stand out from the one who gave suggestions are the need for more guidance and a clearer structure of the template and guidance is seen as can be provided by sharing examples of good DNPs or provide discipline specific guidance these are all results that come out of comments that people gave and with regard to improving the structure of the template it was suggested to create a shorter template or provide checkboxes to tick and also can be seen by the quote here is which part should be set up already in the beginning but can be added later it would be helpful to indicate needed and nice to have for the starting point or for different stages now the next one is not about question 11 but about question 12 because 11 is already mentioned early in the webinar and 12 is about DNP publishing lots of people have answered this question although not all but 134 and almost half of the total number of the respondents provided an answer to the question would you openly publish your DNP and the outcome of this question shows a clear willingness to share DNPs that could be example DNPs for others we think so 48 percent of the answers were yes without any comment and around 20 people said yes but only if these of their requirements met such as confidentiality or only after a project finished confidentiality was also showstopper for some open DNPs probably projects where not too much of the nature of the data should be refilled or brings farmers into in danger as mentioned here now yeah we have one more about suggestions now what are the main suggestions you would give to the european commission about its approach to DNPs in the open data pilot that was question 14 it was answered by a quarter of the respondents and the word cloud of the responses as shown here we can recognize many of the topics that are mentioned before such as the need for example DNPs but a new one here is to involve researchers and institutions and that's also mentioned in some of the comments and shown here talk to researchers and ask them whether the DNP requirements are realistic involve researchers when developing disciplinary guidance and another one is focus on technical exchange format for these plans so they can be shared between tools that is make the plans themselves fair I think that's a fair point as well now the last part of last questions are about providing feedback and reviewing DNPs the question 15 was have you given feedback to other DNP writers on the above DNP now half of the respondents have answered this question and of these most said no I'm 50 respondents and just and followed up with answering the next question which was question question 16 on the kind of feedback and as you can see most of these responses were spread out until over these multiple choice options that we gave and not much other and that's a recovered most there then in question 18 the next slide it's about reviewing there was also a question 17 which is not shown here and it was asked how supporting researchers given a horizon 2020 DNP deferred from assisting on DNPs for other funders now to that question 45 people responded that they felt the horizon 2020 template took longer than other templates now on this slide you can see on the right hand side the result from our survey on the horizon 2020 template and on the left side you can see the results from local research in the Netherlands by LCRDM and the same question was asked there and as you can see it might be that the V-view process to have taken less time for other DNPs than the horizon 2020 DNPs or we cannot say a lot about it of course because only a few people have answered this question in both cases so I can hand over now to Sarah again or to spend a little bit more attention to the recommendations okay that's great thank you very much Ellen so so what we wanted to do to close is to work through the recommendations that we we listed those seven and I'll talk through the specifics that we've kind of suggested in the report under each of those really to think about what hopefully the EC or others can aspire to so the the first recommendation was about clarifying the EC's requirements for data management plans there were quite a lot of comments that came out throughout the report where people had you know either had misconceptions or had been misadvised or had some confusion over exactly what was required so we recommended that the EC collates all their data related and DMP guidelines into a single document they have one main document the one that's linked at the bottom here which is the Fair Data Management Guidelines but there's also guidance in other documents like the general open access guidelines that's not within there so at the moment there's quite a few places for researchers to look so we suggested bringing that together and also making announcements as these guidelines change because people aren't always aware that it's been updated and that there might be new things to adhere to we also suggested being more explicit about when a DMP is and isn't required I mentioned right at the outset that 16 percent of respondents had said you know they didn't see the coin in the DMP process and some of the comments related to those kind of no answer the negative answers were flagging things that were actually genuine concerns they were saying that they didn't have any data that they didn't see why they had to create a DMP and in those circumstances they wouldn't need to so I think actually some of those issues that are encountered could actually be be fixed by just some more guidelines and being a bit more explicit about when people don't need to create a DMP and there are also some questions about exactly what set of questions should be answered so within the guidelines we've linked to here there's a big set of questions and then there's a table at the end with a slightly different version of those questions and people were wondering exactly which ones to respond to and they'd also asked for a DocEx version and actual template to fill in as well as supporting the online tools that are in place so this is the kind of things we've said around clarifying those requirements as I mentioned at the outset there were also some general points made about the template structure and people finding that questions were being repeated or that the ordering was a bit confusing so we've suggested regrouping the questions according to key activities so the fair concepts should definitely remain within that but we're not sure that it's the way to structure the template so potentially asking people about their data creation about concepts like metadata or their plans for sharing all together and because there are so many questions within the template I think people often feel overwhelmed at that six month point so we'd suggest kind of identifying what questions are the primary ones what what do the commission really want answered initially and what potentially are secondary questions that could be fleshed out as the project goes on or could even be skipped if they're not relevant to the project so there are some questions for example about data access committees which may be aren't relevant if the project isn't creating sensitive data so some grouping that enabled a better pathway through would be better. We'd also suggested including more yes no questions or drop-down options or having integrations between tools so you could pull in lists of repositories or or other kind of registry information so researchers can select relevant options and because the EC obviously it's evaluating DNPs and looking at the practice that's going on we suggested identifying questions that will support their evaluation and formulating those in a more structured way so they can have automated compliance checks in future. In terms of the the DMP content and terminology as I flag the language is a bit of a barrier at the moment so we've suggested simplifying the terminology where possible or where certain terms technical terms need to be used provide a glossary to assist researchers in understanding those and in particular there was a big call for example answers but I think it's really useful to provide example answers around those problematic questions so the ones around interoperability for example so it helps researchers understand the terms and see them applied in context in their research area. In terms of the DMP content we'd suggest shortening the number of questions or as we mentioned earlier having some kind of prioritization about which of the primary questions and which are more optional and so having some kind of hierarchy or routing through so it's easier for the projects to complete their DMP and for it to be that evolving document as suggested by the EC guidelines. At the moment because all the questions are presented at the outset I think a lot of projects feel that they have to answer everything and and get a little bit overwhelmed at the start because it seems too too much detail at that six-month stage. There was a big call for discipline specific guidelines so this is something that I think really should be added to to enhance the template and the guidance that's offered and people also asked for example answers and the more that those can be tailored to the specific disciplines and pick up on good practice in each domain the better and again both Ellen and I flagged the kind of request for drop-down options. Again if these can be tailored to the practice in different disciplines that that would be better and we suggest that the EC builds on existing work going on in this area we'll flag some some work or later on but there's things like the Science Europe domain protocols that could be built on and obviously they could collaborate with discipline specific data centers and and other groups like Learned Societies. There's already a lot of Horizon 2020 DMPs being published and so over half of the participants said that they'd publish their DMP and many are already available on the web and we think it's worthwhile for the EC to build on that traction to encourage projects to publish their data management plans and also to collate those together so that the most information can be used and so offering the registry service and ideally providing a library of approved examples. I know this this can be contentious sometimes but I just don't want to to say this a good DMP because I don't want it to be copied but there was quite a lot of points of confusion and people looking for good practice examples so I think it would be beneficial to offer those and again there's already work going on in this area the EC has a number of external reviewers for DMPs we've got all the infrastructure projects that they support that could help with this and Libre I saw the RDM working group has been trying to develop a registry of DMPs and through the Libre the library community to do reviews to show what is a good example. Costs came out in a number of questions as an issue so we really recommend including RDM costs in the grant applications. At the moment the timing isn't that beneficial because people are asked about their costs within the data management plan which is first developed at month six of the project and obviously by then it's too late to include additional costs in the proposal and so we suggest raising a few questions or alerts at the grant proposal stage and the respondents to the survey really wanted a lot more worked examples so they know what an eligible cost is and how that should be added into the proposal and I just put this this one quote from one of the respondents on this slide just to show the level of detail people are looking for they have quite a lot of questions about exactly what is eligible and how to cost that in and obviously it's really useful to brief reviewers on what is eligible as well and also what should be expected so if costs aren't written into a proposal that's something that's flagged as a potential issue and the final recommendation that we pulled out of the the survey was to explain more about the DMP review practices the commission has an internal assessment framework that's been developed in REIA the research executive agency and we recommend that that's endorsed and used throughout the commission and we've been doing some training courses with commission project offices through the foster plus project we recommend continuing those has been quite a demand internally for for more help on how to review DMPs and at the last at the last session we we discussed developing some guidelines on how to do reviews and an FAQ on the practical implementation of the pilot so I think those be useful things to developers internal resources but in terms of the review it's not just about what goes on within the commission it would also be useful to have a public statement on how the data management plans are being assessed and ideally release that assessment framework so people know you know what they're going to be checked against as I mentioned there are a number of activities already going on that that might help with some of these recommendations so the Science Europe group has developed domain data protocols these are essentially discipline specific guidelines or practices that could be adhered to when you're developing your DMPs there are already a number of collections of example data management plans the DCC has a list but there's also Rio Journal and Zenodo and OpenAir so there's lots of examples online that you can find in terms of the RDM costs there are already some existing guides both from a Dutch Dutch group that coordinates RDM work called LC RDM they've taken some UKDA guidelines and applied those to Horizon 2020 projects to think about actual what time and effort is needed on the various activities and how to how to cost that in and the Wellcome Trust has guidelines in their data management requirements on the kind of things that are eligible costs in terms of providing kind of drop downs and pulling different registry content into tools there's been some work we've done on DMP online integrating the metadata standards directory so researchers can find relevant metadata standards but other tools like Re3Data could be plugged in as well and there are working groups to try and develop common standards for data management plans as one that's set up recently through the RDA and I think that will be a useful group for the commission and others to engage with and obviously they'll be ongoing support from the Fairday Trexpert group and OpenAir and others that can help the commission as they continue to develop their guidelines but this webinar is not just about the commission and what they do and there are obviously actions that all of us can take as a community and we would encourage you to to share your DMPs through whatever means you can publish in journals like Rio or deposit in repositories like Zenodo or add them to the DCC list and I think the really critical thing for for all of us to do as a community is to continue to provide feedback on the pilot and what works for you or particularly what doesn't work the commission really wants that feedback so that they can improve these guidelines and and feed that into future developments and whether you're coming from the kind of research community or the support side I think it's really good to collaborate as you're developing your approach to DMPs and also feed your work into wider initiatives so there are national open access desks in every country from OpenAir and there are international groups doing work in this area like the Research Data Alliance so just a couple of concluding remarks and then we'll open up the questions and obviously there are lots of recommendations we've put forward to the EC here but I don't want that to detract from the fact that the feedback overall in the survey was very positive I think the community really likes the fact that there is this open data pilot and the the fair data management plans and for many you know they'd found the process of developing the DMP very positive and over half had found that the template was very useful too and around a quarter of the ones who answered the question about improvements had said that they'd had no issues following the EC guidelines so clearly a lot of what's there already is working but as with anything there's always room for improvement and there were various suggestions made about how we could maybe iterate those guidelines and provide more support as I mentioned the EC has been asking for feedback and they've already iterated their guidelines so they started with you know the open data pilot they expanded that to more areas and I think the move to now introduce you know fair data management is to try and respond to some of the concerns researchers had at the outset about not being able to open their data so obviously they want the data to be as open as possible as closed as necessary but this introduction of fair is to really stress the the benefits of managing data and and sharing it in a way that makes it reusable to others I think further updates are likely so the fair data expert group has been asked to advise on revisions to the template and also to to think about the development of discipline specific guidelines so I think these are things that are likely to happen but I'm sure the EC will reflect on the the other kind of outcomes and suggestions from the survey so just to to leave you on this slide where you can find out more as I mentioned at the start all of the materials on Zanodo so we've got the full survey report as well as the raw data and the analysis we did and our infographic and Ellen and I are here presenting today but there were five co-authors so just to acknowledge our other co-authors Marianne Kvartveld from Dan's Emily Herman's who's also here and can answer questions with us and Eliana van Kauser so thank you very much for listening and I know various questions have come in in the chat box so I think what we'll do now is is just kind of open the floor to those questions so Gwen do you want to give us a more shall I just look from the google doc so there was one preliminary question just asking if the presentation will be made available and and it will um that will be available with the webinar recording at the end we'll circulate that to all the participants and post it online okay I can see this these have been put up now I jumped to the google doc so the the second question hopefully the participants can see this document as well isn't the 74% disagree school on I don't understand fair contradictory to the observation I made at the start that terms are unclear and it is and this is one of the things we found within the survey I think at a high level people do understand fair but there's different conceptions about what fair means and I think when it comes to putting that into practice and talking about how you'll make your data accessible intruproable reusable people sometimes struggle about about how they're going to do that and they had some issues with some of the terminology and there were I noticed at the start when when Ellen was speaking there were a number of links shared in the chat about some fair metrics work and about new papers that are coming out from the group who originally conceived this idea of fair so the the fair data and fair metrics question I think was answered in the chat and so the their metrics are really just a way to assess how fair data are and that's a new paper that's coming out and then the next question I'm going to hand this over to Ellen is there any breakdown about willingness to publish a dnp along the lines of support and researchers I don't know whether we're able to differentiate those from the results yeah I I think we did but there was not a real difference between them but I'm I'm not sure I don't have it now at we we try to do that in several other locations and then when there was a difference clear difference between the two then we mentioned it in a report but I don't know if we okay so this question okay so we should be able to track the results to identify who which you know how many had come from researchers or from support staff and as as we mentioned all the data is on Zanodo so by all means download the data set and you'll hopefully be able to identify that but I assume there wasn't that much of a difference because it's not something we flagged specifically and then the next question I guess is for me are there any plans on making the dnp online template more interoperable to allow smooth integration of info from local systems or vice versa yes definitely this is something that's one of the key priorities for us we want to enable data exchange across systems and allow people to to push information in from local research systems the common standards group that's that's running through rda will adopt whatever they develop so that will also help with interoperability in future too the question about good examples for dnps um I think was was answered in the chat um there is a list on the dcc website which is linked to there but there are others as well so and the rio um rio journal um the links there are a few hyperlinks in the slides that you can follow um so I think there's probably about 15 dnps that have been published in rio and there are lists from other groups as well the guidelines state that the horizon 2020 template is not mandatory and that other templates can be used instead if all the main topics are covered does anyone have experience with that um so we've had one answer in the chat already falco um no sorry that was falco who asked it I think um somebody said that while they're in brussels they were told the horizon 2020 template is not mandatory um but the agencies actually examine the incoming dnps using the horizon 2020 template um so my understanding here the the template isn't mandatory um but I think it's advisable to use it or at minimum to make sure that you cover all of the same elements because the assessment framework um although that's at the moment it's been developed by rio it's just being used internally in rio um it potentially will be used more broadly across the commission um and I think since that's the way they're assessing them it's sensible to try and follow that same structure just so that it's easier for people to evaluate your plans but the template itself isn't mandatory you can use other templates um the experience with that we have actually had a help desk inquiry through dcc where somebody had to use another template and they'd been told by their project officer they were using the wrong one so I think the response you know can be variable it's not officially um mandatory to use it but sometimes people get different advice on that so it's probably safest to go with it unless you've got a strong opinion another way um and then the question about the six month stage here I'm guessing that was related to to comments I made as I was talking um at the moment with the easy approach to dmp's they're required at month six and then are updated as as necessary throughout the the course of the project so this is why I mentioned six month stage because the dmp isn't needed at the grant application stage like with other funders um was there a question about asking if filling in a dmp would change the behavior of researchers um I don't think we did ask a question like that no Helen shaking her head we asked them about their attitude to responding to horizon 2020 trying to remain on that Ellen sorry I thought you're going to say something no I the only thing I would like to mention is that there are so many comments on uh on many questions and they were also in the line of this um um of this question so if we didn't directly ask it but there were a lot of responses in the comments that uh about this yeah I was actually surprised in that first question about whether people find it a positive or negative experience I was surprised how many of the comments in that talked about having approached it with a negative mindset you know thinking it was just going to be administrative and lots of people had said about actually how it made the project reflect and they found benefit that they weren't really anticipating so whether it changes their practice is another step further but um I think the process themselves they'd found useful so far um how do we provide feedback to the EC I mean there isn't really aside from having run this survey there isn't really a formal mechanism but what I would recommend is is talking to your project officer so you know when you're part of the pilot sharing your experiences so that they know what's working or what's not obviously you can um share information through open air through your no ads or also to the fair data expert group and we can try and feed it back through those mechanisms as well but part of the reason why we ran this survey was specifically because the community had asked how they could feedback so this was one mechanism for doing that and the discipline that researchers were from I don't think we asked that no no it was it was in the responses right um but we couldn't link this to the answers so it was not a set up properly I think in the survey to make sure that we could split it per discipline yeah um yeah so question two you'll see if you look at the survey report has been removed because that was where people identified who they were and the organization they were from so sometimes people have flagged their discipline but we'd agree to anonymize the results and as anyone got experience with discipline specific templates um with guidance so there is some being developed as I mentioned the domain protocols by Science Europe and I know some universities within the UK have discipline specific guidance that they've developed um you know within certain schools where they've got local support staff um but I think that there isn't enough of that often funders have one template um and they have a set of guidelines that go with it and for funders like the EC that cover a broad range of disciplines you know it's fairly generic and I think this is why sometimes people want something that's much more specific to their research area but if you look at some of the funders like Wellcome or other health funders obviously the guidance there is more tailored to to that subject area we did try to and then from Marta oh I think there's a delay there yeah uh no it's a CESTA project um last year where we developed a guide uh I left the the link in the chat box but it's specific for the social sciences just as an example yeah yeah I think if you if you look at some of the um kind of community infrastructures the the ERIX like CESTA you'll find there are more kind of subject specific resources um and then Marta has a question um if a DMP is only completed at month six in the project isn't it an issue for costing it absolutely is um they can't ask for more costs within the DMP um which is why we're recommending that some questions are included in the application stage around costs um so at the moment I mean I I guess the only fixed projects could have if they come to month six write their DMP and find that they have costs like if they're planning to deposit in a repository and there's going to be a cost levy because they have a large amount of data they would need to try and find the resources from somewhere within their project which means cutting something else or you know changing their plans somehow um so this is why we think it's it's best that that's raised at the application stage so that any costs are included then at the moment the timing really doesn't work well and in Rob Holt mentions that in RDA there's been a birds of a feather group addressing the inconsistency and non-repeatability of these kind of questionnaires and that makes results hard to use to verify the effects of policy decisions um so uh I guess it's about trying to uh synthesize things across funders I know Science Europe um they've actually got a workshop happening at the end of the month and that's about funders policies and approaches to data data management to try and get more coherence across the approaches um so that could be a useful thing to reflect on in that group um Rob if you want to say more about the group um by all means add more in the chat as well and do we know if there's a consensus on how the DMPs will be monitored in time throughout the project and whether there'll be sanctions whenever DMPs are not followed so at present the monitoring happens via the project officers because the DMP is a deliverable um so they're checked at the six month stage and then whenever there are project reviews sometimes the commission brings in external reviewers and sometimes it will be the project officer themselves so that look at the DMP um when we've run training courses in the commission you know people have asked about sanctions um I think given that this is a pilot um it's you know it's not full policy yet and things take time to bed in they're not going to be very strict on sanctions yet but it's the kind of thing that tight ends up over over time so so I think in the longer term that is more likely but I think at the moment there's there's a fair bit of of leniency and really what they're looking for is a best effort response from from projects obviously if a project says they'll deposit and then you know just refuses to do that in the end and there's not a good reason um then maybe the project officer will follow up and and potentially in future there'll be sanctions but I'd be surprised if there's any like quite yet um I'm just gonna scroll down again the questions have jumped um and Sophie who asks um that she's wondering if researchers have given feedback regarding creating the DMP at the beginning of the proposal instead of the six month point um there were definitely some comments about the timing of the DMP and overall I mean there were different views throughout the survey um overall people liked the fact that it was within the project rather than at the application stage um some had thought the six month point was too early for certain aspects but overall it was generally felt to be okay um I think within uh Horizon 2020 some people do provide some information at the proposal as well there is an optional data management section that can be completed which is similar to a DMP so sometimes people fill it in there um but I'm trying to think about other comments that came out of the survey. Does any of you remember or want to reflect on Ellen? I think it's mainly uh what I would share is what you mentioned also already before that the costing aspect that it would be um good if people would have more ideas about how how much people how much things would cost as a um stage when they applied for a project instead of when the project starts because they could still ask for funds. Yeah that's not about creating a DMP but it's DMP related in fact. Yeah I think it's definitely useful to to flag some things at the proposal stage and it helps projects you know to think early on and when we've been asked by certain projects about whether they should fill in that optional form we've always recommended it if the you know if data is a component of their project it's helpful to consider the things early and then that also gives you essentially a preliminary draft of your DMP for month six something that you can then flush out a bit further once you get to that deliverable. I just noticed that Anders has put in the chat that he's heard of one of the projects where payments were held back because of a missing DMP um so I think you know because these are deliverables that there can be sanctions um so it's definitely be worth complying and filling them in and in terms of the metadata or metadata standards compulsory I find this very complex to provide and Maria's mentioning and so the standards are recommended but the commission acknowledges that there aren't standards in every field so I think from recollection the question says something like if there aren't metadata standards please describe what you'll capture so they're looking for an explanation of of the information that will be collected instead. Just trying to scan through if there are other questions but the the time frame for updates um the guidelines suggest that the DMP should be updated um as applicable to the project so um when things change with the data so there might be a change in terms of which data should be shared or whether they're able to share data or they might change plans on what data will be created or how so the DMP should be updated then um but it says kind of at minimum at the review time so so when people are um sorry I see Paula means the update of the DMP template rather than the update the DMP itself I don't know that there are time frames from that we've only just delivered this report to the EC so I think they need a bit of time to to look and reflect um but um yeah probably I guess sometimes summer or maybe later this year and there might be updates to the template but that's something we'll we'll definitely let you know what we hear back on so I just noticed the time as well with two minutes to two um so we've we've filled up the rest of the webinar with questions um it is any that we've missed by all means uh we paste it in the chat if you want a quick answer scrolling through it looks like we've we've captured most of them so so hopefully you've you've heard back okay yeah I think so too now please have a look forward yeah yeah it looks like it people are saying yeah and people are saying thank you as well so hopefully we've covered everything you were looking for um by all means take a look at the report and and share information back um and thank you all for attending goodbye goodbye