 Okay, welcome everybody to this session, this projects orientation session on the research impact and outcomes for the national data assets projects. So, our director, Adrian Burton, who's the director of data policy and services is going to take you through our requirements around research impact and how you can develop those in your projects. And after that, after his presentation, we'll have time for questions and discussion, but not a lot of it because we're going to finish on the half hour. The next slide please, Adrian, just want to acknowledge and celebrate the first Australians on whose traditional lands we meet and pay our respects to elders past and present. I'm in Brisbane, so it's the Turbull and Yagura peoples. I'm acknowledging that. So I will hand over to you, Adrian. We can't hear you. I think you're muted, Adrian. Oh, we lost him, right? I couldn't unmute whilst I was sharing. A custom to the Webex interface. We'll go back again. Yes. Great. Well, welcome everyone. We're talking to you about, ostensibly, the specific thing we're talking to about is a kind of policy and guideline that we have within the national data assets program. It's on research outcomes broadly. We'll start to use those words in a bit more of a defined way as we go through here. But it's a guideline that says that we're looking for the infrastructure to have actual research outcomes and broader impacts in the society. And that we're together we will plan how to do that. And together we'll plan and monitor those things and be able to report on them afterwards. That's the specific thing we're here to talk about. Probably today we will actually go back to just defining a bit more carefully what we're talking about when we say an outcome or an impact and why. Why we might be interested in those things and then we'll talk about the planning monitoring and reporting as far as this program is concerned. Natasha said that we think it would be a shorter presentation here. There is a longer time if you need to ask further questions. We won't cut it off necessarily, but this is meant to be a kickoff and an invitation to help us to design the application of these guidelines better. But that work will happen further down the track. All right. Most of you will know where an increased program and increases reviewed by the Australian government every few years. And the last time it was reviewed, there was an infrastructure roadmap published. And one of the things in that roadmap said that Australia's research infrastructure should support Australia's future. And you'll see here on the left, you've got the research infrastructure in the middle. And it's giving benefits to Australia's research institutions and world class universities as well as to industry. And then there's an arrow going out to the right with this idea that. That's not the end of it. You know, just universities using the infrastructure or even industry using the infrastructure is not the end point. The end point as far as success for the, whether the increased program is that there are some impacts in Australia's future food security or helpful. They're just example once then. So that's the reason why we have this guideline and policy within our program because we're inheriting it from the federal government program. We have a program logic for the, for the whole national data assets program that is a pretty standard program logic. It starts with the inputs and we're kind of past that stage here. We've been able to secure time resources, investments, people, pre existing data and all sorts of things from the ARDC and our partners. And they are the inputs to different projects within this program. The next stage will, will be, will, will head off and we've got our project plans and we're going to do a certain number of activities that we work packages that we activities. In general, to sort of simplify what's happening across all the projects in all the programs, the activity is to build a data infrastructure. Obviously building a better infrastructure is a much more complex thing than just just building a road in that there are quite important national standards and partnerships that need to be brought in to play for building a data infrastructure acknowledging all of that. That's the activity of the project that we create a new data infrastructure. The next step is the output step. So that is that there is something new and shiny that you can that can be used. Our program is really very progressive as far as infrastructure is concerned and it's based on this premise that data itself can be and is an infrastructure. If it's designed, if it's done properly, it can be an infrastructure in the sense that it can be an asset and it can support research into the future, multiple different types of unknown research into the future. So we're really talking in our program, we're talking about building up those assets. And by the end of the projects, you'll have a new or reinvigorated data asset that can be used for research. The outcome stage then is, okay, it's an asset. It can support leading end-of-edge research. Let's see that happening. That's the outcome stage that the data infrastructure is actually used by researchers or by industry and scientists in industry and governments around Australia. So that's the, we were talking about the logic of the program after the establishment and the almost the finishing of the establishment phase contracts that the next, the outcome that we're all doing it. The reason we're all doing it is so that we can support research. And then the impact from our point of view is that from that research. Really impact the activity happens even beyond research that the research is used to change policy to improve policy or to improve economic activity or to make positive changes to the environment. So that's our overall program logic for the program. And as you'll see, the outcomes and impacts are two thirds of what we're doing. That's why it's a really important. Part of the whole program logic. The astute amongst you will have noted that those really important parts can be sometimes 10 years in the making. And they happen well after the sort of contracted build stage of an infrastructure. But they are important because that's the whole reason we're doing it. You know, research infrastructure is there for the benefit, you know, the whole idea of everything is to realize a benefit. That's why we're doing this to do good. And the research infrastructure is there to benefit research and the research is there to benefit society. In program management speak, you've got this benefit realization concept where you build something like a hotel. But it's not really, you haven't realized the benefits just by building the hotel. You have to fill it with people. They have to have a good time. The company has to get money in response. So all those things are the benefits of having built a hotel and we hear that there are things in the places in the world where they're big cities, they're being built, but the benefits have not been realized from that. So that's not what we're looking for here. So that's another term that you'll hear in program management is this benefits realization. And it's about actually making sure that the outcomes and impacts actually happen. And if you're not convinced by all of that, well, we just have to do it anyway because increase requires us to report what, you know, those outcomes and impacts are. And so we have to ask you that the Australian government is doing it across the whole research sector. So the ARC has full on assessment programs for finding out what kind of research happens in Australia. That's ERA. And then they've just introduced a new program, the engagement and impact assessment that's already done around in Australia. So if you belong to a university or a research organization in Australia, your organization is already being asked these questions by the ARC that they've been providing research outcomes and impact narratives to the ARC already. This is part of a broader Australian government policy on the place of universities in our society and how they are meant to contribute to broader impact. And if that's not enough, the reporting blue there happened in 2015 actually changed the way that funding happens in Australia to really say that just doing a journal publication is not the way that you get money within the funding research system anymore. It's still there, but it's also on your engagement with society, with industry, with public policy. So that's already happened and those changes are percolating through our research sector right at the moment of changing money leaders. And then on the right there, that's something that's happening right now. In April the Australian government put out a consultation on university research commercialization that included translation and then commercialization and in general the relationship with Australian industry. They're thinking about a whole new scheme that some people have called a new MRFF that would provide specific funding for research translation and research business collaboration. And the MRFF at the moment is just as big as the whole MHMRC and ARC budgets combined. So if it goes in that direction, there will be a whole new funding scheme that really focuses on this kind of research translation and the impacts of research. So we have to do it and it's good in any case. So how do you do it well? We've had, the ARDC has had multiple sessions with the CSIRO that has a very good impact and evaluation sort of program. We've also had all our program managers sort of trained through different program logic things and we've tried to reflect that what good practice in this area is in our program guidelines. On the right there, we even did a big investigation that's been over three years looking at what is the pathway to impact that data and data infrastructure. What role does data and data infrastructure play on the pathway to impact? And if you're interested, there's quite a nice report that was done by Professor Mark Reid and Dr Eric Jensen. Analyzing that. And actually what they did was look at all those impact narratives that your universities provided to the ARC and the ones that were provided in the UK. And they did a well, Steve analysis of those impact narratives and identified a very considerable number that relied on data was the first thing. And then they also saw a saw that actually the kinds of impacts that would be reported for that in those ones that relied on data and data infrastructure. There was no other way to achieve that impact other than through this collaboration around data. We're trying to reflect what you care about and so that we'll be talking about and how can we start to monitor and report on those things. Adrian, we just lost you for a couple of minutes. Adrian, you might have to kill your video. I ask him if you can just go back a sort of. Yeah. I'm going to talk to you. Lost them all together. Yeah. Just wait to see if you can come back in. Adrian, can you hear us? Okay. Just wait a minute and see if we can get back in. It might be worth messaging Adrian folks. Just in the chat. Yeah, I'm doing that now. Yeah. Probably still talking away. It's not outside of legal parameters. Adrian. Adrian, can you hear us? Yes. We lost you for about three minutes or more. Oh, can you hear me now? We lost, we can hear you now, but we lost your video and voice for quite some time. Just maybe go back to the last slide when you talked about the IMI report was about the last time we had you. Yeah. The end of that sort of conversation and anything from there on we lost. So if you're happy just go back to that one. And we couldn't tell you something because we didn't perhaps turn your video off just for the wall and see how we see if that helps. Except for this and if I was not sure how to do that last year. Down the bottom. Let me continue there if I've got my phone here Natasha if anything happens just give me a call. Okay, so I'm not sure what I said about the project on the right there the investigating the link between research data and impact. It was looking at the impact narratives that were recorded by all the Australian universities in that ARC evaluation program. It looked at the UK ones as well. And then it's identified these really important pathways that data and data infrastructure provide on the pathway to having an impact. I recommend you have a look at that as well. In any case, all of these things have now been, you know, the good practice that was suggested in all of those kind of studies are now reflected in our program and they're summed up on the dot points on the left there that it's important to plan for outcomes and impact early. It's important to include the end users and benefit realization actually in the project itself. And if you care about these things and you have to have a way of measuring. So that's what the activities look like within the program. We've asked you all to include research uses and beneficiaries in your project. So that should be already in play to have some kind of relationship with transition partners and to have an explicit plan. I think we're we've got that you're in that planning phase now that it'll be an explicit part of the project plan. The implementing and monitoring stage that will set up the ability to monitor usage of the infrastructure and order impacts at a later stage. And we're very keen as part of this to work with you all on building the community awareness of your infrastructure. Building community buying so that we know that the research community will be ready to go as soon as we open the curtains. And then there's a reporting phase as we noted there that's well after the, you know, the establishment project phase. And we apologize that that sort of means that there's kind of a reporting phase that's here in two years after the project finishes. But there's no other way to to measure that stuff could just have to give it time to to happen. And in fact, actually, you probably need 10 years to do that properly. And we may well look at ways of keeping in touch for longer. There is a spectrum of capability in this area. You know, the first thing is that, you know, possibly a level zero is that we don't know how researchers are using the infrastructure. It's not it's unmanaged and it's unknown. We'd like at least to get to that for the second dot point that at least the usage of the infrastructure by researchers is known and monitored. The second two dot points are aspirational will try to do as much as we can in those areas or as much as it is possible and work on those. So that's the fact that, okay, you know that researchers have used your infrastructure, but what publications did they do or what, you know, what programs were they involved in? It's harder to know, but we'll at least try to do some stuff in that area. And then the border impacts that were achieved by those researchers by those researchers are again a step away. If we don't set off in that direction and have it as a goal, we'll never measure it and we will. Whatever improvements we can make to those last two dot points will be for the betterment of everyone. I'll just skip over these. You must be aware of these. They're already in your program guidelines, etc. We've got, you've already put in applications where we all talked about what the impacts and what kind of research would be done with it. You're right now in the middle of doing your project plans that have a specific plan for research outcomes and border impact. And then after the projects, we've got that 12 to 24 months thing. That's the number that there's nothing more specific than that. We want to work with you to get the most weekend out of all of these, those things, but that's the bear bones of the requirements there. And if you've opted in for that 5% extra, that's to create some work packages to make some changes in the systems that would allow you to track and monitor those kind of things. And if you're doing that, then these are the kinds of things that are in scope there. Identifiers that might allow you to scope that change to track the usage. There's a whole set of things that are there already in your thing. I'm happy to discuss some of those. I think that's all except to say that through this guideline and the policy and how it's instantiated in the program. We're just setting a strategic flag that says this is what we would like to be able to do with you would like to help you. Monitor and check the research usage in the broader outcomes for your own benefits and we want to do it for our increased program benefits. It's a great thing to be able to work with you on. We don't have all the answers yet. That's not quite so sure what this practice is in all these areas. We've not even quite so sure what you would think a good reporting requirement might be for those 12 and 24 month reports. We're happy to work on those with you. So we've been busy setting up a working. A working group that would sort of be a co-design and committed practice for sharing amongst the projects and doing the best possible things we can in this area. That would all from me. I apologize for the middle part of it was a mystery and I hope we didn't lose too many people with probably my daughter watching some kind of video on the internet. Okay, thanks very much, Adrian. We have about five minutes more of Adrian's time for he has to go. And then I've just been reminded that Adele Koot who's our marketing and communications manager will give a short, very brief talk on acknowledging the ARDC and the way that you can do that in your projects. So are there are there any questions I can't see any in the chat currently and I cannot see everybody on screen. So if you have one, that's a bit better. Now I can see people would anyone like to ask a question or make a comment or raise any concerns or anything in regard to what they've heard. I'll just make a quick comment. Adrian, that's that's really good. I'm glad that you're putting together this activity, you know, that we can sort of track. It's a very difficult here as we all know. So, you know, just looking forward to seeing what we can all bring bring to this and work out what best practices around it. So it's all good. Good. Thanks. We're hoping look, we're doing because it's aligned with the whole increase program and with. Okay, Adrian, I don't know if we've lost you yet. There's a question from Daniel and then Steve, I think Daniel, do you want to go first? Yes, thank you. Thank you, Adrian and everyone. So my question was about researchers accessing data products and I saw in some of the associated material that came about tracking usage that ideally we would have researchers registering to access the data. Our tendency would be to make some of our data products open source available without registration. And I was just wondering whether this requirement of register to use the data was essential or whether you're happy to have data products being entirely open. It's a very good question, Daniel. And look, I have been working with projects to make the data totally open for years. And however, we also have this requirement to keep in track of, you know, how how the structures being used. So, that's, that's why we've left this open as a kind of an objective and we'd happy to work with each of the projects to see how the best way of doing that is. We would probably say that we might have to compromise on a little bit on both of those. You may not be able to get a very specific registration for every user. But we may not be able to, you know, if this impact and outcome is part of what societal expectation is on us, then we may not be able to just have to click here and have the data in your box. I have seen some pretty good examples of infrastructure that has a very light weight to, by the way, if you're using our data, what are you using it for and what can you give us a contact. So that's hopefully what we might be able to do through that working group is try and share some of those the ways of doing this. So it's because what we really have here is an objective everyone's on board. Now, what are the right ways of doing that so that it doesn't disrupt our activity and cause expense and delay and all of that. How can we get this in some really efficient ways. Yes, so thanks for that. And I understand the desire to find out who's using the data and how can I just make one for the comment about it. Yeah, sure. Go ahead. Yeah, so I mean, I'm quite passionate about reproducible research and I appreciate that in a lot of areas and say medical research. This isn't possible because the data simply can't be shared. But there's also a societal push to make research reproducible wherever possible. So for example, I tend to make the code and analyzing data, regenerating all the figures and things like that available where I can. And so if we go down this path of requiring registration to access data, it actually sort of makes that whole process a lot harder. And so I just note that as a conflicting societal demand, it's also being pushed. And I would say goes against the one to track access to data. Yes, and I see we've got some comments here about API access that's very difficult to manage as well. But sometimes you can. Anyway, I agree. There are two objectives here that are both being given to us that they can't, neither of them can be at zero. And I think the art of what we're going to try and do here is to find the ways of doing this that are both acceptable and practical and don't compromise on the open science agenda. Sounds great. Adrian, we have a couple more people who would like to ask question and we're on the half hour. Should we capture those questions and get them to you by email to then share with your answer with everyone. And how would you like to do the invitation to join the working group would be send that via the email as well and look for volunteers. I think so. I'm happy to work with you and the program managers are the best way of getting that kind of working group up and running. If there's a yes, and if there's further questions, look, this is just we could be discussing this and that's the idea of setting up the working group is to actually discuss how to do this. And so I'm happy to take a few questions if there's a minute or two to go and then. Okay, great. Thank you. Steve had the first question and then Martin. Yeah, I'll make a quick comment on the previous comment. The reproducibility and the accessibility questions are not necessarily in conflict. There are ways of doing this that you can deal with. So I think that's a good thing for the working group to discuss. That wasn't that wasn't my question. So so I'm a social science medical research. You know, we've been working on how to how to deal with the exact conflict you're talking about. So be good to talk about that. My question is applying this to the platforms projects. Are you looking at extending this out from the beyond the data projects? What it was is that. In spirit. Yes, the DC has a has a. Commitment to impact and. Outcomes facilitation. We just we've only made a specific sort of guideline and because I, well, just priorities, we thought this was very, very important for this program. So it's not formally extended to the platforms program. They don't have the same kind of guideline that we have and they didn't set up there in the same way. However, it's one organization and it's the DC that's committed to some of the outcomes and impacts. So. In spirit. Yes, but in form in absolute precise form. And we do have a number of platforms that are actually aligned with with. That's your data assets projects as well. Great. Martin over to you. Sorry, some experience. It's been getting easier to acknowledge funding from the ARC or NHM RC when publishing paper or. In a database or as a note, I mean, might be easier if. Encrase could be recognized in the same way as. And. And it can just be a pull down menu making it much easier. To acknowledge that you've been supported. Just wondering if there's any. Any kind of thing we can do to help that along. Yes, there's a lot of activity and this is not in the 5 minute. We can't cover all of that. There's a lot of activity happening in the background around what does it mean to. Keep an eye out how to track. Inputs into research and. Unders all sorts of things. A lot of it relies on global information systems to do with publishers and grant applications, etc. And they are using increasingly identifies for all of those things. So there's an identifier for the funder and there's an identifier for the publication and for the sample and for the data set. So that's part of why we would be encouraging that. And yes, it would be great to have identifiers for the instruments or for the facilities or for the increases of program. And we are, we are introducing that thinking in and all that does it helps you to leverage those global information systems that are harvesting some of those things. And specifically, we should talk about in this working group now some work that's happening with Crossref and the publishers about capturing references to data sets in publications. There's been some really quite fundamental work done over the last three or four years and I think our program has a way of interfacing with that and we have a very strong relationship with that program. So I think we almost be able to do a test case with these projects. Great. Thank you. There's a comment there from Steve if you want to read it around reproducible research and restricted data. Just referencing Ivan Hanigan's work on that. I think that's all the questions I have unless you wanted to add more on the API question that was asked. Or is that something we'll unpack more in the working group tracking does tracking and monitoring data. Access using API's include monitoring data access using API's. Some of our facilities would, all of their data access would happen through API. So we have to at least look at it and see how what can be done in those areas. Okay, great. Thanks for the question Marco. Okay, so we will leave it there with Adrian and we will hand over to Adele to talk about acknowledgement guidelines. And thanks Adrian and I will follow up with the next steps for the working group. I know there's a lot of work to that people are doing on this to be shared because we all have very similar challenges of how to demonstrate, you know, the impact of the investment that we're making. So I think it will be beneficial. No, not just to the ARDC, but to all the partners involved in our co-investment projects. And I know there's some really good thinking going on in the cadre project around that too. So hoping Steve will be able to share that with us through the working group as well.