 2020. I know that these have been three full days, full days, but I hope that you enjoy them and that you learn new things that will be useful for your work. Before passing the floor to Tipsiana that is moderating the final session with two our guest speakers, I would like to announce here the winners of the best poster and demo competition. And I would like to thank all the people that spend time in voting online for this and all those that have submitted posters and demos for this EOSCAP week. So for the best poster, actually, so when we thought we only were thinking about the award, one of my EOSCAP colleagues suggested that the best award should have been to frame the poster and ship it to Karlsruhe that initially was the original venue of the EOSCAP week with the following message. I almost went to Karlsruhe for the EOSCAP week 2020, but all I got was the best poster award. Well, so in the end, we decided that the awards for the best poster and the best demo winners would be three tickets for the final EOSCAP event that we are going to announce at the end of this session. So then I'm happy and pleased to introduce the best poster winner. So the best poster award goes to the BI Enterprise Content Management. And I would like to invite Gergos Niemic, apologies for my pronunciation. Gergos is a solution architect at BI Insights, and I would like Gergos to briefly describe the content of the poster. Yes, so hello, good afternoon. First of all, a big thank you to all of you who voted for us. I have seen some of the competitors and I have to say it's a great honor for us to win here. I would also like to thank EOSCAP, the DIH and deep team, Marcin, Agnieszka, Elisa, Alvaro. I have a honor of presenting here, but it would not be possible without the team who stands behind the system that we designed. So I would like to name the hero. I need some support. It's a long list. So thank you Dorota, Magda, Malgosia, Mateusz, Vlad, Szymon, Piotr, Piotr, Krzysztof, Marcin. Thank you guys. So we designed a system that can help organizations manage their unstructured data, that is naturally hidden in the departmental silos and is not easily accessible to all the members of the organization. So the goal of the system is to create an information hub where the data from external sources is centralized in a central repository and the knowledge is extracted from the files automatically. It can be shared across the whole organization. So the system can connect to many popular data store types searching for new files which then go to a natural language processing pipeline and artifacts extracted automatically from the files are stored in a document database and in a search index. So now I will switch to a short live demo of the system. So here you can see a system that has been preloaded with almost 50,000 of documents generated by Polish parliament. These are acts of parliament. So we can search, quickly search for, sorry, the documents are in Polish so maybe you will not understand the content but the system is bilingual so the menus and all the labels are so I'm searching for a European Union keyword. This is only a European Polish so and if we go to the document details we will see the keywords generated by the system and all the all the metadata that was extracted automatically together with the summary and let's try find something about IOSC and we find one document but this is just by a coincidence because the system is correcting the user spelling mistakes but we can go and search in external data sources that are integrated into the system for example open air and here we can see something much more interesting and we can click on the link which will lead us to the source system where we can read all the all the details. So I will now switch back to the presentation and we are still improving the system we are planning to add new features like more text analytics and advanced image processing including search with picture and documents semantic similarity networks and we are almost ready to to launch a demo in the IOSC platform and we will load some English documents that would be more meaningful and maybe useful to the IOSC community and I think that's it from my side I hope I did not run over the time thank you it was perfect thank you very much congratulations once again and it was very nice okay so let's move then to the best demo so the best demo goes to the deep hybrid data cloud developing and deploying your deep learning application and actually the award goes back to Kershrui because we have here Valentine Coslog that is a researcher in the data analytics access and application department at the same back center for computing of the Kershrui Institute of Technology Valentine the floor is yours. Good afternoon everyone and thanks a lot for organizers for organizing this IOSC hub and it's not so easy times and of course a great thanks to everyone who voted for our demo and well the award doesn't go only to Kalshrui it goes to of course all the people with whom we work in the project and who developed the solutions which we are presenting together also with my colleague Lara from Spain. So in our demos and first I want to shortly introduce our project deep hybrid data cloud where deep stands for designing and enabling infrastructures for intensive data processing in a hybrid data cloud environment this is one of the European projects and where we have nine academic partners and one industrial partner and then the goal was to prepare a new generation of infrastructures with different generation latest generation technologies so leveraging also hardware like GPUs and infinite band to support deep learning and other intensive computing techniques to exploit very large data sources and we wanted and what we worked for is to make it easier for scientists to integrate their applications and transparently run them on different infrastructures exploring cloud environment and HPC environment that they can easily prepare their modules and share them with other people in the catalog or marketplace and that they also leverage DevOps approach and continuous integration and delivery so we have a few several deep learning use cases which cover different fields of science and this is biology and for instance plan classification this is also applications for classification of seeds fit a plankton this is also satellite imagery massive online data streams and for instance retina party to detect the level of retina party in patients then in fact our solutions are leveraged by quite a number of other communities and the whole list is beyond the use cases which we started this in the project and you can explore many of them in the hour in our marketplace if I now switch correctly just to show the marketplace where you can then for instance well go to an application and you have a possibility to deploy it immediately for training you get the information about the application itself you can also access the source code and the docker image of the application but you can also submit it for training then that's a big figure which is now loading with just to give you a short view on our architecture I hope the figure loads but so in the project we use different technologies and solutions there is a pass orchestrator there is a indigo orchestrate well which is based on a digger orchestrate there is a dashboard there are dashboards there is alien for cloud we also leverage one data storage solution from the xdc project which was presented also during this week we use u docker tool or we developed this tool actually to another that user without escalating privileges can run docker containers also in hpc multi-user environment and so on so sorry the the figure didn't load for some reason but then in short to what was presented in the demo that there are three let's say two different possibilities where people can deploy one of the deep modules hosted in the deep open catalog either in cloud or hpc environment so when we go to cloud we first can select the the module from the open catalog then use the dashboard to deploy it in our testbed and for instance one of the modules is actually deep development environment where you can start developing your own application and then you can do training and you can do inference and since this is based on docker images you can also do the same stuff in your local computer and then in the hpc environment we have two parts where first part where you can login to the hpc system using your odc token so not like login and password but we have also central im service and then we can basically use these odc tokens to login to to identify ourselves in different services and including hpc system and you can do very much the same so leverage a new docker tool you can also submit this deep development environment as a slurm job and then by doing ssh tunneling you can access this environment and work directly in the hpc system and then the second hpc part we do pretty much the same but using the pass orchestrated dashboard so the web interface where you configure your deployment and you submit that to the hpc system and there we also use one of the applications which is in our open catalog retina party and where the data for this application is hosted in one data and just by configuring through the web form you can make a deployment to hpc system slurm system and starts and immediately processes your data stored in one data so this is more less it and here i just show some links where you can find more information about the project the main web page of the project the marketplace also the open catalog we also provide a deep house training facility in the osc portal and there is one documentation and description of your docker tool itself thank you for your attention thank you very much valentine for this brief introduction to your work so i would like to thank once again valentine and jagos for this presentation and then i think it's time to move to the final session of the closing plenary focusing on the us cup contribution to societal challenges and the role of digital infrastructures so i would like to introduce the moderator of this session titiana ferrari managing director of the gi foundation and also your scab coordinator titiana the floor is yours good evening everyone it's a pleasure for me to to be here at the closing plenary for this virtual conference and also i'd like to thank many of you that i see are still connected i hope you are going to enjoy the interviews that we're going to have in this last hour our objective this closing plenary is to bring science in focus and and to show how concretely scientific communities through the european open science cloud through e infrastructures can address societal challenges so as the moderator of this uh of this last session i'd like to introduce our guests first is alexandra bonvan here representing structural biology as a scientific discipline alexandra is a full professor of computational structural biology with the chemistry department of the ultra university he's also scientific director of the day food center for biomolecular research in which structural biology and the study of interactions between molecules is central the second guest is uh richard lucas richard is holding a silcongry research chair with the earth observation and ecosystem dynamics research group with the department of the geography and earth sciences all the other is with university in wills and i've challenged my wash wash pronunciation here so with richard we will be exposed to a completely different sector we will look into uh sustainable development global goals and the ecology so i hope that you will enjoy this interview with our guests and also learn how we are in in daily life years to can support experiments and excellent science so i'd like to ask my first question to alexandra so alexandra in in a few words can you introduce your scientific activities and the we nmr community that you are representing here also first let me thank you for the opportunity to speak here at this plenary it's a real pleasure so i'm representing we nmr which is a virtual research community in the area of life sciences in general and structural biology more in particular we are linked to the instruct s3 project and also alexia s3 project and as we nmr we are serving a large worldwide community of researchers who are working on all kind of problem links to molecules and how molecules interact with each other and we are providing uh compute services actually thematic services under the yosca for for users to transform their data into models using high throughput resources so the portal that we are operating are operated from from uteric university but also from florans so there are different geographic geographical locations uh we have accumulated since now more than 10 years of operation basically under different projects more than 17 thousand registered registered users and these are generating a lot of computing jobs on the yosca resources those users are coming from europe but also the majority actually is from outside europe so from asia the us so we are covering more than 110 countries and in terms of also impact these days we see a large use of some of our services these days to fight covid actually because this is the area of structural biology or drug design yes we will see just in a few minutes from now how practically an experiment which is so actual will benefit from havoc in the demo so i like to to turn to richard so richard you have been i hear from your bio one of the main contributors to the aodism system and also through the eco potential project you have been contributing to aerogeo so can you say something about the project your activity and aerogeo you're muted richard yes thank you very much again for inviting me also and most of my experience is in is in quantifying understanding the response to terrestrial and coastal ecosystems environments to change through integration of earth observation data from multiple sources and we've looked at all sorts of methods looking from the scales range from trees to the globe and as part of the eco potential project we developed the earth observation data for ecosystem monitoring and that allows routine mapping of land cover and change based on evidence through retrieving environmental variables from satellite data so we don't classify really the images we bring the environmental variables together and we then combine these to produce land cover maps and also change so we're leading a project called living whales as well which is a follow-on and we're significantly advancing that as well basically what geo the group of earth observations is very as selected the united nation sustainable development goals as one of its engagement priorities and and eurogeo is the european regional component of geo with the focus on coordination and scaling up these user driven applications being developed in in europe and so i believe we have a video that can explain that a bit more to show is that right yeah thank you richard so as anticipated we have a short video that can give you a practical understanding of how important is addressing the societal challenges especially u n development goals so please fire risk depends on many factors including climatic conditions vegetation type and amount forest management practices in other socio-economic i think it's our eyes to fix the presentation layer so let's see if we can yeah fire risk depends on many factors including climatic conditions vegetation type and amount forest management practices and other socio-economic factors climate change projections suggest substantial warming and increases in the number of droughts heat waves and dry spells especially in the Mediterranean region italy's forest estate is one of the largest in europe preserving and enhancing forests is also important for society and economy for this reason local authorities are deeply committed to forest preservation and fire risk prevention on the 24th of september 2018 a large fire devastated the forests in the mountains nooks of pisa the mayor of calchi the closest city to the fire highlighted that it was difficult to estimate the extent of the fire but that it was surely a huge portion of the forest on mount serra earth vegetation helps monitoring the extent of burnt areas the eodesm model classifies land cover and land cover change eodesm runs through the virtual laboratory v lab which uses european cloud plough pumps including the copernicus diocese and the european open science cloud by using eodesm within the v lab the extent of significant change near pisa was mapped from central to images acquired before and after the fire v lab enables other scientific models too in particular trans dot earth model which calculates the un sdg indicator on land degradation eurogeo enabled the connection among the presented european capacities for forest fire and land degradation assessment so the the video shows how earth observation earth science modeling can provide knowledge for decision makers in particular for sdg 15 which is life on land and more specifically it shows how european capacities may contribute to geo european resources using the showcase are in a central data digital infrastructures including the osc and functionality created through the eo funded research innovation action with the v lab and eodesm thank you richard i like to go a bit deeper exactly on this very topic so looking at specifically how digital infrastructures and i would say also us cub have been enablers for the scientific activities of both we and mr and eodesm so i like to turn to alexander so alexander can you tell a bit how we in mr structural biology and also the research projects you are supporting in covet are depending on europe and the digital infrastructures today so if uh if you look at our thematic services we could not offer the the amount of services and the quality of services if they was not in infrastructure behind those and we are making extensive use of the high throughput compute uh infrastructure in eosk so that's the whole model or all jobs are in terms of thematic services are mainly not so much data dependent but we rely on a lot of computing for that and the high throughput computing is key for that another important aspect of eosk is to facilitate our task to dispatch all the computing to the distributed resources high throughput resources so as thematic service provider we are in between the end user who are using our services and the infrastructure and we have to communicate with the infrastructure so the tools like the dirac for egi are crucial for operations we use a bit of data but it's not that much so it's really the the access to the resource making sure that things are working getting the supports where things are not working because you know the world is not perfect so they are glitches from time to time so having a quick response to ensure that we are 27 uh in operation is also important and we could not provide these services if there was not an infrastructure behind it so i think probably we in mr with your application suite is a very good example of how the complexity of digital infrastructure can be mediated and made accessible to the to the researcher without a major it it but harddles so same question to you richard um because you're you've been very active we have heard about eosman it's actually virtual lab so how digital infrastructures and eosk hub have been supporting you yes so we're currently using eosk as a provider of the computing resources uh through its infrastructure as a service capability and specifically to allow execution of of the other thing the digital infrastructures allows the scientists decision makers through dedicated user interfaces which we have here is to run complex models of applications such as eodesm and without the need of sort of setting up you know their own infrastructures and so this really enables complex uh uh scenarios to be undertaken through collaborative research we can work with people reproducibility repliculability and uh reusability of the research outcomes so people can can access and use eodesm and for practical applications well through this system okay thank you richard um i think in my next question i will go into the the future challenges for both of you but before we have prepared a couple of demos so first alexandra we show through one specific application how scmpv projects can be supported and then we will go into the specificity of the specific cases of eodesm so we are planning live demos and we are very brave this evening so i hope technically everything is going to go smoothly and i'd like it to be the floor to alexandra to show how in practical terms how this works okay so i will share my screen and i hope indeed it's always scary to do a live demo you know murphy tends to to pop up in these times so i'm sitting in a dunes on an island on the north sea and i will show you how we can actually use eosk real time in real life at least i hope to okay so if everything goes right you should be seeing an empty screen right now my screen actually so the use case i'm going to demonstrate now is uh covid related everyone is of course aware of the current pandemic and what you are seeing on screen now is a structure of a protein from the virus the one that you see in a kind of violet light violet color that's a domain of of the virus a domain of a large protein of the virus the protein that you all know if you know the picture of the virus and like the spikes that are on the outside of the virus now the way the virus penetrates our cell is to interact with a receptor which is another protein which is sitting on our cell and the green ribbon representation that you show that you see here is actually the receptor on a human cell so this is the first step in the infection process by the virus so now we could think of a strategy to to combat the virus to compete with the receptor so for example see if i was to extract a small substructure from our receptor in this case an anelix and i could try to design a peptide that will bind stronger to the virus than our own receptors so if you could use such a peptide it will compete with the with our own receptors and prevent the entry so we are now in a field of say protein engineering which is relevant for covid of course but it's also used in in all kind of our life science application so one way will be for example and you see now a yellow part appearing to do a mutation in this peptide replacing an amino acid another question is is this a good choice is this the smart mutation will it lead to better binding of the peptide to this viral protein now that's a question that we can try to answer with one of our tools ad hoc in particular which is a tool we have been developing in uteride for almost 20 years by now and which has been offered as a web portal to our users using the eosk infrastructure for more than 10 years so i'm going to demonstrate you how this could be done so now i'm switching and so i'm a say user i've heard about eosk as a as a tool or european infrastructure so i'm going to search for eosk and i've heard about ad hoc as well now you see the first page and the first page i've all to do with eosk and ad hoc so that's good so we are well visible and actually the first link in this page points me to the eosk marketplace and ad hoc so let's go there and i'm getting this page which tells me okay i'm on the european open science clouds processing analysis as a service and ad hoc for the integrative modeling of biomolecular complexes so let's well i agree to get rid of this and now i'm accessing the service this brings me to like an intermediate step there's no need here in principle for for applying for the service directly from the eosk but users will have to to register first to be able to use our services so let's go to the service this brings me got it to the ad hoc portal and what you are seeing here is the ad hoc portal version 2.2 now for the sake of this demonstration i want to present you a brand new portal which is the ad hoc 2.4 portal which if everything is correct so this is the entry point in uterate for the thematic services the eosk we anymore thematic services in my laboratory so now i'm going to scroll and we see that we have actually quite a number of services that operating from uterates and the one that we are interested in is ad hoc now before being able to do any submission to these portals i need to to be registered which i am of course but i need to log in into the portal i click on the login window and you see another component of the eosk infrastructure which is the single sign one component the egi check-in so now i'm going to connect to the portal and the goal here is to use my university credentials to do that so i'm working at uterate and uterate university is here so this is bringing me to my university uh login page no typos fantastic so it's connected me to the egi check-in yes continue i'm fine and if everything goes right the portal now tells me you have been successfully logged in good so now i can go to the ad hoc portal and start trying to answer this question was the change i make in this peptide good or not so we're going to submit a new job by the way you see here also bio excel showing up so ad hoc is a core software in the bio excel center of excellence project so let's submit a new job and you will see that in these steps there is quite a lot of parameters that the user can can change so this is eosk demo i have two molecules i'm going to submit it so i'm need to upload a file which could come from the database but in this case it's my design peptide and i want to use chain a from that one so this was the the peptide from the human receptor so now i'm going to give it the viral protein which is chain b next so the data have been already validated uh the portal presents me with a number of options nothing needs to be changed for this particular use case in this case and now i'm getting into another page which offers a lot of options and parameters and i'm going to change them for for the use case that we want to do so we don't want to model from scratch the entire interactions we just want to basically refine this new mutated complex and define if this is a good solution or not you see a lot of menus that are clickable that expose all kind of parameters so while i'm changing the parameters to some things which is sensible i could probably do anything because there might not be many experts that are understanding what i'm doing but what you see on another component of basically running thematic services is that you need to provide training to your users users are not going to use the resource without the proper training so that's also a lot of investment in time in effort you need people to do that you need to put tutorials online so that people do the right things and you also need to be ready to answer a lot of questions so everything is ready i can submit my computation to the portal and now since the beginning of april we have added options to tag our jobs as being COVID related research so this proper window appears is this a COVID related research yes and now the job is being submitted so we see it has been successfully processed and at some point it's going to start running on the grid now this COVID tagging that we added in the last months basically allows us on the back end to send the jobs to sites resources that are specifically supporting COVID related research so we've with the so negotiation between EOSC and EGI we gain access since amounts now to the US open science grid resources and we have several high energy physics groups that are physics sites that are providing that are providing resource for this kind of COVID research so now we don't need to wait for for this job to finish because it will take some time so these are not question of seconds but i have some pre-calculated results we can go to so i'm going to the job information so this is the one i just submitted and here we see a previous run and this is bringing you to the result page what the user what the end user will be presented with now users also get notified by email so it's a very user friendly way of dealing with complex workflow complex computations which completely hide the complexity of high throughput computing to the end user so the user can look at these results we have some citations proper citation of the european projects and sites that are supporting us we constantly monitor the satisfaction of your users and then you are presented with results statistics in this case you see here a score and this score would have been to would need to be compared with the score of the wild type protein the unmuted one and actually i am not going to show you the comparison but this is better than the the receptor in our cells so yes we have improved the peptide and maybe we can do more mutations this is of course all in silico predictions we need to do experiments to validate those things another feature of the portal we can look at the structure directly online from the web browser you can see here this particular residue of course when i move my mouse connection might be a bit slow but this is the mutation that we introduce and yet it's there so now i mentioned in my introduction that we have a large user community so i'm going to bring you to a statistic page which is online so everyone can look at it so these are the number of of user requests that are actually active on our portal so we have two entry points so you see there's about 120 that are actually active there's a large number which is queued you can find here the total number of jobs that have been served on the portal since the portal have been running in 2008 so it's more than 270 000 and each job submitted to the portal might translate into a thousand high throughput jobs being sent to the iosca high throughput resources you can also see here that we are monitoring covid related research since since the months now and we can also have a look at the worldwide distribution of our users so you see a map let's sort the users so you see the total number of registered users for all of the we in a mathematics services is reaching almost 18 000 we have seen an increase in registration in the last months because of the covid pandemic so a lot of people are doing covid related projects all together europe aggregated is the largest community but you see that asia is following very closely in the us as well now finally i'm going to show you actually the real time statistics about jobs that are running now actually this is the statistics from last hour so this has been running on the iosca resources so we have about more than 2000 jobs and in this case it's about one third of the jobs that are running for the last hours where covid related one where have those jobs been running this is the distribution again over the last hour over sites so half of them have run actually at in calcruy which is good in the context of the iosca week we don't pre-define where jobs are running so we are using more an opportunistic computing model but dirac for egi does all the magic for us and distributes where there are resources so currently kit is giving us a lot of resource this is specific also for covid related research like the the high-energy physics sites in marseille which which put up resource specifically called covid but you also see the dutch resource very importantly represented there and finally the last bit i want to show you so this is the number of jobs that have been running on the infrastructure for the last 30 days the violet color that you see there is related to covid research so these are all jobs that have been tagged as covid over the last months and this represents a significant fraction of the jobs so with that i want to conclude this demo and i hope that i gave you a real life example of what the server or the thematic services can mean for users for also a real life uh important research question which related to covid pandemic thank you very much thank you alexander and i think today we've beaten the marseille completely so that's a good demo and thanks for this journey from science to infrastructures i think this is really nice and clear representation of the need of science in a specific domain i like to go to richard and we are again moving to a different domain with ecology and the climate and environments so richard we'd like to hear about how your science works and how elders is practically supporting it with use cases to you yes a completely different scale we're looking at the globe and basically one of the ideas is with the edes and is that we focused initially on national parks but as a result of the yes we've been and the facilities there we've been able to apply it um for any location in the world and there's still a lot of refinement that can be done with the with the technique but it's been very well developed over over the last um year so basically edes is a complex and expandable system which means you can keep on building on it so it's it's not something that's going to go away and it generates land cover and land cover change maps based on the food and agricultural organization land cover classification system um what i'm going to show you now is is some um demonstration of that and as i say it was originally developed for these national parks but um one of the things that we um tried to access was the the Sentinel-2 which is the satellite data from the Copernicus program and to try and access that archive directly and produce edes and using that data without actually having to download it so the classification and the change maps are generated that way the Vlab is a technology that implements um an orchestrate for automating the configuration coordination management of the online accessible systems and it's uh it's very um in the following demonstration i'm going to show the potential of edes and the virtual map to address some of the issues that are relevant to sustainable development for the goals example limit impacts and climate and how forest can be used and we've got examples of impacts of bushfires in australia's forests and how it can door in on little harvests and i'll also show you how um commercial harvesting of forest can be monitored using eodes and to support sustainable use using an example from the metang mangrove forest reserve in peninsula melasia um but first of all we'll show you how eodes and works in matthew um sanctor is going to drive the demonstration so um there we go so basically the first thing we can do is is uh we go to the website and we select our region and we originally i say we've looked at our mountains for example and we have a choice there of the the national parks that have been predefined um like grand parodies this is where originally focused our studies in eco potential so we choose grand paradisa national park we select the model and you can choose your region of interest as well the model is eodes and um and you then select the date and we're going to select the first of may 2018 and you choose the image from the archive and we have a nice we found a nice clear image here and then we do um the second date which is the first of may 2019 and again select the image so you've got two images and you're going to be classifying each of those and then looking at the change between those so you can review the um you select the platform are you these are the area of interest you select the platform this is the eos and then you can then run um the model so we've got this model it takes a while as with the other model you saw um previously um so we're going to go back through the history of what we've done before so what we'll do is we go to grand paradisa and this experiment shows the result of what we are running currently the main uh land cover in this particular type is water which appears as blue because it's it's it's actually snow but it's classified as water in the FAO um as you go into more detail and actually put it into water in a later more detailed version of this and the classification um so it's based on the FAO land cover classification system and there's eight classes which are mapped and you can see natural water and basically that's snow and there's eight classes there and uh forest and herbaceous vegetation not separated so you're not separating out forest in in this particular legend but that can be done at a later stage as well now the legend zero and legend one show you then the land cover um the legend for the two land cover classification for period one and period two and then there's another legend which is for associated with result two which shows you then the change in effect you have eight levels um three categories um the eight we talked about before and we basically compare them over time so um each of these are associated with a range of um drivers like climate economic and associated pressures like demand for land or increase in temperature and impacts on the environment for example if we have a change from natural terrestrial vegetation to bear or sparsely vegetative surfaces this can be associated with a loss of impact loss of vegetation which is the impact but the pressure is say deforestation but each of those squares actually represents a major change in the um in the land cover whereas on the diagonal actually it represents um no change in the land cover but internally you could have a change to say the canopy cover the canopy height and so on so basically that that change approach is very consistent and aligns with a lot of the political frameworks related to the SDGs as well so beyond the national parks we're taking out to the Australian bushfires and these were large and severe in in early 2020 and so we'll give you another example here um in the history section which goes to the Australian bushfires uh in New South Wales and they caused extensive damage to many forests in Australia and unprecedented loss of biodiversity and the rainforest areas were particularly badly affected um a lot of the forests like eucalyptus are are um you know they can adapt they're adapted to fire but the rainforest is not and so a lot of those were destroyed so between the 15th the 11th of February you can see now this big change from the green of the result one to the the um the complete uh loss of forest and results i'm sorry result one and it was basically saying it's converted a lot of that's converted to um sparse or no vegetation and the result two is the change which is your eight category change so if you go to results two and that shows the extent of bushfires and i was actually visited there in March and there the that is the a very good representation how massive that burn has been but what we can also do is see you know we saw a lot of recovery from the shrunks the tree trunks they sort of tend to grow up from the tree trunks so we look at what is called the recovery now so we've taken the image on the the later date which the 24th of February which was just after the fire and we just looked on the night of May which is like you know about a week ago and you can see now that it's actually coming back the cover is starting to restore and you're starting to get some hope now into that area where you know you could see that in the field and it's it's really nice to see it's not all coming back some will take decades but at least it's going greener uh the next example is um hurricane damage in the Bahamas and there's an interesting hurricane coming up now was a it's a it's a storm and the um you should now um are hitting uh anger and so on um we can look at that storm uh maybe in a couple of days time and see what the damage was so this is what we did with um hurricane um during in the Bahamas so on September the 26th of September 2019 a lot of the Bahamas was devastated so we look at the Bahamas before with our land cover classification and then we look at it afterwards and you can see that there's obviously a big change in the land cover and then the results which is the change and you can um and you can see the ledge in there that's number two results number two that shows that there's extensive flooding and there's damage to the vegetation in the urban area so there's been a lot of flooding in there and that actually aligns very nicely with NASA's assessment of where the flooding was but we've actually also captured more than just the flooding we've captured the change in the vegetation the 64 change categories are captured through that the um we can also um look at then what's happening in the recovery of those areas I believe we have an example in the history of the Bahamas recovery we just picked that up again you know a couple of a week ago and you can see now it's still pretty sparsely vegetated all there but there's actually more vegetation coming back there's there's a recovery of those areas so what we can do is is we can choose anywhere in the world anywhere in Europe or India or whatever we can change choose it we're providing a basic classification at the minute but we'll talk later on about how it can be advanced um but that's the that's the big thing we've gone away we can support the national parks we can support anywhere globally potentially with this and we're making significant advances in the higher level classification for this which we also want to run through as well the third example um is what should be the fourth because it's about the mangroves in the Tang um basically the mangrove forest reserve in the Tang has been in commercial operation about a hundred years so it's one of the oldest commercial forests in the world and we've been doing quite an extensive project there involving the virtual lab and showing them Malaysian people how it works as well and we can see how um they log on a 30-year cycle and we can see the logging coop so we'll be zooming in in a minute we've got to be able to see the individual logging coop so they they have the forest they cut it down for charcoal which interestingly goes to Japan a little bit but also for poles for construction and so on and you can see now then these these coops are appearing in the mangroves and they regenerate they're here about 30 years to regenerate then they clear them again so this this system allows the the forest managers to say okay what's being cleared you know this week if a clear image comes in what was cleared they can track it because they've got a lot of commercial operators there maybe they can check them with this for example so and also what they can then do is also track how the forests are recovering they can actually then use that to say well can we sustainably manage the forest is the forest being sustainably managed and so we can track all sorts of aspects in there as well so it's it's um very this is the very upper level part of the framework but there's a very deep framework going we should go much further in and use the power of the computing facilities that have been provided here to produce a sort of endless supply of land cover and change and contributions to looking at future environments and looking at sustainability so it's a very exciting way of doing things and the platform gives a gives a great way to allow people to just look at what's happening so so if you want to see the the impact of the cyclone tomorrow just go and have a look through this platform thank you Richard um was again another inspiring demonstration and I think with this second demo we can also see how scientific data can support policymakers and how impactful this is for all of us as citizens so yeah very nice presentation I'd like to encourage our audience to perhaps post some questions in the chat we are pressed by by time but while we wait for a few questions appearing in the chat I'd like to ask me to ask the question to both of you Alexander and Richard so the the question is um what is coming in the future so if you can say in few words what is the the biggest challenge and this would be useful input for the infrastructures and yes I hope to to define our future actions Alexander so well one challenge for us uh is to keep having software and services that are top of the line and up front in in terms of scientific science that are there they're covering because if you have something fantastic today you you still want to have users five years five years from now so it means that you cannot just put a service out there and and and let it go so you have to constantly improve it so that's not so much for the infrastructure maybe a part but projects like bioexcel are contributing to that further for us it's it's important to so I think in my demonstration it was clear it's more about the computing the data comes from the user and each user comes with a different set of data so we hear a lot about data in the context of the european open science cloud but without computing you cannot process or analyze any data so so so the computing is important it is it has to be there and it has to remain in the future should it be greed or cloud it doesn't really matter uh dirac for egi is ready to to to access both so far so life is easy what is also what is also important to realize I think in terms of European landscapes that all those nice services that are put by all the different projects to target the users they are not going to run by themselves you need an infrastructure to run them of course but you also need a manpower and people to to make sure that the service is running every day to answer user questions when things break down to contact the infrastructure provider when there is a problem and and this is not happening by itself so operating services does require manpower uh and this is something which is not always very visible in the current funding so we are speaking a lot about virtual access but we as service provider we cannot make our user pay for the access I think if tomorrow google is asking you to pay for each search that you are doing you're going to use a different service so we are sitting in between the infrastructure which has cost associated with it and the end users and we need to be able to to keep providing the services and I think also an infrastructure a fantastic infrastructure without usage is also completely useless yes thank you alexander I think the message was here we will make sure that you discover we bring this uh to to the commission but also to our partners and the future is bigger entity so Richard what is your uh your request for for the future and well I think one of the things is like you know you develop something like eosm and and um what you need and we've been successful to a point in uh taken out for what is to get the funding to support development of the system but also the associated platform you know you you you you develop something that works so for example in strictest form eodes and requires the use of environmental descriptors retrieved from satellite data and they can be continuous like canopy cover or you know thematic like plant species for example and um capacity needs to be put in place to support the routine retrieval of these which can be done through the eosk platform um and um and they um they could include things like snow hydro periods so for example if you classify snow every every um date you can actually over a year you can work out snow hydro periods so capacity for for doing that or machine learning to do plant species all these can be done some of them already in the vl um but again it's there's lots of elements that we've brought together to create something which is much bigger and much more powerful and and this is the sort of thing that eodesm is really good for so if you could do that and and make sure that infrastructure is maintained effectively get the funding coordinated um around this this concept and then the full um functionality of land cover and change mode could be realized and then also bring in data from other sensors you know like we've got central one land set as well as also a huge benefit so there's lots of lots of capacity due to expand it but what's nice about it is that you can get one grid work on hydro period one online bring in central one and so on and and it's almost like building a big system through the one system the the platforms that we're developing so eodes be designed with this in mind but continue to develop the system and the associated infrastructure is needed to create and deliver the advanced products to european but also the global community so um one of the things then is that um providing opportunities for continued and long-term funding of workable solutions is is also important is these often take time to develop and mature as you said eodesm is expandable it's a workable solution but it you know but but to have something which continue to relies on on small amounts of funding to to build something which is actually quite um large and how could have a real impact is is really needed so it's a combination of funding infrastructure as said before um people to actually be able to know how to use it to access it and so on so it's it's incredibly fantastic system but it needs feeding yeah i think you have in essence a similar message that uh science and policy making can be empowered by data and the complex infrastructures but this requires a lot of dedication and effort and sustained the funding to to make it work for the user so thanks for these comments i don't see more comments coming through the chat which is good because we are over time but um for everyone who is connected i hope you enjoyed as i did the science and the challenges for environment that we have discovered today with alexandra and richard and before i hand it over to sara for a final announcement i'd like to to just give a few things and first of all to our project management board with peroster leading it when we have discussed the conference and what to do with that they have been enthusiastically supporting the idea of a digital conference um this has required effort on trust at to put everything in place to the presenters and we were not too sure about how successful this was it was the first time i think it's overwhelmingly successful and we have enjoyed the large audiences in in many sessions so thanks for for these efforts for the support from the pmb and i'd like also to thank specifically sara and the team at trust at for the dedication for this conference to make it happen and making sure that all the technical part which is not visible would work seamlessly so thanks a lot for that for from us cab the consortium and everyone who attended these two days and with this and with a virtual applause for for the audience and for the speakers i'd like to give the floor to sara for roping up the session well thank you very much to you titiana you also were very much behind the scene and i also would like to add extra thanks to the communication team of eoscap the input julia rob diego and my other colleague at trust at of course and clearly i want to thank all the chairs and the speakers and the eoscap board package leaders as well for the huge work that they've done so before closing as i said before and titiana announced we have a final announcement that now you will see on the screen so we want to announce the final eoscap event that will be organized and together with the shock which is the cluster for social sciences and humanities and the freya project which is the project working on pids so we are working right now on the agenda the idea is to have a three-day event in amsterdam on the week of the 16 of november we will announce a save the date soon hoping that these covid situation is will be over otherwise i think after this very successful experience probably we can go virtual as well the idea is that the event will open with the shock stakeholder forum event so targeting very much the social sciences humanities communities where eoscap will also showcase the specific thematic services serving this community the second day will be mainly dedicated to the eoscap results and the third day to the freya results as freya is finishing in november say that i just want to remind you all that all the presentations are online and linked to the agenda of the eoscap website the recordings of the sessions will be also soon available online and in 10 days or a couple of weeks time we will also share with all the participants the latest edition of the eoscap magazine that will include an article a post event article and for that article it would be great so if we could have if we could take a group picture right now so if you all can turn on your videos so we will try to make this experiment and we will feature this in the eoscap magazine so i give you a couple of minutes to turn on your videos okay i see many faces there so i wait the green light from my colleagues are you taking a picture now so please smile oh nice to see all these very well known faces so this takes a couple of minutes because there are still many people connected so stay with us for another couple of minutes i think that so we are done so with this i wish you all a nice evening and a long weekend for those having one and see you in amsterdam in november hope so goodbye everyone bye thank you everyone i'm done bye