 Thanks, everybody. The GeoDevil work, that's work that we've done in the Geoscience community in Deville, that was the umbrella organization there is OSCOPE, that's the interest facility in Geoscience in Australia. The funding, as we all know, comes from NZRDC. Next time, now ARDC has to understand it and the partners in that particular project were, apart from OSCOPE and NCI, the University of Adelaide, the Australian National University and CSIRO, and I myself from CSIRO. All right, I'll give a quick summary of the lots of little part projects we included in this GeoDevil activity, and I've all done amazing work. So I'll give a short overview of the overall accomplishment so far, number of down in one or two particular aspects of it, which I'm most familiar with and hope for interest to the broader community. The first thing we've done is created our landing page for all the common services and data portals in the OSCOPE in the Geoscience space that we look after or feel affiliated with. So that's a new page available through the OSCOPE website and you see all kinds of web portals listed there as well as references to analytic codes and how they are linked and I'll give a little demo of that a little bit later on. That's in this project, I should talk about technology, so this is in particular an Angular 6 usual thing about this content is that the content is not hard coded or in a database that's actually connected to a software registry that we also developed with OSCOPE and ARDC funding over the previous couple of years and all the codes and portals that are listed there are dynamically retrieved from that service and as soon as people register new services there are new codes that will be immediately available and listed here as well. The second activity was the launch of an ITSN Minting service. ITSN stands for International Geo Sample Number and the idea there is to link physical samples from the world when people go out, collect rocks or drill, associate them with a unique number and then this number can be used in references or talking about these samples and checking these samples that originated as you can imagine in the geo community since it's called the geo sample number but it's reasonably broad in the way it works so it can be easily extended to any physical sample that you want to associate with a unique number. We proposed that work and saw the opportunity there and actually offered to roll that out as an end service and they actually did that work there which I think is fantastic and Julia Martin and her team did great work there so that is now available as an end service so the geo science community is the first customer to OSCOPE and we can, geo scientists can mean samples there but if you're interested in your community, if you have physical samples and you're interested to use that service please talk to us or ARDC about that service and how you can join that. Next activity is the OSPASS portal that was developed by ANew with OSCOPE and that is a web portal to make available passive seismic data I think ANew is hosting around 60 terabytes the star of this portal will make this available to the community. The six laboratory is a data discovery and analytics OSCOPE operates that has been developed over the last couple of years so this project funded some new developments there we're importing it currently it was a javascript front end with a java backend so we're replacing the front end with a modern angular front end and adding some new features within that project I'll talk about this a little bit more further on as well in addition to these developments NCI did a great work to add actual data sets to make them available to the community so they added about more something around 3000 geophysics data sets to their repository and they're available through the OSCOPE portal or the geoscience laboratory a geophysics laboratory we can access this data and process this data run analytics models on that and NCI in collaboration with the University of Adelaide actually transferred about a terabyte I think of magnetic cheluric data sets to NCI and they're available through VGL virtual geophysics laboratory as well I didn't spell out magnetic chelurics because I don't know how to spell it took me about six months to learn how to pronounce it but apparently that's a great achievement so all these data sets are now available there in addition to that there were some applications and presentations and outreach activities I'm not listing them all there if you're interested with the interim report for the project list them all and you can have a look there so that's a broad overview over the the activities that we are undertaking or have undertaken to find this project and I'll put it down into two of those a bit more show them a bit one is the frontend and the OSCOPE virtual research environment frontend the idea there is ready to make it easy for people to go to the OSCOPE site see what web portals web services are available in the OSCOPE community what analytic codes are available there so if you want to try out yourself you can just go here to avre.oscope.org and you will see this page here and currently there's not that much on there yet we just released a couple of days ago but the idea is to grow that and eventually list everything in the community here and make it available so what you see here is all the web portals that are currently under the OSCOPE umbrella that's the virtual geophysics laboratory the discovery portal we have an underworld training course based on Jupiter notebooks for the docker environment and the iGS admitting service that I mentioned before so that gives you an overview over the web portals if you're actually looking at specific analytic functionality for example there's one here called escript magnetic inversion so if you're interested in that and want to know which of these apps actually support that you can take that there and you see if you want to run that you can go to virtual geophysics laboratory and and actually use that code directly there there's also some more details here which show you a bit of the metadata associated with these things but I want going to detail it's all backed by the software registry that we run but that's all in the background so I'm not going to show that here Peter I don't think they have time for that so what you can do is you can launch one of these services in this case the virtual geophysics laboratory that allows you to for example look for data sets these data sets are not hosted in that laboratory that links back to repositories I think in this case at NCI but they're also geoscience Australia has repositories you can see where they are you can overlay them over the map so you can get a preview of these data sets so these are gravity anomaly sets you can zoom into there so that's what you would expect from most data portals where this goes further is in the analytic capabilities that it offers so what I can do here if I find some data set that looks interesting in a one-run analytics on that I can collect the data set and capture it then directly from that portal I can create an analytics job I still need to sign in here to actually process that data set directly in the virtual laboratory for that I submit a job the data set I just captured is here I can add I can upload files I can point to other web services if I need to add other data sets to that the next table connects again to the software registry the software app store I was talking about earlier and here we find the e-script that we were talking about so I can select that here again the virtual laboratory goes to that service it discovers what are the input parameters for that service people's values and I can select the data set I had here let's skip that step and next step all I have to do now to actually execute that code I selected on the data set I've selected is to just tell the system where to run it we support the nectar e-research cloud we support amazon web services and we support the region which is the super computer at NCI you can't see nectar here because I locked in with my gmail account instead of my AF account so if you look in with AF you can run a nectar but if I look and if you look in with the google account or something like that you only have the option of amazon or or region so if you wanted to run this on amazon I just selected here all I need to do is select what type of virtual machine or bigger should be to run on or if I want to run this on region slightly different parameters here I just need to say how many CPUs do I want how many how much memory in gigabyte how much disk space in the wall time I click next here can review that and then I can submit the job so this makes it very easy for people to to go to the virtual laboratory but very easy to discover data sets so that was backed by petabyte or whatever of geophysics study can browse through the search through there dependent on facets and keywords very easy to find analytic codes to run on very easy to provision them so you don't need to know how to use amazon web amazon cloud you don't need to know how to start a virtual machine and look after it you don't need to know how to use hbc facility like region what a pbs system is and so on but all the virtual laboratory hand was all that for you in the background it knows how to start virtual machines and amazon it knows how to put drops into q on on region and we'll do that so you only have to select what you want to do and it goes off and does that that might take a little bit now you can see it progress here it's currently in provisioning which means it's probably somewhere in the queue so I'll just skip to some other one so once it's finished you get an email you can download the results of the job in case the job supports that you can get a preview image of of the results you get log files and so on um you can then start new jobs based on that or you can download the results to process locally um that's uh just a short introduction to these two technologies I think that's probably my 10 minutes are there so thank you very much and are there any questions thank you castan um we will leave a qa after all four talks um um for possible questions that can be come to all uh