 And let me share my screen so that I can start my presentation. Okay, so let's start with this today's presentation is on the remote sensing platforms and I would specifically focus on airborne remote sensing system for the polar applications or specifically developed for SWALPA research. And as I mentioned by list, I work for Sios and Sios is also northernmost Copernicus 3-day on the planet and which we also provide information about the Copernicus satellites and information available for SWALPA region. So since this is remote sensing talk, let's start with the ground-based remote sensing. This is the picture and how things are changing faster in polar regions. So this is the picture from yesterday while I was walking through my office. When I sit in Long Rapien, you can see on the map and the exact location of my office is just short by the arrow where it's next to the cantina. And this is the picture I captured yesterday and this is today. So you can see that how significantly things can change in polar regions. And this is one thing why we need sensors and why we need more information about the polar regions. So with this picture, starting with the ground-based remote sensing, I would focus on today's talk, what I would speak on this. First, I would introduce about the Sios, what Sios is. And then I would focus on the remote sensing activities of Sios while introducing what is the new remote sensing platform we have launched in the last two years. So starting with the first thing is the Sios and the Sios data management system. To start with, this is why Arctic or why Swalpert is very prominent to know that the first news in 1922, when the American Council at Bergen reported that the Arctic is warming and it is failed by sea hunters, explorers and fishermen, that's the first record of the warming. And we have the different sensors, we have the data sets showing that Arctic is no more warming twice, but it's already three times compared to the rest of the globe. And even in Arctic perspectives, Swalpert warming is high. And you might know about the last year's news in June 2020, we broke the 40 years of record of temperature by setting the record of 21.7 degree temperatures. And so this is why we have the observing system in Swalpert because of this warming and the pressing questions about the Swalpert. So Sios is the Norwegian initiated international collaboration to create the observing system and which is focused on the open, free and organized data. And we have regional interest in the Norwegian archipelago of Swalpert and associated waters. And we focus on the Earth system science questions. And our vision is to lead to be the leading comprehensive long-term observing system in the Arctic to serve the Earth system science and society. And our mission is to develop efficient observing system, share technology, experience and data, close knowledge gaps and decrease environmental footprint of the science. As I mentioned that it's a collaboration of the institutions, we have 26 institutions from nine countries and you can see these logos of these institutions and the countries which are involved in this observing system. So basically the observing system is the high Arctic system and this is as I mentioned it is more because of the most severe temperatures increase in the Arctic. In the Norwegian high Arctic, the archipelago of Swalpert already has substantial research infrastructure. The New Orleans and Tresa station, the Longheapian, Honsun and Brandsburg extensive institutionalized international cooperation for more than 15 years between institutions from about 20 countries with coordinating committees like New Orleans and Science Manager Committee in Ismak, Longheapian Science Education Forum, Swalpert Science Forum. So in addition to that we have high data transfer capacity between Swalpert and the Norwegian mainland and in the future between Longheapian and Honsun. So Swalpert has the infrastructure to study both upper and lower atmospheric layers, unique potential for use of satellite data, high overpass rate specifically on site data collection and also the calibration and validation. And we have Harvard facility for research vessels in several locations and that proves that we have a high quality research infrastructure to do science in Arctic. So CYOS works towards what as you can see in the picture, we have the dotted infrastructure ground-based research infrastructures on the Swalpert and the satellite based or the remote sensing based observations and CYOS works towards integration of these new and existing infrastructure and data. And it's a network of systematic observation to have the better temporal and spatial coverage of key observational data. And of course it is for the reliable access for the long-term monitoring data in Swalpert and basically in the broader sense I can say that it's for improving research conditions for the scientist working in persistent science. So with this basic introduction to CYOS, I would introduce how CYOS works or how observing systems data management works. It's a CYOS data management system called STMS. So we have the data centers from the institutional data centers and they have their own procedures and each data center for institutional data center has its own set of data management facilities for ingestion of new data and associated metadata and then maintenance of the data sets including the data and for the data exploration. The CYOS don't change it but bridge these data sets from the data centers and the integration to dedicated working groups in CYOS. So the approach is we follow is that the size STMS is built to be data set oriented. This means that the data sets and the descriptions of the data sets are critical for efficient system. The open data space approach means that we don't have stick boundaries on what is considered as a data set as their form can vary between disciplines. The net center means that we don't have one central storage but network of connected data centers that are providing data that work together and as you can say that interdisciplinarity comes into the picture already with this approach. So here you can see that the different institutional data centers and their sharing of the data through CYOS KC and even we have data centers from outside CYOS member institutions. So CYOS is focusing on the system science in order to support this effort and data and products describing the relevant process are required and this can be result from the long term monitoring efforts or ad hoc efforts focusing on process studies by different institutional but establishing a virtual data center that's offered the unified access to the relevant data is the task in CYOS KC or the knowledge center where we can unify this all this data centers to the data sdms and here you can see that we have the portals from the size of the repositories and even also non-affiliated sources. And so the sdms harvest the data from the member repositories as well as from the non-affiliated repositories and the non-affiliated ones are some that have data that is relevant for us and for example that contain measurement data from Svalbard and Salam Gheriyas but not part of the CYOS and this data is then made available through CYOS data portal and this information about the data sets are harvested on a daily basis. This information is wiped once a month and this is done to ensure that if there are updates in metadata and or data sets that information we don't have pointing or still or data sets or something like that. So with this sdms how the data we handle the data now we have all the data but we also define CYOS core data so what is CYOS core data is that the CYOS core data have been defined to optimize the resources contributed by the CYOS research community. The core observational program of CYOS should provide the research community with systematic long-term observations yet flexible enough to integrate upcoming new methods and research questions. So there are three criteria for defining this CYOS core data is scientific requirement, members commitment and the data availability. So we have the scientific science optimization advisory group with the task force that defines the process of defining the core data and the criteria are based on standard of scientific excellence in the earth system science in CYOS framework and the CYOS data policy and for now we have almost have the 51 variables classified as the CYOS core data but we have defined this as the atmosphere price per terrestrial and ocean. But I would go into the details of this because this is not the focus area of the today's talk. So today's if you are more interested in the CYOS core data you can go through the website of CYOS and see that what kind of core data we have defined by now. So this is just the basic information about the CYOS and the how we handle the data of what kind of data is defined as the CYOS core data but with this I would start now exactly the today's topic is remote sensing activities of CYOS and starting with the why remote sensing is in polar regions and this is a classical card I downloaded from the internet and here you can see that the field campaigns of the whole polar regions are very difficult if not in practical and the earth observation and remote sensing provides a cost effective means to acquire synoptic coverage of the polar regions from the space and what example is that this is this is Hyde Kilihavine and the director of CYOS who is digging the snow pit and Antarctica in the left side and why do we need innovation or remote sensing is that we don't need this pitch to be to appear on the whole Antarctica or Arctic and why so to get this such kind of measurements this is just an example why we need the innovation or remote sensing observations so to produce the environmental footprint on the of the science and remote sensing but it I talked about the remote sensing in polar region but why remote sensing in Swalbard in Arctic it's I would just focus on that so you can see that sometimes remote sensing is the only way to achieve some information in Swalbard and for instance as you as you see that because of pandemic most of the scientists couldn't reach Swalbard in last year and very reduced number of pre-activities happening this year so there are in Arctic lack of adequate ground infrastructure and the lack of adequate communication systems in Arctic also the remote sensing is environmental friendly as I demonstrated in the last slide but do note that in why Swalbard is that Swalbard has among among the in whole Arctic has among the best available infrastructure and that makes the collaboration and validation of remote sensing data attractive so the remote sensing can be many things like satellites like Sentinels or rockets sounding rockets and balloons and radars also unmanned and aircrafts and drones or locally guided drones or fixed measurements sorry so in science perspective is a science is an infrastructure endeavor but the infrastructure must be closely connected to the clear scientific case ideally the science case should come before the infrastructure but the sometimes new technology capabilities not initially envisaged scientifically and that's why the remote sensing infrastructure does necessary tool for providing data to most of the scientific cases and sometimes it's the remote sensing is an additional or supplementary tool to the other observational tools so in science we have the science remote sensing service that I lead and our remote sensing service functions at the single point of contact for satellite information for Swalbard and gives assistance for access to remote sensing data it also informs about the potential of satellite data and how to use that like training courses and we aim to be a forum that brings together product users and we provide us in order to improve the user usability of their satellite data or the remote sensing data in collaboration so integration integration in science perspective is a buzzword it covers broad field of science and is it is also based on the integration of different scientific fields or spheres in Swalbard perspective the integration applies to all of the infrastructure and the remote sensing capabilities covers most of the in situ measurements and has thus the broadest integration capability for instance we have a lot of satellite data with the Swalbard and a lot of in situ observations and connected by the ground measurements and the integration also implies to the capability of making data available and the science website is increasing with the availability of all the data and the remote sensing data have come the furthest with the close coupling to the Norwegian ground segment satellite data but I know and other open data sites and integration and the Calval here you can see in the picture about the chlorophyll in the conspired and these ground measurements which are taught which are shown in as dots and this is this is something what I can say that Swalbard has the best infrastructure for high Arctic sites as I mentioned and this is ideal for satellite Calval and new instruments or the satellite instrument must take this into account that this is important for their calibration and validation but also Syos works towards why how Syos does that because the satellite owners need the best Calval to create the best measurements to justify the large investments and the Syos works towards planning and collaboration and integration by facilitating the dialogue between the satellite owners or space agencies and the field scientists who are collecting the data in the field so this is in general introduction to the Syos remote sensing services but now I would focus very I'm coming to the today's topic what is the new remote sensing platform in Swalbard so here is the new Syos and North the North is the Norwegian Research Institute which is a part of Syos consortium and this is the Norway's first research aircraft with the first passenger aircraft with the high resolution remote sensing capabilities and why this is this is such a good idea or what are the arguments for a pod project putting the sensors into the aircraft passenger aircraft so satellite optical instruments require cloud listings and is a challenge on Swalbard during the summer even in summer months because you get much of cloud cover even you can see that on the on the central coverage or the Swalbard you might find many of the images cloud with the clouds but with this platform we can fly under the clouds and with the excellent conditions especially for low sun conditions or when there are clouds it is still possible the second is that the platform is already in Swalbard and can collect data when the conditions are right and we have fixed flight lines between Long Yubian and New Olsen and Long Yubian and Station Noor in Greenland and also Long Yubian and Swia and also we have high spatial and temporal resolution can be achieved for instance we can we can achieve 25 trips across the front spec per year and that's why it's a cost effective tool for dedicated remote sensing missions and it can be used for collecting research data combined with normal flights normal passenger flights to New Olsen or even in Swia or in this station north for the logistics purpose and in addition it can help heal the gap in emergency preparation preparedness system for the high north so the what is this payload is about this Long Yubian LVIR Long Yubian LN look transport payload integration is that the important information here is that what are the design requirements for that no capacity reduction with the regard to the passenger and cargo because this is this is the chief operation application for this aircraft to deliver the passengers and the cargo in different sites and so on and prepare for the semi-automatic operations of the pod or the sensors inside the inside the aircraft and maybe flexible to for being monitored by the pilots and we also thought about the sliding door for dropping the buoys to in the ocean or the other equipments or measuring sensors through door so this is this is some quite this is a diagram for the for the technologist or the engineers for how great here you can see that we have sensors here and the phase one camera P and I and the IMU unit and the other sensors and they're connected with the with the system itself within the aircraft and I it's controller is the brain of this all this sensors together so what are what are the current payload into this aircraft or the sensor is the first is important is hyperspectral imagery is visible near infrared spectrum and it is made by Norwegian Electro optics and it is useful for detection of the classification of objects for example vegetation oil and water or algeas and it has the resolution up to 3126 nanometer spectral resolution and we we get many bands around 160 more than 160 bands for this from using this hyperspectral imagery but in addition to that we also have the medium format camera for generating the high resolution orthoretical orthoretic vision of the photographs and to generate the 3d models of the terrain and that that is called phase one camera then we have got one radius to retrieve data from buoys and relays the data sharing or the coordination and even we have PIS for information for better real-time coverage around Svalbard and we have GNSS for the accurate and direct georeferencing of images and if you are wondering how much is the capacity of this capability of group transfer down here is that around 2400 kilometer round trip range as shown from the long ebion it's as drawn as a circle and that has the range for the round trip for for instance it's approximately 20 flights per year to station north from long ebion station north is in Greenland and and we have weekly flights to New Alesson and Sweden from long and this plane is higher for charter for measurements or the dedicated missions so what are the opportunities for this for Arctic science here you can see the Arctic mammals our detection of this Arctic mammals or counting or localization size estimation that is that's that's something we can think about but also we have the research applications like CS properties and dynamics vegetation mapping or Arctic biomass ecosystems population mapping even in glaciology for instance generating the digital elevation models using the phase one camera and we have mass it has it can this data can be used for the changing the chair studying the dynamics and the mass balance of the glaciers and also snow cover or albedo or ocean color for instance chlorophyll and primary production or organ blooms something like that so these these are something some some of the applications of the hyperspectral sensor and the phase one on the aircraft so coming to the how do we plan the missions are it's it's important that we we need the resolution requirements determine the flight altitude and that's why we need more information when we ask for the ask for usage of this this aircraft we need more information on the what kind of images the scientists they are looking for and we also determine the overlap and side lap as you can see that from the camera and that that also depends on the requirements and basically this overlaps define the altitude overlap and the higher and the height determine the distance between the pitch image and also the flight lines are calculated by software in the north and that also calculates the flight lines and the other parameters for the camera but we all we actually define this all missions into the three risk levels depending on the distance from the distance from the long opinion for instance we can operate about within 250 nautical miles of long opinion and you also without any extraordinary measures but this this mission should be should have the prior permission from the governor of swalbert based on the risk levels and and the most important is that for applying for ground resolution is usages usage and the ground resolution is an important parameter so how how do we conduct this operation is that it is a semi-automatic operation we plan for instance pilot's responsibilities to planning with regard to the next flight lines and programming with the fiber management systems integrated in GIS that is we generally use for for the start of the payload before takeoff or but after the generator starts and then the status control of the payload via ipad and the name of the next flight line and the waypoints the same as in GIS which is provided to the pilot that is the responsive pilot then we have ground crew that connects the ground power and the payload computers on computers on and then dump the data into to the ground computer via internet table after the missions and the north is a member institution who takes care of the software maintenance and the processing and validation of the validation of the data and also geocoding and then we share the data through SIOS on metadata through NLIFE which is the which is the live system on the north or the north's website so it's also has the automatic recordings for instance software detects when the aircraft is lined up on an active flight line here you can see the the red line is the corridor where we captured the data but we need additional blue line to just take a turn away take a turn from the waypoint and and that's that's calculated you manually or the before we were planning the mission based on the work area we are focusing on and we we also don't capture the data out of the when we are just using the aircraft for getting the turns through the waypoints and this is why this operation is semi-automatic and there are some improvements going on the assignment of assignment of the scheduling software like auto-generation of pipelines and this is now done manually and even using QGIS automation planner and also start up and check up shutdown or automatic instrumentation control systems like camera control or something this is also semi-automatic at the moment and finally the data sharing and storage system currently we the processing consists of several steps using multiple different software packages that when we get the data out of the aircraft and then it has many processes and some are some software are proprietary and that's why that and this this involves much of time when we get the data and making it available for scientists and also the computers because it's a terabytes of data we collect from the retrospective data we collect from the aircraft and it takes a lot of place to transfer this data from the aircraft to the computers and that's why we are still working on the improvements of this making it more automatic so giving the background of this we first tested this system in 2019 in September when the month of September we received around 50 requests on the aircraft and these are the locations of the request but we could just cover a few of the areas such as consvegan or core islands and and the part of the log of yen this is for instance this is the image of and the bottom and this is the high-perspective data and you can see that typical pipelines generated through the area submitted by the researchers are looks like this and we don't capture the whole the data whole data but we capture only the specific areas from the slides and as I mentioned here you can see about the you can see that even with the aircraft we had bad weather days for three three days and it was difficult to fly this is this is just the example of how what are the results for these permissions so this is a mosaic and the elevation model generated by we're using the pix4d software and then we have the ground control points and the accuracy you can see about around two to seven centimeter for x y and z location so this is quite an excellent accuracy for the uh the door near mission in the unison but still it also depends on the terrain because the unison area is quite flat and if you capture the data sets in different terrain conditions it might vary and that's why this is still under process so we we learned some lessons from this test campaign since September 2019 so for instance many requests were partly overlapping or closed so we we know what we do is when we receive such kind of requests to capture the data we we possibly combine multiple requests into single flight lines and then we try to make it more uh the affordable for pilots and even cost related to this pipeline and as I mentioned that generating flight lines and importing them into the pilot is manual and it's it might have a risk of entering wrong coordinates and that's why this is also something where we can improve on and the data processing is not streamlined yet and but we are focusing on this process now and with these two years of experience we should be there in a place where we can provide the data earliest after the capturing of the data sets in the caretap. Another example I mentioned that supplementary information like how what kind of data we can collect from this door here is also collecting the data from buoys and this is one example when we captured when we had two flights 15th January in 2020 but that was failed because the radio on the buoy never responded but on the 19th February we tried another one and we collected around four GB of data from buoys using the radio and the in the border so this is this is supplementary sensor which can be used for collecting the data when it is not possible to travel by shape or if it is not connected with the satellites. So future plans is that radar on board on here we are planning but this is not the done yet and this is mainly because the Norway is a SAR country or even in Svalbard we have dark periods in the winters and that's why the SAR comes into picture and it also has the relevance of the cloud cover but this is this was this is proposed by Norse but it's not implemented yet and this is something something in the future plan of Norse and I think we will hear more about this in coming year. So this is this is something I covered only about the platform but some of you might not know about the hyperspectral imaging so I just added a few slides what hyperspectral imaging is and I would go faster because I have less time so the hyperspectral sensor captures the data in different substances in different wavelengths and it is it's a data cube. For instance the quick example is that here in this picture you can see that there is one artificial plant and there is a legal missing person in this picture but how can we we cannot identify this with these eyes because we have limited capacity with eyes and for example the spectral properties of cameras has three different modes is monochromatic or the multispectral or hyperspectral in the hyperspectral you see the image in 10 or 100s of narrow spectral bands and the hyperspectral imaging captures the data in the data cubes or the data and here you can see the changes in the colors of the this picture in different wavelengths like 473 nanometer or 547 so it's not a good idea to since human I can only see three bands at a time so band selection and different band selection using these hyperspectral imaging is important. So here you can see that even changing the bands you cannot easily identify which is which part is artificial but if you click on some of the changing the bands and playing with the bands you can easily identify with the different bands easily identify which is the artificial plant with based on the spectral signature and also you can find out the anomaly and the spectral signature as the missing legoman as the white picture on this on the left so this image processing is in an indispensable part of the hyperspectral sensor system here we have the camera processing and the presentation and it is also it's not simply cannot computers cannot simply visualize the hyperspectral data we have to make the use the different band combinations and see the data or process the data and there are many advantages and disadvantages of this heavy data set systems but I will go into detail so this this is just an introduction to those who do not know much about the hyperspectral sensors so after that I would go quickly go through the in last few minutes I would go through the Syos Air 1 campaigns so using this aircraft and the hyperspectral sensors we we had the missions so Syos supports these scientific projects using hyperspectral data and aerial imagery using both using the aircraft and the drones and this this idea was developed in last year when most of the scientists couldn't travel to Svalbard and there was a threat to have the missing data sets into going to long term prime series data sets and this is why we supported we invested around two million donors to support scientific projects from Syos member institutions and as I mentioned already that we provided these two cameras which are on board on board down here for example that captures the two images per second and around 10 centimeter ground resolution for around 800 meters swath from the 1000 meter altitude and also hyperspectral sensor that captures images on 30 centimeter from 1000 meter altitude and we we provided this opportunity and you can see in camp in 2020 in the last year and campus 2021 which was just concluded in the month of August on first week of September and we supported around 10 projects and 25 flight hours were spent and even here in 21 we spent around 21 projects and to just to enlist some of the projects you can get more information about these projects on the Syos website but to just to let you know that we had a variety of projects for for instance glacialogist terrestrial ecologist and terrestrial vegetation scientist and even bird researchers they they got the images which will be helpful for their research and even 21 we just concluded this this campaign and again we have a variety of projects starting from the icebergs and counting reindeer and also evolution of the sorted periglacial circles and also learning the mapping surface properties of the the glacier so we have a variety of the projects supported in the last two years and with this I would quickly go through the some of the applications of this which are which are just preliminary results out of this yeah templates so first is the identification of the crevices this is very important in swanport when you conduct the field survey and here you can see in the left side you see the image from the dawnier and the on the satellite how efficiently we can see the crevices using the dawnier images and that is quite important when you carry out the field work using the snow scooters and here another example of the same that you can see that more examples how much detail on aerial photograph can provide in comparison to the satellite data so not only this but interesting these are the reindeers seen from the aerial pictures and that's that's quite significant because the reindeer counting in swanport is still a manually laborious job to do that every year and with the new technology we with this like dawnier images of the drones that can be that can be measured remotely and with efficiently with the covering large areas so this is one example and of scientific case for terrestrial biologists and not only this we we also train the next generation of scientists to process this data and we recently concluded the science hyperspectral remote sensing training course and there were around 40 35 participants selected out of 80 applications and they also worked on the many projects and some of the many projects are presented recently and I can give some examples for instance in the derivation of the relative chlorophyll concentration using the normally data but this is this is just a preliminary analysis and it is not validated with the ground data so it it just happened it just happened in those two weeks and we are thinking to get get it get this these efforts together to make a final report of this all these efforts together and finally this is this is the the special issue we support we host on the remote sensing journal where we support such application from the dawnier data or the hyperspectral data sets and we are planning to continue until 31st of December and if you are interested to have if you have a test study of using the hyperspectral data in small part you are more than welcome to contact any of the editors so with this I would I'm on time I guess thank you for listening and if there is question I'm happy to pick some questions and my email I'm not sure it's not visible but it's there on a slide but it's not visible I even need my email ID I can put it on the chat box to get in touch so thank you very much for listening again and questions. Well thank you Shreeda that was a fantastically thorough overview of everything that Syos does in a very short amount of time so I really appreciate that you'll need a lie down after this so we have we have questions I've got a few from the panel and there's one in the chat box so Zongbo Xi asks could the remote sensing identify potential dust source regions and actual dust events and second part of that question when you mentioned surface properties of glaciers do you mean albedo or do you mean something else so two questions from Zongbo Xi oh you're on mute right now. Okay sorry so dust storms I think it is possible to identify the dust storms but I'm never read thoroughly about this kind of papers but I remember during my total research somebody have worked on these dust storms and using the satellite data but I'm not in the position to to confidently say that okay it is possible but I need to read a lot on that but somehow I remember that I have seen such such kind of papers about the second the the surface property of glaciers is not the albedo but the surface features about the crevasses like as I mentioned all the glacier surfaces like melting snow and the different parts of snow faces but also the albedo is one of the properties but this this was not the focus of this project. Yeah there's a lot of very interesting work going on on glacier albedo around the around the world and I'm sure there's some of that going on in Svalbard I know there's some interesting algae work happening up there and in fact we have some growing in our lab. So some some questions from a few members of the panel what we're curious about is how you manage to integrate all those different kinds of data sources when you're working with the the data from the aircraft you have very tight control over the data chain from collection right through to to how people can can use it. How do you deal with different data sources that are collected by different people and how do you ensure that it's that it works in in your system and also how do you ensure that it adheres to fair protocols. Yeah so this is a good question I think the for instance airborne datasets we have the streamline processes as I mentioned in my talk that these processes are not developed yet so what we are doing is developing the protocols for how to make this data available because this is the first time we captured so much of data like terabyte of high-respecting data and we are also thinking the ways to find the solutions how to make it available and how to follow the metadata standards international standards and this process is still going on so I cannot answer it fully but this this is undergoing task and I think in next one year you will get more information about this when the data is available but yes we have the Sios data management system working group we have experts from different member institutions and they define how to do this thing but I'm not too part of the STMS but we have the scientific integration and data officer in Sios who takes care of these things. Great so at the moment it's it's a kind of committee led effort do you think there's any opportunity to automate that? Yes this is this is a dream I think automatic even we are thinking since this process is semi-automatic at the moment and we are thinking not because at the moment one one's research scientist or engineer who from North has to travel too long and can get the data and then process this and then transfer the computer as process it and then make it available so this this has many semi-automatic interventions from humans so it's the ideal situation is that we can be dreamed to be there that we we don't have to travel any engineer just the hyperspectal sensor on the aircraft and you just need to provide the area where you need the images and that should do automatically and transfer the data automatically to the system and process it to get the final product so that's the aim but this is I think this would take a lot of time because there are many technological constants and the size of data as we can see that one data by data in one hour of acquiring acquisition of the data sets and aircraft it's that's too much of data so we will be there but we don't know when it's it's it's just a matter of time. Thank you yeah it's a it's a big challenge especially working with lots of different kinds of data um are there um are there any kind of things coming down the pipes in the future that you think will really enhance your science so for example you were talking about uh testing of 7G around uh Neolusund and in uh in Longabean um do you think there are any future technological opportunities that will really enhance your capabilities? Yes I think that the sensors are developing faster and I think in every sphere like in in buoys in the ocean gliders and even terrestrial sensors uh with the hyperspectral sensor we tested I think that would the hyperspectral sensor would make the SWAL but one of the richest uh region where we have the richest uh hyperspectral data so uh that that would change a lot uh in the future because with so much of data the challenges to get the across products done out of this what information we uh we can retrieve from this so uh the new things coming up is to not only the new sensors but the new methods to derive this information uh and quickly so uh and I think the pandemic taught us that there are smarter ways to do field work here in SWAL but even your uh when you are not able to travel to SWAL but you you think that how how smartly I can do and I don't have to travel much in SWAL but in even after the pandemic so the the changing situations also it's a motivation for many scientists to derive such kind of things yeah but uh indeed the the coming coming sensors not only sensor but innovation in the in the research it's it's a key for this and that's why PSYOS has recently opened the new call of innovation award and that first award will be given will be awarded in PSYOS SWALWAR Science Conference in Oslo even on 3rd of November so if so any of you are joining that you will know that we work on innovation and we support the innovation in SWALWAR. Excellent I look forward to hearing from the from the winners of that inaugural award maybe uh maybe they can do a webinar for us next year. Yeah sure um a final question um as as a polo scientist I know that the biggest logistical problem that we have in doing polo science is weather. Is there anything else that you particularly have to deal with when managing your data collection and your your data validation? Yes um weather is the big challenge and clouds that's included into the cloud weather itself but these are the biggest challenge at least from the remote airborne or the satellite remote sensing perspective that's the biggest challenge but also some of the challenges is that the satellite sensors are not exclusively developed for polo science they are developed for the other applications so we don't have we have very few sensors for instance ISAT by NASA it's it's for the studies and changes in the ice sheets but not many sensors are developed for exclusively for the polo science and that's that's where the gap is and I believe that getting to getting these priorities from the polo science to the space agencies and will will improve this picture to get dedicated sensors for this region and that's that's the problem I feel uh apart from the weather conditions. Yeah that's a very good point um but you know ISAT and and cryosat as well we've got improved coverage at the poles people are are realizing that we uh we do need the data there and and yeah I think I think programs like yours um can only help governments realize what's what's possible if you have a lot of data um to enable science.