 So location index is the project that I'll be talking about today. I think this has been presented previously, but more from a kind of higher level. I'm going to be going through more from a product kind of description, what you can do a bit, some future directions, technical architectures, all that kind of stuff. Essentially what we're trying to do is enable data integration via linking location data. And I'll go through what I mean by that in a little while. This is a project that's been a partnership between ABS, Geoscience Australia, Department of Ag, Water and Environment and CSIRO as part of the DIPA program data integration partners for Australia as part of the Commonwealth Government initiative. So, the kind of question that we tried to tackle in this project was connecting up location data. And as you can see in the next slide here, this graphic, this image that we've been, I guess captures what we're trying to do is to integrate between different sorts of data across different data holdings. And typically in Commonwealth Government, there's quite a lot of data being captured in different aspects so you can see, you know, from societal aspects to the economy to the environment, there's all these different data sources, and being captured that typically geocode the data, whether it's in geospatial formats or in traditional databases or data warehouses. All of these data sets that you can see here from sensors to health to business activities to weather climate have some location embedded within it. Now the problem is, if you've got one geography, you want to translate that to another geography, you need specialist expertise or you have to somehow convert data. And this has been kind of a barrier between barrier for data analysts in Commonwealth Government and in the private sector. And often you compromise the modeling because you've got to make some assumptions. We're trying to bring together gridded address data and vector data as well so that's kind of a feature of Loki. As I mentioned location is the thing that is going to bring all of these things together. And the couple of things that we wanted to propose is have consistent identifiers for spatial things. We need to be able to link locations and observations. So, if you have a database of social economic data, you need a way to link the location with observations using these consistent identifiers. And my consistent identifies I mean globally consistent identifiers for the spatial things. We need to be able to assemble differently referenced geographies, whether there's vector gridded, and whether it's different geographies, we need to be able to make sense of them and crosswalk them. And, and so part of the Loki project was also looking at the user archetypes and the types of users that would use this sort of approach or this system. And there was quite a bit of interviews done to elicit this, this set of users. So we've got, I guess, data analysts and GIS analysts and people who broker between these people on the right hand side and they typically develop information that you've got these enterprise data warehouse archetypes that manage the systems that provide the data to these groups and that's important because if we're going to propose a new way of managing and accessing location data then these people are going to be key in now allowing that data prep to the users in the data warehouses that, you know, within different agencies and companies and groups. And there was this extra layer here because this was a deeper project. There's the ABS data lab which has this big data analytics capability they have fine grained information and privacy sensitive data. So they want to be able to integrate location as well. And that was one of the primary drivers for this project is to allow them to do that aggregation from unit level records into different reporting regions. So that was one of the key things there. Speaking to this group of geospatial experts. I guess you might have a question, you know, why can't we can't we do this already. And to a certain extent I think we can do these already but there are a couple of problems. Firstly, we've got not so much a geospatial problem but identity problem. So you might publish geospatial data but might use the same identifier as another geography. And if you're faced with the identifiers but they have that are the same but have different features. You've got to do some disambiguation. Sometimes you'll have multiple features with the same identifier. Sometimes we'll have the same identifier for multiple features. So you kind of get different sorts of identity problems prop up once you start to integrate these these datasets together. Sometimes they don't line up so you might have the same identity identifier but they're kind of slightly different. And so there's a question about which one's the authoritative one or what processes was done to this one to get this result. And when you're integrating data that that that is a problem that you have to resolve. Yep, get more of that. So the traditional solution is to delegate it all to the GIS expert. So you get them to calculate the relationships and get answers. But that requires the GIS tools and expertise and to know how to operate these tools. And depending on the different person you give it to the outputs may vary, you know different approaches to resolve data issues right. So the Loki approach is to try to do a couple of things firstly is to standardize the location identifier. So instead of having the identifiers locally identified in a geo package or a shape file or whatever it is. We want to publish those that location information using reliable and consistent web identifiers. And we were strongly proposing a link data approach here where everything is published by the web. Using the protocols established by the web to identify uniquely identify things. So you can see here that I've got an example of two geospatial features that have been created in Loki and given a name. So this is the identifier that's globally unique. And the naming structure is important as well and I won't go into that at this point but suffice to say that if I click on this link. My browser fire up. Yep, there you go. It gives me a landing page with that feature. This is somewhere in a CT with some metadata and other ways of getting at different machine readable and other other views of the data so I can start doing some machine processing on that. As well as, you know, get the actual data. So, so publishing these on the web is one of the features of Loki. And so we want to do that for every data set. And I've shown you two data sets here two features. The second thing we want to do is standardize the linkages because the links between different features or geographies is important in order to do data integration. So if I want to go from an ASGS statistical error level to which is a which is a ABS product to a geofabric contractor catchment on the right, which is a Bureau of Met product. Typically I'd need to do that GIS operation in my desktop GIS. What we've done in Loki is we've pre calculated that and publish that and describe that using standard semantics using your sparkle. So what we're saying here is that this statistical area level two is within this contractor catchment feature and you don't have to use the GIS as you as the user of Loki, don't have to use a GIS to understand that that is a link. So Loki will provide you that that answer in a consistent way. So if multiple users hit the API, they get the same answer. And so yeah, as I mentioned, every data set gets a set of web identifiers. And these then get published on the web and you can see if you have a thing with a link to another thing, and many things you start to get a network or a graph, a data graph. So that that's a feature. As I also mentioned, there are APIs that provide GIS like operations without the need for desktop GIS tools. So I'll demonstrate that in a little while. Okay, so I'll just go to some demos just to illustrate what I mean. Okay, go back to this browser. So, the first demo is the Loki Explorer. So, what this is is, it's, it's an explorer of what's available in Loki so I can do a tech search for example for acting, which was that area in in that that I showed earlier, and I can get information about acting via tech search, or I can drop a pin somewhere along. Oh, that's a different acting. Let's see if you can find the actual acting. There we go. We can drop a pin. The zooming is not working. Sorry. That's a little bit of a bug, but I have dropped the pin there. It's just doing somewhere else. I've got acting. Well, I've got the pin where I think acting is. And then on the right hand side shows me the results of that of features that intersect with that point so I can go. Well show me the drainage division is going to be a little bit big. Or show me the statistical area level one. Well, so there's there's quite a few results that that are on the right hand side here. I can page for them if I want and get all different kinds. And it's telling me that there is this feature called mesh block that 80005100 that intersects with this point. I can get more information about that either in this explorer, or hitting this link to that that mesh block. So that's the mesh block there. I can view that information here and it pulls in metadata from from actually that page to display here. So this is an identifier, a local identifier, and a link to a geometry. So I can follow that link to the where that geometry is hosted and we've got a service that hosts all the geometries for Loki in our system. So this is the geometry view. I'll just pause here about the geometry here we've got a geometry data service that provides just the geometry. So I can do things like give me the geojason, which might be handy, give me the wkt, which might be also handy for different, different clients. There's this thing called an alternate view, I can get things like the centroid view. So, you know, if I just wanted a centroid, I can, I can do that without having to do any, you know, desktop GIS operations just specifying that one is a centroid. So, let's go back. Here you can see a little bit of a graph, and I'll just explain what this is is so we're viewing the feature, this mesh block. And there's two groups of relationships here. There's a range relationship. See if I can do something like that. And one that's a within relationship. And so what this is saying is that for this feature, this mesh block, you can navigate to things that it contains. So it could be an address. And in the highlighted box there. Or you can navigate to things that it's within. And you can see here that it's within a statistical area level three remote area remoteness areas and all these other features. And that's because we've pre calculated that that spatial relationship to other features. Now we can navigate to any one of these things via this interface or via the API. So this is showing you the remoteness area. I think because remoteness area is big is pulling in the geometry so it won't update until it gets the geometry but so so that's the explorer. It sort of highlights what you can do with the API. There are some useful operations that you can do in this site. But really it's to highlight that you can grab information from the API in these various ways, and then start to do different operations with it and I'll show I'll show a little bit down the track as well. So that's, sorry, that's explorer. It's available at explorer.locke.cat. The second tool that I like to demo is the data reapportionment tool called accelerator. And there's no mistake in the name. That's because what we wanted to do was allow data analysts who deal with CSVs and Excel files. Much of the data is published in Excel files that have due coded identifiers. For example, here I've shown the census data for population housing. If you go to this site, you can download the data as a as a CSV file. And so a lot of the analysis. Well, a lot of the studies that the user study that was carried out showed that many policy analysts and data analysts in government use this sort of format to do geospatial and want to do geospatial re-aggregation or transformations. And so if we built a tool that uses low key to do that, since we have globally identified spatial things and the linkages between different geographies, this allows us to use the API to transform that using accelerator. And so conceptually, what we're seeing is what we'll see is that we have data sets here on the left here encoded as mesh blocks as just 2016 mesh blocks with some observation data. So it could be dwelling counts. And we want to transform that into another geography, for example, the geofabric contractor catchment. So I've loaded an example here of what it might look like. A cut down version, perhaps. So in this Excel file, we've got, you know, the geolocation, the mesh block code. And we've got a dwelling count. So we've got about 25 of these or so, 27. So let's just go to the accelerator site. So what you can do, typically, you'd load up a GIS. And you'd load, you know, the mesh block layer, you'd load the geofabric layer, you'd load the CSV as another kind of layer and try to integrate the lot. I've done some previously, so just testing. And what we're, what we're showing, we're going to show here is that using this interface, this web interface, you don't have to install any programs. Go to this website. I've got this data here. I'm using data, it's called mesh dwellings mesh blocks, and I can drag and drop that CSV into this, this, this site. Now it doesn't know this is asking me what I want from. So, as in the geography. So I have to select mesh block because the data set is in 20s as years 2016 mesh block. And then I can transform that into any, any of the matching ones that a mesh block can transform into. So I wanted to do contract the catchment. So there's geofabric stuff. Yeah, but there's also other as GS ones that I could transform that to. I'll just do that there. And the feature of this site is that you can kind of cues, so you can load in as many as you want and do all these different transformations if you want to. I might do to say to, I might do a mesh block to a river region. That's probably enough. So just downloading that result and viewing it. What I get back is a converted data set so it was mesh block. Now it's in contractor catchment. And it's inverted the Loki identifier so you know any other data set that that uses this down the track can then, you know, understand what what one two one zero zero six one five is. So if I click on this link, it will take me to the landing page for the contractor catchment. And I can get the message. I can get the feature information. Page because that thing was loading. It's got the metadata. Yep, there you go. And again, it's a landing page, which you can then get different views and different formats out of out of it. I suppose one thing to note here is that I can then use that file that I've downloaded because it's geocoded for. Contract the catchments, loading that converted file in it, it reads the file in it understands that it's contracted catchment and then I can go. Maybe I want to go back to mesh blocks right so I can go that way as well. So yeah this is this is kind of supporting the use case for data analysts dealing with CSV files. All right. Any questions so far. I'm happy to pause. I will keep going. I'll keep going. Yeah, this is just showing conceptually what's happening so you've got geocoded locations. And then you can transform that to other features and then we can do then is integrate that with other data so say I've got, you know hydrological data in, you know, in the contract, the geofabric. So we intersect that and do some analysis. Not yet. So the next thing that I wanted to show was so so we've got the simple use cases we've got the accelerator. We've got Explorer which allows you to kind of view different things and get different identifiers out of the system. It's fairly simple but I guess depending on the use case will be quite, quite powerful. We also wanted to show some of the Jupiter notebooks, the data science, I guess environments that that the API allows interaction with. So we've got a bunch of data science notebooks on GitHub. And in the presentation here. So, you know, you can go go here and see all the different notebooks that we have on that interact with the API. So I'll just demonstrate a couple. So, I've got one that I did recently, looking at covert 19 data. I think in about July, the HHS was publishing their data on on their website as a CSV. So kind of a thought experiment was can we load this into Loki and visualize it easily. So this notebook describes how to do that. So I'm loading the data here. So we've got those cases and then there's a bit of Python coding to get things working here. But once you've got that code. You can basically bring that CSV to life without having to fire up a desktop GIS or without having to load spatial layers into this environment. This is just querying the Loki infrastructure to get this, this, you know, the LGA data. And then we can use Python standard Python tools to color, color code the regions with the different numbers. So yeah, this is just a quick demonstrate of how you could use Loki to power this sort of visualization. You can imagine any other LGA kind of data being being inserted into here. And once you've got the notebook can basically interchange the data and get this visualization quickly. The second example here is we can because we've got the GNF addresses in Loki we can plot the addresses for different features. So this notebook shows you how to plot addresses from GNF for any given ASGS mesh mesh block. And this is an interactive version. Once you get past the code there. There is a text field here. And this text field allows you to enter in a mesh block identifier, a Loki mesh block identifier. So I've loaded one in from what is it. I just go to this somewhere in Canberra. Yep. So, I don't know. Let's go to the Explorer and see, you know, where else we might do that. So say, say I was in, wanted to get addresses. I don't know, I'll drop a pin somewhere here. Why does it keep doing that. It's a bit of a bug, sorry. Live demos. Somewhere in Essenton North. So I'll look for the mesh block. Here. So there's a mesh block here. It's a bit funny. Anyway, we'll just get this link address and pop it into our text field here and yep, so there's a progress bar showing you that curing the Loki infrastructure is gotten 47 addresses. And so it's it's loading all that information in here processing each one of them. And then it will visualize it in this in this notebook. So five seconds ago. So we're not loading GNAF into a GIS. It's already pre-cached in the Loki infrastructure. We're not loading ASGS layers. Is it going to come? Or maybe not. Oh yeah, there is. Okay. Now I'm not sure what this is, but anyway. For some reason it's decided to get this dot here. Well, there you go. So yeah, the point is you can put any mesh block here and get addresses. I'll go to another example. So this other example is doing that reapportioning use case. So using this notebook, I can take a contractor catchment and reapportion data as from a contractor catchment CSV to ASGS mesh blocks. So, you know, if you wanted to transform data and one into the other, this might be a way of doing that. I'll jump to the last Jupiter notebook which is going to let me. And sorry, there was another one as well, but we might have run out of time around DGGS. So if we've got time, we can look at that. I wanted to also show a notebook demonstrating how we can integrate geospatial big data pipelines with Loki. So I've got a couple of examples here of using the open data cube, which is a product that's being developed by GA and CSRO and it's being adopted and deployed in different countries. So I've got two products here one is a Lancet eight fractional cover product. So I've got a notebook here, we're tinkering with it and this shows how you can. Well, we're listing a few Loki identifiers here, LGAs. And what we're doing is we're trying to intersect for an LGA give me the remote sense data. So this fractional cover data has. That's the Loki feature the LGA feature. This, this remote sense data has vegetation information. So we've pulled in remote sensing data from the Lancet data set for the area which was the which I showed which was the Barossa Valley Barossa Valley area. It's showing amount of green vegetation. So this is a remote sense product showing you green vegetation. And this notebook shows how we can intersect well cookie cut this remote sense data, which is gigabytes big in a data cube. And plot that for that LGA so say I wanted to know how much vegetation is in the Barossa Valley for a given time using this remote sense product. Traditionally I'd have to load in this geospatial layer right and the remote sense data here but in this notebook we're showing we can use the API is to do that job. And there's quite a few powerful a Python libraries out there that allow you to do things like drill into the data to get different time slices. So this is showing you the 13th of February 2018. And, you know, you're just using the slider here we can page through different days. And quite easily get a visual representation of, you know, the Barossa Valley at, you know, August or at my February 3. And we can also do things like statistical, you know, summaries. Kind of brings the intersection of authoritative data authoritative geospatial location data with remote sense data which could be anywhere on on the earth. And, you know, we can show that we can easily go from Barossa Valley to the arranges for example right so you know I can, I can run the rest of this form for that as well. And if I swap that Lancet data for the Wolf's data set which is the water observations from space suddenly I've got a, you know, a water oriented data set pipeline. And similar thing I can use Loki to pull in the feature I can pull in the remote sense data I can then, you know, map that and page through the different data data points in time. Yeah. And this is just showing whether something is wet or not. So there's certain parts in the Barossa Valley that gets wet. Um, how am I going for time Kieran pretty decent if you got going to go for another five minutes or so and then made open up for questions. Okay. All right. So I've got time for one more demonstrator which is so using. So something that the Geoscience Australia has been working on is called the DGGS. Not sure if many people have heard DGGS. It's the discrete global grid system and there's an OSPIX product. So, this allows you to do and and there's been some work with the Loki team in collaboration with GA as well to provide a pathway for integrating DGGS with Loki features. So, while this is running, it allows you to. So what DGGS is is a discrete global grid system where you can despite the earth into cells and then nest those cells within more cells. And so there's an infinite number of nested cells that can be created. Each cell has an identifier. And so you can take something like this spatial region which is a map of outline of Black Mountain in Canberra. And put it through their DGGS OSPIX engine, which is the GA product and gets an approximation based on this engine. It gives you the cells for this region that intersect. So you can go down different levels. I've selected level 10, but you can go down final resolution than that and you'll match, you know, more of these coin edge cases. But using, but one of the things with Loki is we've got an API that allows you to find locations by for a DGGS cell ID. So that's why these cell IDs, we can get the intersecting authoritative Loki feature. And so, what we can then do is ask things like, well, what is the feature overarching feature for those DGGS cells and, and this is the answer is Canberra or ACT. Like giving me the mesh block, give me the statistical area, give me the contractor catchment or all the all the features that we have in Loki. We can, we can ask for the specific thing. So this was a simple example. We've got a more sophisticated example here. We're loading in feature data set from the Department of Environment. It was called before then but Department of Ag, water and the environment now. So they publish, I guess, maps around species of national environmental significance. So we can check out the data sets on the website and the GIS data. What we've done is we've just, we've just pulled out one species, the leopard opossum, and map that in here, just pulling that from, you know, the, the, we've created a geo adjacent file for that excerpt from that. And so that's showing that there. Now, I guess, one question might be, well, can you tell me the LGAs that that are affected by that that have the lead bed opossums, the critically endangered lead bed opossums. Traditionally, you'd have to fire up your GIS and do load in those layers and load in the sort the lead bed opossum layer and do those intersects. We've precalculated all that in Loki. So we can then go, well, we can precalculate that in Loki and the DGS and then give that answer. So this is using the DGS engine to what we call DGS enable the data. So we can get the cells for the lead bed opossums. And then based on those cells, we can then crosswalk to find the matching LGAs. So, this is showing the LGAs that overlap intersect with those DGS cells. We can do the same for ABS statistical areas as well, which is still is still processing, but it's fine for taking, I guess, any arbitrary feature geospatial feature and getting answers using authoritative geographies. Okay, I might leave it there but I guess come back to the talk. A couple of parting remarks. So, the goal of Loki is to enable Australia spatial data on the web. And a couple of things there is that persistent identifiers the feature and geometry, which I've shown. There's a separate geometry service. So we're handling geometry in its own right and providing different APIs to interface with that. We've got links sets which are the links between geographies. And we've got the Loki integrated API and graph case, which I didn't talk about too much but I can I can share with those who are interested in the technical details. And hopefully I've shown that we can have a GIS capability over the web without needing to launch traditional GIS tools in this presentation. Some acknowledgments. This was a cross agency cross organization effort so there's there were many people involved in this project. So I want to acknowledge them from the different agencies. And yeah, if you like more information, feel free to reach out or check out some some of these links on online. Okay, thanks. Thank you Jonathan. I have a question on the chat that hasn't been answered by Simon. Erin was asking whether the desegregation that you were showing on the accelerator are features that are distributed across landscapes. Is it just using the Excel data to desegregate or was it looking at extra dwelling data online. So the sorry if I understand the question. This aggregation from catchments to mesh block. So, I guess you, I guess, this is a crude way of doing it. We're doing it by spatial area and overlaps. Currently in that particular demonstrator the accelerator demonstrator. It just uses, you know, pure spatial overlaps to calculate the, you know, the reapportionment from one feature for another. So we are in discussion with data 61 and there is a project called your data your region which has a more sophisticated statistical covariate approach to doing that sort of this, you know, redistribution I wouldn't say this aggregation sometimes you're aggregating and disaggregating. So that distribution of values using covariate so if you load in a committee profile covariate set, then they can calculate, you know, the distribution in a more sophisticated way. So, we hope that in the future that those, those two products can kind of converge and provide, you know, so low key providing the authoritative geographies. So that's a region providing that capability to do that disaggregation and vacation in a more sophisticated way. But yeah, this was a demonstrator showing what you could do with it. Yeah. Thanks, Jonathan. Yeah, I was just trying to clarify, you know, what was happening with the disaggregation whether it was fundamentally looking up some information and returning to that previous state with our resolution or if it was, as you just explained. Sort of spreading the data out across the landscape using some mathematical model. Because I think when you first sort of went okay we'll go from mesh to catchment and then we can go from catchment to mesh. Then in my mind I was like, is that going to create the same data. When you're going forward and back. No, thanks for the clarification. Yeah. Is there any other questions. Hi there, it's Bob Atkinson. Rob. Hey, John. One thing you didn't really mention, maybe it was implicit that could be made explicit is pre calculation of relationships is often necessary because there are sometimes complex issues. I know for example if you're looking at matching objects defined at federal and state level they often have quite different ways they handle coastlines and estuaries and some of those things are really not trivially easy have different resolutions, simple polygon analysis to work out stuff. So the ABS has always published correspondence tables where this stuff has been gone through the great deal of expertise in care and vetting. So the technique of precalculation is well known. But what's being offered with persistent web addresses backing up is the ability for people to find whatever precalculations have been done. And that was not that that's a new mechanism which didn't exist before without the persistent web addresses. It may leave people cold. Yes, of course you could do that. But what you couldn't do before was actually find if someone else has done those precalculations in a scalable way. Yeah, is that a fair enough assessment of one of the key issues here. Yeah, I guess I guess is the is one of the fundamental things that you mentioned Rob, like, firstly, standardizing the location identifiers and I mentioned standardized linkages I didn't go through too much of that but you could imagine different link sets for different types of precalculations we've pre calculated based on a spatial overlap. And intersects and all that kind of stuff but you know that's a night what that is one process but you could have a different, you know, process that has a has a broader kind of community adoption or kind of vetting. And that's an alternative option as well. So, yeah, I guess you could have a catalog of link sets which we do and broker that broker that querying based on the user needs, you know. So following that up from the perspective of a community of practice. The, the registration of link sets plus other data which is attached to these objects is necessary to community practice level. In order for people to be able to share because if you try to centralize all that information and the management of it, we don't have any agencies with that mandate so it doesn't scale. So has there has been any update on thinking about how the community of practice would share such information. So there's been quite a bit of discussion about governance of Loki. And so we have been trialling a governance body based on the partners in Loki and so the mechanisms haven't been ironed out yet but it's kind of informal at the moment. So, have a have a email address that you can get in touch with if you'd like to register a data set or if you'd like to register a link set. And there is a bit of a process about evaluating, you know, the business case and then the technical, you know, technical requirements for getting that into Loki. So, yeah, I kind of kind of sits with GA and I think they were calling it the Spatial Location Authority, SLA. So that's being prototyped in Loki, the Loki project, but I wouldn't say there is a stringent policy or whether there is a, you know, recommended governance approach at this point. And we're still, we're still trialling things out, but happy to engage with the community to, you know, bring on board new data sets and advise on how to publish spatial data on the web, aligned with Loki. Great. So that's a really good suggestion of how does the community of practice engage in that activity. So probably something that we can take offline as well. We'll leave for a quick one more one more quick question. Is there any other questions that anybody wants to ask. So thank you, Jonathan. Thanks for the session. It was really useful. I found it useful. There's also a link on the meeting notes to the GitHub for the Jupyter Notebook said Jonathan was showing. So if anyone's interested, you can find them on the meeting notes. And I'm sure if you've got anybody's got any other questions to follow up with Jonathan will be happy to take emails and answer the questions. Right. So with that, that brings us probably to the end of this session a couple of other things that we wanted to mention. Michael, did you want to talk about introduce the research session. So, next week at research conference, which is moving it moved across obviously to an online format and should be a great event. And the community practice was actually lucky to get the birds of feather session in there. And the birds of feather session that we'll be doing is unpacking the mapping the spatial data and services landscape activity that we did a couple of months ago now. With the mindset to really drill down and figure out what the requirements for this may be and what a registry may be like if we were to move that direction. So a bit of requirements capture. The other part of the boss is going to be looking at what are some of the future activities, which will come out of that activity, the mapping one that is in towards I guess the future of the spatial geospatial community practice into I guess next year and looking at what are some of the key items that we could look at in that space. So, the actual birds of feather session is going to be held next Monday the 19th of October at 1220 for those who are attending the research conference. It's on there for those who aren't registration is really affordable. If you'd like to still come along and see some other exciting presentations and activities like the one that will be doing. Otherwise, the working group, which will be formed out of this will be kicking off more formally around about early to mid November at this stage we had a couple of delays in September, which will push things back. We're looking to kick off the working group more formally later on but yeah birds of feather session next Monday and getting some ideas for the future. Right. So I think that wraps up this session so thank you everybody for joining and hope to see many of you at the research session if not the next one after this. Thanks, Karen.