 So switching gears again, we're now going to talk a bit about integration and data exchange. So in 239, we introducing a new service called the aggregate data exchange service. So this is essentially a service that allows for data exchange that could be from one Disha's to instance to the other or within a single Disha's to instance. So this is a kind of a built-in solution that hopefully should replace a ton of scripts that do not exist out there and provides a integrated experience for moving data between Disha's to and also within typically like individual data over to aggregate data. So once again, a data exchange is essentially a transfer of data from what we refer to as a source instance of Disha's to, to a target instance of Disha's to. So in the source, we basically are used to utilizing the analytics API. So data will be aggregated in the source instance using the analytics engine, which means that you now can support data elements, indicators, program indicators, reporting rates and so on. So pretty much everything you can do in the favorites in analytics or using those API, we can now also do in the source instance of Disha's to. Now the data is exported in the data balance set format. So out of this source Disha's to, we get data in the role data balance sets or some people call it the aggregate data value format of Disha's to. And this is quite powerful because it allows you to do data transformation, data aggregation in the source instance. So if you would like to use an indicator to compute the number, if you would like to aggregate in time or up in the hierarchy before exchanging data with another instance, we can now do that easily. And the format is very, is identical to the analytics favorites and API. And so it should be familiar to many. For the target instance we then import data as a role data. So on the target side, it looks like role data and you can treat it and store it and look at it and use it as you would with any kind of role aggregate data. So this can be run ad hoc using the API or using the new web app that we're gonna talk about in a minute or it can be run as a scheduled job. So if you want this to run at a specific interval such as 2 a.m. every night, we can also do that now as a scheduled job using the data, sorry, using the job schedule or application. Okay, so what can this be used for? So let's have a look at some of the use cases for this new solution. So I have sort of laid out four different use cases in our case. So the first one would be DHS2 HMS instance, moving data over to a data portal instance. We know that many countries and organizations now have dedicated instances that act as a kind of a data warehouse or data portal that collects data from multiple DHS2 instances. So instead of writing a custom script, you can now use this to move data from your HMS or with your kind of data portal instance that could be public facing or at least be integrating data from multiple data sources. The other one would be moving data from a diesel to a tracker instance to an HMS instance. So we do in many cases recommend that you have a separate instance of DHS2 if you're dealing with confidential and sensitive individual data. But at the end of the day, you often would like to move at least the summary or aggregate of those data over to the HMS so that you can use those data as part of your integrated analysis in dashboards combining the different data sources as we like to do with DHS2. So now you can set this up typically by setting up program indicators on the source side and then move it over to aggregate numbers in the HMS instance. The third use case is what we call precomputation of program indicators to aggregate data. Sometimes we know that program indicators could be slow to load real time. So if you need to compute data based on individual data and put it on a dashboard and have it load fast, it could be a good idea to pre-compute those data into an aggregate data elements and then use that data element in the dashboard. And finally, we can also then now automatically actually move data from national HMS systems over to global donors. We're working with the global funds on a project where we would like to infer some of the reporting and the submissions of data that they can't be doing through directly through DHS2 without having to go through out of bounds approaches like Excel or manual entry and so on. So essentially like inferring reporting straight from country atomizers.