 So this next feature is also incredibly powerful. The next few features will, we're labeling as preview features for the version 40 release. And this isn't to say that they're incomplete. It means that they are quite big functionality that we want to get out there and let people start to use so that we can improve them and maybe make some changes to the API or the way that they work in the future. So the next three features are labeled as preview features, but I'll go through them and they are available and fully functional in version 240, version 40, apologies. The first of those is the event hooks. So event hooks are very exciting way to listen to DHS2. When something happens in DHS2 and in version 40, you have two sources for those events, metadata changes and scheduled jobs. And you can listen to events and in some way send those events out to other systems which can then respond when those happen. This hasn't been possible or hasn't been easy in previous versions of DHS2 because you had to pull or regularly check in again with the DHS2 API to be able to see what had changed and you wouldn't get notified when something changed immediately. So for version 40, we have support for metadata changes which means when any object in the metadata of a DHS2 instance is created, updated or deleted, a notification can be sent out and we have multiple targets for these notifications that can be sent to the console which means that just gets put into the DHS2 log that can be sent out as a web hook which is a long requested feature that is quite powerful and that allows you to send an HTTP request to another service which can listen to those HTTP requests and then do something in response. It can also support more advanced queuing functionality through Apache Artemis and some specific workflows with Apache Kafka and we'll be exploring sharing more of the kind of recipes for how to use those tools in listening to events to perform different types of actions in software that's outside of the DHS2 server but wants to listen to those events that happen within DHS2. Metadata create update and delete is the first one and then schedule jobs is the other one so you can probably the most requested feature here is to be able to listen to when the export of analytics tables completes and then be able to do something in an external service once that happens. So we will be looking at expanding this in the future so we've introduced metadata and scheduler events in version 40 but we'll be looking at data and user events as well. So data would be when a data value set is submitted or completed or when a tracker event is created those types of things and users would be creating, updating and deleting users maybe also different ways to hook into the login system or things like that but these are quite performance intensive ways to use or to listen to events because there are many of them there are many data events, there are many user events and so we're not introducing those in version 40 we're starting with metadata and scheduler and we'll be expanding to other areas of DHS2 in the future and we wanna see how you use this and how you want to use this functionality of event hooks. So this is a preview feature in version 40 it does need to be enabled in the DHSConf which is a protection against any performance impacts that might happen because of this we don't expect there to be many but because this is a preview feature we wanna see what happens in real-world scenarios before we give it a stamp of approval. So hopefully people can use this and leverage it for their architectures in production environments and we can see how that works and continue to improve on this functionality in the future.