 Hi, my name is Martin and I would like to tell you more about validating chromi test rules and taking it one step further. You may know me as Fusekla from GitHub. I'm a DevOps Engineer at Cessna.CZ and recently we did some major refactoring of our alert routing and we found out that we need one thing additional to test in our alerting stack to make sure everything works as expected. You can see all this you can test with Chrome tool or AM tool but this one still isn't possible and this was the motivation to create a tool I would like to show you. You need somehow ensure that the alert the labels the alerts have are those that you expect in the routing tree of the alert manager. So the tool is called promuval. You can just simply go get it and here is a simple example of alert which has label severity with value info and the second snippet shows the validation YAML file and this is actually a configuration file for the two promuval. It consists of list of validation rules and each has its name scope which can be only recording rules, alerting rules or all the rules of the file which will be validated and then it consists of distinct validators. Here you can see I'm using has labels validator which ensures that each rule has the label severity which we see the alert does have and the other one validator is label has allowed values and it checks if the rule has the label it checks if the value is one of the allowed values. We can see that this is not the case so the validation should fail and now we can try to run the tool. You can use the pre-built binaries from github and run just promuval validate pass the configuration file which is the validation YAML you saw on the previous slide and the rules YAML with the rules actually you want to validate. It prints out some human readable description of the validation that will be done and the result which we can see is invalid should be again human readable easy to fix and some statistics. I show you just two of those validators these are all of the currently supported validators. These are for labels for annotations I would point out that there's for example validation of annotation value if it contains a valid URL which can be handy if you put playbook links in your annotations it can also resolve the URL actually or you can validate your expressions there are just three of those validators I would point out for example the does not use older data then I actually bump into this myself many times writing alert which uses longer more data than the retention of the prometeuse has which you can forbid by this you can avoid this or missed this issues which is really handy also you can you can make sure that the queries are not using range vector selector shorter than your scrape interval which is also good if the users or yeah users creating the alerts are not aware actually of the of the scrape interval. Yeah the tool actually embeds the prometeuse so it actually really parses the prompt URL and analyzes it you can also disable the rules temporarily you can do it by well-known annotation and passing the names of the validation rules separated by commas or using the comma command line flag and the last feature is that the validation level configuration can grow a lot and it's not easy easily readable so if you want to provide your users or anyone creating an alert some human more human readable form of the validation you can use the command promoval validation docs pass it the configuration file and set the output currently supported output is html and markdown and text and it looks like this so this is all how there are some future ideas and make sure you check out the github repository if you have any ideas for additional validators I'd be really happy to add those and you can read also blog post about it short blog post I wrote about it about demotivation and so thanks