 Hello everyone, this is Juan Jose Martos from VINAMI and during this session, I'll be talking about how VINAMI tests thousands of releases per month. This session will be really useful for those who are already familiar with VINAMI because you will learn how important it is for us to provide you with the latest versions of the VINAMI solutions in the different marketplaces. However, if you are not familiar with VINAMI, I'll be talking about what we do and how our infrastructure tool looks like. I'll be also giving you some test practices we implemented in our pipelines as well as the lessons we learned during all these years. At the end of the session, I'll also talk about the VINAMI Enterprise offering that allows you to customize the VINAMI offering to meet with your requirements. But let's start with the basics. What is VINAMI? VINAMI is a catalog of open-source software that includes more than 180 packages. Those packages can be easily deployed in any environment, any format, or any platform. You don't need to worry about the configuration or the requirements. We do that for you. In case of the environments, we allow you to deploy the solutions locally, in a public cloud, or in your own data center. However, regarding the formats, we provide you with native installers, virtual machines, containers, or deployment templates. Regarding the platforms, we support the major cloud providers at this moment. But maintaining the catalog is a really hard and complex task. These are some raw numbers about the work we do. We are tracking 500 companies in their upstream projects to ensure that we are using the latest version. In order to test them, we have created 2,500 tests that we use in our pipeline to ensure that the solution is working as expected. Those numbers are big, but if we take into account the number of releases we perform every month, those numbers are even bigger. We release 12,000 containers every month using those tests that we have created, and that's only for containers. We need to add the 75 charts we also support, and we need to maintain, as well as the single VMs, virtual machines, and native installers we also support. But it also requires experience, and we have that experience. We have been working on providing the community offering to the different cloud providers during the last 10 years. Regarding charts, we have been creating chart solutions during the last four. I'm mentioning the community offering because we also have now the enterprise offering. It allows you to get the open source components, the VINAMI packages, and any other commercial application you want to package and customize it based on your needs. I'll be talking more about this at the end of the presentation. But nothing of what I said before would be possible if we don't follow our pillars. They define what the user will find when using the solutions in any of the platforms we support. They will find up-to-date and secure applications because this is a must for us when building them. They will also find and unify the configuration in all those solutions that will allow them to simplify the development process on top of our solutions. The documentation we provide, it will be really useful because they can find common options to perform on top of our solutions by browsing our docs.vinami.com site. Finally, if they have any questions, we have a high-quality support system that allows engineers to provide help to the different users when asking any questions. So let's start with the technical part now. As you can see in the title, automation is key in this process and all the steps you can see in this slide represent what the VINAMI pipeline looks like. It's really important to say that the VINAMI pipeline is deployed on top of a VINAMI chart solution that makes VINAMI the first consumer of their own solutions. Let's dive a little bit on every step now. The first two ones are the version checker and the downloader. The version checker tracks for new versions in the upstream projects and ensure that we are using the latest versions of those components. Then the downloader gets the source code of those applications and stores them in the VINAMI database. After that, we have a build and add options that allows us to perform additional actions when preparing the VINAMI package. So for example, if we are building a Node.js application, we will run the npm-installed-goma to gather all the npm modules, the same for a Ruby application. Those processes are customizable and we can add any other action that the application requires before preparing it. So for example, in the case of Drupal, it requires dash and php-mail it to run. So we get those components before packaging the source code and all the dependencies it needs. In this case, I'm going to talk about the container and chart build steps. But before building them, we perform some unit validations that I'll be talking about later. So now that we are talking about the container and chart build steps, those processes ensures that we are using the latest versions of the VINAMI components as well as the latest system packages. The chart build step will automatically get the latest containers we have released to update the chart packages. This is what I just explained. We can see that when building the Docker file, we are running the apt-get and apt-adgrade commands. Well, the apt-get update and the apt-get upgrade commands. That allows us to include the latest versions of the system packages inside the VINAMI container. Once we update the chart solution, we will include that new container that we have just created so the chart will be using the latest packages as well. Now, the last part in this pipeline is the container validation and the chart validation. I mentioned before that we also perform some unit validation when packing the application. That unit validation ensures that we are using the correct version and that the component that we have packaged is ready to be used inside the container and the chart. The container and chart validation is performed by the VINAMI validation system. I'll be talking about that now. This validation system consists on a set of tools, scripts, and files that defines the action to perform when validating the different applications in all the supported platforms. Those set of tools and scripts are scanning tools to check that there are no vulnerabilities in the container charts or testing tools like Apetir or Aspel.js that allows us to perform different actions in the different solutions. As I just mentioned, we are scanning the different containers and charts to find if there is any vulnerability affecting the container. If we find any, we will be automatically updating those containers to mitigate the issue that we found. In order to be sure that the charts can be deployed currently, we deploy them using the default parameters and the post-on months. After that, we also perform upgrade processes using the hub-upgrade command to ensure that the users can upgrade the solutions from a previous version. Using Apetir or Aspel.js, we can access the application externally and ensure that we can log in into the application or install a plugin or a theme, sorry, or create new sites or jobs depending on the application. Finally, we also have some shell scripts that access the application internally, let's say that we can access the container and ensure that a file exists or that a binary can be executed correctly. The applications we test are the applications that we have in the catalog. For example, we test WordPress, UBLA, RedMind, but we are not only testing the application itself, we are also testing all the components that are included inside the deployment. So for example, in the case of WordPress, we are testing that the database is working properly and that we can access it without any issues. The supported platforms are the ones I mentioned before. We are checking that all solutions can be deployed correctly in all the major cloud providers, no matter if you are using a chart or a container, a virtual machine, or a native installer, a single VM, or a multi-tier solution. If we take a closer look to the pipeline code, we can see that this code that I'm showing here takes care of checking that the chart solution works properly after installing it without no parameter or after setting the default parameters during its installation. We also check that the upgrade test works properly after upgrading the application from a previous version. I'm now going to deploy a binami.wiki chart solution and perform other type of tests. In order to install the solution, I went to the GitHub repository and got the latest version of the binami.wiki solution. You can easily install it using the commands I'm highlighting there. Once you run those commands, you will need to wait for the application to be ready. Let's get its status now. As you can see here, the application is fully deployed right now. And we can also get the URL and the username and password to access the application. If we access the URL, we can check if we can log into the application using the credentials that we obtained. As you can see here, we log in properly using the username user and we can log out properly as well. Let's see how binami tests this login logout functionality. As you can see here, binami has created a login logout script that uses Casper EAS to access the application. The login function requires some credentials that we need to provide and the logout function doesn't require those credentials. Those functions are defined in an external file. The login function waits for the selectors to appear and uses the inputs that we have provided to the tool to log into the application. Once we have logged in, you can see that we confirmed that the login was successful by waiting for the logout button to appear. In case of the logout function, we wait for the logout button to appear and then click on it and wait for the title login to appear again. In this case, we have logged out from the application successfully. Let's run the binami validation tool right now. In this case, we require to pass the name of the application as well as some inputs that I'll be showing later and the name of the test we want to run. The inputs.json file contain the credentials and the information to access the application. For example, the host and the port, and the username and password to use. If we run the tool now, we will get the results. As you can see here, the tool is using the inputs that we have provided and it also shows that the login functionality works properly. We just run the external validation. As you saw, we logged in and logged out from the application properly using our browser. Let's run the internal validation now. As you can see here, we are performing some tests from inside the container to verify that everything is working as expected. For example, we confirmed that the persistent directories are created properly. Those persistent directories allow you to upgrade the container and charge without losing any data, apart from that. There are some files that need to be created for the community to work properly. We ensure that those files are generated properly after passing the wizard, so the application is ready to be used. If we take a look at the validation tool command now, we can see that we are also providing the name of the application as well as a JSON file with some inputs and the name of the pod and the container where we want to run those tests. The content of the JSON file is the following one. We are also providing the host and port or the username and password of the application. There are some cases in which those inputs are not required, but we need to specify them in all cases. If we run the tool now, we can see the output. As you can see here, the tests were successfully run. We need to confirm that the directories are persistent and that the files that we were looking for are created. We can also see that there are other tests that we perform to ensure that the solution and all the environment is working as expected. If I go to the VINAMI pipeline now, I can get all the results of all the tests we performed when testing a solution. Let's open that job. As you can see here, there are some files that are generated when testing the solutions. You can see there the antivirus scanner and the CVE scanner that checks for vulnerabilities. Let's open those files. The antivirus scanner shows that all the files that we are including are correct and it confirms that there is not any infected files. If we open the CVE scanner now, we can see all the vulnerabilities that are affecting our solution. You can see a lot of vulnerabilities here, but none of them are currently fixed. If they were fixed, we will have the information about the version that fixed the issue. Let's get now the results of the external validation we were talking before and the internal validation. As you can see here, we could look into the application properly, create a post, or even upload an image to the application. Those tests were passed successfully. Let's check the internal validation now. As you can see there, the information is the same one that the one I was showing before. All these tests are passed when testing a solution and all of them confirms that the solution is up to date, secure, and ready to use. But apart from automating the process, you will also need to simplify it to get things faster. In our case, as I mentioned at the beginning, we are performing some unit validation when packaging the product at the beginning to ensure that what we have created is ready to be included in our containers. From that time on, we only need to update the containers with the latest versions of the system packages or any other component that the application needs. This configuration allows us to release a new container every day without needing to be worried about any other delays in the pipeline. As you can see here, we have released a new version of the Doku Wiki Vignami solution three hours ago, but we also released another version a day ago and another version two days ago. Now that you know everything about having amit-test the entire catalog, I'm going to give you some lessons we learned during all these years. Security must be your top priority, and we take security seriously in Vietnamese. No matter if you are building your asset for internal use or for external use, you need to ensure that there is not any security vulnerability affecting that code. You also need to maintain your code quality. It's good to release the solution as soon as possible, but it's also good to maintain the quality of the solution you are creating. To do so, you need to set up processes to find and fix any code issues that are included in your code. Finally, automation is here again because I want to be sure that you understand how important this is. You need to automate your processes as much as you can to get a robust system. But those ones are not the only lessons we learn. We also learn that we need to review the library of tests regularly. We need to ensure that we are testing the correct functionalities in all the applications we are testing. So it's really important to balance quality versus quantity, because no matter if you have hundreds of tests, if you are not running the correct tests and testing the correct functionalities, you will be releasing a wrong application. Documentation is also really important. You need to generate internal and external documentation to allow your teams and users to use the solutions you are creating. Finally, support is one of our pillars. You need to provide people to get answers when they have any questions. So what's next? You can go and get any of the checks we support from our VHAP depository. As I showed before, you will find the information about how to install them in your own environment. You will also find great tutorials in our docs.binami.com site. If you need to deploy any other solution, you can go to our main site, binami.com, and get the different solutions from there. If you want to get more from our catalog, you can visit tanzu.binware.com to learn more about our commercial offerings. That commercial offering allows you to customize the binami solution to meet with your requirements. This way, you can benefit from the binami filing to build your own assets. This is the end of this session. I hope you enjoy it and thank you for watching.