 The GitLab platform includes a unique feature called GitLab CICD that allows the execution of pipelines in a close relationship with repositories. IBM ZDevOps solutions and GitLab can complement each other to provide a complete experience for a modern development methodology. In this video, we'll show you how to integrate some of the IBM DevOps solutions for mainframe into GitLab CICD pipelines. We'll begin by logging into GitLab and browsing a defined project. In the project's repository, a specific file is used to define the pipeline. This file is called gitlabci.yaml. Let's take a look at its contents using the integrated web IDE provided in GitLab. At the beginning of this file are some variables that will be used throughout the pipeline. These variables define parameters used in pipeline stages. There are five stages in this pipeline. The first stage is called preparation and is aimed to prepare the workspace on the target system where the build will occur. The second stage is called build and its purpose is to trigger IBM dependency-based build to build this cobalt-based application called mortgage application. The third stage triggers an IBM developer for Z feature called code review. Code review inspects the quality of the source code and checks that it respects a set of defined rules. The fourth stage calls a packaging script that leverages the build report created in the build stage. This script will package the build binaries into a zip file and send it to Artifactory with the help of Urban Code Deploy. Finally, the last stage is triggering the deployment of the created package to a target environment on COS. This project is enabled with the auto DevOps feature which triggers the execution of the pipeline for each commit, but you can also manually run a pipeline when needed. When the pipeline is running, the GitLab web interface will automatically refresh to display the status of each stage. You can click on each job to check for the details. When checking on the execution of the preparation job, you'll notice it removed old artifacts and created a brand new work directory. The build stage is executing an IBM dependency-based build process to build the entire mortgage application. The summary of the build shows that nine files were successfully built. At the top of the log, you can see the exact command that was executed on the ZOS machine. Let's take a look at the DBB web application to check the build report that was generated for this build. In this report, we can check some properties related to the build process and view a detailed report on the artifacts that were just built. For instance, it shows the target data set where the BMS maps and load modules were generated. Now we'll go back to the pipeline execution and check the results of the code review stage. This process leverages the information collected in the build report generated by DBB and will execute the code review process on the COBOL files that were used to build the application. It is executing a JCL to call the code review function. At the end of the process, we can check the return code provided by this process. The next stage is the packaging phase of the pipeline. To perform this step, we're using the Urban Code Deploy agent running on ZOS and a provided utility called BuzzTool By listing all the artifacts that were generated during the build stage, this process will create an input list to the BuzzTool script that will create a component and upload it to Artifactory. The created package is now available in the Artifact repository and ready to be deployed by the next stage in the pipeline. Next we'll have a look at the deployment job. As shown in the log of this process, Urban Code Deploy was instructed to deploy the latest version of the mortgage application component that was just packaged on the previous step. The deployment is now marked as done so we can check in the Urban Code Deploy Administration interface that the last package was correctly deployed on the integration environment. By checking the details of this package, we can observe that the origin of the artifacts are the ZOS datasets where our applications' DBRMs, load modules, and maps were just built. That concludes this demo.