 Hello, my name is Daniel Helfan and I'm a developer advocate with Red Hat's OpenShift Container Platform. In this video, I'm going to walk you through the OpenShift Pipelines tutorial. OpenShift Pipelines is OpenShift's cloud-native CICD solution based on Tecton. For those unfamiliar with the concept of cloud-native, in the context of CICD, what we're really referring to is the ability to run each step of a CICD pipeline in its own container. And this allows us to address issues of scale associated with CICD but also allows us to do things to improve the performance of a particular build or a pipeline run. And this is achieved by being able to actually run steps of a pipeline in parallel based on dependencies that are defined from previous steps that have executed along the pipeline. On a broader scale, OpenShift Pipelines helps to address issues of serverless in terms of actually having to maintain a CICD server. And it also makes all of the resources that you define as part of your CICD process in terms of pipelines and steps that run on those pipelines highly reusable across projects that you might be working with. In order to actually go through this tutorial, you'll need a couple of prerequisites. So the biggest one is that you'll need an OpenShift 4 cluster. And if you don't have one, you can sign up for one at try.openshift.com and then set up an OpenShift 4 cluster on AWS. The other thing that you'll need is the Tekton CLI. So there are instructions available at the Tekton CLI GitHub that will walk you through how to install the Tekton CLI on any operating system that you might be working on. The last part in terms of what you'll need as a prerequisite is either the OpenShift CLI, Cube CTL, or the OpenShift web console. And you'll use any of these three tools to create Kubernetes custom resources as part of this tutorial. In this video, I'll primarily be using the OpenShift CLI. In this next part of the video, what I'm going to focus on are some high level concepts that are associated with Tekton. In this portion of the video, I'm going to go over some high level concepts of Tekton that I think will be helpful for people going through this tutorial. Tekton is based on Kubernetes custom resources that are an extension of the Kubernetes API. We define these custom resources in order to capture aspects of CI CD that we would like to use with Kubernetes. The first resource that we think about defining when we are working with Tekton is a task. You can imagine a pipeline that builds, tests, and deploys your application. Each of those steps would be captured by a task when we think about working with Tekton. A task will on OpenShift allocate a pod, and then each task consists of a series of steps that will be necessary for, in an example case, building an application. And then each step will actually run in its own container. And this helps us to address issues of scale and then overall pipeline performance in terms of how long it takes a pipeline to execute. The next resource that we think about defining is a pipeline. A pipeline is a series of tasks that ultimately are combined to work together to automate some aspect of your application development process. Next, we think about defining pipeline resources. Pipeline resources are inputs or outputs associated with your CI CD process. So an input example that we can think of is a Git repository. So this will be the source code that your CI CD process works with. And an output that we can think of is an image that is actually pushed to an image registry as the result of running through all the tasks on a pipeline. Next, we think about task runs and pipeline runs. A pipeline run is the actual execution of a pipeline. And ultimately the result of that pipeline, whether or not it succeeded or failed. And then each task that is along a pipeline has its own task run as well. So later in this tutorial, we will actually use a pipeline run resource to actually trigger this pipeline that we're going to use in this tutorial to deploy a sample application. I hope that these high level concepts were helpful for anyone watching this video and proceeding with the tutorial. In the next portion of this video, I will go over how to install OpenShift Pipelines on an OpenShift 4 cluster. In this portion of the video, I'm going to show you how to install the OpenShift Pipelines operator in order to install OpenShift Pipelines on our OpenShift 4 cluster. We have this documented in the Pipelines tutorial repository on GitHub. But what I'm going to do is take you through this step by step. To start, what you'll do is you'll come out to your OpenShift 4 cluster and you'll go ahead and create a project. In this case, I will call it Pipelines. Next, what we can do is head out to the operator hub and click on operator hub. Next, click on integration and delivery and simply scroll down and you'll see the OpenShift Pipelines operator. You'll click on the continue button here and then install. And then you can leave all the default values and hit subscribe. So what's going to happen here now is that you'll see the zero installing option here will go to one installing and then it will officially be installed once one installed appears right here. So what we're doing here is not actually installing OpenShift Pipelines. What we're doing is installing the operator and the operator will install OpenShift Pipelines and will also handle upgrades of OpenShift Pipelines on our cluster. So I'm going to go ahead and now click on one installed. And what you'll see here is this OpenShift Pipelines install option. And this will allow us to actually install OpenShift Pipelines on our cluster. So I'll go ahead and click create new. And then the last thing that you'll do here is just leave this resource definition as is and click create. So now that we've gone through these steps, we've successfully installed OpenShift Pipelines on our OpenShift for cluster. And the next portion of the video when I'm going to go over is creating a service account for OpenShift Pipelines and also showing you how to set up the sample application that we're going to deploy in this tutorial. In this portion of the video, I'm going to go over some steps that will be needed in order to deploy the sample application as part of the OpenShift Pipelines tutorial. The sample application that we'll be deploying is the Spring Pet Clinic application. So this is just a simple Java Spring Boot application. And if you'd like to learn more, you can head over to their GitHub repository here. In order to give OpenShift Pipelines the appropriate permissions to use tools such as S2I and Builda to actually build the container image for the Spring Pet Clinic application, we'll need to create a service count and give that service count the appropriate permissions to use those tools. To do this, what I'm going to do is run all the commands that have been listed here as part of the steps through the tutorial. The first thing I'm going to do is create a project that we will work with. And this will be where the Spring Pet Clinic application is ultimately deployed to. I'm going to call it Pipelines tutorial. So our project has now been created. Next, what I'm going to do is set up our service count. So the first thing that I'll need to do is create the service count. I'm going to call it Pipeline. Next, I will give it the appropriate permissions that are needed for this tutorial. And I'll just run one more command here. So now our service account is set up with all the appropriate permissions that we need. And the last step that we'll need to do is that we'll need to create an image stream for the Spring Pet Clinic project. And this will define some properties that are needed in order to set up everything that will be needed to host the application on OpenShift. So to do that, I am going to head back out to the terminal and run the following command. And this will basically pull the raw content from GitHub that has been defined for this image stream for Pet Clinic. So you'll see here that we created an image stream, a deployment config, a service, and then a route that will be needed for the Spring Pet Clinic project. So if we head back out to our OpenShift cluster, we can search for the project that we just created. And then we can see that our deployment config is set up here and we should be ready to proceed with the next steps of the tutorial. So in the next part of this video, I'm going to go over how you can create tasks and then ultimately add those tasks as part of a pipeline that will be used to deploy the Spring Pet Clinic application out to OpenShift. In this portion of the video, I'm going to show you how to create tasks and then ultimately add those tasks to a pipeline that will be used to deploy the Spring Pet Clinic application as a part of the OpenShift Pipelines tutorial. As mentioned previously, tasks when they run on OpenShift allocate their own pod. And then each step within a task runs on its own container. To show you one of the tasks that we will be using as part of our pipeline, I'm going to take you out to the Pipelines catalog repository under OpenShift where you can find several tasks for several different use cases. In our case, we are going to be looking at one called S2I Java 8. So as you can see, this is a custom resource definition. And we are going to call this S2I Java 8 because this is going to use source to image to build an image for Java 8 source code. We can specify an input for this task. And in this case, we are giving it a name of source and a type of git. So what we're doing is basically saying that this task will take an input of a git repository. Next, we can define outputs associated with the task. And we give it a name of the image and then a type of image. So what we're saying here is that this task is going to produce an image that will be used for Java 8 source code. Next, we define all the steps that are associated with a particular task. As mentioned, each task runs in its own container. So we can specify for each step which image we would like that container to run. In this first case, we are giving a step, a name of generate. And this is because we're using S2I to generate an image for the Spring Pet Clinic application that we'll use. Then we use the image from Quay here, which has S2I on it. And then in the next step here, we're going to give a name of build. And this is because we're going to actually build the image using Builda. And we can specify that by saying that we want to use an image that has Builda available on it. So ultimately, what we'll do with this task is that we will input the Spring Pet Clinic git repository and it will produce us an image that we can use on OpenShift. In order to add these tasks to our OpenShift cluster, what we'll need to do is use the OpenShift CLI to run the commands that are specified here. So I'm going to go out to the terminal now. And I'm going to run the following commands to create the two tasks that will be used as part of the our Spring Pet Clinic pipeline. First, I'm going to create one called OpenShift Client Task. And this will be a task that has the OpenShift CLI, which will be used to ultimately deploy the image that we create from this next task that I just showed you the S2I Java 8 task. So I've added these tasks to our OpenShift cluster. And then using the Tecton CLI or TKN, I can observe what tasks I have available under the namespace that I'm working with. So in this particular case, I'm working with the Pipelines tutorial project. And I can see that I just added these two tasks to this namespace. So next, what we'll need to do is we'll need to add these tasks to a pipeline. As mentioned previously, a pipeline is a combination of tasks that can be used to automate some part of your application development process. To show you the pipeline we'll be using in this tutorial, I'm going to take you out to this Pipelines tutorial repo that we've been working in and go under the Resources folder. Here you'll see that we have another custom resource definition. We are specifying that this is a pipeline and we're getting it a name. We're going to call it Deploy Pipeline. We can specify the resources that will be associated with this pipeline. So these are going to be the inputs and outputs that come as a result of this pipeline. In the first case, we're specifying that we have one called AppGit and it's of TypeGit. So this is where we can actually define what specific Git repository that we're going to use as part of this pipeline. And then we can specify the image that is created as a result of running this pipeline. Then we can specify the tasks that run along the pipeline. In the first case, we have one called Build and this will use the S2I Java 8 task. We can specify that this task has inputs associated with it and specify that the resource to be inputted to it is AppGit and that it will ultimately create a resource as an output called AppImage. The next task along our pipeline here is Deploy and we'll use the OpenShiftClient task and we can specify using the RunAfter property here that we want this to only run after the build task has completed. So ultimately what we're doing with this pipeline here is we are combining the tasks to build our image and then push our image to a registry and then we will deploy that image using the OpenShiftClient. So the last step that we'll have to do here is we need to actually add this pipeline to our namespace just like I did the tasks. So I'll come out to my terminal again and I will run the following command and you'll see now that I've created the Deploy Pipeline and I can actually see the pipeline that is available in my namespace and you'll see that we've added Deploy Pipeline. So using the OpenShift CLI we've been able to take these custom resource definitions, tasks and pipelines and then add them to our OpenShift namespace so that we can ultimately use these to deploy the SpringPet Clinic application. In the next part of this video we're going to actually launch a pipeline run that will execute this pipeline and then deploy the SpringPet Clinic application out to OpenShift. In this last portion of the OpenShift Pipelines tutorial video I'm going to show you how you can trigger a pipeline that will ultimately deploy the SpringPet Clinic application out to OpenShift using OpenShift Pipelines. As mentioned previously we will need to include inputs and outputs that are associated with our pipeline and the way that we can do this is by defining what are called pipeline resource definitions. To show you some of the pipeline resource definitions that we'll use for this tutorial I'm going to take you out to the resources folder under the Pipelines tutorial repo and you can see that we start by specifying a kind of pipeline resource and then we ultimately give a name to our resource and say that we are defining a Pet Clinic image and specifying that we're giving it a type of image and then we can specify a URL that will be associated with the image registry that we are pushing this image to. The next pipeline resource that we define is a Git repository and we're calling it Pet Clinic Git and we give it a type of Git and then we specify the URL associated with the Git repository. You'll notice that previously we did not explicitly define the Git URL in our pipeline anywhere and that's because this is building on the idea that we want to have tasks and pipelines be highly reusable so you can create these resources that can be simple inputs and outputs but the pipelines and the tasks themselves never explicitly define these resources within them. In order to create this on our OpenShift cluster we can run the command here so I'll go back out to our terminal and I'll run this command to create these inputs and outputs that will be associated with our pipeline so you can see they've been added. There's not currently a command with the Tecton CLI that allows you to see resources but it will be available in a newer version of the Tecton CLI. So next now that we've created these resources that we can use as part of our pipeline the last thing that we want to set up is a pipeline run and a pipeline run is how you actually kick off a pipeline with the resources that we just defined as shown here. To show you this under the resources folder of the pipeline tutorial repository I can come here and show you that we have a kind of pipeline run and we are saying that we want to use the deploy pipeline that we created. It's going to use the service account that we've created and here's where we can define the resources associated with the pipeline. So in this case AppGit is going to refer to PetClinicGit that we just created through the resource and then we are going to specify that AppImage is associated with the PetClinicImage resource that we just created. So in order to go ahead and launch our pipeline using a pipeline run we are going to do it just like we've done with all the resources we are going to use the OpenShift CLI and we are going to run the command that is shown here. So I'll go back out to the terminal and we'll run this. So this will actually kick off the execution of the pipeline that we've defined and it will deploy the Spring PetClinic application out to OpenShift. So you'll see that we just created a pipeline run and we can view this pipeline run through the CLI. So you'll see that we have this pipeline run and the status of it is running and that's just because we've kicked it off. So to actually view the logs of this pipeline what we can do is we can run the following command with the Tecton CLI. We can run TKNPR logs and PR stands for pipeline run. So we're going to take the name here of the pipeline run and include that in the logs. And so what we can do with this command is just watch the logs of our PetClinic application deploying out to OpenShift and it might just take a second to start up here. So you can see some of the output of the logs that are coming through right now. So while this is actually deploying out to OpenShift I'm going to go ahead and pause the video. So our pipeline run has concluded successfully and we can verify that using the following command with the Tecton CLI. So you'll see that our pipeline run has a status of succeeded and it also reports the time that it took the pipeline run to execute. So the duration was seven minutes. And then the last step that we'll want to do is we'll want to actually head out to our OpenShift 4 cluster and just verify that the application is up and running as expected. So we can click on our pipeline tutorial project, click on the deployment config, and then the resources, and then click on the route associated with the application. And we see that it has successfully deployed and the Spring Pet Clinic application is up and running on OpenShift and has been deployed by OpenShift pipelines. So thank you for taking the time to watch this video and go through the tutorial with me. If there's any feedback that you can report we'd love for you to open issues in this repository but also feel free to open pull requests if you think that there's anything you can contribute back in terms of making this tutorial better. Thank you so much again for your time and we hope that this was helpful in explaining on the concepts of OpenShift pipelines.