 Hey there. In this video, I want to give you a quick introduction to using service binding with a Node.js application and Red Hat OpenShift Streams for Apache Kafka. Now, as you can probably see, I have a Node application running on OpenShift. The application isn't particularly healthy though. You can see it's in a crash loop back off, which means it can start. Now, this application to start needs connectivity information, and that information should tell it how to connect to an instance of Apache Kafka running on Red Hat OpenShift Streams for Apache Kafka. So that means it wants to connect to a managed Kafka instance, and it can't, so it's failing to start. I do actually have an instance running over here in my OpenShift Streams for Apache Kafka account. So I have this Node.js binding example that I'm going to use for this demo, and I even have a topic created for orders that this Node.js application will be using. If you want to follow along and recreate this demo, you can head over to Red Hat Developer and find the OpenShift Streams for Apache Kafka page, and you can click that Create Kafka Instance button to get signed up and follow along. You get a free trial for 48 hours, no credit card required. The same goes for OpenShift. If you'd like to try this out, we have the developer sandbox, same situation, totally free, no credit card required. You get an OpenShift environment for 30 days with decent amount of RAM, decent amount of storage, and you can run applications there. So let's see how we can get my Node.js application to connect to this Kafka instance. I could do it manually. I could use this bootstrap server, create a service account, and provide the information the Node.app needs in that way, using secrets and config maps. But there's actually another way to do this, and that's service binding. So we want to automatically retrieve and inject the configuration into the Node.app. So let me show you how to do that. I have the OpenShift CLI installed, and I'm already logged into this cluster, so you can see I'm on the right project, and the API for the sandbox environment. I'm also logged into the rows CLI here, but if I wasn't, I could use the login command. It would follow a browser-based flow and authenticate me. So what I'm going to do now is make sure I'm logged into the right rows account, and you can see I am. So we can see that Node.js service binding, sorry, that Node.js binding Kafka instance. We can see it's ready, just like we did in the UI. So I'm going to select that now, and I'm going to use it as my context for all commands going forward. Now, to connect my Node.js application to this Kafka instance, it needs to know the bootstrap server URL. It needs to know the sasl mechanism security protocol, things like that. So the first step is making that available in my OpenShift project, and this rows cluster connect command does that. So if I run this command, it's interactive, kind of guides you through how to do it, so I'll say yes, I want to continue. And it tells me I need an OpenShift token here, so from cloud.redhat.com. So I'll go ahead and fetch that token. Now, you can see my token here. Naturally, you shouldn't share this. I'll revoke this token after the video, but I'll paste it in here, and you can see what happens. So this command does a few things. It creates a customer source known as a Kafka connection. It also creates a secret that contains that OpenShift token that I just pasted, and the Kafka connection and the operator that you'll see in a minute running in the background on this cluster also gets a service account. So there's a lot going on there, so let's start by taking a look at the Kafka connection. So I'll describe that, and you can see it has things like the SASL mechanism, the security protocol, and the bootstrap server URL. So all stuff I would need to connect to my Kafka instance. It also has the service account secret name, so that's the one that contains the SASL username and password. So this basically has everything my application needs to connect to Kafka, but if we go back to OpenShift, my application is not connected because I haven't told it how to use this yet. So to do that, it's also pretty straightforward. We just use the cluster bind command. So what this will do is it will give me an option of which application deployment I want to associate the Kafka connection information with, and that is my Node.js application. So I'll go ahead and do that. And once I do, the application starts to spin up a new pod, as you can see over here, and this new pod should be healthy and stay running because it will establish a connection to my Kafka broker. So if we wait a moment, the container should spin up. And if we take a look actually at the deployment over here, you can see in the environment there's a service binding root. So that was added by the binding we just performed. And if we look at the YAML, we can see that there's a volume mount, and that contains the Kafka connection details under that binding directory that the service binding root points to. Also, all of this is being automated via the service binding operator here and the OpenShift application services operator. So the two of those were working with the resources that the CLI created to dynamically update my application's deployment. And now the new pod has spun up. I can open up the application. So you can see it's a pretty basic demo application. A little cute, though. You can order ice creams. So strawberry, mint, banana. So I'll order a mint ice cream, and I'll just get one and place order. And when I place that order, it actually writes or produces the order record to my managed Kafka. So it associates an order ID with it and my email and other pieces of information. I can also place more orders. So maybe, you know, I'll make friends here. Order five strawberry ice creams because he's got a bunch of kids or something. Place order. And there you go. Now, let's actually verify those orders are in my topic. So I'm going to run a Kafka cat command I have in my history here. And hopefully we see the orders. Yeah, there you go. So you can see my order here. One order of mint. And I also have friends' order here, which is five strawberry ice creams and the order IDs are there as well. So if you want to try this yourself, you know, you can head over here to my GitHub. You can see the sample application here. It even has a little bit of code here showing you how easy it is to use this service binding in your node application. You just use the kube service bindings module. And once you've created the binding like I just showed you, you call the get binding function. Tell it you want the Kafka binding. And in this application, I use the Kafka JS client for connecting to Kafka. So I tell it I want it to return the connection information in that format. Pass it to KafkaJS, then create a producer and return the producer after it has connected. And then I can use it anywhere in my application. There's also a little overview of how the service binding works, the different resources, the operator. And yeah, you should definitely try this yourself. So head over to Red Hat Developer. Sign up for Red Hat OpenShift Streams for Apache Kafka. Sign up for the Developer Sandbox and have some fun with this. I hope you enjoyed the video.