 Hi, my name is Bernard Tizon. I am a technical marketing manager in the application services business unit in Red Hat. Today, I want to show you how to get started with Red Hat OpenShift Connectors. Red Hat OpenShift Connectors is a new cloud service offering from Red Hat. These are pre-built connectors for quick and reliable connectivity across data, services and systems. Connectors are delivered as a fully managed service. They are tightly integrated with OpenShift Streams for Apache Kafka, which is Red Hat's managed cloud service for Apache Kafka. So at the moment, we have more than 50 pre-built sync and source connectors available, where source connectors consume data or events from a data source and publishes the data to a Kafka topic on Streams for Apache Kafka. And sync connectors consume from the Kafka topic and push the data to a data sync. So the connectors include source and sync connectors to a variety of cloud services. So think AWS S3, GCP, PubSupp, et cetera. So these connectors are based on the awesome CamelK technology, but we also offer source connectors for databases based on the popular Debezen project for change data capture. So deploying connectors is actually very simple, thanks to a code-free and user-friendly user interface. So with that, let me show you how easy it is to create and deploy connectors. And for this getting started example, I'll use a couple of connectors that require minimal setup. So the first step is to log in in the Red Hat Hybrid Cloud Console at console.redhat.com. I log in with my Red Hat account ID. And once in the console, I can navigate to application and data services. OpenShift connectors is tightly integrated with Streams for Apache Kafka. So the first thing I need to do is to provision a Kafka instance. I can deploy a trial instance free of charge, which will remain available for 48 hours. To do so, I go to Streams for Apache Kafka, Kafka instances. I can create a Kafka instance. I give it the name. I select the region and click Create Instance. My Kafka instance is being provisioned. This is going to take a couple of minutes. So in the meantime, I can create a service account, which I will need to connect to my Kafka instance. So I go to service accounts, create service account, I give it the name as well. And then I will be presented with a client ID and a client secret, which I need to copy because those are the credentials for my service account that I will need later on. So once I have copied the client ID and client secret, I can close this window. So let's go back to my Kafka instance. So now that my Kafka instance is ready, I need to give necessary permissions to my service account. So the service account needs permission to read and write the topics on the Kafka instance. I need to find these permissions from the access tab of my Kafka instance. So I click on the instance and then on the access tab, you will see the default access control list. So I can change that with manage access. I select my service account, assign permissions. So I need to be able to consume from topics and produce from topics. I will take a white card here. Also for the consumer groups, I will use a white card. So that means that my service account can consume from every topic and use every consumer group. And then I need another one to produce to a topic. And again, I'm going to use a white card here. So now my service account is set up with sufficient permissions to read and write to every topic. So finally, I need a topic for my connectors to produce and consume messages. So let's do that. I click on the topics tab, create topic, let's give it a name. And I will accept the default values for partition and retention, I click finish and my topic is created. So I'm ready now to create my first connector. So for this, I navigate to the connectors page. So let's start with a source connector. So a source connector generally consumes messages from an external system. But for this example, we're going to use the data generator connector, which simply produces the messages at a configurable interval. So I first need to find my connector. I can do that by searching for data. Connectors will give me one choice at the data generator source connector, which I'm going to select. So next I need to select my Kafka instance. That's the managed Kafka instance I created previously. Next I need to create a namespace to deploy my connectors. So in the context of the service preview, you can create a preview namespace which will remain available for 48 hours. And in that namespace, you can create up to four connectors. So I click create preview namespace and then create. And this will create a namespace, will take a couple of seconds for it to be available. When the namespace is available, I can select it, click next, which bring me to the configuration of my connector. So my connector needs a name. Let's call it data source. Then I need to copy the client ID and client secret from my service account I created previously. The next screen shows the connector specific configuration details. So I leave the data shape to octet stream. The topic name is the topic I want to produce my messages to. So that's a topic I created previously, which was called connector. I will leave the content type to text plane and send a sample message like her world at an interval of 10,000 milliseconds. So every 10 seconds, my connector will produce a message to the connector topic. On the next screen, I need to configure the error handling policies. So I have a couple of possibilities. The connector can log failures. It can stop when it encounters a failure or it can send messages to a debt leather queue in case of failure. So the default is stop. And that's the one I'm going to stick to. The last screen gives an overview of my connector configuration. So when I click create connector, it will deploy my connector into my namespace. So my connector is being deployed. This is going to take a couple of seconds. So now that my connector is ready, it starts producing messages to my topic and we can verify that in the message viewer of my Kafka instance. So let's go back to here to my Kafka instance. I select my topic and in the messages tab, you will see that messages are being produced roughly every 10 seconds with the hello world value. So let's now create a sync connector. For this, I'm going to use the HTTP sync connector, which consumes from a Kafka topic and calls an HTTP endpoint for every message it consumes. So an easy way to get to an HTTP endpoint is with a free HTTP webhook testing site such as webhook.site. So when I navigate to webhook.site, I get a unique URL here that I can use as HTTP endpoint. So let's create the HTTP sync connector. I can search for it by typing HTTP here. That will give me the HTTP sync connector, which I select. I select my Kafka instance and my namespace where I want to deploy the connector. Again, I have to give it a name and paste my service account credentials. On the next page, I see the connector specific configuration. So again, I leave this to octet stream. I leave the method to post and for the URL, I'm going to copy the URL from the webhook.site and use that as an endpoint. And the topic I want to consume from is called connector. I'm going to keep the same error handling method, my overview screen, and finally create my sync connector. So once my HTTP sync connector is ready, I can head over to the webhook.site page and you will see here that roughly every 10 seconds, my HTTP endpoint is being called with the Hello World message. So actually, I have an end-to-end connection between my source connector with generate messages and then the sync connector who consumes those messages and calls the HTTP endpoint. From the connector screen, I can, by clicking on the three dots I can hear, I can stop my connector's few details, edit the configuration or potentially delete it. So that's it. So in this video, I wanted to show how easy it is to get started with OpenShift Connectors. So I showed actually a really simple example. So stay tuned for more videos where I will demonstrate more realistic use cases for OpenShift Connectors. Thanks for watching.