 Before you get started, use the dash dash version command in a terminal session to verify that you have the following software packages installed. JDK version 11 or higher, Apache Maven version 3.83 or above, and Git version 2.30 or higher. After verifying your versions of Java, Maven, and Git, you're ready to get started with the Red Hat OpenShift Streams for Apache Kafka service. Once the service account is created, you're ready to connect an application to the Kafka instance. To do that, go to github.com forward slash redhat dash developer. Find the app services guides repository, locate the sample code URL, and copy it. Open your local terminal session, run the git clone command, and optionally add a depth equals one command to speed up the cloning process. Back at the Red Hat Hybrid Cloud console, the Kafka instance should be ready. To connect to it, expand the menu and click connection. Here you'll see the bootstrap server URL and the token endpoint URL that will be required whenever you connect to the Kafka instance using your service account. Copy the bootstrap URL and save it to your terminal session using a bootstrap underscore server variable name. And you'll also copy the token endpoint URL and save it in your terminal session using oauth underscore token underscore endpoint underscore URI as the variable name. Now you're ready to open the Java application. To do that, change the directory to the app services guides you cloned. Go to the code examples folder and find the corkis Kafka quick start folder. This folder contains the code you will use for today's exercise. Open your IDE, expand your source directory, open the resources folder, and select the applications.properties file. This is a basic Java application that generates price records and writes them to a Kafka topic in real time. Here we can see various connection settings and configurations specifically for outgoing and incoming messages. The variables set earlier such as bootstrap server are referenced here and will be read from your environment on application startup. This application has an outgoing channel named generated price and an incoming channel named prices. The channels persisted to the Kafka topic named prices. The outgoing prices are serialized using an integer serializer and are deserialized as integers when read back into the application. These channels are referenced in the Java application code base. Start the prices application in development mode and connect it to your Kafka instance. Use the maven corkis colon dev command as explained in the application.properties file. The application will then commence startup. To see the application running in a web browser, click the local host link and navigate to the forward slash prices endpoint. And you can see prices updated in real time within the application. These prices are being produced to then consumed back from the Kafka instance. That's how easy it is to get started with Kafka using OpenShift Streams for Apache Kafka. No installing, running and maintaining your own Kafka instances is necessary. Just provision Kafka instances using the Red Hat Hybrid Cloud Console and connect securely using SSL and authenticate using Sassel. For additional courses and self-service learning paths, be sure to check out the other available resources.