 Hello everyone, my name is Urvashi Munani and I'm a Principal Software Engineer on the OpenShift Containers team at Red Hat. Today I'm here to give you a super quick talk about improving data science collaboration with CICD. So as we know, AI and data science have really skyrocketed in popularity recently. Data scientists are still doing exactly the same thing they used to, you know, exploring and massaging data, creating models and updating them. The thing that's changed though is the need to collaborate faster than ever before. Of course, you know, old ways of collaboration, you know, via sharing files manually or via GitHub, still apply here. But honestly, that can be pretty tedious. And I'm sure our data scientists will really appreciate not having to deal with all of that. So what if we take advantage of the fact that Kubeflow notebooks works with containers? We all know that containers are portable and very easy way of sharing your applications. So why not apply containers to data science? Well, one problem that comes with that is most data scientists probably don't understand or, you know, really care about containers. So now asking them to build a new container image every time they make a change is probably worse than just collaborating via GitHub. So is there a way we can simplify this? And the answer to that question is GitHub Actions. Given that most of the collaboration today happens on GitHub, we can use GitHub Actions to automatically trigger container image builds when a notebook is updated or a new notebook is added. We can tag these container images with the show of the commits to help keep track of all the changes going into these notebooks. And then once built, we can push these images to registries. Other data scientists around the world can then pull this, you know, image down, run it directly with Kubeflow notebooks, or can use like a local container engine such as, you know, Podman or Docker. So I have a demo of this working in action. I have a repo here where I have a notebook that I'm just updating, just adding something very simple, like listing the colors in a rainbow. Once I merge this pull request, and we go to the actions tab, we see that the first of the two workflows is triggered. So the first workflow is to check whether a notebook was added or updated because we only want to trigger a container image build when a notebook is changed and not when any other files in the repo is changed. The next workflow gets triggered is the build notebook image. So this is the workflow that actually builds the container image, taking and copying in the new content that went into the notebook, setting up all the requirements that you need for running this set notebook. And then once this workflow is using builder under the hood, but you can basically use any container build tools, like Podman, Docker, et cetera, anything that can build a container image. And then once this image is built, we tag it with the latest tag, as well as the show of the commit. So this is just taking a bit. So over here on the bottom, we can see that the image was built and tagged. And the next step is just push it to my query repository. You can use any container registry. So I'm going to be using, you know, my local Podman container engine because I didn't set up a Kubernetes cluster with Q-Flow notebooks, but this will work exactly the same in that environment. So using Podman, I pulled down the image, and then I will, sorry, given time, I'm just skipping through the video quickly. I pulled down the image, I'm just going to run it by exposing the port so that once my server is up, I can access it in my local web browser. I have the URL there, and just going to my web browser to open this up. And there you go. We have JupyterLab up there working. My notebook is there with my changes. And when I run it, I see that list of, you know, rainbow colors. So yeah, so this is basically, oops, yeah, next slide. So this is just basically a simple way of, you know, setting up that automation for just being able to build container images immediately whenever changes go in. This workflow shows pushing the container image registry, but we can definitely configure it to automate it further. For example, if you're using Argo CD to manage, you know, the notebooks available on your cluster, you can configure the GitHub action to also update the YAML that is applied to your cluster. And yeah, this is the repo that I was doing that, doing the demo in. The repo has the basic setup there for having this workflow in place. And I have also added documentation to explain the different components. So this is something you're interested in trying out and would like to configure more. Definitely feel free to clone that and just, you know, configure it to fit your needs. Yep, that's all I have for you today. Thank you.