 Good afternoon, everyone. Thank you so much for coming out. This is my first KubeCon. I'm super excited. So hey, everyone. I'm Junaid. I'm a software engineer at Tatlin. And I work in the platform team primarily, a lot of Argo workflows. So today, I want to talk about a little tool we built to make our lives a little more easier working with Argo workflows. So it's title shipping Argo workflows in packages. So I have a few questions before we get started. So how many of you write Argo workflows as a part of your day today? Like every morning, you are like, oh, I have to write some YAML. Cool, OK. How many of you manage multiple workflow templates? Like you have to deal with a bunch of YAML files, and you're always confused. Cool. So I want to talk a little bit about reusability in Argo workflows, right? So this is our very famous example on Vailsay. Think of this as some standard workflow template that you are constantly reusing in your org. Maybe it could be some processing step, or it could be some API call, or something. Now, if I want to reference or use this workflow template in something like, say, a hello world, we normally do this. We do a template ref, and we just say Vailsay, template say something, and then we pass some parameters, or whatever. Now, for this workflow template to run, the Vailsay template has to be present on the cluster. So immediately you can see there's this dependency graph forming. You need some other workflow templates to be present in the cluster so that your workflow, that your writing can run smoothly. So this is something we ran into at Atlan when we started our journey with Argo workflows. It became a mess, so we started to solve around it. So I just wanted to give a little bit introduction about Atlan. Atlan is what we call a modern metadata catalog. So it has all the features you need to build a data catalog along with some collaboration features like so you could have your tables, columns, all crawled in, and you could add readme to your tables. So we aim to be a home for data teams, just like GitHub is the home for engineers. Figma is the home for designers. So we are trusted by data teams across the world. We are a B2B company. So we have this deployment model where every customer gets their own tenant. So each of those tenants, we have a deployment of Argo so that deployment can run all the Argo workflows that are basically data pipelines that will pull in all the metadata. So this is the first screen you would probably encounter when you start using Atlan. You'll just open up, click on new, and you'll just start setting up your integrations. You'll set up a Snowflake workflow, you'll set up a Redshift workflow, stuff like that. So our challenge was to build this, to build this ecosystem of basically data pipelines so that all our customers could use them, set them up, and build a self-service type of platform where you could just set up your pipelines as you feel comfortable, as you feel like. So this was a challenge. Just like any other data pipeline, it's ETL, extract, transform, and load. But if you imagine this in the context of an Argo workflow, so all of a sudden, you have to think about, oh, if I build a new package, I have to do, I have to declare all these steps again and again and again. So this is where reusability became a huge problem for us. So our idea was that, okay, why can't we just use Argo workflows as packages? Just like we do for JavaScript, just like we do for Python for Golang, why can't a workflow be a package in itself? So that's how we built Argo PM. It's inspired by NPM. It's a package manager for Argo workflows. It enables developers to distribute and consume Argo workflows as JavaScript modules, right? So it's public, it's open source. You can install it right now, start using it. It has a bunch of features. It's very similar to NPM. So you can like scaffold a new package, install to class. I'll come to these points again. Dependency management, annotations and labels. You can also add to your package standard Kubernetes resources like secrets, config maps. We have also added some features to add Grafana dashboards to your package. So all of a sudden, your package becomes something that has your workflow as well as the observability components attached to it. We also added support for static data. Sometimes you just need some artifacts to be present on your object store. So there's that as well. So coming to a previous example, this is how we normally would package those two templates as Argo PM packages, right? You would scaffold a package, very simple. As you can see, there's a templates folder which has your templates. And then there's a package.json file, right? Say like you want to, so now that we're talking about packages, of course we have to have registry, right? Somewhere you could publish your packages and make it easily consumable for the other developers, right? So let's say you work for Pipe Piper and you're setting up their Argo PM registry. And so this is your VLC package. You set it up, you add the template, then you build your workflow, the hello world package. And then, so the main point I want to show is the package.json for both these packages. So if you look at the VLC packages, just a simple, no dependency package. It does not depend on anything. So developers can use it. And when you're building your hello world package, you know that I have to use the VLC template. So you just add it like any other JavaScript dependency, right? So once you have this set up, you basically run NPM publish because it's NPM compatible because it's JavaScript compatible. This just goes out and becomes a package on NPM. All of a sudden you have this ability to push your workflow templates just like JavaScript packages, right? Now when you run Argo PM install on the hello world template, you'll notice that it will also install the... It will basically go through the dependencies object and install all the dependencies and then go into those dependencies and install on. So all of a sudden, when you install just the hello world package, you have both the templates on your cluster. All of a sudden, all the headache around, oh, I have to get all my dependent templates on the cluster as well as gone. Argo PM takes care of that for you. So yeah, this is Argo PM we have. And then you can just run it. It would say hello world. So that's Argo PM in a row. So I want to talk about... So when you run Argo PM in it, just like you do NPM in it or YAN in it, it scaffolds a package for you. So we do a similar thing with Argo PM when you run in it, it will generate. So you can see config maps, as I said, standard Kubernetes resources, Chrome workflows, secrets, static data. We're also adding support for pipelines as well. So yeah, and we also have support for annotations and labels in the package.json itself. So in the package.json file, you could define your annotations and labels and those will be attached to your workflow template as well. So this becomes very easy when you're trying to do a mono repo setup and you want some annotations and labels to be there on the cluster, right? So yeah, these are the commands for Argo PM install. It's very simple. Inform it actually, give you some info about the package. Run would trigger the workflow template and it would run in it, uninstall list, pretty standard, very much similar to what NPM offers or any kind of package manager offers. We have some flags as well. You could, so the most interesting one is minus, like dash R and dash C. Dash R we use to change the registry, like the target registry. You could just use NPM or if you want that your org needs its own private registry behind a VPN or something, you could do that. Minus C will actually, it's a flag that either installs it as a workflow template or as a cluster workflow template and dash R's help is just help. So yeah, it's designed as it's very similar to NPM. It's actually built with NPM primitives and the JavaScript library for Kubernetes. Our intent is to get it as close to NPM as possible so that it just feels seamless to build Argo workflows just like you're building a JavaScript package. And functionally, the install command is very simple. It just loops over the dependencies array, recursively dissolves the dependency and then installs them on the cluster. So yeah, basically, all of a sudden you have a package managing tool that will make sure that your dependencies are always versioned and they are always in the cluster in the right format. So because we use, because we relied on package.json and the whole NPM primitives, we are all of a sudden able to leverage a whole host of tools that are available in the JavaScript ecosystem. For example, internally at Atlin, we use Lerna for monorepo management. So all our packages are in just one repo and then Lerna takes care of doing the version upgrades and so on. And so this whole setup, so we have our own private registry. It's called packages.atlin.com. That's where we push all our packages and then from there, our developers can kind of browse them, take a look what's there, have some documentation, have some read me. And when it comes to updating every, like you can set up your own way, but what we do is in every deployment of Atlin, there's an update workflow which runs, it's a ground workflow which runs every 30 minutes and it pulls the latest versions and we can disable it based on if we want that tenant to have all the package versions or not. On top of this, we have also built capabilities to do like tenant-based deployments. So I'm just giving a hint on how, just because we are relying on an existing ecosystem, there's a host of tools that you can use to build your own setup in your own org and use RGOP in a way that makes sense to you. So yeah, that's all from me. Any questions? Actually, I curious if you have thought about deploying these templates in GitHub's manner with referencing a version, remote version. Yeah, so yeah, you can have GitHub actions that like whenever your code is more as it does, just like the Oscar package, right? It runs learn-up publish or NPM publish before it does some like validation checks. So you can have that kind of setup as well. I mean in GitOps, to update the Git tag and then August CD, for example, will go and fetch the right version. I'm not sure, but I think package logistics will have their own tagging system. So maybe you can build something of that that it will fetch the tags necessary and you can have a setup like that. Thank you. Anyone? Thanks.