 Hello, my name is Brett Weaver. I'm an engineer at Intuit and here to talk to you about an exciting new project we are developing, Argo Cloud Ops. Intuit developed Argo Cloud Ops to manage our Cloud resources via GitOps leveraging open-source frameworks on multiple Cloud providers. So what does that actually mean? When you deploy a new service, you probably start with a configuration management framework to manage the resources for that service. For example, for Terraform, you create your main.tf and then run Terraform apply. At some point, you may need to do some configuration prior to running Terraform, staging artifacts or setting context variables. To perform this pre-work, you may create a bash script to run prior to Terraform apply. In the future, you'll likely need to do something once Terraform is completed, and you might be running this on your laptop initially, and things are working great locally until you attempt to implement running Terraform with minimally privileged credentials on a secure system. This needs to be run against both dev and production, which have different configuration and access credentials. Over time, new services are developed which require their own configuration. At this point, you've probably started running your configuration management framework on a build or Cloud instance. An acquisition or two happens, and you have configuration which is done in different frameworks that you'd like to support using a single solution. As the number of projects increases, you need to provide varying levels of access to teams who are responsible for managing those projects, as well as capture logs and audit trail. You get this all working, and eventually your business has requirements to run service and additional Cloud providers. How do you onboard these new services with unique configuration management solutions for other Cloud providers? This is the situation where Intuit found itself after almost a decade of running our workloads in the Cloud. We developed Argo CloudOps to solve the complex problem of managing our Cloud resources across Clouds and frameworks. It is guided by the following principles. Argo CloudOps is adaptable. You can get started in minutes, and it can be expended to support known and unknown Cloud providers and frameworks as your requirements evolve. A GitOps interface for managing resources. Argo CloudOps is open source available on the Argo Proj Labs organization. Finally, Enterprise Scale, providing support for multiple projects and teams with role-based access control, while removing the need for direct access, or admin permissions to manage Cloud resources from continuous integration pipelines. Why did we align to GitOps as the interface for Argo CloudOps? GitOps provides a consistent developer experience, regardless of the framework or Cloud, engineers engage via known GitOps interface versus leveraging multiple pipelines and tools for different Cloud providers and frameworks. GitOps pulls all configuration management from a Git repo with security controls and verifiable audit trail. We are on a journey to consolidate to GitOps as our interface for managing Kubernetes and our Cloud resources, providing a single developer interface to their entire service configuration. In the following demo, we will download this example manifest from a trusted Git repo and use it to deploy the following resources with Amazon Cloud Development Kit. Here's a quick demo of the experience using Argo CloudOps to configure Cloud resources. Argo CloudOps has the concept of projects which contain targets. Access is granted to projects for a user. In this demo, we have created a single project, project one, and are going to use it to apply changes to target one. We start by running the Argo CloudOps sync command. Sync accepts four options, which will tell Argo CloudOps where to download the configuration within a trusted Git repo for that project. This will be used to configure the target resources. In this example, we are using AWS Cloud Development Kit to configure resources in the target. You'll see the sync command will return an ID for the configuration being applied by Argo CloudOps, which is run in a container managed by Argo workflows. You can retrieve the logs for that workflow via the logs command. You can see Argo CloudOps that require short credentials, downloads the configuration from Git, and runs the configuration command. You can use get to get the status of a sync, which can be used to trigger success or failures in a system calling Argo CloudOps. This was just a quick introduction. For full setup guide and details on Argo CloudOps, please check out the Argo CloudOps project in the Argo Project Labs GitHub organization. Thanks for listening. We would like everyone to check out Argo CloudOps and provide us feedback. Please reach out and let us know if you think this can help your organization's cloud resources.