 Welcome to Amsterdam and KubeCon, CloudNativeCon 2023. Join John Furrier, Savannah Peterson, Rob Streche, and U.P. Scott as the Kube covers the largest conference on Kubernetes, CloudNative, and open source technologies together with developers, engineers, and IT leaders from around the globe. Live coverage of KubeCon, CloudNativeCon 2023 is made possible by the support of Red Hat, the CNCF, and its ecosystem partners. Hello everyone and welcome to KubeCon Europe. We are here in beautiful Amsterdam. In fact, it is so sunny and we've got natural light inside the building. They're even letting me keep my shades on, although I will say that's probably because my co-host, Upe, and our first guest, Amit, are so bright and brilliant. I've got to keep my shades on for that reason. Upe, are you going to bring shades tomorrow? I need to. It is so bright in here. It is so bright in here. It is so bright. It might be Amit's fault. You're a Kube veteran. We loved having you on the show in Detroit. We're over here in Europe. How are you doing today? How are you feeling? Fantastic. Couldn't get better with the sunlight here. I know people believe it. We're not under fluorescent lights. That in itself is pretty exciting. I haven't seen the sunlight for two weeks. So we'll see you outside in a few minutes. Precisely. Yeah. So we obviously know who you are and what you do, and we love having you. Love your energy. We, as a collective, have actually dubbed you the best hustler in the community, and we say that as a compliment. We love your hustle. Jesus, okay. I'll take it. So just in case the audience isn't familiar, give us just a quick pitch. Sure. Now that you're feeling awesome. I'm feeling amazing. So Kube, we're branded ourselves as Chad GPT for DevOps. What that actually means, without all the buzzwords, is we're offering a conversational AI experience for end users who can go and consume DevOps functions and platform engineer functions very easily using natural language, prompting Slack teams or any type of chat interface, and very easily attributing their natural language to automations, knowledge base, and so forth. So very easily you can go and consume DevOps functions that typically would require a human in a loop with a chatbot, a smart chatbot that's very conversational, tells jokes, very actually lighthearted. So that's... Just like us. Wow. Just like us. I feel like a chatbot now. Yeah. Are you a chatbot? Maybe. Yeah. So when we came out that was kind of our initial thesis, since then we've rolled out and we're actually announcing this today at KubeCon, the operator experience. Right. Tell us more about that. Sure. Absolutely. And one of the biggest challenges for platform engineering teams, DevOps, SRE teams is rolling out and maintaining a platform, adopting a new platform, whether it's an internal developer, a portal, a service catalog, or really any type of workflow automation, how do you go and create and maintain these workflows? And we've created LLM embedding with all of the new technology. Now you can go and prompt a workflow using natural language, it will go and generate that workflow with all of your business logic, with all of the access controls that are baked in, and now you can very much fine tune it, drag and drop in a low code interface, play it back, simulate it, and then commit it to your organization within minutes. Okay. So you set a magic buzzword, low code. So this used to be a thing. Years ago everyone was talking about this, right? I mean I think a lot of people are still talking about it. People are still talking about it. An interesting part here is there was a concept called the citizen developer kind of writing that same wave, and I see it making a comeback in the cloud native space as well, where a lot of the stuff that we do is still fairly complex. You need to know a lot about Kubernetes, about storage, about networking, about cloud, the list goes on, to be able to even just operate something from day one. And so that complexity is still something that is sometimes too complex, too mind boggling for users in an organization, especially if you factor in self-service, you know, giving people the empowerment to do it themselves. So if I understand correctly, this is kind of the problem, space that you're solving that you're looking at? The two sides of the coin is really the end user experience, how do you go and extend things very naturally to them so they don't have to either flag one of the operators in the loop and get their help, or otherwise no code or low code in order to interact with the system? And for the operators, it's really velocity. How do you go and extend that without having all the maintenance and overhead, otherwise low code systems would typically, even if you want to go and create, as a citizen developer, if you use that term, let's use it, prompt engineering is a new citizen developer if you want to be honest about it, but that's a different story. If you want to go and talk about low code as a citizen developer, you still need to have all the business logic, all the organizational knowledge and domain expertise, even if you're not technical in terms of coding, you still need to have that operational knowledge. What better way to go and bridge a gap than to express your intent and to being able to have the system smart enough to recognize all that and align that and have you with the reinforcement learning aspect of it just fine-tune it for your own needs. Yeah, exactly. Yeah, that makes total sense, and especially if you factor in kind of the amount of work that still needs to be done with boilerplate code with, you know, total... That's exactly what I was just thinking. There is just so much work that still needs to be done every time, and it doesn't necessarily add value, and like you said, the velocity of it is kind of the key aspect of it. Yeah. It's not necessarily, you know, an insurmountable mountain, but it is a lot of work, and just taking away that boilerplate, I think, is very, very valuable. Sure. So what are the use cases that you see your solution being used for specifically? What's kind of a... Oh yeah, talk about trends. Yeah. Yeah. We'll give a few flavors. One of them in really the low-hanging fruit is just access to organizational knowledge. You can go and prompt it, so we train on confidence, notion, your own documentation, structured, unstructured data. It's really a data catalog, if you want to call it. Now, doing our own embeddings using vector database, we can go into that in a little bit, but the concept here is, without having to know anything, without having to really being thrown into wikis who may or may not be outdated, just have a conversation with the interface, almost like a chat GPT-like experience, only here it's kubia, or kubi is our mascot is called, and then it will either throw the documentation at you with links or precise, very concise answers, even with actionable prompts. So you really don't have to know anything. Having conversation with a human being, that's the closest thing you can have. You're expecting short, concise answers. You don't want to have a wiki thrown at you and read the Bible, that's not the point here. Or have to go to somebody with the legacy knowledge in an organization every single time. Data that exists in silos. Yeah. Exactly. So I love the velocity. I'm glad you just brought that up. And ever since we met, I've thought about it multiple times, you save people a lot of time. Yes. My God. Yes. Can you talk more about that? How much time do you think you're really saving? If you can quantify that, I feel like you might know. When people try to quantify transactions, they usually do machine-to-machine interactions. How much time did you trigger a CICD job, or did you? Right. That's actually the smallest portion of the time saving. That's typically 2 to 3 or 5% of the time saving. What is a time saving? The context gathering. Gathering and stitching different transactions, sub-workflows into each other, human in a loop. So you can go and use an example of one of our customers who's in a digital media service of space. What is he doing? He's doing kind of a more of a legacy type of transaction where his project managers are going and bridging the gap with the technical team of having to go through file storage servers, going, getting an encrypted file, unencrypting it, giving kind of a proxy, a little small thumbnail of it, delivering it over Slack in a secure manner, uncorking it. All these things can take days because of the human element to it. Here you can do it within minutes with approval flows and everything. Wow. That's nice. And there's another aspect of this that I'm kind of interested in. Yes, you optimize existing workflows, but I'm sure this unlocks some new potential too. There's new use cases. There's a new thing that we haven't thought of before that organization now can do with this. Tell me more about that aspect. Great question. We're going straight to the punchline of where this is going and let's start there. It's always good to start with the TLDR and go from backwards. What's a utopia in the world of large language models? You know what? I'm going to come with a statement here. What have large language models actually done for us beyond all the buzz and all the cool little demos you're seeing? It's bridged a gap between human to machine interaction for the first time and this is kind of where the thesis lies. Humans and machines can understand each other and rather than having the human need to know code or low code to interact with the machine, the machine is now smart enough to interact in natural language and convert that into code. So that's uncorked all of the business potential. Now where are we going with this? We can now bridge all the gaps and all the little gray matter that still exists in there. A human will be able to go into a chat interface and interact as if they're talking to a colleague on the other side. Hey Fred, help me do XYZ. Now whether it's an existing or non-existing workflow, the system will be able to build that workflow, fine tune it with the operator possibly and with all the access and permissions and then serve that to an almost real tie. That's incredible and it's all optimized. I mean it's pretty amazing. Precisely. Wow, okay so you mentioned that your booth is an experience. Tell us a little more about that. Savannah, don't get me in trouble. I'm not going to get you in trouble. You said that we can take photos and stuff, you've got t-shirts, you were a big part of our swag segment in Detroit. I'm going to keep it PG here, don't worry. I understand the rules. We have a dancing robot. Oh that's what you're going to, okay. That's not what's going to get me in trouble, but we do have a dancing robot. We do have... Are you the dancing robot? I wish I could say I could dance like the robot, but a kid would tell you otherwise. We have t-shirts, we have a very fun experience. Just shut off, give us a little chest here. Yes baby, love that. And we do have quite a unique experience with swag that all I can say you're an answer to them, just be protected, okay? You're stealing my line, so I love it, I love it. I mean we all want everyone to stay safe here on the cube, and let's just say if you need to stay safe folks, be sure and drop my kibby a booth. I'll be dropping by that booth later today, I can assure you of that. You probably should too, in fact the whole staff should. Only if you tag it, so. I love it. What's next for you all? I mean you obviously got to make an announcement today, but give us a little vision down the pipeline. So one of the very interesting ideas that we've kind of accumulated with all the human interactions and human-to-machine interactions. What an interesting message. It goes beyond EVA, so it goes beyond just cloud operations. Today that's really our focus. But a lot of the end users we're discovering are actually our customers' customers, so customer success, solution architects who are actually using this to interface with their customers and to give them a better experience. So that's kind of next in line is how do we go and allow our ecosystem to leverage conversation AI, highly curated conversation AI, in their own workflows and really partner up, and that's actually where we're going with this. I love that. So actually now I'm curious since you mentioned that, has there been a use case or users that have surprised you that you weren't expecting? Oh, gee. Yes. But see, this is really where it comes down. Every organization has their own domain expertise. So whether it's in a digital media space, as I mentioned to you, or in a dev tool space, a lot of customers are really seeing the benefits of the embeddings of really the large language models and how that interacts with the user base. Beyond the humor, that exists already in the bot. We're actually seeing a lot of internal organizational workflows. Our people are interacting. Their CFO is interacting with them. So for example, one of the use cases is, and I have to be very careful because I don't want to call myself replacing or displacing other people's income, but cost savings around license management of our ISV. Oh, yeah. So this, I'm not going to use their name because they're not going to partner with me any longer, but imagine they're not getting any free press. But imagine my users are saying my next tier is going to cost me instead of $20 per user, $100 per user. And I need to roll this across 20, 30 different users. Can we go and use Kubea to be the interface and then keep it in the licensing model to a lower tier and now that's approval flow and attributes access, the request will go through a single point of contact with the smart approval flow and that very easily optimize the workflows and more than thousands of dollars. I was just going to say, cost optimization is such a value. So that actually extends into the FinOps movement as well. Where we're now in a phase where cost is starting to matter. Where we're past, cloud can cost anything, we're rationalizing. And so instead of doing that work manually, there is a big opportunity for you to go in and help customers automatically save and control and control. So think about how you can go and provision a machine through Slack, right? Like you would with the AWS interface. But all of a sudden, if there's a budget breach above $50 per day, that has to go through another approval flow. So now it's really guardrails and hygiene that you can really implement without having to overextend yourself going to different consoles, context switch, all the things that typically happen in an organization. Too much effort means people aren't doing it, right? And that's the idea. We're trying to go into democratize that and give that type of sentiment across the organization. Well, we certainly see that you're democratizing a lot at your booth this time around. Long live democracy. Well, so you're now a QCon veteran. We'll help to have you back on the show when we're in Chicago. Well, will we see you in Chicago? Will we be there? 100%. 100%. I love this. What are you? So obviously you've got the big announcement. What else are you excited about at the show this year? Oh man, you know, there's so much back on sale ones. Even though we're in a very tough economy, I think everybody can relate to that, right? Nothing's going away in terms of the customer needs. The customer needs, if anything, are only extending themselves and increasing with the need for provisioning of infrastructure in a smarter way, with the need for access to requests and controls in order to go into tying the high genes, the DevSecOps movement. None of that's going away. Even with chat GPT movement, all of that gray matter is starting to float up, right? How do you go and find two models? How do you deal with the regulatory issues? All of these things we're really going to go and start attacking front and center. Question for you. And I know we're not going to apply buzzword bingo, but I'm curious about your particular angle on this. Do you think that chat GPT has made the conversation and I realize what you do is very different, but has it made the conversation easier for you to get the message across? 100%. Yeah. It's done all the market education for me. It's also given all the tailwinds for the bandwagon to also increase the type of noise. And now you really have to go into rise above that noise but really it's done more hard. Yeah, a little bit of noise in the space right now, just a little bit. It's very easy to separate yourself when you actually show, not just a demo environment, but actually prove it to me and we have the credentials to prove it to folks. And we have a new playground we just announced yesterday. Tell us more about the, I like playgrounds, let's hear about it. The playground is the end user experience. It's not yet the gendered AI experience of the workflow automation, but you can go and really sign up. We had the first hour, a couple hundred sign ups. People are playing around with the, having a conversation with their chatbot, rated PG of course. It's very sensitive, it has some jokes, but it's very sensitive to today's political issues and every day. We'll have to check that out. Yeah, yeah, yeah, we're gonna have to test that. Yeah, yeah, that feels like a test for us for sure. For sure. Is there anything, this is actually, now that we're in this category, is there anything that makes you nervous about all the buzzword bingo that's going on right now? I think the, not the buzzwords per se, this is the first time I can remember, and I'm sure others can attest that, large organizations, Microsoft, the world, AWS, Google have moved at the speed of startups. This is unbelievable, it's unheard of. So. That's a really good point. That's an excellent point. While we're still ahead of the curve because we've already been in this space, six months ahead of the trend. Nice timing, just go ahead and dust your shoulders off on that one, well done. It's always a pause for hesitancy, do you partner, what do you expose, how much do you reveal, how much you keep in-house in terms of domain expertise knowledge, when you integrate and you overlay on top of your platforms. So that's always going to be some that we're going to have to do the trade off game with. I think that's a really, I love that just as a benchmark, the competition and the market have demanded the acceleration of some of the largest organizations on earth. It's an arms race, and it's between themselves. They don't see startups as being a factor yet, they're just competing with each other. Well, maybe they should start looking at the startups as- If they view the cube, they'll know. Yeah, yes, I love that. There we go, there we go. On that note, on that note, wow, what a good closing note, on that note, thank you for being here next to me. And thanks for coming on again, Amit, I love this. I just, I feel like it's not KubeCon unless you're here with us on theCUBE, talking about Kubea. So, I mean, if that's not Kube, I don't know, I don't know what is. That's not meta, I don't know what is. Wow, thank you all for joining us here in Amsterdam. My name is Savannah Peterson, and we are the leading source for high tech coverage here on theCUBE. We'll see you this afternoon.