 Welcome to theCUBE's coverage of KubeCon EU 2024, live from Paris, France. Join hosts Savannah Peterson, Dustin Kirkland, and Rob Stratche, as they interview some of the brightest minds in cloud-native computing. Coverage of KubeCon cloud-native con is brought to you by Red Hat, CNCF, and its ecosystem partners. theCUBE's coverage of KubeCon EU 2024 begins right now. Good evening, nerd fam, and welcome back to KubeCon cloud-native con here in wonderful Paris. My name is Savannah Peterson, joined by a really brilliant trifecta of men here. I've got Rob Stratche, who's been killing it in the chair all day. We've got Dustin, who we are so lucky to have, and I feel blessed to match. And fabulous analyst guests on Jeeve. Thank you for being here. How's the show going for you so far? It's been outstanding. I am so happy to be here in Paris, of all the places. Of all the places for a work trip. I know, it's the best. Yeah, everybody, so funny thing is, everybody was so jealous as going to Paris, except when I came here, and Nvidia GTC took off in San Jose, people are asking me, why are you in Paris? Why aren't you in San Jose? I'm like, come on, Paris? I mean, San Jose. San Jose, that's an easy call. Yeah, exactly, my mother, actually, in one of the more flattering things she said to me recently, she wondered why I wasn't interviewing Jensen this week, and I thought, well, you know, I got to go to Paris. Someone's got to go to Paris. Someone's got to. I mean, somebody's got to do the hard work, you know? Yeah. But I mean, I think it's been great, I think, from a atmosphere perspective. We've been really seeing people leaning into how they're helping build the communities here. And I think, like we had Dinatrace on earlier today, and they were talking about how they've actually built a secure distribution of open telemetry that you can actually use with, you don't have to use it with Dinatrace. You can just use it. And I think people leaning in on that type of stuff, and I was saying, before we went on, the data on Kubernetes day zero yesterday was super interesting because we always talk about the intelligent data platform, and they're showing how to go through how are all the different open source and how to make it easy. Now, we both know it's not that easy, but I think that's been that sharing of knowledge this week is, I mean, even just into day two, I guess you could say, or day one official, has been fantastic. Yeah. I feel Kubernetes time has arrived. This is a very unique moment in its history. First of all, it's been 10 years. In June, it'll be the 10th year since it was announced. But what's happened is that with AI, so I think Kubernetes is in love and being in Paris is the right place, and it's in love, there's a romance going on. Kubernetes is in love with AI, you know? This is an amazing analogy. I did not know where you were going with this. I haven't associated Kubernetes with love, although I kind of have a love story related to Kubernetes, which is ironic, and now I'm just in a little cyclical circle, but I think you're right, and we were actually just, Dustin and I just had the opportunity when we were chatting with Perion, because Kubernetes is having its Linux moment. Yep. And I think, yeah, it's cool to see this ubiquitous, I mean, even the difference from KubeCon, I started working in container management with Kubernetes in 2020, and just in the last four years, I mean, back then it was like, maybe we're going to do this, everyone's still using Docker, we're using all these different tools, and then now it just seems like there's been a whole industry acceptance and agreement, and to our earlier conversation, standardization, essentially, that we're using Kubernetes. Yeah, one of the interesting things about all the AI ML workloads that we're talking about is the flexibility of Kubernetes, the platform itself. I mean, 10 years ago, Kubernetes was the sexiest way to run web apps and databases, and moving off of some of the more closed source platforms as a service, moving on to something that you could self-host, you can run in your own data center, you can run in a different cloud or some other cloud, and that was great for the restful web app, but that same system also now working with GPUs, being able to time-share and slice and stack workloads on those GPUs all through a scheduled container. I mean, that's pretty remarkable. It is, it is. Yeah, so what's happening is that because AI is working on tremendous amount of data which requires GPUs, so Kubernetes now has a new purpose in life, which is how do you provision massive infrastructure and automate self-healing, all the things that Kubernetes was good at, and now it's found a purpose which is helping with GPU farms, so. And to me, that's the Linux moment. The Linux to me is ubiquity. It's running on phones and laptops and desktops. And servers and space shuttles and everything you can watch is like everything you can possibly imagine. For something to have a Linux moment, it's got to be as ubiquitous as that in terms of its applicability, and that. So I think one of the interesting things that we haven't hit on yet, and hasn't really been talked about too much, we kind of briefly hit on it, is really that it's getting helped out a little bit by the VMware Broadcom acquisition as well, because you have Kubevert, which is running virtual machines on Kubernetes, natively. And that one is really, it's lying there under the covers here today, but I haven't heard a lot about it this week, and I think that is where they're not cloud native apps, they're actual VMs living in harmony with cloud native apps. And I think that'll be interesting to see how that plays out over the course of the week, because that to me is the... It's an interesting observation, I'm glad you brought that up in one of our earlier segments, because it is interesting how, this is a grassroots style, communities are, everyone here is a builder, a contributor, and all the projects, hundreds of projects here, but I think that we don't always highlight how the enterprise decisions, or the big business decisions, affect projects and communities like CNCF. So I really thought that was an astute observation on your part. Just as an energetic thing, I mean... Oh yeah, high energy. Right, I mean, it is 6 p.m., it is the end of a long day with many interviews, I'm still grooving, it's hard for me to believe it's already six, I wouldn't know it if there weren't cocktails happening behind us right now, which I'm also looking forward to, but you just, the vibe here, I wish I could bottle it up and share it with y'all at home, it is so unique, because the energy is just, it's people who are building the future. And yeah, so I'm curious, you've been an analyst for a hot minute, you've got some experience and some depth, you don't look it, I meant that just as a venerability compliment in terms of your brilliance. But what are some of the projects or use cases even of some of the projects that really excite you? One interesting thing that I've seen is that we know there's a scarcity of GPUs and GPUs are very expensive. If you can even buy the GPUs, you know? I was going to say, I don't know what you're talking about, is anyone going to talk about GPUs this way? Yeah, it's such a scarcity. So now there are different options of running, doing LLM inference on chips that are not GPUs, doing it on the edge, like you mentioned, like on your laptop, you can run the whole LLM. Which is wild. It's wild. It is, yeah. The amount of processing we can do on the edge now is a huge leap in bound that I don't know that we talked about enough, maybe just because I'm excited. But I think that also highlights the fact that Olama and Mistral were on stage this morning, which we didn't really talk about in our opening, and where people are looking for these models to be open. And I think that's a really interesting point as well, because you look at Meta, you look at some of the others that are open sourcing, it does make a lot of sense because you have to trust the model that you're going to go then and train and you're going to fine tune and then use for inference. And since CNCF is all about open source, so having open source models and CNCF come together is actually the right thing to do. So yeah. And the other interesting thing that I also picked on is that when the whole DevOps movement started, the developer and the operations community came together, so I see there's some semblance of that happening now here, because you've got the AI engineer, and then you've got the operations people, you've got people who are doing research, and they're separate people, separate tools. But the thing is that the AI engineer doesn't give a damn as to what infrastructure is being used, how to even provision that hardware. Yeah. So there was some announcements from Intel and Mistral about some dynamic resource allocation. So you don't have to hard code and say, give me two NVIDIA A100. That's kind of the flip side of, yes, there is a GPU scarcity problem acquiring the GPU, getting the hardware, but there's also a GPU underutilization. I was going to say the optimization is a really big challenge. Yeah, and so I think just in the technical track this week, there's a number of changes that have happened in Kubernetes itself around the schedulers, the NVIDIA sponsored part of the keynote this morning. We're talking about four different layers of abstraction that allow for denser packing of workloads and higher utilization. So I think both of those are part and parcel. They're going together. Correct. Yep. Yeah, no, I think it's a good point that you bring up. And I do, coming back to this Linux moment thing, it feels like we're about to, as a community, hit a hyperscale moment. Yeah, I see that. Yeah. I feel like our conversation, say, in Salt Lake, for example, when we're at KubeCon in the U.S., will be potentially quite different. We will have a lot more of these applications out in the wild, serving real people, not just us nerds sitting here, hanging out and talking about it. Yes, I think that you hit the nail on the head. It is, what are the benefits that, Allianz or Mercedes-Benz, what benefit the business derives is the key to success, not what technologies, because we love technology. We spend all day tinkering around, but if it doesn't serve the business, then what's the point? And it just costs money at that point, too. Getting into AI is not cheap. And I mean, we even had ABB on earlier today, and we were discussing about using Kubernetes at the edge in disconnected ways, and looking at how, as a packaging and distribution, it becomes a real, a container is a really fabulous way to go and do that, but also then you have to have supportability, upgradeability, reliability, and the security that actually is under talked about here this week. And I think we'll hopefully do a better job with that tomorrow, with having some people on, but I think it's one of these that they have their security con, now slated for June, and so they've kind of pulled back on the security messaging here, which I think there has to be security built in from the beginning, especially with this, and I think we did talk S-bombs and supply chain and all of this earlier today, which I think is going to be another theme that we have for the rest of the week as well. I think that's a good observation, I do. I do think we'll have more conversations about it, I think the other thing is everybody knows security matters in this room, so it's less of a conversation, if that makes sense. There's kind of this baseline knowledge. Yeah. I was walking by and I saw a line which was like a one block long people trying to get into a session, so I asked them, what session is it? It was like pitfalls or something about security. Yeah. Oh, yeah, that's an interesting observation. It is table stakes, but it's table stakes that when they aren't there, boy, it manifests itself in big ways, so let's talk about that. I mean, the ubiquity where we're at with Kubernetes now, a security problem in Kubernetes now affects the world's 10 largest brands. Priyanka was sharing that. It's a super good point. You can't take it for granted. We really can't take it for granted. And things get messy real quick when we're moving at this velocity and scale, especially with things like AI. The tiniest breach could lead to absolute chaos pretty quickly. Right. Yeah. There's a lot of discussion and questions on ethics responsible here. Yeah, yeah, yeah. That's a very important topic. And in fact, I've not seen a lot of coverage on AI governance. AI governance to me is a huge topic and I have not seen that as yet. Like what is Kubernetes role in ensuring AI governance? Missing. I think it's like security. There's a less emphasis on that topic. Well, and it's quite possibly. I mean, here we are a community of open source people who have agreed to a certain code of ethics and standards. It's exactly what we're going to need to do with AI. And we're in Europe, which tends to come together to create standardization in a very different way than the United States does. So I think that is a really interesting, that's a really interesting point you just brought up. I mean, I'm going to be noodling for that all week. Yeah. And I think building off of that, I mean, you just had the passage of the AI Act. Yes. In the EU. 10 days ago. Yeah, exactly. You start to look at how people, and we've actually had a couple of conversations about it, but it was, how do you have AI for good? Was one of the quotes that came out of this morning. And I think people here are, I like it because people are very positive about AI. Versus being scared about AGI or what have you. Yeah, one place I think the CNCF, Linux Foundation, this group in particular could offer real guidance on is some of the licenses. What are the license implications? Another, yeah. Around the generated code, data sets, inferences. But right now it's kind of wild west. And it can get a little messy. I mean, Piyanka was talking about that in our last segment. No, I think that's really good. So there, I guess since I come from the data side, so I see things slightly different. I see two gaps. One is we just talked about security, governance. I think there's not enough. The second thing that I think they've missed is the importance of metadata. Why I say that is because CNCF published, so they have a working group. And Piyanka mentioned with QR code, please read our platform paper. I went through it line by line just to prepare myself. And so they talk about the advantages of Kubernetes as a platform and the layers. Just like we talked about the data, but there's no mention of metadata. And I think Kubernetes has a huge opportunity to bubble that up because once you have a metadata end to end, you control it. You have full transparency, full visibility into that ecosystem. And they missed it in that paper. And I think what's interesting is because they do talk, if you go around and talk to some of the, you know, from Spark to Trino and to Presto, they're going to, that paper talks about that kind of being the layer that does that. And they kind of leave it at that. And I think as we know with an intelligent data platform, the metadata and federation, because you're going to have silos of data, but you have to have federated metadata and governance across that because that still, to your point, I don't know that that will get solved here. And I think that's one of those things that it's going to be an interesting thing to see if something does start up around that. Because I think a lot of the organizations that have sponsored these are looking at that as their value add that they're bringing to it, specifically on the governance and specifically on the metadata management. So this is my theory. Any company that owns data, they own the customer. So AWAs, Snowflake, Snowflake wants everything to come into Snowflake because once data is in Snowflake, where are you going to train it? You have to train where the data is because petabytes, exabytes, so you cannot move data around. Now in the world of Kubernetes, you don't own the data. Kubernetes is not going to own the data, but it can own the metadata. And that transition has not happened in CNCF because they don't think they're not data people. So hopefully if they watch this, then they'd be like, ooh, let's add metadata to our platform. Someone's going to make a great business model out of everything that you've just talked about. It's 115th sandbox project. So. Yeah, there you go. Well, the sun is quite literally setting here on Dustin's face and on the end of day one. Rob, such a pleasure to sit here with you all day, Dustin as well and so nice to have your insights. Such great calls out. I've been following your videos. I was hoping you'll come with your Rubik's Cube earrings and first thing. I'm glad I didn't let you down. Yes, you didn't. Thank you so much for having me on. Yeah, anytime. It's absolutely great. We've got an absolutely power packed day for you on day two. We've got Docker. We've got Microsoft. We've got a wasm panel. More guests from Red Hat and so much more. Thank you all for tuning in to theCUBE's live coverage from KubeCon, CloudNativeCon here in Paris, France. My name's Savannah Peterson. You're watching theCUBE, the leading source for enterprise tech news.