 Hi, this is your host, Bheem Bhartiya, and welcome to TFR. Let's talk. Today we have with us George Castro, developer relations at CNCF. You're at the AI.Depth conference. I was supposed to be there as well, but I had to change my plans at the last moment. So I'm missing the conference. So I wanna hear from you. How has been the conference so far? It's been amazing. It feels like I've learned six months' worth of technology in about one and a half days, not even counting the hallway conversations, but it's always good to see large organizations that are doing AI at scale be able to share their expertise. So interesting talks this morning from NVIDIA, Amazon, hugging faces on earlier. Just seeing the level of scale that people are doing cloud native deployments and AI on top has been really... So probably I'm like, you know that it's happening, but when you get into the technical details and someone's in a talk and they're kind of talking about the numbers, once you see all the zeros at the end, you start to understand the level of scale that we're talking about, that even in a cloud native context is still much larger than we've seen in the past and that's always very exciting. Today's main topic is Genitive AI. Of course, because of this conference, when I was at KubeCon and Cloud Recon in Chicago, Genitive AI was a major discussion point here. A lot of companies, they are using Genitive AI in products and production. I want to hear from you. What does Gen AI mean for cloud native at the same time? What does cloud native mean for Gen AI? First of all, there are organizations that are built their AI on type of Kubernetes, Bloomberg, CERN, OpenAI, NVIDIA. I already mentioned Huggingface. So there are organizations out there that have been doing that and it's due to Kubernetes' extensible nature. We're always talking about the API driven, being able to extend outside the core primitives and things like that. Now that it's been a few years of having these things in production, I think CERN's demo two years ago was kind of talking about all of these things. And now that these end user organizations have been out figuring out at that level of scale because you can't figure it out unless you're doing it at scale. Now is the time when there's interest on bringing those primitives and lessons learned back into Kubernetes to kind of give people that more out of the box batteries included experience that they expect from invisible infrastructure. So in Kubernetes itself, you're seeing a bunch of caps which are Kubernetes enhancement proposals over the past year and you're gonna keep seeing them coming up more around dynamic resource allocation, batch scheduling and just things that remove a lot of the complexity for getting high throughput, low latency workloads in there and AI is kind of a natural fit to that. So as Kubernetes is turning 10 next year, we kind of moving from the mindset of web apps to workloads that look more like AI. So Clayton Coleman said that inference is the new web app and that is something that was mentioned during one of the keynotes at KubeCon, CloudNativeCon in Chicago and has been one of the kind of mantras that you're hearing core developers and people that are involved in the core kind of repeating over and over again to themselves is that the end users as AI has come to take over are pushing CloudNative in this manner due to the extensibility of the API and for us it's just trying to grab those common patterns and put it in a way that commoditizes it, puts it upstream and so that end users can get that economies of scale and that's why they choose CloudNative to begin with. So yeah, it's a very exciting time right now. Now, time passes by so fast. It's been almost 10 years since Kubernetes came to exist and it's been used in production and there are so many awesome use cases, of course like CERN that and thousands of course. I feel that Kubernetes is a technology of same magnitude scale as is the Linux kernel. It's a very mature foundations for organizations to build their products and services on top of it. Can you talk about how does this maturity of Kubernetes and the CNC of hosted CloudNative technologies? Yeah, so if you consider Kubernetes like the kernel, right it's the kernels are very interesting in the early days, right? And then eventually people care about the application. So as Kubernetes kind of gains these primitives and becomes less important, we start to look at other things in the CloudNative landscape, projects like Kubeflow or Volcano or there's 174 projects in the CloudNative landscape and all of them are now starting to find new niches and new uses for what AI users are looking for. So for everything from Coverno to OPA that's policy that you can run on your cluster, you're gonna want that for your AI stuff as well, right? So for a lot of them, it's once the bottom kernel primitives are handled, they're already kind of leaning forward in the saddle and many of these projects are being used in production. Kubeflow just had a release in November and has been used in production for a very long time. It's one of the earliest AI projects on top of CloudNative. And now that it's just hitting production, AI in general is just hitting production in so many places. And now's the time for those projects to kind of shine and for us, that's where the importance of having those healthy community processes to ensure that these projects continue to remain sustainable from a contribution perspective because the more production users you get, the more issues you're gonna find and that virtuous cycle of contributors and consumers, we need to make sure that that's healthy and that's pretty much my day job, right? Make sure the contributors are finding places where they can express themselves and contribute and learn and do what they want to do, what they get out of open source, that's the reason you contribute, but also be able to provide value to those end user companies that wanna consume the projects and doing that together in a way that's fair and neutral and fun for everybody and inclusive is, I mean, that's kind of the mission. And as organizations are embracing GenITAI with cloud native technologies, what are some of the pain points that they have to deal with, whether it's about hardware, infrastructure, scalability, security, and how is CNCF community helping them or can't help them? There's three places. The first is gonna be hardware enablement, GPUs, TPUs, dynamic resource allocation. There's a Kubernetes enhancement proposal that is one of the, is very busy. That's where a lot of activity is happening right now. And there's been a lot of work in batch scheduling as well that has been landed over the past 18 months. That's kind of quiet. I was surprised, it felt for me when I discovered it, it was kind of flying under the radar that kind of is important stuff. And kind of making that stuff simpler for people is something that Kubernetes has always kind of struggled with, right? Like making it simpler for people while keeping it powerful and extensive sometimes feels like mutually exclusive features. And then the last bit is going to be the applications themselves. I think things like keep flow are just the beginning of applications on top. There's still, in the Kubernetes space, we're kind of still concentrating on keeping the field, fertilized and ready to go so that when the seeds come that they're healthy and ready to go. And we're still in the early days of this where we need to get, especially the hardware enablement stuff kind of sorted out in a way that's, you shouldn't have to be certain to get the stuff, right? And one of the great things about the model is being able to have the end users participate in open source as closely as they have been. A lot of these organizations have been involved deeply with open source so that you get the stuff faster. And for us, it's making sure that that process is as efficient as possible. And that's why we do it all in the open. So when I mentioned Ikep, anyone in the community can participate, you can listen in, there's no secret meeting, as far as I know, and you can just participate in the meetings and all the issues and things like that. And that's kind of what people wanna see. When there's a transformative technology like this, there's always people who understand that it needs to be open and collaborative and that's the way to move an industry forward. And the way we do that in open source is just by doing it, right? It's like, here's the GitHub issue, you can subscribe to it and you participate. And organizations that are leaning forward in the cloud data space are the ones that are really being able to accelerate their own efforts so they can build what they want. It becomes a bit tricky when you talk about open source as well because it's not as easy as the lab stack. There are four or five components and they're all open source. The Biden administration also came out with an executive order to kind of issue that AI, Genetic AI is safe for human consumption. So can you talk a bit about open source and Genetic AI? I think it's a whole new skill set of people. Here at the cluster level, I got a lot of ops background, container nerds doing that kind of thing. I've met people at this conference, students, data scientists, this entirely different skill set of people that bring an entirely new outlook on how to solve a problem, right? Because we can't really design a thing by ourselves, right? Like we can't just make a thing and say, here, consume it. That's a process that involves both the end user and the people that are producing it at different levels. So down to the cluster hardware level, all the way up to the people that are consuming the projects that I talked about earlier that are gonna be doing that. And I think it opens it up for a bunch of people who are less technical when it comes to the operational stuff, right? Like we want them to do their science, not have to worry about the scalability of their cluster and things like that. So I think it's a tremendous opportunity to bring a new diverse set of skill sets into open source that is always exciting. Like, you know, open source has been, you and I have been around a while, right? It's started with a bunch of Linux nerds and things like that. And when's open source going to reach the rest of the scientific fields in the world, right? And with data science, that's just happening. So being able to see people that might not have any knowledge about the cloud native stuff, but have deep knowledge on AI and things like that. And it's just, it's a good melding of, you know, they have a place they wanna go. We have the tools, we just need to know how to get them there. And it's the same problem for them except the other way around. So that leads to a lot of interesting discussions and seeing familiar faces, but also a bunch of new faces who have never experienced open source. And if this is their gateway into open source, you know me, I'm all about that. That's how we move forward. And when we look at Genetic UI, we do talk about developer and actually even at the KubeCon, the CloudNuttyCon, we had a lot of discussions about the whole developer experience. Can you talk a bit about when we look at CNCF and your role as developer relation, what plans do you folks have to enable, empower, engage developers with all these open source projects that you folks have? Yeah, so for me, it's all about efficiency, right? When we built a lot of the processes, the open source processes and things like Kubernetes and CloudNative in that first decade, right? Now it's about making that even more efficient. Before it was really, you could kind of afford to only hang out in the CloudNative area. But with something like AI, it brings in all sorts of different foundations and all sorts of software projects. So now PyTorch is involved, totally different foundation. Apache is an important player in this space. The CNCF itself, the rest of the Linux Foundation projects, all of a sudden you have this common thread of AI that kind of ties it all together and each part of the stack is getting exercise now. And it almost feels like the first 10 years you had different pockets of technology that were evolving at different rates built around that open source substrate. But now in order for AI to succeed due to the scale that you have to have, like you can't not do this unless you have the scale. Now it's really, I'm starting to see a lot more cross-collaboration, starting to see people who are wearing multiple hats and multiple organizations kind of trying to tie it all together. So you might see someone that has one foot in CloudNative and one foot in the OpenSSF Foundation or one in the AI and then one in the CloudNative. And then there's another person that has one in all three. All kind of going towards the same mission on how we can have the technology serve the end user in a way that's sustainable for contributors. So it's a lot of the common patterns and it comes down to more human organization than the technology itself. And that's our generational challenge, I think. Being able to scale that across multiple organizations, multiple tech stacks in a way that's sustainable I think is our next 10-year goal. All of the content here, all of the projects I've talked about, all of the talks here are all available online. You mentioned growing up with the LAMP stack, right? I remember struggling days trying to figure out your LAMP stack. These days, the amount of software, it's the opposite problem. You're overwhelmed with the amount of software and content out here. The amount of expertise that's available on the YouTube channels for this conference alone is immeasurable, like if you're students or you're just getting starting in this technology, we're here, you know? We're documenting everything the best that we can. And we're just looking for more people to dive in and get involved. So all of these projects, always looking for contributors, writing things like contributor guides, mentorship programs. That's always something that's on front of mind for us. So don't be afraid to dive in. It's always a learning process and we strive to be a place where you can learn and do what you wanna do. Build what you want. I mean, I don't even have to mention that, but the fact is that it is cloud-native, it is complicated. It was not meant to be easy. But when it comes to Genetic UI, how do you see that the whole CNC of ecosystem helping customers deal with this complexity? That's a hard problem, right? It's like, you see all this stuff and you want it. And sometimes you have that, you know, you go to a conference and you know, all the cool kids are using this stuff, are we behind? And we're, you know, how are we supposed to consume this? And, you know, a lot of it is just having the foresight to dive in. What I like to tell people is just being involved doesn't mean that you have to, you know, figure it all out by yourself. And a lot of places, just being in the room and just being able to keep track of things are going and being smart about where you put your time and where you wanna contribute. So what I tell people is find out what's most important to you and just get involved and you'd be surprised how much just by reading the release notes of software that you, you know, might be interested in after a little while, you start to learn things. And when you come to a conference is one of the great places where you can go to get up to speed with people who have been in your shoes before. There's nobody in this, everyone who's at these conferences has been a novice at one point. And we've been around long enough and know enough about open source sustainability that the only way this works is by ensuring that everybody else is also learning from each other. So what I try to tell people is don't be afraid to say when you, you know, you need help or you want to be able to spin them on technology and there's plenty of people there available to help you do that kind of thing. Because this really is one of those things where as an industry, the health of the entire industry and how people consume this technology is part of the open source model. Like we can't have a healthy ecosystem unless this is on front of mind. So that's why we kind of take it seriously. And yeah, that's why we take it seriously. You know, if you're not, if you're not having a good time consuming this stuff then, you know, I'm gonna work a little bit harder to make sure you can. So when everybody has that attitude, we move forward and we iterate and we move as fast as we can. You folks have, you know, especially interest group or SIGS, you know, or working groups, talk a bit about if there are any special interest groups focused on Genetic AI. So a lot of the things in the CNCF are going to be to support those AI workloads. So there's a few SIGS and working groups that we should mention. SIG node in Kubernetes right now is a hot topic that's where people are looking. There are two batch working groups, one at the Kubernetes level and one at the CNCF level. And there's almost each of the meetings that I sit in on a lot of CNCF project meetings and there's an AI topic being discussed probably every single day. And you know, I'm only attending a fraction of those meetings. So it's one of those things where it's find out the thing that's interest you and if you sit in a meeting, someone's going to bring up AI at one point and then that's where you can start, right? And that's the thread that you can pull on in and get started. I mean, when you look at the adoption of Genetic AI and apprehension and fear, I think it was the same when it was in the early days of Linux kernel or when the Docker containers came out or when we started talking about, hey, everybody should move to the cloud and all the apprehensions about cloud or Kubernetes. So it feels like, you know, that we are just going to the same cycle when it comes to Genetic AI and things will mature over time. Major, thank you so much for taking time out today and talk about Genetic AI in the context of CNCF, Kubernetes, Cloudative. Thanks for all those great insights and I would love to chat with you again soon. Thank you. Yeah, hope to see you in Paris. Thanks.