 Hi, this is your host of the party and we are here at cook on in Chicago today We have with us Michael maxi VP of business development at the data Michael is good to have you on the show Thank you for inviting me. It's my pleasure to host it today I have covered the data in past, but it's a good idea to just remind our viewers What do you folks do the data provides is edge orchestration and management? So what that means is you can deploy workloads to edge devices running outside your data center And we manage the whole life cycle of that application. Excellent. Thank you. We are here to continue folks made an announcements here We did we had a very exciting announcement this morning. We announced a managed Kubernetes service So we've had customers deploying Kubernetes on the edge for about three or four years now But mainly in a bring-your-own fashion where they're kind of bringing their own and now we're offering it as a service So you can buy it from us. We support it. It's fully integrated into our edge platform So we're super excited about this new offering if you look at managing Kubernetes You know, they are so many solutions, you know, especially manage Kubernetes What is new that Zed is bringing to the market? Yeah, so for us, we're bringing That managed Kubernetes in a very small footprint, right? So our customers run maybe one or three devices at the edge and we're delivering a really lightweight K3S based Kubernetes distribution. It's the runtime. We still partner with great orchestrators like Rancher and Rafi and others around the building here But that runtime is now supported directly by us. And if I remember Zed data early days or you know, we started talking about Edge computer cloud computing in a way, which was not that this cloud computing which was more like near the users So talk about about how much evolution did you see of that computing? Yeah, so the evolution of the edge market? Yeah, I think it's starting to accelerate, you know, we've been talking about the year of the edge for about three years now But starting to feel a little bit more like reality What we're seeing pretty consistently is you know a couple applications running on a single device Often something old running right next to a Kubernetes cluster in new So they'll have maybe a Windows app that controls a robot or connects to a car or something to that effect And then they'll stream data into a Kubernetes cluster and do AI or inference or something on the edge So we're deploying these full solutions on a little piece of hardware from Intel or AMD and let's also talk about The role of Kubernetes because they're also low footprint Kubernetes There are a lot of you know zero case and a lot of things are there. So talk about the low Powered version of Kubernetes for these kind of use cases. Yeah, I mean it's required first of all right because it's hard to run the entire ecosystem on one box We see a lot of k3s, which is pretty well known smaller distribution And that is really designed to run on individual devices or three devices It has a smaller feature set, you know, you're not doing elastic scale when you have one device So things like that are removed from the distribution But it still brings the full API it brings the full experience and people can deploy to it Just like they can in the cloud or in their own data center And also how different is the full-scale Kubernetes to this low footprint because we talk about that complexity Which cannot be done at the edge use case. Sure So the the difference is well on its size and CPU and utilization, right? So they really built the binary to run in smaller environments But with that means you're kind of ripping out a lot of the projects, right? So I don't know how many Kubernetes projects there are a couple hundred I don't know there's an I chart full of them But what you end up running on the on the edge is kind of the core services. So think about like at CD Think about like the container run times and those core components designed to run in a really lightweight container They're a lightweight package. Right, right and now The reason I was asking this question was to just go there and then when we talk about managed Kubernetes in this space so once again, how different is it from the managing full-scale Kubernetes to this and Once again, what values that are bringing there. Sure. Yeah, so the managed piece is that smaller runtime, right? So we're not we're not delivering a steel at scale on a single-edge device, right? So it is managing it's based on k3s. We're partnering with SUSE on this And the benefit for us is, you know at the edge, it's really the cloud on your hardware That's kind of how we view it. So we've always supported Terraform We always supported sort of cloud native patterns But we've had the customer kind of bring their own runtime and with this new managed service offering They can buy it directly from us. We support it. We're going to upgrade it We're going to keep it current and they can kind of take the eye off the ball on that piece and sort of concentrate on the Applications up top. When we talk about once again Kubernetes, the discussions are it's too complex, too complicated It takes to consume a lot of not only developer time, but a lot of company sources There are a lot of discussions, you know, not a lot of discussion on the Twitter like no Is there something next Kubernetes, you know, which is really simply if you notice that discussion What are you seeing there? Yeah, great question So we're seeing shops that love Kubernetes and have Kubernetes skills Embracing Kubernetes on the edge, right? We're also seeing customers that aren't using Kubernetes try to do it and to your point Like it's very difficult. So what we've seen in that case often is switching to maybe a different runtime We partner like a Vasa. You can do native containers. There's an IBM offering There's lots of sort of solutions that aren't Kubernetes, but you can run containers against and then forward-looking even more I think wasm is starting to be interesting. We're not hearing it directly from our customers yet But that really lightweight, you know model conversion We think will be a big player in the edge in the long term, right? This can be totally off topic or totally irrelevant discussion But if you look at the whole old Linux map, you know big players and then arm come and you know Their developers wanted more resource and even Linux was not happy, but now everybody's benefiting from it So similarly, you know when we look at you know K3s and a lot of other offerings You know, do you think it might help in reducing some of the complexity of Kubernetes or you think it's a totally different word altogether? No, I think it helps right and I think as you know, we productize it more and we sort of build simplicity into it That will help a lot But yeah, I mean we're big fans of open source like our native Evo S is open source So anybody can come contribute and build on it and you know, we've had our customers help improve their own product So I think the same thing will happen in K3s and the Kubernetes space I do want to throw this question. It depends whether it's relevant to you folks or not Is that these days almost everybody talks about generative AI, okay? So I want to see that, you know, what does generative AI mean for Zaneda? It could be seen as people running generative AI workloads Or it could also be generative AI may help right the Kubernetes I think for what we're seeing on the edge is not building big large language models on the edge, right? That's really done centrally in a cloud or a data center But what we do see a lot of is data aggregation tagging and cleaning, right? We talked to customers who like hey, I want to do edge AI and we're like, okay How's your data look and they're like what data, right? So a lot of customers are just started getting into the practice of cleaning and tagging and collecting all their data So they can build these large language models The other two scenarios we see we see a fair amount of inference at the edge So they'll take that large language model or any model really and then do like video inference at the edge And then the last thing I would say is Machine learning so distributed machine learning is pretty common So you'll see like data coming off a robot or out of a factory or PLC Into a machine learning model that then eventually pushes it up to a cloud for an aggregated data pool So we're kind of in the data collection space for large language models And then the other end of the spectrum where we're doing inferencing on the edge But more thing which is different when we talk about the fullest stack You know, I'm not fully stack is not the right one like full-scale data center What's in the edges that you have all the resources there, you know, right? But edge devices it may be on the top of a tree It may be top of the mountain it could be on a remote vehicle You cannot send your teams for repair all the time. So they have to do a lot of self-healing They have to do a lot of it. So what are the challenges that are associated with it? I'm not talking edge in general, right? But in the context of Kubernetes and then well I think you hit one of them, which is people are expensive, right? It's hard to find Kubernetes workers and then tell them they have to drive to remote Texas every day like That's very difficult Not that Texas is bad But you know, I think The the cost of people visiting these sites is really high So to your point you have to bake in failover and fail back You have to think about network failover. You have to think about, you know How do I make sure that the operating system can be upgraded, right? You don't want to send a guy with a USB every six months You want to do that when security patches come out So you have to think full stack more than Kubernetes and that's where security and access and people kind of come in And collide on a lot of those things. So that's the stuff we've been solving for the last five six years In open source, so go take a look at Eve and you can see some of the techniques we use around that What kind of use cases are you seeing as you said, you know, it's catching up But there's still a lot of folks who are using it already. The use cases we see are pretty bespoke You know, everybody has their favorite data flavor and these sorts of things, but we see a lot of computer vision Be it for watching people or watching, you know gas flares on an oil rig We saw a fair amount of computer vision and then the data case I mentioned earlier that collection of data starting to build that AI pipeline That's that's a pretty common use case as well Michael, thank you so much for taking time out today and of course talk about the data you new offering here I really love the way you explain the difference between the full-scale data center versus edge Edges so in place and you know, it's the market is still catching up Yeah, but we'll see enough for 5g 6g and a lot of things are going to change Which also means that we should talk more because a lot of things are changing in there So I look forward to our next discussion, but I really appreciate your time today likewise. I really appreciate it as well Thank you very much