 Hello everyone, welcome back to theCUBE's main stage coverage of DockerCon 2022. We got a great guest from Intel here, Ajay Bungara, Senior Director of Edge Software and AI at Intel. Talking about cloud native and AI workloads at the Edge and building a better developer ecosystem for the Edge, which we all know those with the actions going, cloud native, compute data, data as code. These are things we've been talking about. So Ajay, welcome to theCUBE. Thank you, John. I'm really happy to be here in DockerCon and everything we do, Docker makes it better. Well, you guys have done a lot in your career and looking at your background, the Edge was manufacturing, the old school IoT stuff, now that's converged completely in with cloud native IP technologies. Everything's kind of happening now at the Edge. This is where the problems are now shifting and solving because of the goodness of the cloud and what that's done for cloud operations, which essentially distributed computing, is making the Edge the battleground for where the innovation's happening. Could you just share with us your view of why the Edge is so important and why it's different than what we've been seeing in pure cloud and on-premise data centers? Yeah, you know, 75% of the data that is getting generated off late is happening at the Edge. Okay, so there's a lot of value, that's a lot of value that's getting generated at the Edge because most of the compute, we want to move it where closest to the data because of latency issues, bandwidth issues, security issues, all of those things is getting people to move compute storage data towards more at the Edge. There's also one big shift from a developer point of view where 51% of all of the developers in the world have deployed in somewhere the other cloud native Docker based solutions out there. Okay, what we are seeing is the combination of cloud computing, networking, Edge computing, all of that coming together. And that is where it is pushing the envelope from the Edge perspective. And one of the big drivers is AI at the Edge as well, right? The Edge inference workloads that is really happening with camera as one of the sensors is really driving that compute. And your question about what's so different about it, the challenges at the Edge are compounded because it's bringing together the operational technology, the information technology processes and cloud computing environments along with networking all together. So when a developer wants to build a solution for the Edge, they have to figure out what part of that workload sits in the cloud, how they're going to move that workload towards the Edge using some form of networking, how are they going to protect the data in transport as well as at rest because Edge devices can get stolen. So there is all of these challenges about how do you figure out the trade-offs between price, performance, functionality, power, heat, size, weight, everything matters when you talk about the Edge. So anyway, that is where we see those differences. It's interesting. You do go back in history and distributed computing, the movie's still the same. Remember back in the day when I was breaking into the business, memory was the bottleneck and storage was the resource. And you have to swap out memory and as a developer you have to deal with that. Then memory became abundant and storage was the problem. Now you got networking as the latency problem. So again, these are a challenge that developers have to weave through. I was going to ask the question of why is the Edge important for them? And what's in it for the developer? Why should they care about the Edge? And I think what you were saying is there's design decisions going on around how to code. Can you elaborate on what's in it for the developer? Why should they care about the Edge? Developers have to really care about the Edge is because when you're really building a solution you cannot move the data and make all the decisions at the cloud because it's late, right? Sometimes latency, your bandwidth costs, your solution costs are going to get increased. And because of security and privacy concerns, sometimes you have to make those decisions at the Edge itself. You will have to figure out only take the data strategically to the cloud where it makes sense, okay? And that is the reason why developers have no choice but they have to focus on the combination of cloud, networking, and Edge. And that's where we are seeing a large scale set of deployments that are happening today. Yeah, and I can see the business value too, which is one of the big themes of DockerCon this year's tracks on that. People talking about that. You're seeing trends like headless retail which is basically it's not Shopify managed service. It's more of you build your own stack and you put the head on there which is the application and business model. So again, that's an example. There's also the manufacturing, there's automotive. All kinds of use cases where there's money-making opportunities, right? So there's business value there. So the developer's going to be pulled to the Edge because they're in the front lines now. So this is about making the Edge ready. And I want to hear your thoughts on what Intel's doing to make that developer environment ready for the Edge because we know the developers on the front lines today and that front line Vanguard will be the Edge. What's it look like? Exactly right. So what we have done is we have created this environment for developers, which we call it as Intel DevCloud. And in this DevCloud, which is a Kubernetes based environment where we support all of the Docker workloads and it's based off of Red Hat OpenShift. And we thought about this a little differently. What we did is it's a cloud environment where you could use a browser to do all of your development build test and all of that. But we also took a whole range of these Edge devices and we made it available in the cloud. So as a developer, you don't have to have an Edge device sitting at your desk. You have an Edge device or a plethora of Edge devices sitting in the cloud. So you have one environment where you have cloud, you have network, and you have all these Edge nodes. So you could start building your solution. You could start building your cloud native or Edge native solutions, tested benchmark it and figure out how and what type of combination that you actually need for your final solution, as you said, in retail, in smart cities, in healthcare, any of these vertical markets and get your solution closer to being a deployment ready. Yeah, and I love your description by the way. It's called a container playground. I mean, that just, it just comes across as fun. And I think this idea of having these nodes available, you guys bring a lot of expertise to the table. That's almost like your local host for Edge devices, right? You can work with it in a safe environment. That might get that right. You're getting that right. And in fact, during the pandemic, when we are all working remote, right? Nobody has access to these labs where you have all these Edge devices available to you. You could actually play with all these network simulators, everything. Now, with all these developers spread all over the world, you don't have access to as many of those Edge devices. So now with browser, with this container playground, you could develop any of your Docker-composed, Docker-based container workloads and try it on all of these Edge devices, which may range from an Intel's point of view, CPUs, VPUs, GPUs, anything. Right? We know there's a lot of compute at the Edge, which always helps Intel, but your North Star is about making it easier for the developers as you guys invest cloud network on the Edge in the cloud native world. That's the goal. How do you do that? And what should the developers optimize for? It sounds like they're going to learn with this playground that you have, the DevCloud. What are you seeing that they're going to learn to optimize for? Is it like I use the old school example of memory optimization, swapping memory out and that kind of thing, but what's the new issues that need to be optimized for if you're a developer? If you're a developer, you got to optimize for your Edge AI workloads, right? So that means AI inference workloads, you have to look at like saying that, how can I take like a model that is developed in some type of a cloud environment, like a TensorFlow model or a PyTorch model, bring it on to the Edge, and then you have to do inference workloads. You need to understand to do this inference, what type of compute you need, what type of storage do you need, what type of memory do you need? And we give you those options where you could optimize those type of inference AI inference workloads, you could actually do that. Then you also can decide like what type of decisions you want to make at the Edge, what decisions you want to make at the cloud. We give you those options and flexibility for you to build those solutions. One last point I'll make is, there's a lot of legacy applications that have been developed, which is traditional embedded applications. We also want to teach developers how to take these applications and containerize them, how to take advantage of the cloud native DevOps type of paradigms, where it would make your life easier when it comes to scaling your solution, deploying your solution worldwide. All right, AJ, thanks so much for coming on theCUBE, DevCloud, a container playground. Now back to you at the main stage at DockerCon.