 Hey everyone, welcome back to theCUBE. We are live at VMware Explorer 23. This is our second day of coverage. Lisa Martin here with Dave Vellante. We're at Venetian Expo in the hub at the event. We're very pleased. We're going to be digging into Edge in this conversation. We're pleased to welcome back one of our CUBE alumni, Sanjay Upal, the SVP and GM, Service Provider and Edge Business Unit at VMware. Great to have you. Thanks for joining Dave and me today. Thank you very much. Nice to converse with you again. Absolutely. So you had a keynote yesterday. We had to leave to come here to start the show, get the audience in case they were not able to catch it, like the top three takeaways that you shared, presented, announced. Yep. So really all this stuff about the Edge is because endpoints are producing and consuming data at rates that they were not doing before. So of course you have billions of endpoints and a lot of conversations and sessions and communication happening. Now in order to make sense of all of this, you need a software-defined Edge, meaning you need digital infrastructure for running the applications that make sense of this information and then act upon it. And so what VMware was announcing is a critical component of our product line called the VMware Edge Cloud Orchestrator. And what the orchestrator does is three things and that's the important part that we were announcing. The first thing is that its right size is the infrastructure. And I like to say, honey, I shrunk the stack. So that's really, you're able to shrink the stack. Secondly, it does it with zero-touch orchestration because it's pull-based since it's many dispersed locations. And the third thing is we make the Edge programmable so that the application can make its needs aware and we understand what those needs are and we program the network. So that's essentially what I said. And I said, come by the floor. Take a look at the manufacturing floor, the retail store, the website, now the sales side, the F-150 that we have there. All of those are examples of the Edge. And when you shrink the stack, what do I lose? Essentially what you lose is at a single point that you cannot run a very large number of workloads. But you don't need that at the Edge anyway. Sometimes, like in that F-150 that we have on the floor, you're essentially running two to three applications. You're not running a thousand applications. So you don't lose as much specific to the Edge but you can't take that stack then and expect it to run in the data center. And if you had to start over with a blank sheet of paper, would you have designed something that's purpose built for the Edge and would that have been worth it relative to your shrinking the stack approach? Yeah, so what happens at the Edge is that it's actually very vertically oriented. So today, what the manufacturing folks like an Audi would want at the Edge is different from what the retail folks would want at the Edge, which is different from what's in that F-150. So what we do first is we solve the problem individually for these customers. These are our lighthouse customers. They're the ones who have the vision to see how it affects their particular industry. We bring that back and then we articulate a horizontal infrastructure. That horizontal infrastructure for us today is the Edge compute stack, but it was built first by finding the lighthouse customers. So had we done it the other way around, had we done it that let's come up with this horizontal infrastructure and sell it to everyone, it would have been if we build it, they will come and people would not have come. So we have this specific approach to find the lighthouse customers, build it for them and then come back and build the horizontal infrastructure. Can you share a little bit about some of those lighthouse customers and what you were able to learn from their use cases and their needs? Absolutely, so the first thing we did is we started off with telecom. So telecom has an Edge use case, which is really cell sites. That's the primary use case. So when you look into a cell site of a telecom operator, it used to look very much like it's all about the network. So there's radios and there's antennas and all of this. And there's the software that was in there was locked together with the hardware. That's how cell sites used to be. But after we got in with this ORAN architecture, which is Open Radio Access Network, we disaggregated the cell site. So now it doesn't look like, it looks like a mini data center now. So you go in, you find common off-the-shelf platform, there's VMware software running on top and yes, of course, you still need the antennas and the radios, but it looks very much like a computer in a room, rather than a network. So that was the first case. We leaned in with Dish, we leaned in with Vodafone. Those were our lighthouse customers. But what we did for telecom, we're now saying we can do the same thing for manufacturing. We can do the same thing for retail. We can do the same thing for utilities. And that's really what we're going about. So finding the lighthouse customer and then expanding from there. So you start there, telco, manufacturing, retail, probably healthcare is there as well. And then you, I'm curious as to how you map from that very specific vertical use case into a horizontal infrastructure, because you're right, if you did with Windows Phone, that wouldn't have worked. So how does that map? Did you find situations where, uh-oh, we maybe have to change the stack a little bit so it can be horizontal, or did you just get lucky, explain that to us? No, it's actually a combination. You of course have to rely on some element of luck, but it's really a combination of finding those customers who in my terminology will hand us a flashlight. So if you find the right customer, you like, you find an Audi and you know that there's a visionary within Audi. Then when Audi selects the technology, and initially it's not a product, we just give them prototypes, right? We give them something that they can use in a POC on a factory floor where they're not producing a whole lot of cars. But when they use it and they give us that feedback, it's like handing us a flashlight. And that flashlight illuminates the path ahead for us so we know what to build next. Our responsibility is to iterate very fast on those flashlights and then to give technology back. Now, if I find a number of these lighthouse customers and I hand them flashlights, and the flashlights are all pointing in different directions, then we've got to start from step one. But if they're kind of pointing in the same direction, then we know we're on to something and that informs the roadmap that we have going ahead. So yes, there's of course some element of luck, but it is also how the visionaries and the customer base talk to the technologists at VMware and how that, when it comes together, that's when the magic is created. And you discovered that there's enough overlap, take those three verticals, telco, manufacturing and retail, enough overlap that you discover the ability to build that horizontal stack and it invits. And that is exactly how the VMware Edge Cloud orchestrator came to be. Because the first thing we did is that we solved the problem for SD-WAN, then we expanded it for SASE. Then independently, we went and solved the problem for DISH in their cell sites. And then when we looked at it, we took a step back and looked at it and said, wait a minute, there is some commonality here, particularly when somebody like an Audi came in and there was a project in our CTO office called Project Keswick. They were also doing something very similar. So when you take the step back and you say, here are three seemingly disparate things, but you put them together because there's the commonality of what the requirements are, then you can articulate, oh, by the way, this is actually a product. And it's the VMware Edge Cloud orchestrator. It's not like somebody went out and wrote 10 pages of requirements and came out with it. No, it's exactly like how it happened. It's in this amorphous process of discovery. And Audi is a manufacturing example, not an automotive example, correct? Audi is a manufacturing example. What they want to do on the manufacturing floor is to have a smart manufacturing floor putting in disaggregation and virtualization so that they can not just reduce their costs, but they can produce cars faster and they can have features on the factory floor that then will reflect in features in the car. What about that automotive example on the consumer side, essentially? Is that just an outlier where that horizontal stack doesn't fit or can you play there as well? No, you can play there as well. Now, of course, VMware and then, of course, the Broadcom as well, we don't have a channel to consumers. So we rely on customers to reflect that. Like even with our telecom operators, Dish, of course, sells the service to consumers. AT&T sells the service to consumers. VMware doesn't, but then the technology that we provide to them helps consumers at the end. Even with hospitals, like I have one really key customer in Texas, MD Anderson Cancer Center, they solve problem for cancer patients. We don't call on them, but they do. And so there's this in lieu of kind of model. Well, we think of the edge, you talked about some great use cases and great verticals and there's so much more that's going on there. And of course it continues to grow amorphously. Where's the security angle? Because obviously that's critical for organizations as the threat landscape is changing so much. How always VMware enabling customers to orchestrate at the edge in a secure fashion? Yeah, I think security is absolutely critical because really what distinguishes the edge from what's happening in the data center is the dispersed number of locations that are usually spread across a very wide geography under different administrative domains. So you could have an edge that is sitting in a retail store in an agricultural field on an oil rig. These are all edges. And they could be in really unsafe locations. People could break into that edge and get access to it. There's a gas station in Northern Africa where we're in where we have to really protect the edge because someone could break into it. So security, the bedrock of security here is zero trust. So it's a zero trust architecture. We cannot have any binding in terms of that we trust this particular edge. So it has to be based on zero trust, which is really what it's all based on. I think the next piece of this is what happens when AI and ML comes in, right? Because what we're doing over there is this whole generative AI thing is critically important. It's got trillion parameters and the like these large language models. But what is happening at the edge is also very interesting because these sensors and all the information that they produce, sense can be made of them with something as small as 256 kilobytes of RAM machine learning model. And that also has a security angle to it because what you need to do is make sure that the training and the model is kept within that zero trust architecture and not have people break in because there's all kinds of information you need for that training and then you need for the inferencing as well. So to me, the security and what's happening with Edge with AI go hand in hand. And these customers want to work with you because they want to map back to their sort of data center operating model or is it more, hey, VMware's innovating at the edge and they're helping us solve problems? Is it a combination of that? It's really, when you look at the data centers and what VMware has traditionally addressed, it's the IT part of the house, right? Most of the people at our event, they're IT practitioners that come in here. They're the CIOs. What's happening with the Edge? Yes, there is an IT angle to it, but it's really about OT. It's about the operational technology. It's about the lines of business. It's about the head of manufacturing at Audi or the people at the PLC manufacturers, CMNs, Rockwell and all these folks. It's the network part of what's happening in telecom. So it's different from what's happening in data centers. Really, what's happening at the Edge, a lot of these applications should run locally at the Edge because there's no time for that data to go all the way back to the data center and be back. So there's a couple of these angles that are different between data centers and the Edge. One is clearly an IT OT division and the second is they need to run applications that are not in small number of large location, but they are in very large numbers of small locations. So when we think of the VMware Edge Cloud Orchestrator, you mentioned three pillars and the third was programmability. Yes. What is the persona of that developer at the Edge? Is it developers that we know and love or we see running around here or in the cloud shows? Is it an emerging developer? Is it engineers who are becoming developers? Yeah. So what we need to do is make things simple for the developer. If we expect the developer to know what are the intricacies of every network that they have to run on, it's not going to happen. So our responsibility is to come up with an intelligent overlay that sits on top of the network but can understand the application. So as far as the developer is concerned, they say, I want this type of connection because I have important information to send as an example from my ventilator machine. So if the sprinkler system doesn't come on and the hospital, it's fine. But if the traffic from the ventilator doesn't make it through, not fine. So when they write those applications, they should say, this is important traffic. And then our responsibility is to understand that, independent of the network that is running underneath, and then to look in runtime as saying, oh, by the way, right now I have this 5G network. I have an MPLS or I have a satellite network. And let me make best use of all of them to send this ventilator traffic with the best possible class of service. So we don't go back to the developer and expect them to know the intricacies of every network. We just expect them to tell us that this application is important or not. How much of a topic is ESG in your, at the edge, in your customer base, particularly low power, obviously low cost and high performance, everybody wants that. But how about the whole ESG piece of it? It's absolutely important. 70% of the power consumption in a telecom operator is in the radio access network. You add all of those cell sites together and all the spectrum that they have to run with the power that they need, and that's 70%. So any difference that we can make in terms of reducing that consumption helps. Similar with the factory floor. When I was in Europe last winter, of course they were going through, because of the war, there's all these issues of manufacturing floors actually coming to a standstill because not enough power to run them. So when we come in with desegregation and virtualization, one of the benefits is energy reduction. And we have, in fact, in the RAN itself, in cell sites, we have this technology, which is the RAN Intelligent Controller. One of its responsibilities is to make sure that the energy consumption is the best and it's optimal based on the current traffic. So you may have not too many people going through that cell site at a particular time. So why do you need all of the spectrum? Why do you need all of the antennas? Just shut them down in run time so that you have the best energy consumption. And of course that helps with the sustainability angle as well. So Telco's interesting, because you've got a very entrenched, you described it very well before, the hardware and the software, all that functionality is fossilized in a system and the network needs to disaggregate. It is disaggregating, but Telco moves slowly. This is interesting because it had to leapfrog from a strategy standpoint and tap developers and new applications and tap that energy. But at the same time, Telco's want the... They're really good at connectivity and reliability of a network, so they're slow to move because they're afraid of things like open RAN and intelligent RICs. And so how is that playing out? Obviously you've got to be patient in order to get the return, but the market is enormous. So I wonder if we could double click on Telco a little bit. Oh yeah, absolutely. I mean, we're playing the long game. As you rightly pointed out, Telcos don't make snap decisions. On the other hand, the spend in the RAN is $40 plus billion every year. So any improvement that you can make, both in terms of how Telcos are doing in modernizing that infrastructure, therefore reducing the costs, but also very importantly, monetizing the services that run on top. So our strategy is very straightforward for the Telco. You modernize the infrastructure, but then you have to monetize the services. Without the monetization of services as we've seen in Europe, they have this entire fair share argument going on. You see here in the United States, I mean, the Telecom operators are not exactly doing as well as what you would find from the compute side of the house. So with these challenges, the Telecom operators have to play this long game. And Dish, of course, coming in with the Greenfield approach, but to their credit, they came in and have rolled out a wide O-RAN-based network in the shortest possible period of time using compute-based technologies. Not looking at it by saying, oh, I'm going to engineer everything to the last bit, and then I'm going to get it done. No, you come in with compute-based technologies, and then you see how computer-based cloud, Edge cloud-based technologies can come in and make an entire network run in a modern way, and then you monetize the services on top. So it's a shining example which other folks are following as well now. It's a fast, we were at MWC this year. We'll be there again next year. And the fair share thing, I don't think Netflix is really going to step up and offer their help. You're right. So that was an interesting argument at the last year's show. Yes, it was. I want to talk a little bit about VMware's differentiation, why VMware. You talked about sort of that common denominator from a horizontal perspective that allowed you to develop the Edge orchestrator that will be an advantage across verticals. Talk a little bit about, is there a common denominator, why VMware pitch? Yes. In Edge that really resonates across Talco manufacturing, retail, healthcare. Yes, absolutely, great question. So just like VMware did the software-defined data center, started with server virtualization out of the network and storage. For software-defined Edge, there are really three layers of this Edge. So for manufacturing, for retail, for health, we're talking about supporting the workloads that run at that top layer. So it's the application that runs the sprinkler system. It's the application that runs on the manufacturing floor. That's that one layer. Then the layer underneath that is the network and security services. This is where the software-defined WAN runs. This is where you get the SSE, which is the security Edge that's running. So these are the set of services that run at the next layer, which make sense of the workload so then you can program the network. Now, VMware also sells into the network. This RAN Intelligent Controller and all of this, we sell into the network. So the one place that you can go to with one digital infrastructure, with one type of software-defined Edge that understands the workload, programs the underlay, and actually runs in the underlay is VMware. There isn't another place today that you can go to that does all three layers of the software-defined Edge. And there's a massive advantage to doing that because if you come in, you get the best set of ecosystem partners. We are not saying that VMware is going to develop all those workloads and applications. No, we are saying that we are the ones that develop the digital infrastructure so that everyone can bring their ecosystem and their applications to run on top of us. And that has a benefit to the telecom operator, but telecom operator serves the needs of all these businesses and consumers. So we kind of tie this thing together from a GTM standpoint, but we also tie it together from the three layers of the software-defined Edge stack. And can you talk a little bit about AI? We touched on it, but what does the AI play at the edge? I mean, obviously the lot of AI is conferencing going on here. ARM is ubiquitous, low-cost, low-power, massive, but in very small quantities in each place. Automotive, I don't mean manufacturing. I mean the consumer side, the driving autonomous vehicles. Where's AI fit? Have you been able to piece together that puzzle? Absolutely. I was showing up on stage yesterday a little device. It's actually a collection of sensors. And that collection of sensors, if you stick it next to a machine just through noise, vibration, and temperature, you can intelligently predict what that machine is going to do in terms of when it's gonna fail and when it's not. That same set of sensors you can wear in a wearable on your hand. And instead of telling it that you're bicycling or you're running or you're doing weight training, it will tell you, oh, by the way, it looks like you're doing bicep curling right now. And all of this is done with really small machine learning models that you can program into those tiny little devices. And while you pointed to the phone and said, this is like indicating that it's a small device, not a data center, this is an order of magnitude more than what you would find in those little sensors. So what's happening with AI is the opposite of what's happening with these large language models. The AI at the edge is all about inferencing and inferencing with small amounts of footprint but large amounts of data that you've got to make sense of. And you've got to make sense of that very, very quickly. So you don't have the ability to go query somebody with natural language processing at the back end. You just have to understand that, oh, this is vibration from this machine and I better act on it now. Otherwise in the next, maybe 10 minutes the machine might fail and that might cause a lot of cost and problems for the manufacturer. So it's all of these data points and the data points are bound. There's just so many use cases that we are addressing right now because we can make use of these small machine learning models that you can put down. Like, you know, TensorFlow has a TensorFlow Lite equivalent that you can make just for microcontrollers. So you use the same software development package but you know now that this is a small little microcontroller that you've got to program. And then you program that down, you distribute it all the way around, then you have our software defined edge to help you make sense of all of this. So I think the edge AI is absolutely fascinating. It's been happening for a while and of course we will link to the backend LLMs and make use of generative AI as well but just the machine learning models that you can run to make use of sensor data. That's a fascinating place to be. You know, years ago, it must have been three years ago now, we wrote a piece that we said most of the AI is being done today. It's modeling in the cloud. In the future, it's going to be inferencing at the edge and I'm looking at the graph that we wrote it. We probably, we predicted it into the decade. It's probably going to happen a lot faster than that. It's AR, VR, it's autonomous vehicles. It's power grids, it's new payment systems, manufacturing, machine diagnosis, smart factories, I mean, on and on and on. The amount of data is going to be, we think going to be massive and just kind of overwhelm anything we've ever seen. Yes, and then of course one has to make sense of the data. And by the way, you are absolutely right. I mean, just in 2022, there's only 5% of the edge compute had machine learning model attached to it. By 2026, it's going to be 50%. That's 10x improvement because it's just become the developers. It's very easy for them now to lay the machine learning model down on these small microcontrollers and then to make sense of it. And the improvement that you get in terms of understanding how the sensors are producing information and what to do with it is just staggering. There's just so much benefit that you get from it that it just makes complete sense to get it done. And from VMware standpoint, you don't care what the underlying silicon infrastructure is, you'll work on whatever it is. It's going to be arm in our view and you can run on anything. My question is, it seems as though historically in the industry that the consumer applications or the volume applications, in this case may not be consumer, eventually find their way back to the data center. And we wonder out loud, will there be a new economic model that emerges as a result of the edge, all the power that's in these small devices that will drive new economics that eventually will find their way back to the data center? We'll see. Yeah, I think that's a very interesting question and you're right, we will see. But there will always, I mean, people talk about the pendulum swinging between what's in the data center and what's on the terminals and things like that. But going forward, there'll always be the need for a set of applications to make sense of what's happening at the edge. But equivalently, not everything can happen only at the edge. You absolutely need what's happening in the data center as well. I mean, even if you take the case of manufacturing, eventually that information has to feed back into what kind of cause you're making and how fast you're making them. And that's running in very large applications that are in the data center. So yeah, there's a good marriage to be had. That's a virtuous cycle. Yes. Last question for you, Sanjay. This is day two of the event. After your keynote yesterday, I'm sure you had lots of time to, or no time to talk to customers and partners and prospects. What's been some of the feedback in terms of VMware's story and capabilities at the edge? Yeah, so I think that, yes, I have been quite busy, but people have been coming back, coming to the show floor and checking out the future of retail, checking out the future of manufacturing, checking out that F-150 that we have over there. And I think the feedback has been, some of it is just, it's opened people's eyes because a lot of the people here are more from the IT part of the house. But then this IT OT, what used to be a divide, that divide is going away. For a number of reasons, whether it's in telecom or manufacturing or retail. And so I think these practitioners that are coming in are learning about this new way, which is the software-defined edge. And it's very exciting for them because this is new technology that can help them. It's not just about cost efficiency. It's about a lot more than that. And that excites them. And it excites them, it excites us, because then we think we can add some value. Absolutely, it's software-defined edge. So much going on there, Sanjay. We so appreciate you coming back on theCUBE and really kind of dissecting the value in what VMware is delivering to edge customers across many verticals. We'll have to have you back on because I think we're just scratching the surface here. Yes, I absolutely agree. Thank you so much for the opportunity. Our pleasure. For Sanjay Uphol and Dave Vellante, I'm Lisa Martin. You're watching theCUBE. You can catch all of our content from VMware Explore Day One and what we've done so far on day two on theCUBE.net. All of our editorial on siliconangle.com. We'll be back on both sets after a short break. See you soon.