 From around the globe, it's theCUBE with digital coverage of VMworld 2020, brought to you by VMware and its ecosystem partners. Hello, and welcome back to theCUBE virtual coverage of VMworld 2020. I'm John Furrier, host of theCUBE. VMworld's not in person this year, it's on the virtual internet. A lot of content, check it out, vmworld.com. A lot of great stuff, online demos, and a lot of great keynotes. Here we've got a great conversation to unpack the NVIDIA, the AI, and all things cloud native with Chris Prasad, who's the SVP and GM of Cloud Platform Business Unit and Manavir Das, head of Enterprise Computing at NVIDIA. Gentlemen, great to see you virtually. Thanks for joining me on the virtual theCUBE for the virtual VMworld 2020. Thank you, John. Pleasure to be here. Quite a world. And I think one of the things that we've been talking about all year since COVID is the acceleration of this virtualized environment with media and everyone working at home remote really puts the pressure on digital transformation. It's been well-discussed and documented. You guys have some big news, obviously, on the main stage NVIDIA CEO Jensen, their legend, and of course, you know, big momentum with AI and GPUs and all things, you know, computing. Chris, what are your announcements today? You got some big news. Could you take a minute to explain the big announcements today? Yeah, John. So today we want to make two major announcements regarding our partnership with NVIDIA. So let's take the first one and talk through it and then we can get to the second announcement later. In the first one, as you well know, NVIDIA is the leader in AI and VMware is the leader in virtualization and cloud. This announcement is about us teaming up to deliver a jointly engineered solution to the market to bring AI to every enterprise. So as you well know, VMware has more than 300,000 customers worldwide and we believe that this solution would enable our customers to transform their data centers or AI applications running on top of their virtualized VMware infrastructure that they already have. And we think this is going to vastly accelerate the adoption of AI and essentially democratize AI in the enterprise. Why AI, why not, Manavir? Obviously we know the GPUs have set the table for many cool things from mining Bitcoin to really providing a great user experience, but AI has been a big driver. Why now, why VMware now? Yeah, and I think it's important to understand this is about AI more than even about GPUs. This is a great moment in a time where AI has finally come to life because the hardware and software has come together to make it possible. And if you just look at industries and different parts of life, how is AI impacting? So for example, if you're a company on the internet doing business, everything you do revolves around making recommendations to your customers about what they should do next. This is based on AI. Think about the world we live in today with the importance of healthcare, drug discovery, finding vaccines for something like COVID, that work is dramatically accelerated if you use AI. And what we've been doing at NVIDIA over the years is we started with the hardware technology with the GPU, the parallel processor, if you will, that could really make these algorithms real. And then we've worked very hard on building up the ecosystem. We have two million developers today who work with NVIDIA AI. There's thousands of companies that are using AI today. But then if you think about what Chris said about the number of customers that VMware has, which is in the hundreds of thousands, the opportunity before us really now is how do we democratize this? How do we take this power of AI that makes every customer and every person better and put it in the hands of every enterprise customer? And we need a great vehicle for that and that vehicle is VMware, right? Guys, before we get to the next question, I just want to get your personal take on this because again, we've talked many times, both of you have been on theCUBE on this topic. But now I want to highlight, you mentioned the GPUs, that's hardware. This is software. VMware had hardware partners and still software is driving it. Software is driving everything, whether it's something in space, it's an IoT device or anything at the edge of the network. Software is the value. This has become so obvious. Just share your personal take on this for folks who are now seeing this for the first time. Yeah, I mean, I'll give you my take first. I'm a software guide by background. I learned a few years ago for the first time that an array is a storage device and not a data structure in programming. And that was a shock to my system. Definitely the world is based on algorithms. Algorithms are implemented in software. Great hardware enables those algorithm. Chris, your thoughts. And we're living the future right now. Yeah, I would say, look, I mean, the developers are becoming the center. They are actually driving the transformation in this industry, right? It's all about the application development. It's all about software. The infrastructure itself is becoming software defined. And the reason for that is you want the developers to be able to craft the infrastructure the way they need for the applications to run on top of. So it's all about software, like I said. Software defined. Yeah, I just want to get that quick self-congratulatory high-five amongst ourselves virtually. Congratulations. Exactly. Chris, last time we spoke at VMworld, we were obviously in person, but we talked about Tanzu and vSphere. Okay, you had Project Pacific. Does this expand? Does this announcement expand on that offering? Absolutely. I don't know, John, over the past several years, VMware has been on this journey to define the hybrid cloud infrastructure, right? It essentially is the software stack that we have which will enable our customers to provide a cloud operating model to their developers irrespective of where they want to land their workloads, whether they want to land their workloads on-premise or if they want it to be on top of AWS, Google, Azure, VMware stack is already running across all of it, as you well know. And in addition to that, we have around 4,000, 5,000 service providers who are also running our platform to deliver cloud services to their customers. So as part of that journey, last year, we took the platform and we added one further element to it. Traditionally, our platform has been used by customers for running VMs. Last year, we natively integrated Kubernetes into our platform. This was the big re-architecture of vSphere as we talked about. That was delivered to the market and essentially now customers can use the same platform to run Kubernetes containers and VM workloads. The exact same platform, it is operationally the same, so the same skill sets, tools and processes can be used to run Kubernetes as well as VM applications and the same platform runs, whether you want to run it on-premise or in any of the cloud services we talked about before. So that vastly simplifies the operational complexity that our customers have to deal with and this is the next chapter in that journey by doing the same thing for AI workflow. You guys had great success with these co-engineering joint efforts, VMware and now with NVIDIA, it's interesting, it's going to be very relevant and it's very cool, so it's cool and relevant, so check, check. Ben, if you talk about this, because how do you bring that vision to the enterprises? Yeah, John, I think it's important to understand there is some real deep computer science here between the engineers at VMware and NVIDIA. Just to lay that out, you can think of this as a three layer stack, right? The first thing that you need is clearly you need the hardware that is capable of running these algorithms, that's what the GPU enabled. Then you need a great software stack for AI, all the right algorithms that take advantage of that hardware. This is actually where NVIDIA spends most of its effort today. People may sometimes think of NVIDIA as a GPU company, but we are much more a software company now where we have over the years created a body of work of all of the software that it actually takes to do good AI, but then how do you marry this software stack with the hardware? You need a platform in the middle that supports the applications and consumes the hardware and exposes it properly. And that's where vSphere, as Chris described with either VMs or containers comes into the picture. So the computer science here is to wire all these things up together with the right algorithmics so that you get real acceleration. So as examples of early work that the two teams have done together, we have workloads in healthcare, for example, in cancer detection, where the acceleration we get with this new stack is 30X, right? The workload is running 30 times faster than it was running before this integration just on CPUs. Great performance increase. And again, you guys are hiring a lot of software developers. I can attest to knowing folks in Silicon Valley and around the world. So I know you guys are bringing the software chops to the table on a great product, by the way. So congratulations. Chris, democratization of AI for the enterprise. This is a liberating opportunity because one of the things we've heard from your customers and also from VM, we're mostly from the customer successes is that there's two types of extremes. There's the, I'm going to modernize my business. Certainly COVID's forcing companies, whether they're airlines or whatever, not a lot going on. They have an opportunity to modernize to essentially modern apps that are getting a tailwind from these new digital transformation accelerated. How does AI democratize this? Because you got people and you got technology, right? So share your thoughts on how you see this democratizing. Yeah, that's a very good question. I think if you look at how people are running AI applications today, right? You go to an enterprise, you would see that there is a silo of bare metal servers on the side where the AI stack is run and you have people with specialized skills and different tools and utilities that manage that environment. And that is what is standing in the way of AI taking off in the enterprise, right? It is not the use case. There is all these use cases which are mission critical that all companies want to do, right? A worldwide that has been the case. It is about the complexity of that is standing in the way. So what we are doing with this is we are saying, hey, that whole solution stack that Manu will talk about is integrated into the VMware virtualized infrastructure, whether it's on-prem or in the cloud. And you can manage that environment with the exact same tools and processes and skills that you traditionally had for running any other application on VMware infrastructure. So you don't need to have anything special to run this. And that's what is going to give us the acceleration that we talked about and essentially live the democratization of AI. That's a great point. I just want to highlight that and call that out because AI is every use case. You could almost say the cube could have AI. And we do actually have a little bit of AI in some of our transcriptions work, but it's not so much just use cases. It's actually not just saying it, you got to do it. So taking down that blocker, the complexity certainly is the key. And that's a great point. I'm going to call that out after. All right, let's move on to the second part of the announcement, Krish. Project Monterey. This is a big deal. And it looks like a, you know, kind of this elusive architectural thing, but it's directionally really strategic for VMware. Could you take a minute to explain this announcement and frame this for us? Absolutely. I think John, you remember Pat got on stage last year at VMworld and said, you know, we are undertaking the biggest free architecture of the vSphere platform in the last 10 years. And he was talking about natively embedding Kubernetes in vSphere, right? Remember Tanzu and Project Pacific. This year, we are announcing Project Monterey. It's a project that is a significant project with several partners in the industry, along with the NVIDIA as one of the key partners. And what we are doing is we are reimagining the share of the data center for the next generation applications. And at the center of it, what we are going to do is re-architect vSphere and ESX so that ESX can normally run on the CPU, but it will also run on the smartNICs. And what this gives us is the for, let's say data center infrastructure type services to be offloaded from running on the CPU onto the smartNIC. So what does this provide the applications? The applications then will perform better. And secondly, it provides an extra layer of security for the next generation applications. Now, we are not going to stop there. We are going to use this architecture and extend it so that we can finally eliminate one of the big silos that exist in the enterprise, which is the bare metal silo, right? Today we have virtualized environments and bare metal. And what this architecture will do is bring those bare metal environments also under ESX management. So ESX will manage environments which are virtualized and environments which are running bare metal OS. And so that's one big breakthrough and simplification further elimination of silo, further elimination of specialized skills to keep it running. And lastly, but most importantly, where we are going with this touches on the question you asked us earlier about software defined and developers being in control. Where we want to go with this is give developers the application developers the ability to really define and create their runtime on the fly dynamically. So think about it, if dynamically they're able to describe how the application should run and the infrastructure essentially kind of attaches compute resources on the fly, whether they are sitting in the same server or somewhere in the network as pools of resources, bring it all together and compose the runtime environment for them. That's going to be huge. And they won't be constrained anymore by the resources that are tied to the physical server that they are running on. And that's the vision of where we are taking it. It is going to be the next big change in the industry in terms of enterprise computing. Sounds like an operating system to me. Yeah, runtime, assembly, orchestration, all these things coming together, exciting stuff. Looking forward to digging in more after VMware. Manavir, how does this connect to NVIDIA and AI? Tie that together for us. Yeah, it's an interesting question because you would think, okay, so NVIDIA is this GPU company or this AI company, but you have to remember that NVIDIA is also a networking company because our friends at Melanox joined us not that long ago. And the interesting thing is that there's a yin and yang here because Chris described the software vision, which is brilliant. And what this does is it imposes a lot on the host CPU of the server to do. And so what we've been doing in parallel is developing hardware, a new kind of Nick, if you will, we call it a DPU or a data processing unit or a smart Nick that is capable of hosting all this stuff. So amusingly, when Chris and I started talking, one of the, we exchanged slides and we basically had the same diagram for our vision of where things go with that software, the infrastructure software being offloaded, data center infrastructure on a chip, if you will, right? And so it's a very natural confluence. We are very excited to be part of this Monterey program with Chris and his team. And we think our DPU, which is called the NVIDIA Bluefield 2 is a pretty good device to empower the work that Chris's team is doing. Guys, it's awesome stuff. And I got to say, you know, been covering VMworld now 11 years with theCUBE and I've known VMworld since it's founding, just the evolution. And just recently, before VMworld, you know, you saw the biggest IPO in the history of Wall Street, Snowflake, an enterprise data cloud company, the number one IPO ever. Enterprise tech is so exciting. This is really awesome. An NVIDIA, obviously well-known, great brand, also a chip company as well and got processors and data and software. Guys, customers are going to be very interested in this. So what should customers do to find out more? Obviously you've got Project Monterey, strategic direction, right? Frame perfectly. You got this announcement. How, if I'm a customer, how do I get involved? How do I learn more? And what's in it for me? Yeah, John, I would say. Yeah, sorry, go ahead, Chris. No, I was just going to say, sorry, but many a way I was just going to say, like a lot of these discussions are going to be happening. There are going to be panel discussions. There are going to be presentations at VMworld. So I would encourage customers to really look at these topics around Project Monterey and also about the AI work we are doing with NVIDIA. And attend those sessions and be active. And we will have ways for them to connect with us in terms of our early access programs and whatnot. And then as Vanuvar was about to say, I think Manuvar, I will give it to you about GTC. Yeah, I think right after that, we have the NVIDIA conference, which is GTC, where we'll also go over this. And I think some of this work is a lot closer to hand than people might imagine. So I would encourage watching all the sessions and learning more about how to get started. Yeah, great stuff. And just for the folks at VMworld.com watching, Cloud City's got 60 solution demos. Go look for the sessions. You got the expert sessions. Raghu, Joe Beda, a bunch of other people from VMware are going to be there. And of course, a lot of action on the content. Guys, thanks so much for coming on. Congratulations on the news, big news and NVIDIA on the Bain virtual stage here at VMworld. And of course, here in theCUBE, thanks for coming on, appreciate it. Thank you for having us. Okay, this is theCUBE's coverage of VMworld 2020 virtual. I'm John Furrier, host of theCUBE virtual here in Palo Alto, California for VMworld 2020. Thanks for watching.