 Hey everyone, welcome to theCUBE live at VMware Explorer 2023, theCUBE's 13th year of covering VMware customer conferences. Lisa Martin with John Furrier, John. We've come from the keynote, lots of talk about multi-cloud, lots of talk about gen AIs and big folks on stage. We're going to be digging into a lot about the ecosystem, which you know of VMware is incredibly important and deep. We've got two guests from Wipro here. Mahesh Chandra, the SVP and head of Full-Stride Cloud joins us and Ramu Padmanaban, VP Full-Stride Cloud Solutions at Wipro. Guys, welcome to the program. Thank you, thank you very much. Give us a little bit of a background on each of you. Mahesh, we'll start with you and then Ramu will go to you. Thanks Lisa. First of all, thanks for having us here. My name is Mahesh Chandra. I head the Full-Stride Cloud for Wipro for Americas and I'm responsible for all the cloud business. Just 20 seconds on Full-Stride Cloud, what it made about April 2023 of this year, we brought two of our key service lines, cloud and infrastructure services, and we put digital and cloud together to give one integrated cloud to our customers, partners, alike. And you know, VMware is one of our key cloud partner. That's why we are here to demonstrate our partnership in full scale. Good, we're going to dig into that. But Ramu, first tell us a little bit about you. So let me go next, Lisa. So I go by Ramachandra and Padmanaban, but I think so it's pretty long. So I go by Ramu. Thank you. Right, and that's what you can call me. And as Mahesh said, I'm also part of the Full-Stride Cloud services team, but my job is to actually really drive line solutioning. So I make business value to our customers. You heard today in the keynote, a lot about the ecosystems and how do you really bring in. So that's my job in Wipro, work very closely with customers, with partners, and truly orchestrate and really bring business value. So that's my role. You know, Ragu's always been a great Cube guest. He's been on the Cube many times. Even before he was CEO, he's a technologist and seeing the CEO of NVIDIA up there, Jensen and Ragu. They're smiling like they're kids. Like, you know, they just discovered something and they're giving away a G-Force, you know, graphics card, kind of throwback. And it's memorabilia. I mean, we're at an age of tech where it's there. But one of the things I loved about Ragu's speech was his focus of AI was awesome. We believe in the same thing, but he laid out the waves. PC wave, oh, this is our generation, right? PC wave, PC apps, the web wave, web apps. And then he jumps to mobile, which I thought was interesting. That's fair. Mobile was a great ecosystem. iPhone created the world of mobile apps. And then he jumps right to AI apps. This is what's happening in the enterprise right now. Every single enterprise sees departments. He laid out the chart up there too that showed the impact, not just a chat GPT consumer thing. There's real departmental across the board impact for every single enterprise. That means new apps are coming. And they got to land somewhere. This is a huge opportunity and a challenge as well. This is the market for enterprise AI. This is what you guys do. What's your reaction to that? You guys are like, okay, rolling in the money. Is that business good? I mean, it's got to be great for you guys. This is what you guys do. Yeah, I think you're absolutely right. It was a great keynote and the six innovations which is coming. We are very excited by it. And actually we are also very honored and proud to be recognized by VMware and Raghu. As a key partner, it means a lot to us. And let me take one step back. Earlier this year, our CEO, Thierry Delaporte, as part of a journey to being a AI-first organization, right? We're going to become an AI-first organization and announce an investment of $1 billion, right? To actually do what? To actually increase our capability on AI, okay? As part of this, we have launched what is called AI360, which is an ecosystem that will bring together with pro-technology and our advisory across our four global business lines, platforms, research and development, and your IPs along with partners and talent all under one umbrella. That's the beauty of it, right? And how are we going to do it? You know, if you have an innovation hub, Vipro innovation hub, which we call it as Lab45, we'll play a key part in it, okay? And as we speak today, you know, what we are doing is all the 250,000 employees of Vipro are getting trained and certified on GenAI 101 and responsible AI, okay? And this is very serious for us in terms of, you know, if you look at our CEO, senior management, to an engineer, everybody is doing this course, everybody is excited, they're saying, hey, we have done it because we believe GenAI has that capability as well. I just want to touch upon, you know, Don and Lisa on our relationship with VMware, right? You know, it's very unique. And why does it unique is? Well, hold on, before you start there, I wanted to just point out, you guys are recognized as the key partner. Yeah. And congratulations on that. Thank you. And the brand that is well recognized and it's on the charts, everything. So congratulations. But if you plan more, how'd you get there? What was the key to success? Yeah. So just today, VMware has been there for 25 years. But our partnership with VMware has been for 21 of those years. Wow. It's been very successful. We have got go-to-market solutions together. We go together on the virtualization on the cloud. As I said, they're a key cloud partner. So today, we are the number one partner for VMware for all Tanzu-based application modernization projects. That's number one. We have over 500 press-joint clients together. Okay? And on five of our FSC cloud centers, the full-strike cloud centers, is actually built on VMC. Okay? And as I said, we have a strong 21-years-press relationship which is really built on trust, you know, executive-level presence. And we have made some inroads in 10 years. We have built some industry-specific use cases and we are taking it to our customer. So we are a little bit ahead of the curve. And you know what? We are very confident when Hawk mentioned that of the two billion, one billion will go to the partners. I think we are very strongly there to get a major share of it. So a strong, long-standing partnership has such as you talked about, 21 out of the 25 years of VMware's existence. It sounds like a deep, broad relationship. You talked about JNAI. You've got some use cases there which we hopefully will get to. But Raman, I want to bring you back into the conversation and really get your perspectives on what enterprises are doing with JNAI today. What do you think the next three to five years looks like? So great question, Lisa. So if I really look at it, right? This year, 2023, has been the breakthrough year for a lot of us, right? If you look at every conferences that you've got to, JNAI has been the hot topic, right? And what happened? Obviously, the Chattivity that came into everybody started utilizing it. We all started becoming the training to that public large language models, right? And truly, if you really look at it, that's all about in terms of how, as users, we are able to unearth the power of what AI can actually bring in. But is that something that the enterprises will truly get in? So let me unravel that and you heard a lot in today's keynote as well. If I really look at it, it is an ecosystem play and we need to have a lot of the ecosystem players for enterprises to be successful. So let me put that in perspective, right? So if I look at the bottom of the pyramid is all your infrastructure and the orchestration vendors like VMware, that's where you have the large compute storage networking that comes in. You saw the power of NVIDIA and VMware coming together. You have the AI chipset vendors that are truly bringing in the compute power of the GPUs that makes it much more easier, flexible for everybody to actually really look at it. Then you have the large language model providers. Now that's the brain behind a lot of the investments that will go in by enterprises because they want to make it much more private. They want to make it very unique to them that addresses a lot of the privacy concerns that you actually saw today, right? Then you would have a next set of layers is where developers like us needs to get evolved and the engineering practices of looking at how can you use these AI practices into a truly looking at how to develop it, how to deploy it, how to monitor it and how do you truly bring the responsible AI into practices. All needs to be laid a good foundation and that's when you'll see the uniqueness of the applications, whether it is the horizontal applications that are larger in size that would be use cases that will be there for every large enterprises to adapt to it or you would have vertical domain led, right? Now this domains like healthcare, you ask me a question around what kind of areas and where are we actually really looking at it? Healthcare, CPG, manufacturing. So this is where you learn how unstructured data can actually really bring in. So in my view, the next two to five years is an inflection point. You would see that where many of those AI apps will actually be working in conjunction with the existing vendors that are going to be there. Everything will be a maker checker model, right? And that's where it's going to be a very interesting time. Can you give an example of where the customers are at right now in their journey? Are they kicking the tires? Are they coding? Because you're seeing a lot of acceleration even in the startup world, for instance, as a comparison. It used to take 10 people to get the company at a certain level to start founders and then you get the 10 people, mostly engineers, now you can do three people. You're up and running. So the speed is there. Are customers that far along? Are they playing with it? What are their concerns? Is it more architectural scoping? What are some of the activities that customers are going through right now? In my view, John, customers are today in their labs looking at the right use cases that are there. So we're looking at every enterprises wherever there are large volumes of data that's going to be there. We're actually working very closely with our customers, basically. So we have these AI labs that you have where we've built in a lot of use cases. Good amount of knowledge miners are already available now, right? So we're actually focusing on that. And this is not something new that's happened this year. As a company, we've been there and Mayesh will talk a little bit more about some of the work that we've been doing with our customers, as well as some of the researches that we're actually trying to do about it. But to briefly tell in terms of where the customers are leading to, this year will be where we are actually putting the foundationals in place. A lot of architectural decisions in my view will happen in the next six to eight months, basically. I'm sure you guys probably can't talk about the intellectual property, but you guys are known for executing operational workflows on behalf of customers. I'm sure you're using AI internally to go faster. Oh yeah, because just add to what Ramu was saying. The technology is there, because I work with a lot of banking customers. They know their technology very well. But it's the productionization of it, similar to what we had the self-driving cars. The technology was there 10 years back. But how do you bring into production? It takes time. So I believe when it comes to gen AI, the technology is very much there. I think we go to take it responsibly. I think that's where I think Ramu said, the next two to three years is going to be key. So I think it's going to be very exciting as we embark on this journey as well. Definitely not a dull moment. Mahesh, can you double click on something that Ramu was saying in terms of the work that Wipro is doing with customers? How are you working together with them on gen AI use cases? How are they really helping to influence the direction that Wipro is going with respect to gen AI? So it's in two aspects, rightly. One is if you go back about 10, 15 years back, automation was a very big thing. And there's a lot of automation platforms came, but tools came. But what is important is the use cases. The way approach we are also doing is we are working with a lot of our partners and customers building their use cases. Because if you have a right use case, that is when they can see productivity. That is when they can see innovation. That is one fact we are doing. And just head back to what Ramu was saying. For any enterprise to be successful in gen AI, you've got to look at the four layers of the stack. And I'll tell you why Wipro is uniquely positioned in the gen AI journey and where our relationship with VMware is going to succeed in gen AI as it has done in the cloud and other thing as well. If you look at the foundation layer, as Ramu mentioned, it is about compute storage network. So we have great relationship with hyperscalers and hyperscalers as well as orchestration like VMware. We have a lot of solutions where we have gone to the market and together and we will continue to do it. And then you have your model players. What do they do? They are the one who are actually giving access to both your commercial as well as your open source in a foundation models like your NVIDIA as well. So NVIDIA is a customer of ours. They have a partnership with us. So we play a key role in that as well. Then you go one level of the stack is where your AI engineering is coming. This is where the entire model engineering, the model lifecycle comes, where we're going to deploy, where we're going to do development into gen AI models. And we pro, over the last few years because we have invested heavily in AI and now with the new investment from our CEO, we are going to build a lot of capabilities and practices and we will be at the forefront of this. And lastly, I mentioned about the gen AI apps. You said like what Ramu said, moving from one to the other. So we have a lot of institutional knowledge in the apps. We will ensure the use cases are relevant to customers and we take those gen AI and harness it to the customer. That's where we play a unique role under a partnership with VMware. We'll definitely take us to the next level. That's a great point about that whole ecosystem and the partnership with VMware that you have. You know, one of the things that Raghu didn't put on the chart was the cloud apps, but I can understand he's a non-premise guy. I get that. I would have put a cloud apps in there, but gen one cloud apps, we say gen twos here. If you look at the API, what clouds did for APIs, I see a similar thing with AI, and this comes back to the ecosystem and Ramu, your piece, which is the technology, LLMs and foundational models will be data driven because data is AI. So data and API, people sharing data, the role of ecosystems, not just VMware, but connecting things. If you look at the LLM model, it's not one model to rule the world. We reported that first, but now that everyone's agreeing that's basic obvious now. You got different LLMs. The on-premise dynamic with data is there. You got public cloud AI, which just use cases for that. Edge is going to have some action next. Runtime is the multi-cloud. That's what was set on stage. So, Ramu, what's the LLM picture look like? Because, explain like, because you got hyperscalers have LLMs, you got proprietary LLMs, you got open source LLMs. How do you guys look at that? How do you advise customers about the LLM power dynamics or the, we call it the power law? Now, absolutely. And I think that this is where a lot of our work, right? As I would call them, we call ourselves now as an orchestrator now, not as a system integrator, right? Because the whole focus is about composability and how do you actually create this engineering to figure out as to which is the LLM that I will have to really be adapting. Because as what you saw today in the keynote, the four key aspects of it, right? Is it about in terms of the choice of the model? It is about the cost. It is about the performance. And how do you make it affordable as well, right? So if you can, if you really look at it from that perspective, as you rightly said, a lot of researchers happened on the public LLMs that's now becoming commercialized. You would have a lot of R&Ds that are going on with an open source LLMs that are there. But truly if enterprises needs to be having those huge data sets to be thrown, we need to have proprietary LLMs that would be there. Now, these proprietary LLMs needs to address and God rail a lot of it. And to that, John, if you really look at it, we're trying to create that decouple model. We're trying to make sure that as an engineering practice, we have the ability to choose. And for us, this is not new. As a company, we've invested in AI 10 years back, basically. The last four or five years, even before the open AI came to the market, we've been working with Academy in terms of really building algorithms that addresses that, right? With the University of South Carolina, we've done a lot of investments with Indian IITs, basically, right? And we've tried to bring together all of this in place. So that's where I see LLMs becoming that linchpin, I call it, right? To make an enterprise successful, basically. Just add one analogy, if you don't mind, right? You spoke about multiple LLMs, and I'll also say how a player like VPros, being on the AI engineering, can help. I'm talking about fine tuning. So we had two models. One is your GPT 4.0, and then you have your Palm 2.0. And then the use case was created. I can't get to specific, but I'll just give you a very high level, is when you did your prompt, GPT 4.0 is a success ratio, when it generated the content was 76%, whereas on Palm, it was Palm 2.0, it was like 16%. A little bit of fine tuning, guess what the success ratio went to, 86%. That's the beauty of fine tuning, and that's where I think we will play a key role. Good point. Excellent. Guys, thank you so much for joining us on the program today, talking about Wipro. What's new there? What are you doing with VMware? And why this partnership with respect to GenAI, it's going to be really attractive to customers. You said you have 500 joint customers. Now I'm sure that's only going to grow. We thank you so much for joining us on theCUBE today, guys. Thank you. Thanks for having us. Our pleasure. Thank you very much. For our guest and for John Furrier, I'm Lisa Martin. Stick around, there's more great content coming from day one of our live coverage of VMware Explorer. We're going to be talking with VMware executives, partners, customers, analysts, you name it. Don't go away. You're watching theCUBE, the leader in live tech coverage.