 Hey everyone, good afternoon. Lisa Martin with Dave Vellante. Welcome back to theCUBE's coverage, day two Dell Technologies World 2023 live from Mandalay Bay. Dave and I have had some great conversations so far. One of our alumni is back with us. We're going to be breaking down some great topics. The Chuck Whitten, the co-COO who's back with us, shared yesterday on MainStage. Chuck, great to have you on theCUBE. Thanks for joining us. I'm always happy to come by. Thanks for having me. So this is your, as you said yesterday in your keynote, second time on MainStage. Talk about what's new compared to last year. A lot of advancements, a lot of announcements, a lot of news, but compare and contrast this year with last year at this time. You know, I hope that people walked out saying, wow, Dell delivered what we promised last year. I think this was the year of, you know, our say-do ratio proving itself. So, you know, last year we stood on stage and said, look, there's an enormous number of problems that CIOs are facing today and we happen to sit in the middle of it. We talked about multi-cloud. We talked about the edge. We talked about security. We talked about AI. And of course, you have to talk about the future of work, which was evolving. And I think you flash forward a year. I mean, we just delivered an enormous payload of innovation, particularly around the multi-cloud topic yesterday. And so, yeah, a year down the road, I felt like we delivered two years of innovation in a year. You were saying yesterday that you, in a year, you've spoken with hundreds of customers and partners and they're articulating and you just mentioned some of them, some of the same challenges that they're all facing. Can you walk us through in our audience some of those main challenges? Multi-cloud being one of them that customers are facing that Dell is saying, we are here to help solve. Yeah, absolutely. So, look, multi-cloud is a reality. You know, if I go out and I talk to customers, they will tell me I'm running anywhere between four to eight cloud infrastructures. And the problem I'm trying to solve is, how do I make it all act as one single platform? One unified cloud. I think super cloud coined by my friend here. That's the challenge that we're in the middle of solving in multi-cloud. In security, you know, we're being told, I have a complex ecosystem of security partners. Can you help make this simpler? That's what we're talking about today with Project Helix. In AI, look, the world, you can't go into a boardroom today and not have a question about what are we doing to transform our business, our business processes, our products with generative AI. The problem to solve is, I need to apply it if I'm a company with proprietary models on my proprietary data. How do I do that securely? How do I do that ethically? If you go to the Edge, we talked a lot about the Edge last year. We announced Project Frontier, which was our vision to simplify the orchestration of applications and infrastructure at the Edge. We sort of paid that off with our announcement today of native Edge, but that was the problem we sat in the center of solving. How do I make it all work in a remote and harsh condition with no IT intervention? And then look, the future of work is just simply, we've all pivoted from working in an office to working remote, and now the world has settled on hybrid. Maybe it's, maybe in your company, it's two days a week, maybe it's three days a week, maybe it's four days a week, very few or five days a week. So somebody's got to make that work, in office, out of office, mobile. And so those are the problems we're in the center of solving. Taco Tuesday. Taco Tuesday is a great way to bring back your employees for a day. So, presumably 100% of the conversations you're having with customers involved, some conversation around AI. Absolutely, yeah, yeah. What was the progression like? If we go back, because it's clearly the year of AI, if we go back a year ago, how much of the conversation was AI? And was it open AI and chat GPT and Microsoft Bing that catalyzed that? I mean, everybody says, hey, this is really not that much new, but it sure is new to me when I asked GPT. So what was it like back then? Yeah, I think, look, AI is not new. Companies have been working with it for years. We've been putting it into all of our product and solutions for years. We've been applying it to our business for years. So whether it's our forecasting and our supply chain, our services organization, I think what's happened with the chat GPT is it's upped the visibility and the amplitude of how companies are looking at it. And so it's sort of made an aware, look, I like to joke when my kids are asking me about AI, you know we're at a certain point in the hype cycle. And I think that's great. But I think that the difference that's happened now is, now companies are saying, okay, there's these large language models, large unbounded public data models. How am I going to do that on a subset of my data? And I think where the evolution of the conversation is going is from ML or focused conversations inside a company about AI to okay, there's these great unbounded public data models that I can use. How do I get it now on my data for solving my specific business problem? We have some great examples on the floor today of applications of AI in those environments. Well it created massive awareness to anybody and anybody who can speak language. You made a comment, you talked about your end-to-end leadership and you called it, I think I got it right, built for this moment. What did you mean by that? Well look, I think that this moment in technology is fundamentally about data. It is about controlling it, securing it, and most importantly extracting value from data. We're the leader in data storage. We're bigger than number two, number three, number four combined. We have the IP and the capabilities to solve multicloud but also to solve a lot of these data problems. And so that's one thing. And then I think look, we're positioned as a natural partner in a very complex ecosystem. And I think one of the things that makes us different is not just our asset position as leaders from the core PC to the core data center to the cloud and all the data that spans across it, but also that we're natural partners. We partner with everybody and you've seen in the announcements, Microsoft, Red Hat, VMware, Databricks, because we are not confused. We're not a cloud trying to compete with the other clouds. We're following customers and that's kind of been the DNA that Michael's always had as he sort of evolved the company over the years. I feel, I wonder if you can comment on this, Chuck. So it's all about the data, absolutely. I feel like most companies don't have their data act together. They're not able to put data at the core. I remember I was talking to Jen Feltz about this. Like, how do you deal with all the, you know, you've sassified your company. Well, we take it out of the sass. We stick it into an enterprise data warehouse. That's how we put it in the core. Because that's the best you can do today. And so, do companies have to figure out their data architecture in order to solve for AI or is AI going to solve that for them? Yeah, I think it's both, right? I think the reality of it is, is you have to start with the business problem. What am I trying to solve and what data do I need to solve it? Because we are generating data so fast and so distributed, as you said well, Dave, that you have to make choices, right? You have to make choices in your infrastructure. You have to make choices. So, certainly the object is not, put all of the data in a single unified lake warehouse, not practical, not cost effective. The art is going to be, how do I get the right data in the right place at the right time? That's what we're solving with, you know, our multi-cloud architectures we're talking about. Talk a little bit about that. From the multi-cloud by design, from that lens, you know, customers to be able to get value out of data they've got to know where it is. Every company is a data company. Every company has to have intelligence. How is Dell obviously positioned now with the announcements to solve that problem? But maybe a favorite example of a customer that was multi-cloud by default last year, and this year has really gotten that much more strategic by design with Dell. Absolutely. Well, the, you know, the announcements yesterday, what we've effectively given is the core building blocks to make your multi-cloud work. And so today you wake up, you have my four to eight clouds spread across. Now you look at what we announced yesterday, our Apex cloud storage offers. That was originally called Project Alpine, our putting file block and object in the public clouds. Now let's just stay in an Azure environment for a minute. Say you have Apex block storage in Azure, and on premise now you have our Apex block, or excuse me, our Apex cloud platform for Azure. What we've created with our IP is a common data layer, right? It's a common storage layer. It is our underlying software and storage assets in the public cloud and on premise. And that will let me enable mobility, put the data in the right place. Now we did something extra that I think is really exciting and maybe got a little bit underplayed yesterday, which is Apex Navigator. That's, I called it our air traffic control. If you're in the cloud and you're on the ground, you need air traffic control. Well, what Navigator allows you to do is now see where is all of that data across my multi-cloud estate, which is exactly what you need to unlock that data mobility. And over time we can imagine putting lots of capabilities into Navigator. FinOps, AIOps, things to allow you to truly optimize cost, performance, whatever you're solving for across multi-cloud. So that's, we've basically given the tools to customers now to take this sprawl of clouds and start to make them coherent around our storage IP effects. So Chuck, I understand you've been, Dell, companies like Dell have been using AI for years. Customers don't even necessarily know about it, or maybe they do, because you talk about it, but it's there, just by default. By default, sure. So that's cool. But when I look at the data in terms of where customers are spending on AI, or even deploying, spending resources, it's the big three clouds, it's data bricks, and now of course open AI pops up in a big way. Do you see that dynamic changing where you guys, large language models as a service inside of Apex, where you're actually selling AI? Well yeah, and let's start with the announcement today, because I think the first thing customers want to do, I agree with you, today their choice is, build my own large data center or go to a public cloud. Today I think we're trying to make it simpler inside their data center, with Project Helix and NVIDIA, right? We bring the infrastructure and the services, they bring obviously the accelerators, but also the AI models and the software in an integrated stack, make it very simple for a customer who wants to apply a large language model to their proprietary data, that's my proprietary data in a proprietary setting. Now over time, could you imagine us offering that as a service as part of Apex? We'll follow the customers. I think if you look at where we've built out Apex over the last year, it's been through customer pull, right, saying, hey look, I want Apex PC as a service included in my console experience. I want bare metal compute, Apex compute we announced yesterday, so do we go there? I think there's a lot of companies offering AI as a service out there. Certainly we're in the business of you can buy our infrastructure, you can subscribe to our infrastructure and increasingly you can subscribe with a managed service. I could certainly see AI going there. Your strategy is not to build a large language model that's Dell's large language model and sell that maybe along with ecosystem partners, or is it? It's clearly got to be ecosystem partners. That's your ecosystem. It's ecosystem partners and I think if you listen to Jensen today say look they know a lot about these large language models, the open source models that are out there. What you want to be able to do is give it to a company maybe that doesn't have the skill set to write their own large language model which is most companies in the world, right? They want to be able to take a reference model and put it on sort of a standard stack and that's what we're really good at doing. So you got two children, right? Two boys. And they're teenage years. I got kids, they're a little bit older but do you, a lot of schools are saying you can't use GPT, you can't do that. Do you agree with that or do you think we should be teaching how to prompt GPT? I don't agree with it. This is not a Dell position. This is a Chuck Whitten position. I think this is a thoughtful dialogue society's having but look, it would be like telling me don't use the internet for research, right? When it came out. I think anytime there's a new tool you can use the tool for good, you can use the tool for bad. So my boy is that when I talk to them about this is this is an amazing instrument. Don't cheat with it but use it to enhance your creativity, enhance your productivity. That's the AI conversation the world's having. I think the conversation about our swaths of jobs going to be eliminated. Some jobs will probably be eliminated. The cost of cognitive labor is coming down but what's really happening is we're putting a force multiplier on productivity with AI and that's how I think you have to think about it. Well, I mean, I was watching Michael on CNBC too and they got to keep coming back but I don't understand what people want to say. Yes, it's going to affect jobs but machines have always replaced humans. Now it's cognitive so I think that's why people are a little bit more scared but you see there's a big discussion around the debt levels. Of course, I talk about it all the time with 120% of GDP. Some people think it's going to 200 before it goes to 100, it's probably true but if you look at the productivity there's been a productivity lag in the US for the last 10 years. Cloud didn't change that. It's probably because people on social media cloud mobile social distracted us maybe from being more productive. You know, this is the Dave Vellante theory. How do you think AI is going to affect that and can that get us out of sort of whatever the debt issue, the concerns about interest rates and recession or is it just too early in your view? I know it's hard to predict. Yeah, and that's without stepping into the grand unified theory of economics and interest rates and what it does, I think fundamentally this is a productivity accelerator and it will be for society and I think we're just starting to scratch the service on what we can imagine AI is going to do and so whenever I look at pessimistic forecasts about our productivity or the world's productivity and then you look at the capability of these tools I think you start to sleep easy at night and think it's going to be an accelerant. Do you think it can exceed, I mean, obviously the industrial revolution was sort of the goat of productivity improvements but the PCs did pretty well. Internet did okay. Do you think this will exceed that? What should I tell you? I do, I think it's a massive, I mean, I think it's a massive transformation. Again, I think like all new things and I mentioned the hype cycle earlier, we will probably overstate the near term and we'll underestimate the long term. That's sort of my view of where it's at. I got to get my security question in too because you said customers are telling you don't be a security vendor, change the security of the tech industry. What did you mean by that? Yeah, that seemed to provoke a bit of a reaction. I think I also called the security industry broken. I called it a do over. But look, I think, look, the world is out there with lots of layer upon layer of security, right? That is the way the security industry has been built and that is not to take away from, we have many great partners, you need good end point security, you need all of these security assets out there but they are an individual silos. And what the power of the concept of zero trust is, zero trust is not something you can buy but we think it is the gold standard framework around which the world is going to organize to build security into infrastructure. That's what I think customers want. And so when you hear us today announce Project Ford Zero, what we're effectively doing is taking a customer zero, a very sophisticated customer zero, the Department of Defense, and saying we're going to work with you and 30 other partners to create a full stack, zero trust infrastructure and then we're going to industrialize it for the world. That's what the world needs. They don't need more point specific security solutions. That's not working today. And so that's what I mean by we've positioned ourselves uniquely given our position in infrastructure, our position with the US government to be at the center of those conversations. It can be made simpler. It's never going to be made foolproof. There's always going to be a security challenge. That's why we offer our data protection business, our cyber vaults, all of that has to help you recover but it can be done better and that's what we're off trying to do. And where do security ecosystem partners play? Is it bring your own identity? I know you have preferred partners presumably like you can ask a deal with CrowdStrike for small business which is really I think really interesting. Where do those guys play? Are they part of that ecosystem? They get to that? They absolutely have to be looked at. Zero Trust is simply going to organize all of us around what are the common frameworks that are going to, if you sort of approach security today as a trust but verify. Zero Trust is saying never trust always verify what is the framework around which we're going to do that. And of course there's, you know, whether it's identity, whether it's endpoint security, whether it's network security, we all have to play together to be able to do that but it needs to be organized around, you know, around a sort of a single principle. That's Zero Trust. That's what we're trying to do in Project Fort Knox. So Chuck, take us home with so much momentum. So much has gone on in the last year alone. Next year you're sitting in this seat or you're on the main stage. What are some of the things you think you're going to be reflecting on in the next era of Dell? I love being here every year and saying we paid off what we told you last year. So look, I think the work is never done on multi-cloud. You've heard us announce big announcements with Open Hat Redshift, with VMware, with Azure. I would expect us to be making more announcements next year. I would like to see us standing here next year saying, wow, Project Fort Zero paid off. We can see fully integrated Zero Trust industrialized solutions that customers are applying. I think next year we'll be sitting here talking about an AI business that isn't just oriented at the largest, most sophisticated, large language models, but this spectrum of opportunity that we started talking about over the last couple of days from a precision workstation, which can do a lot to our standard servers all the way to that 9680 eight-way GPU that we're offering today. So I think we're going to continue to chip away at those big problems I called out on stage yesterday. Product Helix, if you can get GPUs. Project Helix, look, I think the conversation on GPUs and Project Helix really important. But again, what we're trying to remind the world is there is a spectrum of applications in AI, and you can do a lot with a standard server. It's all compute, sometimes it's latency in time, but as an example, we have an offering over here or an example running of take all of our proprietary data on Apex and put it into a virtual human interface. That's running back in our Palmer labs being orchestrated by a standard server, right? Can you cut the latency down? If you added it on a 9680, sure you can. But there's a lot of room for AI across the infrastructure stack, not just GPUs. Thank you. Well, we will be watching this space closely as the silicone angle on the cube does. Chuck, thank you so much for joining us on the program. Thank you, it's a pleasure. All of the exciting things, the challenges you're helping customers solve. We can't wait to see you next year. I look forward to it. We'll see you there. All right. For Chuck Whitten and Dave Vellante, I'm Lisa Martin. Up next, Varun Chabra and Carrie Brisky join us. Talking about generative AI and some great news that came from Dell and NVIDIA this morning. Stick around, we'll see you in a minute.