 Welcome back to SuperCloud 4, where we're exploring the impact of generative AI. In particular, we're interested in the disruption potential of AI, the importance of quality data, and how organizations can protect their IP and still keep pace with market movements. With me is Jeff Boudreau, the chief AI officer at Dell Technologies. Jeff, good to see you. Thanks for coming into the studio. Thank you for having me. Yeah, you're welcome. So new role, how come Dell decided we got to appoint a chief AI officer and you must be a popular guy? Yeah, popular guy right now. A lot of interest after we announced it. I just stepping back in regards to how we're thinking about AI and specifically gen AI as we go forward. It's a defining technology of our era. We truly believe that. And if you think about where the industries are going and what is being used for gen AI and how it's disrupting our everyday lives and how people use the technology and how business operate, we think it's extremely important for us to have focus and dedicated leadership in this space, specifically as I would say the boards in the C suites are basically shifting all the money to all AI projects. We thought it was important to really make sure that we had the proper leadership and focus in place at Dell. No question. I mean, we've reported on this. It's like, it's not like the top line is all, you know, growing the IT spending. It's not like, you know, I was going, oh, just spend like they used to in 2020. It's not happening like that. So AI is really stealing from some of these other initiatives, but what excited you about this role? Why Jeff Boudreau? Sure. And before I go into that, I guess to hit your last point. Yes, I'm definitely seeing some, as you would say, stealing from one project to another to really fund this and get this going. But I actually see over the long run, I think it's a huge opportunity for growth and expansion of our TAM, but also the set of services. So from hardware to software to services, it really brings it all together and actually expands our TAM and our opportunity to really help our customers. Now, what it got me excited about the role, it's all about tech. You know, long background, I have, I guess a good mix of both business and technology background. And I have a long track record of leadership. So if you think over the last four years of leading ISG, merging Dell and EMC storage teams together, taking over compute storage network, leading and driving the industry leading technologies and infrastructure, you know, across the domains. Obviously running large businesses, the team was over 25,000 strong. So I've drove plenty of transformation issues. And this was something that we looked at as the new role. So the opportunity to take tech and business, merge those together to really provide a better outcome for Dell, but also for our customers is really, really excited. What does a chief AI officer do? It's a great question. I would say it's early innings as a baseball person. We're in early days. So it's evolving in regards to the definition in the industry. If you kind of look as we just talked about, you look in the industry, you'd say in the heavy technology company, usually it's a very technology centric type person. If you look in other industries, maybe in commerce, you'll see more of a business and marketing slant to this type of role. So they're evolving depending on the industry and the type of company you are. For me specifically, my role in the company is to define and refine our AI strategy. It's about setting governance, policy, practices, guardrails, also responsible for the, I would say prioritization, the development, the implementation of the AI projects that cut across the company. So this is not like BU by BU. This is a true pan company type initiative and that's where the focus and direction comes in from. So I think I'm correct and that Dell doesn't have historically a chief data officer. I think Jen kind of took care of that. And so you kind of leapfrog that. So my question is, are you seeing other organizations institute similar chief AI officers? And then, cause that gets kind of funky, right? Cause you got the chief data officer, which has always been sort of an overlap with the CIO. So what are you seeing other organizations? It's a great question. So do I see others having a chief AI officer? The answer is absolutely yes, it's happening. It's happening in multiple industries. So I won't or can't take credit for being the first cause I definitely wasn't. But I will say it is a morph and I've seen like I said before, there's people coming from the CTO side. There's people coming from the CIO side. There's some people coming from the CTO side. Again, we thought it was important since it's such a big initiative that we had to have the dedicated focus and support and someone that could actually drive the transformation across the company. I have to as part of that, collaborate with all my internal stakeholders. So if you think of all the functions, I don't care if it's legal or finance or sales or services or go to market or supply chain, I got to cut across everybody, working with Jen and Dell Digital in providing a great architecture instead of services for our own internal use so we can have more effect there. So it's a big role that's gonna cut across a lot of different functions. And I look forward to working with Jen on moving our data strategy and our technology strategy so we can realize the potential of AI. Great, at DTW, you guys put together, put on a framework, I think there were like four pillars, your Dell's AI strategy. Can you just sort of review that and what's changed since May? Sure, the strategic framework that we have that you're referring to at Dell Tech World, we have it, I say it's four key areas. It's AI in, AI on, AI for and AI with and I'll unpack that a little bit. So if you think about AI in, that's about embedding AI into our offers so they're more intelligent. And a simple example of that would be something like a Dell optimizer, trying to make our solutions more efficient and autonomous by nature. The secondary thing is AI on. You can think about that as us building world-class infrastructure so customers, partners could run their AI and ML workloads on our solutions. And an example of that would be something like Project Helix where, you know, Dell and NVIDIA both hardware and software came together to build a stack to help our customers really drive or be able to deploy AI and gen AI type use cases in their on-prem or cloud environments. AI for, I really have, it's a two-prong for me specifically, I call it AI for and the first step is that is for ourselves. And as I think about that as it's early innings here, how are we using AI capabilities, tools to help improve Dell technologies, internal processes and business processes. They have a better operational experience for our team members and be more effective. The secondary part of that is AI for our customers. And what I mean by that is taking those lesson learned, those potentially best practices, potentially a new set of service offers we could bring to market or bring to bear for our customers to really help them on their journey. So it's really about sharing knowledge with our customers. And then lastly, the AI with, you know us for a long time. We are embraced in open ecosystem and this is no different. So AI with is about having strategic partners up and down the stack from hardware, software, platform layers into software, aligning ourselves with strategic partners to really bring better set of solutions to our customers. So that might be software company, LLM providers. Yeah, so if you think of the stack, if you think of the stack, I'll start with hardware real simplistically. If you think of the three layers of a modern AI stack. So from a hardware layer, you got infrastructure. So think of what we did with NVIDIA in our GPUs in our 9680s, right? So it's having world-class silicon and our compute nodes to bring that as a foundational element in the hardware. And the software, you start thinking about things like the abstraction layers and the OSs, right? It could be things we're doing with Red Hat and some other OSs, if you will. You come up the stack into the platform where this is where your tools and your models are there really to make the day in the life of the developer much easier, right? So they can develop applications, AI and ML applications for the infrastructure for their enterprise. We do partnerships with that, like say Meta with Lama2, Falcon, I mean, there's a lot of folks in that, there's a lot of people in that bucket and full transverse. It doesn't, more than doesn't. And then you get up into the, in the higher level stack of the application layer. And this is where we did partnerships with folks like in the data area with snowflakes or recently with Starburst, but there's so many more up there. But all the way through the stack, those are just some examples. We're going to be partnering to provide world-class solutions to our customers. The unique thing is you have to wrap that all in services because as you know, right now they're all kind of independent piece parts and our opportunity, I don't know, where I think we're uniquely positioned as being that integrated to bring a lot of those different pieces together in that open environment. And you guys have made some announcements since May. I think there was a reference architecture. I think I saw the financial analysts meeting some things on Apex. Yeah, we've actually, there was a, and since DTW, I guess going back to where you were a minute ago is that that was really setting up, I would say what I call the foundation for us to enable our customers to embrace and drive AI in their own enterprises and organizations. And I would say the key things that we did is one is we expanded our Apex offer. And that was about business agility, workload flexibility, a whole bunch of things in there that we did. And it was great set of work by the team. In addition to that was the introduction of Project Helix. That was the work with NVIDIA hardware software, bringing that together for our customers. The other big one was around cloud native edge. And I know you're going to be talking about edge later on in the conference, but it's really building that software platform in delivering it in a very easy and secure way for our customers to access data at the edge and use that. And then lastly, it was a whole suite of solutions, both for AI and Gen AI solutions. And that was a cut across multi-cloud, a cut across data and data preparation, a cut across security, it's so many different areas that occur on. So a lot of great innovation by the team to really help our customers step forward this journey. Just a few months. Yeah, things happen at best. I want to come back to how Dell is using AI and what you're learning. There's a lot of customers, they're experimenting, you know, they're using things like ChatGPT for marketing copy, writing code, stuff like that. But they're really trying to figure out where the value is. Where are you seeing the value? Sure. And I would say as I think about the customer landscape and kind of what's going on in the industry right now, what I'm seeing and the people I'm talking to across a lot of different industries, it's a, I'd say they're exactly right. They're in the assessing, learning, POCing kind of phase, we're all learning together. With that said, I'd say there's kind of three or four key use cases that are really developing and are maturing. So one is around what I call around customer operations and that's around both pre and post sales and it's around how do we enable our agents and our customers to have a better experience. There's a whole thing around what I'd say content creation and management in that domain. And then there's a lot of things around software product, assuming software developer productivity. It's really about how do we make the team members more efficient in what they're doing and how they can have our developers who are highly thought of, highly paid, very talented people. They're working on the most important things versus kind of the noisy things and we're helping do some of that stuff. Now for Dell specifically, we're leaning into a bunch of different use cases that we're driving. So in the spirit of the AI for, we have a modernization effort going on and really how do we improve our entire operation. One of it's, we're focused on what I call sales and go to market enablement and that's really how do we enable our sellers to be more productive? How do we provide a more personalized experience for our customers as we go to market? The second one I would give you is around our customer operations on the post sale support and we have doing a lot of great and interesting work. So if you talk to some of our customers, they'll tell you, hey, we love your support teams. We always get the right answer. Unfortunately, sometimes it takes too long to get to the right person or get the right answer. And we're doing some creative work with AI and Gen AI to actually flat that entire ecosystem. Almost thinking about giving them a chat GPT or a prompt, if you will, where they can put an error log in and in seconds, tell them through our knowledge bases, our best practices, our white paper guides and tell them in seconds. Basically, here's your error. Here's the specific actions you need to take. Here's a link to actually show you in the knowledge base exactly what the problem was if you want to know a reference. And there's also an opportunity if you're an agent to actually click about and say, and I want to provide a customer communication on a regular interval when something happens. So we can actually take a lot of that latency in noise zone. So those are some of the big ones we're focused on as a team. One real example, I guess a great example I can give you for me in the early days. And one of our products, and I won't use the name, had a memory leak. And this issue had persisted for a period of time. And it was causing some frustration with some of our big customers because they were feeling the pain. In addition, it was costing me a bunch of money. And what it was was, I was throwing cargo at the problem to help one of our customers because I wanted to do the right thing by them. But also taking our talented resources and say, hey, we got to go figure this out. And our teams were looking like, do I write the code wrong? Did I have a syntax issue? What was the problem? So this specific product actually has a part of its open source and a part of it is our controlled IP. We just took the open source, we put it into one of the models and said, just walk it and tell us if you find any issues. And actually within two minutes it came back and actually said, told us you had a order of operation, more or less an order of operations. We just had two of the lines. So there was no syntax issue. So our team was, what they were looking for was right. It was just, it was an order of operation. Do this first and then, yes. And that was causing the memory leak. And within two minutes, something that we'd been working on, I want to give you the whole timeline, but a long time with a lot of money and a lot of frustration, we were able to find that out and it would cost it in less than two minutes. We made the change that same day. We did a test in validation and I had it back in the customer's hands with a fix in less than two weeks. So that to me was the ah-ha. Wow, that's amazing. And back to what you were saying before about the customer and the service capability. So the customer who doesn't get an immediate response, he or she might be starting, they might Google it and not sure if their response is accurate. They're waiting for a response of frustration. So that value there is obviously the customer satisfaction, but it's also the cost of doing that service drops right to the bottom line. So those are the types of use cases that I would expect are really going to start driving AI to gain share. In other words, I'm spending on AI. I'm getting value. Now let's spend more. It might even trickle back to some of the other initiatives that to your point rises all ships. It rises all ships. It absolutely does. And we definitely see as an impact on customer experience which we think will have a direct impact on stickiness and revenue opportunities as we go forward. We definitely see benefits in regards to productivity for our customers and for ourselves, which then they can figure out how do you want to take those resources and redeploy them, right? And some of that could go back to a P&L to be more profitable. Some of that could go back to investing in future innovation or just on higher valued projects versus kind of the noisy projects. So a lot of opportunity there. I want to bring up the power law of gen AI that we developed a couple of months ago. And what this shows is in the vertical axis is the size of the model. So you've got the big cloud guys, you've got the llama twos, you've got open AI kind of on the left hand side. And then the horizontal axis is the model specificity. And then orange line is a sort of historical example of the music industry where you had very few music labels dominated the industry and you had a really long tail. And our premise is with all this open source, you can see those red lines, it's pulling to democratize AI. And then you see in the middle there on premises, the specialized AI, that model highly domain specific models. That's what you just described. Using your data. Now my question is will that, our premises, it'll happen on on-prem. Of course it's self-serving, but will that happen on-prem? Are you talking to customers about, hey, I don't want that IP leakage. I want to do this in my own estate. I think it could be more cost effective, safer, more secure. What are you hearing from customers here? Yeah, I'd say the big thing I'm hearing from customers is it's, as I said, early innings and it's still immature, I guess, and it's nature in the enterprises truly haven't seen the benefits yet because they're working through this. I would say the biggest trend, or thing I'm hearing from customers is that the concern, a handful of things in full transparency. One is about complexity. Two is around talent. We have the right talent that actually could do the data prep or could I do the build the model, train the model, tune the model, prompt the model, what have you. In addition to have the concern about security, and now is this a tax service bigger or not? And the biggest thing they're talking to me about is IP control, kind of where you just ended. So they are worried about all that stuff. And they're realizing there's an opportunity here in regards to that they could do it a lot more cost effectively and have a lot more control over their data and their security and their IP control if they do it on smaller models, on-prem models with some of these open source tools that allow them to go do that. And that's something that they're all leaning into. So I would say to you right now is we've talked about in the past, all data is not created equal. It is a hybrid world. So I do see a hybrid AI world as well. And depending on the use case, the value of that use case and where you can get the best service that's in a cloud or in a colo or an on-prem type environment, I do think we're going to see a hybrid model. I do believe more of that because of the critical IP, I think more than 80% of the world's most critical IP is still on-prem. I think that it plays to our hand and to your point, that's a huge opportunity. So you described earlier that sort of modern data stack. I'm comfortable that the core infrastructure in the OSes, there's plenty of that that you can bring on-prem that's world-class. Are you confident that the tools and the models and the LLMs that you'll be able to bring that to create as rich of an experience as the cloud guys? The answer, we have to, right? And the quick and dirty, but it's we have to have an easy, but we have to make it a simple on-rim for our customers so they actually can do what they want to do, which is keep their critical IP on-prem. So the answer is, I think we can with the open models. You made the comment about how do we democratize certain things? And you and I were talking before, one of the things that I have in my head is, one of the defining technologies of our time was the World Wide Web in 1989 and that democratized the access to information. And what I'm seeing with AI and specifically Gen AI, that's changing the game, it's democratizing everyone and all of our opportunity to use AI. And so it's democratizing AI and usage. In addition to that though, it's actually democratizing Gen AI specifically, it's democratizing our opportunity to use that information or that data in a more impactful way for our organizations for each other and our data live. So I think that creates just tremendous opportunity and all these different tools, you're gonna see innovation continue to happen. It's still early. So I'm seeing innovation in the Silicon layer and the hardware. So you think about the NVIDIAs, the AMDs and the Intel, there's about 30 or 40 other startups I'm tracking in that space that are all doing some unique and interesting things depending on the use case. Some of that's on the big stuff that we're doing with large language models. Other stuff is going on the small models. Some of it's going for inferencing, especially at the edge and other things. I think at RAG and inferencing. So a lot of stuff going on that space. You come up the stack and you think about the closed models or the open models. You think about what Meta did with Lama 2, some really cool stuff going on in there and what you can do. Obviously what OpenAI has been doing, there's just so much innovation happening. There's a lot of startups in that space that are actually doing some really cool things on what I would call, going back to my old storage days when we talked about data efficiencies where you would compress and de-dupe and thin things. They're actually doing that to the models to make them thinner and easier so you don't need big, huge resources to drive some of that stuff. And it's just, we're in the early innings. You're gonna see so much innovation. Innovation, explosion. Jeff Pedro, congratulations on the new role. You gotta be super excited. I'll give you the last word. Hey, I think it's a first thank you for having me. Let's start with that. I suppose it's more important. And I just want everybody to understand that I do believe that this is an important period in time. I think AI is going to change the game in regards to how we work and how we live. I am, as we say at Dell, I'm a technology optimist. I truly believe it can help us make the world a better place and help really drive human progress in a better way. And I want to be very clear, Dell, we will not be sitting on the sidelines just like every other technology transition. We will be at the forefront. We will be standing side by side with our customers and we will help them bring this open ecosystem together so they can benefit in the most beneficial way to them and their customers. Fantastic. All right, Jeff, thanks again. Thank you. Thanks for watching SuperCloud 4. We'll be right back with more live and on-demand content from our studios in Palo Alto with John Furrier, Rob Streche and myself. More great conversations around AI. Keep it right there.