 Hey everyone, welcome to theCUBE, the leader in live tech coverage, covering supercomputing 2023 from the Maya High City, Denver, Colorado. I'm Lisa Martin with Dave Nicholson. Dave, this is my first supercomputing. You were here last year, I'm so excited to learn about what's going on in HPC, Quantum, AI. We're going to have some great conversations the next three days. Looking forward to it, looking forward to it. We've got an alumni back with us. We have Shreya Shah, the portfolio manager at Dell Technologies, and another Dave. We've got a two Dave quota set here, Dave Kane, the co-founder of Denver Data Works. Welcome both of you, thank you so much for joining us. Thank you. Thank you. Shreya, talk to us about the state of the market today. To say it's dynamic is a massive understatement. What are some of the things that you guys are seeing and then we'll get into the partnership with Denver? Absolutely. Yeah, one of the biggest trends that we're seeing in the industry right now is that the power needs are going up significantly. And this is primarily because of TDPs or thermally dissipated power for silicon. That is exploding, right? So when we say silicon, we're talking about CPUs, we're talking about GPUs. And what we're seeing across our customer set is that we have folks that are sitting at single digit kilowatt per rack power. And there's a spectrum, and at the high end of the spectrum, we've got folks that can support north of 100 kilowatts per rack. And so, you know, as we think about being able to harness the power of AI and as those computational needs grow, we're seeing that there is a deficit in the demand versus the supply from a data center aspect cooling power, computational needs. And so in the next couple of years, we're seeing customers quickly trying to pivot their infrastructure to be able to realize and be able to productize, be able to support these in their data centers, whether it's existing or, you know, new colos, whatever that may be, right? And so this data center journey or transformation, call it evolution or revolution, wherever you may be, that's already in effect. And I'm kind of thinking about it as datacenter.next. And so how does Dell and Denver DataWorks come together to help answer and solve some of these problems that are emerging so quickly is what we're really excited about. Dave, give us a little bit of a backstory on Denver DataWorks. Ironically, DE and VR, we're in Denver, so I thought I got to bring out that irony right there. Denver and Dave and Dave, right? Yeah, but you're based in Alberta, correct? We are based, we're a Calgary Alberta-based company in Western Canada. What is mission vision? What was the catalyst to launch the company back in 2017? Yeah, so five years ago, as we founded the company, we realized this collision course was setting up. The compute's getting hotter, it's getting faster, and for AI, you need clusters of computers, right? So the single server isn't a thing anymore, right? You actually need to train AI models, you need hundreds of servers in some cases. So as you put that pressure on the infrastructure to output hundreds of very hot computers, we started to look at the infrastructure and figure out where could we go to make a difference. And Tria, talk about why Dell has chosen to partner with Denver DataWorks to really tackle some of the challenges and address the trends that you're seeing. Well, it comes down to partnership. Again, if we look at our customer side, there's a huge breadth of who we need to support. We want to provide flexibility, we want to provide diversity. Diversity in terms of silicon choice, diversity in terms of your interconnect, the diversity in terms of being able to meet the customer in their journey, wherever they may be. And so when we think about partnership and that last mile, or how do we actually make everything work end to end, this is where we're very excited to work with Denver DataWorks. You mentioned that you're not talking about just single servers doing these things. You mentioned interconnect. There are a lot of folks who are talking about this as the connectivity centric era, right alongside the GPU, CPU, et cetera, importance of those things. So you're setting up very, very complex environments. You are partnered with Dell, building world-class technology. That technology could be deployed in someone's data center. What's the pitch for relying on your expertise instead of having people rack and stack it in their own data center? I assume Dell is fine either way. We're hearing from a lot of CIOs and CTOs that they just don't have the time to do that. That it's to call it the FOMO, if you're a missing out factor, call it the desire for time to market, but talk to us about the value proposition that you're bringing. Because, yeah, in theory, I could cobble it together on my own, right? Yeah, but it's hard, right? It's complex. So what Denver's done is we focused on a hybrid approach where we can bring the data center to your data. So one of the CIO's challenges is also security, right? Sovereignty of their data. If I have healthcare data or something, do I want to send it up to the cloud or somewhere? I'm probably not even allowed to. So what we've done is we've built modular data centers that are fast to deploy, and we offer them as a service so they don't have to get into the large capital expenditures and get into worrying about where their data's going because we're bringing the cloud to them. So in a modular way, we bring things forward, and there's an environmental angle. We've built our data centers so they don't use water, for example. So there's something completely different that we're moving towards. And in the meantime, we're also deploying extremely large clusters with Dell and Dell Professional Services because it's tough to put these things together. We're deploying large clusters in Houston, for example, with extremely large language models being trained there on traditional data center clusters, all put together with the shared expertise. Over time, we'll bring those to you instead of you coming to the data center. Oh, go ahead. Yeah, no, no, so is this, so are these, would you consider these multi-tenant or single-tenant environments or a combination of the two? It's a combination, yes. We offer a cloud as a service, right? So it'd be multi-tenant environment where you're running on an NVIDIA super pod, for example, and there could be others sharing it with you, or it could be your own. So it's very flexible. So we build it in a flexible way so you can come and go. And to make it work with all the pressure that there is on the application set, we follow all the best practices and standards, right? So this is an enterprise-grade solution, not typically something that most companies have the ability to build themselves. It's hard. Shreya, share the impetus or the catalyst for Dell engaging with Denver Databricks. Was that a customer-driven? Talked about meeting customers where they are in this journey, in this transformation, in this revolution. Was that customer-driven? I think there's a lot of factors, right? One of the things that, and I go back to this data center transformation, one of the capabilities that Dave and team have kind of brought to light is immersion cooling, okay? If we look at the journey that customers typically take, you have your free air cooling that you start with much more from a simpler way to deploy. Then you have your lack or liquid-assisted air cooling, which is self-contained cooling within the node. You have open lack, which is self-contained cooling within the CDU, the central distribution unit, the cooling distribution unit. And then you have your rear-door heat exchanger where you're starting to do the plumbing for your facility water. And that's where it gets very, very complicated. And then you get into your liquid cooling, hybrid, and air complete liquid cooling, and then you have your immersion cooling. And so that spectrum and the breadth that we have to provide to the customer with the MDC capability and with the immersion cooling capabilities, we're very, very excited that we can service the customers not just at the low end, the mid-end, but also at the high-end. That prep is impressive. Yes, it's a good thing. So we always do our due diligence before coming in and talking to folks in this context. You have a PhD, and her job title is Immersion. Yes. And I will freely admit that I looked at that and I thought, huh, I didn't know what that meant, and now I do. Yeah, so Amy Short is fantastic, right? And she is, in fact, a PhD. She's a chemist as much as she is a technician and a technical expert. And so Amy runs our immersion plans because we absolutely submerge Dell servers, right? They're swimming in the hot tub and we cool them that way without the use of water. And it's a place where we can get, you mentioned some rack densities in your opening feast. Today we've been running for 15 months with zero hardware failures in immersion at production scale. And we can run sort of 150 kilowatts of rack easily and our technology can cool another 30 or 40% above that for the next generations of whatever's going to be launched next year by NVIDIA and AMD and Intel and Dell, right? We're ready for it from an environmental and a cooling perspective. From an environmental perspective, a cooling perspective, share with us some of the main customer pain points or challenges you guys together are taking off the table. Do you want to go first? I can go first, sure. I mean, a lot of the data centers can't handle the heat, right? So literally you can only put one or two of these brand new servers, the Dell XE9680, is one of our favorite products. We've been buying handover fists from Dell this year, as many as we could get, actually. And so you can only put one or two of those in a rack. So think of a seven foot high rack and there's only two servers in it because the building can't handle any more heat density than that per square foot. And so you're running out of refrigeration power because air just doesn't move enough. So you can't put three or four or five or 10. In our racks, we put 16 side by side because of our capability. So you're going from sort of two in this space of this table to 16 in the space of this table. And so that's a problem because to run a large language model you may need a thousand servers. So now we need a football field-sized building with 700 or 800 racks for all the networking, connectivity, and the servers. So we take a football field and we compress it down into something that's under 900 square feet. And so there's some pain points there for a few things, right? It's expensive to operate a football field-sized building like the auditorium we're in today and try to keep things cool. And so we found, we think we found a better way. Anything that you would add, Chirin, in terms of the challenges that you're really knocking on the table? Yeah, I think with this generative AI craze everybody wants to harness the power of AI, like I said. But that's actually in conflict with some of our carbon emissions, the reduction in the goals that we have. So how do we bring those together? And I think there's a lot of innovation in this space that will help us get there. One of the things that we've talked about, and I believe we talked about this previously as well, is the heat capture. How do you optimize that? And then how do you reuse that so that you have that circular loop that you can sort of minimize your footprint as you're going up in your computational needs? You want to be able to grow some strawberries as well. Actually, greenhouses are one of those first, I'd say, look into how can you recycle that energy? How do you reuse that energy? But the sky's the limit here. How can you power your, or how can you cool your industrial systems and your buildings and houses and pools and whatever that may be? Well, Dave mentioned hot tubs earlier. I imagine that the natural solution would be to just have like a spa on the outside wall of the data center. It would be perfect. A little less secure environment, but you know. Exactly, who knows, right? We'll see what the future holds. Think of, on the surface, think of how inefficient it is to heat a hot tub with GPUs. However, with the extra latent heat. But so it's a very serious consideration when you talk about heat dissipation. And it's not as simple, if I'm hearing this right, it's not as simple as, well I have a data center, I've got a raised floor, I have power, I have cooling. Maybe I'll just rack this stuff up. It's not that simple, if I'm hearing correctly. That's exactly right, because it's a classic engineering and physics problem. Air can only move so much heat. So does that then mean that we've seen this move, this mix of IT infrastructure being deployed on-premises and in the cloud, or off-premises, however you want to define that. Do you think that AI, and the sort of the wind behind the AI sales, is going to drive more people because of dynamics like this, to do things as a service? Are you seeing at Dell, as a service, the default method for folks seeking to do modeling? To start out? Or are you seeing it hybrid still? What's, you know, I know you can be the arms dealer and provide to partners hardware that you build, and that could be a mix of directly to customers through partners, but what are you seeing in that regard? I don't think the answer is either or, it's an and, and it's a hybrid, right? Depending on your workload and where you are in that journey, because training doesn't necessarily have to be the largest model size, you could be doing some fine tuning, or you could be doing some training with much smaller models, and depending on, you know, some of the things that they brought up, security, HIPAA for example, right? That will force you to consider a hybrid approach, and I think going forward that's really not going to change, and so how do you make them work together is going to be the key to success. That makes sense. Let's talk about sustainability, we can't have conversations about power and cooling and heat dissipation, without really understanding how you can enable organizations to, what's your vision for sustainable AI in the future? Question to both of you. Yeah, I think from a Denver point of view, we focused on a few things. One of them is water use, right? Outside of the carbon neutral and net zero and all the other slogans, data centers use a lot of water, right? And you can loop it around a few times and do other things, but at the same time, when you pull water from an aquifer or a river, it's no longer in the aquifer in the river, right? It's now gone to a different place. So we've really focused on that as one of the key things. And then energy efficiency and land efficiency, right? If you imagine it's a little different in North America, we're a little easier here with more land, but if you imagine Europe and massive million square foot data centers, they eat the density of people and land and the environmental conscience there, this is a real challenge for them to join the AI wave and this technology set, because it's hard to build football field sized data centers that meet the new environmental regulations everyone's focused on. So for us, it's a whole package, right? It's a combination of things, including right down to land use. What's the most efficient use? Should we put a bunch of servers all over the place or can we compress that in some way, right? I don't know that Denver has all the answers for that, but at least we're trying, right? We're taking a shot at it. And what you're doing isn't happening obviously in a vacuum, you know, Dell is doing work to optimize hardware. You're optimizing at the data center layer, but if you look at it holistically, the hope is that, yes, we're using resources, we're using energy, that's going to drive carbon up. However, if the insights gained by doing the work with AI prove to be what we believe they will be, the efficiencies gained are going to be able to drive carbon down in other areas that you may never personally be aware of, right? So you've got data scientists and teams working on things and what you know is, wow, my data center is nearly on fire, but since it's not on fire, we're doing a great job, but what you don't know, maybe, is that they are, in fact, doing material science that could not have been done otherwise, that is transforming the way that other production things are being done. So I like to be an optimist in that regard. Feel free to join me. Well, sure, right, because yeah, there's the terminator view of AI, right? Is that a problem? Or is AI a beneficial thing and there's AI for good, right? And we're on your side of the optimism side. We want to make sure that it's also good for the planet and it's resources, right? So we need to take a stab at that as well. So now you can get a compounding effect, right? Do you guys have a favorite customer example that really articulates the value together what you're delivering to customers, even mentioning it by industry works? Yeah, I think we're very active together in a few places. The first is the large language model groups, right? So everyone's jumping on that and you see all of the top 20 startups that everyone's well aware of in large language model space. So we're working very closely together with some of those. But interesting is we're seeing a lot of collaboration between Denver and Dell in Enterprise, right? So Enterprise, whether it's a digital twin of a factory, right, or it's optimizing operations and those sorts of things. The technology's starting to come there. It's early days for the Enterprise, but obviously Dell is, if not the world leader, certainly one of the world leaders in that space in the Enterprise by a market share or any other measure. And so we're thrilled to partner with Dell because we think there's a second wave here that you'll see beyond the research institutes and the large language model, you know, billion dollar fundings we're seeing. We're going to see regular rank and file enterprises show up and we're starting to see that. We're seeing a lot of activity from cities. Interesting. Right, optimizing transportation, right? Or the sewer system. Or citizen access with large language models. Call into the helpline, and if you can only speak Portuguese, the operator speaking English, you can now talk because a large language model's in the middle. So we're seeing customer service applications like that that we're working collaboratively on all the time. I think to add to that, one of the things that I want to also call out is that we've all been in the AI craze, but let's not forget HPC and high performance computing and modeling and simulation that has been around for so long. It's not going anywhere. Oh Shreya, that's so 2022. Didn't you see the llama walking around outside? But in all seriousness, you know, AI has been brought up on the backbone of HPC is what I say from a data center aspect. And you'll still be doing inferencing and you'll still be doing training on just the CPUs for as an example, right? And so this breadth of customers and the workloads, it's really not going anywhere. AI has taken off on a trajectory of its own. So how do you service all these different customers with different needs is where, you know, we're positioned really well to go and attack the market. Yeah, we see often conversations about AI. If you look just under the covers, you realize this is really machine learning. It's not really strictly AI. And then if you look specifically at the projects that we see CIOs and CTOs involved with, you can apply that sort of 80-20 rule, at least in the enterprise, where 80% of it is sort of garden variety optimization of processes that are the lifeblood of running a business. So if you're a CIO responsible for keeping the lights on and innovating, a lot of your resources are going towards this keeping the lights on running the business activity. It's never going to make the cover of the Wall Street Journal that you drove efficiency by 13% in some process, but a lot of what is called AI that turns out to be machine learning is in the optimization side. The headlines are always going to be the really sexy cool things. That's right. And especially the kinds of things that you see here. When you walk by, we've got the NSA down the block here, NASA, all of these institutions of higher learning. But where the rubber meets the road in the enterprise where Dell has so much experience, that's what we're seeing. We're seeing a lot of optimization, which is nothing wrong with that at all. Yeah. So with all this momentum, excitement, you're moving fast and furious, like what are some of the things that are next that we can expect to see from Dell and Denver together? I think one of the big things that we hope to see is more of an open standards. The cooling that we talked about, the OpenRack or V3 as an example, giving customers the flexibility and the choice. We're going to continue working on that. We're going to continue partnering on that and you'll see a lot more from both Denver and Dell together. Well, we will be keeping our eyes on this space. Shreya, Dave, thank you so much for joining Dave and me on the program. Sharing your insights, what you're doing together to really enable sustainable AI and enable a lot of optimization for enterprises across the web. We will definitely keep watching this space. Thank you. Thank you. Thanks for having us. Our pleasure. For our guests, I'm Dave Nicholson. I'm Lisa Martin. You're watching theCUBE live from the Mile High City, Denver, Colorado at SC23, we'll be back after a short break.