 Hey, guys and girls, welcome back to theCUBE. We're live at Dell Technologies World 2023 from Mandalay Bay in warm Las Vegas. Lisa Martin, Dave Vellante here. We're going to be talking about some cool stuff next, pun intended. David Hardy joins us, PowerEdge product manager, responsible for power and cooling solutions at Dell Technologies. And Brandon Peterson is here as well, SPP, a product at cool IT systems. Guys, welcome to the program. Thank you. Hi, thank you. Brandon, talk to us, give the audience an understanding of cool IT systems. I know you date back to 2001 and the catalyst to start the business had to do with the gaming industry, but give us the context there and how the company has evolved since then. Absolutely, so cool IT's been around for over 20 years, as you said. We were founded in a garage in Calgary, Canada. The company's grown quite a bit since then. We got our start in desktop liquid cooling and when the industry started to heat up the data center industry, we started to transition our technology over to use for more commercial application in data center servers. And so we've been focused on that. We've been partnered with Dell to develop and validate liquid cooling servers direct from their factory. So you started in desktop? We did, yeah. So we have desktop liquid coolers that we still design and manufacture in ship today as well. You got to explain that. So the desktop was what, running like super computers on the desk? For gamers and so they'll have a small radiator and a small pump inside of their desktop computers and we adapted our technology for use in those as well. The gamers always want to overclock, they want to run it as hot as possible, break all the rules. Yeah, my kid's a gamer. Every time I go into his room, it's like so hot. I get it. They'll talk to me after we'll get him one of those devices. There you go. Cool. Doug, talk a little bit, sorry, David, from your perspective, a little bit about the partnership and what you guys are doing together. We've talked so much at this show about generative AI, power issues, challenges, efficiencies, opportunities, but talk to us about why Dell is partnering with Cool IT Systems and what's the value in it for you? Sure, and there's a very clear reason why we decided to partner with Cool IT. Back in the mid-2010s, we saw where this was going, processing power was really going to continue to increase, get hotter and hotter. The need to do enhanced cooling made us investigate a lot of different technologies. Direct liquid was one that we figured out was going to have the highest performance, best longevity. Cool IT as the leader in this space was a natural for Dell to partner with leader, partnering with leader, being kind of best of breed for the combined solution. So what, I mean, I've been around a while, so I remember looking at IBM mainframes before they went to CMOS. It was like it made a coupled logic and they were hot and it was all liquid cool. You couldn't cool those things with a fan. You couldn't do it with ambient air, hot aisle, cold aisle, all that stuff. So what's the state of liquid cooling today? What's the substance you're using? What are the, is it chemicals? Is it, you know, how should we think about that? Do you want to take that one? So the state of liquid cooling today, there's a number of different technologies on the market. We separate those into direct liquid cooling, which will have what we call single phase, which was one of the technologies Cool IT offers and two phase, which just basically means the fluid's boiling off. Those same two options exist in another version of liquid cooling, which is immersion cooling. So there's a dielectric fluid where the servers are dunked right in those tanks and it may stay as a liquid or it may boil off. So single phase or two phase. And then there's also indirect liquid cooling. So rear door heat exchangers or in-road coolers, we're actually putting water closer to the racks just not into the IT equipment like you are with whole plates. So we offer both direct liquid cooling, single phase and rear door heat exchangers that work integrated with the rest of the system. And what are the advantages of those relative to some of those alternatives? Why did you choose those? Yeah, so there's two main thermal challenges that we see in servers and in data centers. One of those is the local cooling problem at the chip level. So direct liquid cooling gets cooling right to the chip. It targets the hottest spot on the processors to take that heat away. The other challenge is more at the rack level or the data center level. And some of those technologies will deal with one or the other. Direct liquid cooling can deal with both. Rear door heat exchangers help at the rack level or the data center level, but don't get cooling quite to that chip. So DLC is more versatile, if I understand that, is that correct? Very versatile, yeah. And what are the economics like in terms of the different approaches? Yeah, so when we look at different deployments across the world, we would look at an ROI somewhere between six months to two years, depending on power costs. If you can offset some of your infrastructure costs, some of your cooling equipment at the building level, you may not have to invest as much up front for chillers and pumps and fans and HVAC systems. And then you're obviously having an energy savings. You're not running the server fans as hard and you don't have to run your chillers. You don't have to run your HVAC systems. So DLC is more expensive, but you get the payback faster. Is that correct or not necessarily? It is a cost adder that has a strong ROI. Right, okay. And so you presumably know a lot about this, right? And you consult with customers as to help them make the business case, how they're going to get offset some of those costs, whether they're replacing chillers, maybe lower capacity, or maybe just getting better efficiency out of them. Is that right, David? Yeah, by and large, but all customers are unique. So they've always got a particular use case that we help them solve for. Not, it doesn't do a lot of good to throw general TCO numbers to customers that have use cases where they bring us in to not just sell them servers, but help with your entire environment. So when customers are focusing on performance, which directly cooling is uniquely situated to provide support for the top-end CPUs and GPUs, it may not be necessarily an efficiency game. It may not be a when do I get my payback, my cost break even. I don't want to go faster and not blow up. Others have more of a nuanced position. They're like, we're willing to take on this added investment with a little bit of complexity because it gives us higher performance, but we're paying attention to that payback and we're going to monitor what our cost savings are because we not only are delivering performance for different functional groups in our company, but they're answering to the C-suites who are now asking about sustainability metrics. How is this reducing our carbon footprint? Things of that nature. So even some of the complexities within DLC, the benefits, it sounds like far outweigh some of the complexity that's still there. I mean, that's the beauty of liquid cooling, to be honest. I'll start by saying we help our customers make the transition when it makes sense to them. That's a big part of being a trusted advisor, not just selling something. But when they're ready to accept it, it not only is higher performance, but it's so much more efficient at capturing heat that it's one of those, you know, it's a win-win, basically. It enables higher performance, enables greater efficiency, reduces the power burden, it's just a great solution. Who are your customers? I mean, I don't necessarily mean if you can name a particular person. I'll answer from the Dell perspective and Branick can answer from the broader perspective, but for us it's been high performance users, be it in financial companies, electronic design companies, national labs, things of that nature, but they've been really the heat seekers, trying to maximize compute per rack. Right. And yeah, we have a similar perspective, so we've seen government research supercomputers be more of a traditional use case for direct liquid cooling, but the transition that we've seen in partnership with Dell the last few years is a transition to more enterprise applications like telco, financial, medical applications and so on. Do the other co-lows involved in the sort of decision process, or are they more sort of, you could put whatever you want in our systems? That's a great question. And again, I'll answer from the Dell perspective. We have a, obviously a lot of our customers have gear in co-lows. Co-location providers, they have a lot of smart people on their staff that are behind the scenes, so they know what's going on. They see this as a need that maybe isn't quite ready for prime time, and so they're not proactively rolling it out, but they've studied this and they know exactly what it's going to take to stand it up on their floor because they know they're going to have to in the next couple of years. Yeah. And by way of doing lots of deployments, we've been in a lot of co-location environments and there are the same questions up front in any data centers. Just help us through the process. How do we do this right? And we are able to walk them through that and with Dell have deployed in multiple co-locations. I love walking through data centers these days. It's just amazing, isn't it? I mean, you guys obviously do that all the time. And so you, I think I'm inferring, you're going to see more and more liquid cooling in these co-lows over time. Well, one of the things about using liquid cooling in a co-lo that you'd especially like is that because it requires less air movement and the fans are spinning more slowly, it's so much quieter than a traditional data center, especially running high performance gear where those fans can be really screaming loudly. David, can you comment on, you talked about a number of industries that are being impacted here positively. We talked yesterday with JJ Davis. We talked a lot about ESG and sustainability. We talk about, and one of the things that she mentioned an interesting stat was that 95% of RFPs coming in have a sustainability or an ESG requirement that customers need to work with companies like Dell and cool IT systems who are going to help these customers achieve their sustainability goals. Talk about DLC as a facilitator of that for your customers. Sure, sure. So one of the attributes of direct liquid cooling as we've deployed it is the heat exchange is a coolant capturing the heat out of the IT, exchanging the heat to another liquid loop. What you are able to do with the solutions as designed by cool IT is use a very warm, a high temperature for heat capture and transfer. What that allows you to do is eliminate running chillers which consume a lot of power overhead. It in some cases allows you to eliminate even evaporative coolers which in certain environments like say in Las Vegas, evaporative coolers can be very energy efficient but they consume a lot of water. Place like Las Vegas where water scares, you want to try to do something and really the most efficient is to do basically a dry heat transfer that you can only do if your solution relies on warm water heat exchange and Brandon I don't know if you want to add anything to that. No that's perfect and we've seen that for many years out of Europe, the RFPs with requirements for efficiency and we've seen customers successfully respond to those because of direct liquid cooling and being able to reduce the energy used to cool those data centers. What's the broad trend? I don't know if you can answer this but what's the broad trend thinking across the whole landscape in PUE? Is it trending toward one or is it going in the other direction? Well, we know things that are definitely trending. Power per components continuing to go up. Power per rack use space is continuing to go up. So because liquid is a more effective means of heat capture it's going to be capturing to liquid. The question is exactly how is that going to be deployed? We're using it a certain way right now. I'm sure that's going to evolve but exactly what that ends up looking like in five years it's going to be different but we can't exactly predict. We're jointly we're on standards bodies for the industry to help push things in certain directions that we feel strongly about but what the exact implementations are going to be, yeah you can't really say yet. But if I were to, I don't know somebody must do this there must be some data center research company that does it but if I were to take broadly the average PUE across all sites I'm inferring that it's trending in the wrong direction and that's a good thing for you guys because it's got to be at least stabilized or reversed and the only way you're going to do that is with liquid cooling. Is that a fair assumption? Well I would say a fair assumption is definitely that rack densities are increasing that to David's point the chip heat is going up and on top of that the temperature you have to hold those chips at is actually coming down so they're hotter and you actually have to keep them at a cooler temperature and so some of those traditional technologies are just not going to be able to keep up with that. I think that watching a metric like PUE helps and I think if you're measuring something you're going to look to improve it over time so we hope to help bring that PUE down and I think as more data centers choose efficient cooling technologies we'll hopefully see that number come. I mean are people actively measuring that or is this just sort of something that is a fun metric to throw around? Oh they're measuring. Yeah okay so they're still serious about it. I mean last time I was involved in this space that was the holy grail right? I mean in search for PUE at one. Yeah so PUE has some limitations as a lone metric to measure data center efficiency so people are using that in combination with how water usage effectiveness or utilization effectiveness. There's other metrics that are trying to combine the different pieces there are more holistic efficiency metric. It's a little bit like talking about sustainability in a broad sense where it's not just the materials you make your system of or your cycling plans it's what's the source of the energy that's going to power that system or the life of the system. It's the same thing we're having to look or we're helping customers look more holistically from the server to the rack to the entire data center to try to maximize the efficiency there. Is what happens to the liquid? Is it sort of sustainable, it maybe evaporates, you've got to replenish it? Well it's closed loop. Yeah so it's closed loop period of the end right? So we circulate a water mixture with profiling glycol to the IT rack that's a closed loop and that fluid that's maintained properly will last over 10 years and it's pretty environmentally friendly relative to some of the other fluids. I think the big thing in the industry this year that everyone was watching was the PFAS fluids and how those would be banned eventually and we've seen a lot of the two phase vendors have to move away from certain fluids that were not environmentally friendly. With us it's just a mixture of water and it can be disposed of for local standards very easily. What's the maintenance protocol? What's the best practice? Do you have to inject certain other fluids to keep it clean? Yeah so we check in on the systems every six months. We'll take a sample of the fluid and we'll analyze that and make sure everything's within acceptable limits. If it's not we send out a chemical adjustment package that gets added into the system and everything's running happily and normally after that. Just put a little tang in and stir it up. You're good. Exactly. David first to you and then Brandon as in our final minute here talk to us about the maturation of the partnership between Dell and cool IT systems and some of the things if you could kind of peek into a little roadmap that the audience might be getting to have their hands on any time soon. Well that's a sneaky question. You're trying to make us devolve secrets here. I have my ways. But I'm actually going to say something that may sound a little boring but our customers have historically been really high performance, kind of the exciting big projects. We're working on making this more consumable for the mainstream enterprise customer. We want to make it pretty boring. We want them to be able to order it, know that it's going to show up like a traditional Dell server with pro support wrapped around it. Have it feel like a mainstream transaction and have the management of that server in their environment feel like all of their other air cool gear. That's our goal, very, very boring. And I'll be a little bit boring as well. I'll be backwards looking. So we've been working with Dell for a very long time and we launched product all the way back to 14G. We've been expanding upon that effort ever since. So we work very closely with Dell. We're developing validating liquid cooled servers that their customers can trust and can be deployed to your pro deploy and pro support. Trust it, make it more consumable. Guys, thank you so much for joining David and me on the program really giving us the lowdown on DLC, the opportunities and benefits for customers. We really appreciate your time. Thank you. Thank you very much. For our guests and for Dave Vellante, I'm Lisa Martin. Up next, two CUBE alum join us, Doug Schmidt and Satish Iyer. They're going to talk about how customer priorities for services are changing and the five ways in which Dell is responding. You won't want to miss.