 Good morning, high performance computing fam, and welcome back to Denver, Colorado. We're here at Supercomputing 2023. My name is Savannah Peterson, joined by my fabulous co-host, David Nicholson. How are you doing, man? Doing well. You came in all juiced up this morning. I've been very excited coming off of a couple of my Wharton classes and meeting with someone from Cambridge. What could be better? You're feeling intellectual and academic. I'm going to do my best. That must be your first. I'll do my best. I'll try to hang. Well, I'm really excited for our next segment. We have a CUBE veteran, Andrea, from Dell. Thank you for being here. Absolutely. And we've also got Paul from Cambridge representing the intellectual academic side of things. It might be. Yeah. I don't do the intellectual after a long time. I mean, that's fair. It is day four. It is day four. We're at elevation. It is arid. My vocal cords are certainly showing it, but really excited to talk to you too. There's been a partnership between both company, or between Dell and Cambridge for 17 years, which is awesome. We've got projects to talk about. Paul, I want to turn it to you first. How's the show going for you? I imagine you've been to many supercomputings. I've been to 22 supercomputings. You might be our record holder on the show this week. And it was much easier 22 years ago, believe me. Really? Yeah. Interesting. I heard this is the largest ever. Yeah. It feels like it. It feels like it. Over 10,000 is what we were seeing. I think we've got the COVID thing out of the way, and everyone wants to come out and chat and talk. It's good. Yeah. They were whittling microprocessors out of wood back then, right? No. 22 years ago. I'm like that old. Wooden supercomputers. Floppy desks. The Beowulf clusters, I was right at the start. That's when I got involved in the very beginning of the Beowulf clusters. That's 20 years ago. I was walking down the kind of history panels out the front that made me feel old. Are you on one of those history panels out there? No. Well, we'll have to work on that. Andre, we had the pleasure of talking to you last year. Big announcements for Dell coming out. I know that Cambridge implemented some of what you announced last year. Tell us about what's going on. Yeah, so last year we used SC-22 for our huge launch of our XE portfolio, in which I kind of iterated, we focused on a couple key things. One being versatility in vendor diversity within our products. The other being able to meet customers where they're at in regards to thermals from a liquid and air cooling solutions. Now we get to see it in action, which is so incredibly cool, because University of Cambridge took our first XE 9640 right off the line. It's our liquid cooled variant. And it's also our Intel Pontavecchio variant. So it's both parameters. Which is, I love that you just said that, Andre, application is what we're starting to see here at the show. It feels like we've tipped over hypothetical and there's so many different use cases. I can imagine Paul being in the research side of this hot topic. You're on the front lines. What are you seeing? What are the students and stuff doing? So this machine is designed to kind of be a converged AI and simulation platform. So we want to run simulations. We want to run AI. And we actually want to run simulations that are informed by AI. And so we want a GPU system which goes into, mainly because of the one API program and environment that allows us to develop codes across platforms. So this is actually the fastest AI system in the UK. We've got 20 petaflops. The system's up and running now. We should be doing early science in December. And one feature of this system that's actually quite interesting is from carbon boxes in my loading bay to LIMPAC took us just three and a half weeks. You had to stand up. A super computer. That is super fast. Boxes to HPL in three and a half weeks was kind of stellar. And it shows what can be done if you focus. Of course we've been working with Dell for a long time to get to that stage in co-designing the box. Yeah. We've been working with Intel in the software environment. And it's rocking. And there's a lot of applications already that we can put on that system that are going to make some real impact. So I had the pleasure of joining in the Dell Summit, HPC Summit, and I saw your presentation. I'd like to hear about essentially your kit. If I remember correctly, it was something around 100 million pounds of investment. And particularly I want to know how you maintain security and you keep out the mongrels from the industrial, the filthy industrial town of Oxford. Oh yeah. Okay. In particular. But in all seriousness, tell us about the size and scope of what you're doing. Yeah, so the size, so this is 258, 9640 servers. Each one of those servers has four GPU cards. So there's 1,024 cards in total. 32, I can't do my maths. 1,032 cards. Each server also has four HDR 200 links because we really wanted this to be as scalable as it could be. That's why we actually chose this 2U Dell server because we can get four full width PCI cards in there for the networking. So there's one infinity band link per card. There's one NVMe drive per card. So you've got a large local NVMe capacity in the servers and then you can get out to a large solid state storage pool. So in total, it's a really powerful machine. Yeah, it sounds like a very powerful machine. Tell us a little more about Project Dawn. Yeah, so as a dorm was funded by a co-design partnership with Dell Intel and the UK government and it's really designed to push the UK, kickstart the UK's AI capability. The UK has been underfunded for some time but the current administration recognized this and we're ramping up our federal funding dramatically and this is the first of a series of AI focused machines in the UK. And we are targeting it at free kind of science use cases to begin with. I think the first one is the fusion community in the UK. So the UK has got quite a large activity to try to develop a fusion reactor, putting energy on the UK grid by the 2040s. Because fusion is a huge problem. The simulation domain is multi-physics, multi-time domain. It's a really big simulation problem. So we work with UK quite heavily. The second domain is pushing AI into clinical medicine, actually into the clinic for patient processes. Of course, most clinical research is in the lab but pushing IT into the clinic is a different matter and we've been doing that for some time. So there's a lot of activities in pushing HPC and AI for various domains in clinical medicine. And the third domain is climate science. So we support a lot of climate activities globally. We have a lot of climate science research staff and software development staff. And we have this machine really to push the boundaries on climate science. You know, it's a fascinating loop when you think about using AI to advance energy technology, so that enough energy will be available to power the machines in order to put it in a vicious circle, man. But yeah, clean energy is a huge one. And using AI to shortcut some of the computational costs of simulations is going to really revolutionize what we do. Because most simulations, a lot of simulations are solved in large partial differential equations, right? So at the heart of many simulation codes, it's a PDE. I cross from fluids to electronic structure to materials or PDEs. And we solve them by brute force and that's really expensive. So we need extra scale machines. But if you can use AI to simulate that PDE, you can get an effective exoflop on the petaflop machine, right? So that's where I think we're going to see huge gains in the coming years in science. It is a really exciting. I mean, supercomputing as an industry, high performance computing, started with a lot of simulations and weather and climate. And it's amazing to see us scale and evolve orders of magnitude essentially in what's possible. Andrea, I love that you just touched on this, Paul, the collaboration between government, academia, and enterprise partners like Dell. How important are these partnerships for you and how, I'm curious, since you had the big announcement last year and AI is so hot, are there more and more countries and government organizations and people reaching out to try and collaborate like this? Absolutely. I would say one of the key focuses we have is actual early engagement with really, really strategic partners. When we're looking at our next generation product designs, their feedback is so imperative for us. You know, when we can have the early feedback in regards to, hey, this is where our head's at, this is what we're thinking from a design perspective, and we can actually get live feedback that can influence the design to make sure it's exactly designed for what our customer needs. That's one of the top tenets of how we design and articulate what we need to go do. Well, and I love that. I mean, you're designing with your community in mind and making sure that you're optimizing some of the hottest hardware on the planet, quite literally, to do so. Are there any trends in the converse? I realize there's obviously, you're very early in the process sometimes in these strategic dialogues and probably can't disclose, but are there any trends that you're noticing? We've had a real sense of FOMO here on the show as everyone's trying to catch up. Are you experiencing that? I think one of the biggest trends, and we touched upon at last SE, I think everyone was talking about, oh my gosh, the TDPs and thermally, now a year later, it's exploding way more. Especially in the acceleration space and these types of products and designs. I think one of the most critical things when we're talking to customers now is the conversation changes to we need to be educating and bringing along the journey of five years out because for a lot of these products, from a thermal perspective, liquid cooling is just one, not if anymore. And we actually have to plan the CAPEX investment now. And it's two key parameters. One is going to be how do you plumb a facility for water. The second is make sure you have the right rack power and even the right weight that can be supported by the floors in some of these locations. The different time scales that infrastructure build up. So your IT infrastructure, you might be looking at a year time scale from inception to completion. But the data center around there, you've got to do big work in your data center. You're talking two years, three years. So you need to be thinking two or three years ahead for your data center planning. Otherwise you're not going to be able to buy that computer. So the different time scales here, I mean the customers need to be talking with Dell so they can find out what data center are I building for equipment you're going to be selling me in three years time. Yeah, and that loop is quite a difficult one. Customers are not used to that because we've been in a pretty static air-cooled environment for a long time. And now there's a big transition going on and you've got to prepare and look, looking three or four years ahead in the IT space is quite difficult. If you look at the rate of change we're going through at the moment, it's difficult to predict. I'm not sure about your job actually. Yeah, I guess. And the other key thing I think is, you mentioned geographically, especially energy prices and consumption, Europe, predominantly, in a lot of other different areas, sustainability is so critically important as we're looking at the future and in different ways to also be able to protect against the cost of the power and how to optimize from that standpoint as well. Because every megawatt's costing you two and a half million bucks a year, right? So that's a lot of money. So small improvements, 10% improvement in energy efficiency because you're cooling and saving you a quarter of a million bucks. You're looking at one megawatt. You're looking at two people from California so we're extremely familiar with, you know, residential rates of 60 cents a marginal kilowatt hour. This is home. You know, running a megawatt in your basement. Right, no. No, I generate megawatts on my roof, but over time. I think that's, we had at North and Dell actually on the show earlier this week talking about taking liquid cooling and taking that water and actually heating homes in Copenhagen. Yeah, we're planning something similar. Which is so cool. It was one of, I feel like there's a moment, every show, thanks to you brilliant, wonderful humans for sharing your intellect, where it all clicks for me with some new layer of this. And that was where I got how this becomes sustainable and renewable and actually makes so much more sense. And so, yeah, I mean, mind blowing and I love talking about it. It's your aha moment. 100%. Well, and I, you know, I mean, we're all technology advocates. It's why we do what we do and we're passionate about this. What I do feel, and I mean, even with travel, I feel a little guilt about my carbon footprint sometimes. And I feel guilt about the types of activities I promote because I'm not sure that that's the most climate friendly activity even if we have to run the climate model. But now, I feel like we're finally at a place where we can see, as much as none of us have a crystal ball, I feel like we can finally see that future is possible and that we will build the tools to make that future possible. So, I would. And the flip side of the coin is the technology helps with global weather patterning. Exactly. To be able to predict what the future would be. So it hits it on both paradigms, ironically. Which is awesome. Exactly. I mean, you just mentioned it. It's this full circle holistic situation and it's so great to see partnerships like yours. Paul, I got to ask, since you've been to 22 supercomputings, which is impressive. How, do you feel like hardware is in the spotlight more broadly now? Do you feel like it's having a moment? Do you feel like? Well, obviously the AI boom is really focusing people's minds on dense accelerated systems. All the quick cooling thing has really lifted off. When did you see plumbing at supercomputing shows? You never used to see it. I know. Yeah, pipes and tubes. So you know, for the hardware junkies, it's an interesting time, right? Very. But again, I would say that we should really put more focus on the software. Because with hardware, you might get one X or two X with performance gains. With software, you can get 10 X or 100 X. So the big gains are coming from the software. The hardware is a necessity and a lot of us are hardware junkies and we love the cables and all that kind of hardware stuff because it's what we like. But the real gains are in the software. Well, on a happy note in that subject, I mentioned off air that I do guest lecture work at Cambridge CTO program. When we talk about the AI stack. Well, exactly. It's all software. We casually mentioned TPU, GPU, CPU. Let's talk about the software stack. And so it is a focus. But we're always harping on the idea of just how critical the hardware is. Yeah, yeah. It's a critical enabler, but then a big gain. So we bought a whole lot of A100s three years ago. And those things are just getting faster and faster and faster by themselves. I think the performance on that has trebled since I bought that same bit of hardware. So without you even noticing, that same piece of hardware is just getting faster, faster, faster in the AI space because they can take advantage of low and low precision, right? And again, the HPC space has got a cotton onto that. The HPC community is going to thrive. It has to learn how to adapt and use AI technologies. So we've got to get into lower precision. We've got to get into using that AI hardware and that's how the simulation community is going to benefit from this focus on AI. So we mentioned how so much has changed since we chatted a year ago. How much technology has advanced when we have your wonderful self on the show next year at Supercomputing? What are we going to be talking about then? How do you say? I don't think I can say. Well, I mean, we're always here for the scoop, Avi. But I think the next ecosystem of how we are going to go and address liquid cooling in a way in which it makes it easy for our customers to adopt. I think that's the ease of implementation and our designs are really, really focused on that because everyone needs the easy button. The end of this is such a hot topic. I want to tell you what, I was in some Dell meetings this week and for the first time ever, they've surprised me. And they've never surprised me in 17 years and it's really exciting what they've got planned. I can't say any more than that, I couldn't tease you. But I was surprised and excited with the stuff that we had. Talk about a 12 month cliffhanger, Paul. Yeah, just wait for 12 months and then you'll see. It's looking really, really good. Okay, so no one, you know, it's just us here, right? We're just in our, it's just us friends. Just us, just us. Very well illuminated. So, so, Blake twice, if there's any truth to the rumor that Dell is planning to acquire Starbucks and co-locate data centers with Starbucks using the latent heat from processing to brew coffee. Is there any truth to that? How did you know? I don't know. Is there any truth? Okay. There's a rear door cooler exchange that goes directly into the machine. Right, well, so Savannah thinks it's cool that we're all using this heat. I feel like it makes us look like apes. And we get free coffee out of the TO because, hey, it's a strategic partnership. But eventually, shouldn't we be efficient enough that it's not very warm? How did he know? Wow, okay. Well, we're going to have to edit that out. You've got a leak. Yes. No one needs to lose a job over this interview. Taking us in our final direction for our chat. I know you mentioned that Blake, your stepson is going to be walking the show floor with you this afternoon. What are the younger folks most excited about in supercomputing? I think they just love to come and geek out on the technology, right? So he goes to DU, so computer science. And he's all up to speed and all the time. My boy does the same. He's a computer science in Manchester. Imagine that. Yeah. How did he get into that? I don't know. Yeah. AI is the thing that my boys are really interested in. Today, if you're young and you're in that space, it's going to be AI. It definitely seems like a gateway conversation to the beautiful high-performance computing hardware behind us. But even seeing what they learn in their universities, I think coming to an event like this and seeing the future of technologies, and then, you know, I'm a very visual person. So when you can actually see it, touch it, feel it, it resonates in different ways. And then it also is enormous in terms of brainstorming and idea-generating. And then, you know, that younger generation is going to fly. Here's, how about this for a brainstorm. Internship at Cambridge. Hey, Paul. Exactly. That's good. I have my boy in my group this time. Yeah, yeah. How about we swap? He's like, daddy, I learned more in eight weeks in your intern than the whole year of Manchester. And I got paid. So why am I at university? Oh, that's a good thing to know. Exactly. Conversation with my kids. On that note, congratulations on your smart offspring. And thank you both for being here on this fantastic episode. Andre and Paul, it's so insightful. David, thank you for being here and always making some creative analogies during our little adventure in getting the secret scoop. And thank all of you for tuning in at home here. We are live from Denver, Colorado at Supercomputing 2023. My name's Savannah Peterson and you're watching theCUBE, the leading source for emerging tech news.