 Good afternoon hardware fans and welcome back to beautiful Denver, Colorado. My name is Savannah Peterson, joined with John Furrier here for our four days of coverage of Supercomputing 2023. John, what's the coolest thing you've seen the last three days? I think I love the GPU that Nvidia has, that's pretty hot. I love the Ethernet conversations we're having, faster Ethernet. And I just like the whole cloud vibe coming in to the semiconductor boom and just this whole smashing of the middle innovation era between semiconductors and cloud computing and the renaissance of the data center. On-premise and edge is going to be exploited with net new use cases and AI is absolutely shining a light on it and that's to me the coolest thing is the reality of AI is having an impact. Totally agree, we're kind of at the epicenter of the nerd Venn diagram of everything that's cool and I'm really excited to welcome our two next guests to the show from HPE. Joseph and Andrew, thank you so much for being here. Joseph, when we were just chatting you said you've been to roughly 14 super companies. Approximately, yes. It's casual, not that anyone's counting. Of course, of course. How is the show going for you? It's fantastic, honestly when you asked John what's the coolest thing he saw at Supercomputing, I was surprised I wasn't on the list. That was last night, that was not at Supercomputing. Yeah, that's right. But you know, that was in the cloud. This show is amazing this year, it's a great present, the feel, the people, the technologies, there's just so much great stuff happening at this show. So much positive energy. Andrew, would you agree from the lab side? Absolutely, I think John hit on just a lot of those fantastic things that are all coming together and showcased here on the floor. Every year, it's just like, it's hard to imagine that much advancement has happened year over year. That's the thing that always strikes me. I'm so glad that you said that. We were at Supercomputing together last year. Conversation was much more focused on quantum. Little bit, and obviously performance cost optimization, always a hot topic. Sustainability as well, but we are all things all AI, all chaos, and everyone's looking for the right partners and the right players. Top 500, green 500, Joseph, what are those titles? Yeah, we're very proud to say, I'm just going to say that we're very proud that. Own it. Yeah, the top five spots, like we have number one, number two, and number five, number one, number five are with our partner AMD. Number two is with Intel. And on the top, the green 500, four of the top five are HPE systems. Like, come on. That's those shoulders. Thank you, thank you very much. How is it that we're not, I'm not the coolest, John. You guys are cool. We always love- We just hadn't had this conversation yet. He'll re-toggle the list after this. That's right, yes. We can do it in edit. The thing about HPE as I worked in nine years back in the heyday, you know, super computer started when I graduated college in 1988. And I remember that my early days at HPE, pre-HPE, it was all about the processors. And then how can I get faster IO, adapter cards. Now that we're back to the systems mindset. You mentioned systems. The theme to me is, this is a systems mindset. The developers have to have it. The infrastructure platform engineering's emerging and changing. So how do you stand up AI stuff? Hardware, systems, clouds, edge. This is kind of the new architecture. Kind of the same game, but different stuff. But it's still the same stuff. Networking storage, servers, processors. What's the new picture look like? Yeah, I think it's great observation. And, you know, with AI really being, like you say, the hot topic, if you will. Especially with, you know, everything around generative AI this year. I think the thing that is really kind of resonating hit everyone that how it is becoming super computing problem when you're doing AI at scale. And so naturally, you know, we think we're pretty good at that, just like Joseph was mentioning. So yeah, just a lot of those advancements at that system level, all coming to bear, not only for traditional HVC, but as we further integrate, you know, AI and analytics as part of that, we're really seeing that convergence come together. And if I can add, if I can add, I think a lot of the challenges that we've helped our customers address with super computing, some of the same concerns are starting to pop up now as they're looking at artificial intelligence. The good news is a lot of these aren't new challenges. Some of them are, but a lot of them are not. And we know how to manage them at scale. And I think that's what really matters. You know, we've covered a lot of you guys for the years, you know, you guys won our early cube events we've been to early days. I remember covering the storage. Storage became at the three prior opposition. Then you got the servers continuing to be the servers. You always had great servers. But the thing now is with AI, it's like, okay, you got these models now that are going to be the abstraction layer between the new interface. The new interface is going to look a lot more like what people want now is a chat GPT open AI, just shut down their new registrations because they have to develop a day there's so much demand and capacity. That's a whole green conversation. But that's the expectation of the user, that interface. So now you get large language models foundational models in between that. So the question is, is like, how do you manage the data? Right, so now it's like, okay, data management's upside down. But if you've got large language model, you want to have breadth and to your high HPC problems here. Problem statement is that there's also precision for personalization. So you have the large broad swath of data and then getting precision super fast is what we're looking at with AI. That's kind of like the HPC meets AI problem. It absolutely is that convergence and in fact, there's an entire progression, if you will, on this journey for, well, okay, if I'm going to do something like a large language model, I have to produce that model, right? And so there's a certain system, certain data management, certain development platform that goes along with doing that training. And then you have another phase that may be the personalized tuning. Maybe it's language specific or geo-specific. And then ultimately putting that model to use, the inference part requires, right, it has its own system architecture and requirements. That's the great thing about HP as a company is that we're working across the breadth of that entire training, tuning, inference progression. My favorite quote yesterday, Savannah, I want to get their reaction here is from these guys is that on theCUBE, the CEO of GROC who's got the new, amazing LPU chip. They had a great llama out there yesterday, too. The llama was phenomenal. The llama's also been a star of the show this week. They're going to get a swag award, but it's not exactly swag, it's more of a marketing. It was an annual llama. Someone's going to win the llama, man. They're going to win the llama. The people who are watching, there was an actual llama. So the CEO who invented TPUs from Google on his 20% time, changed the game on, as we know, AlphaGo, that whole project that he did. That, what he said to me on theCUBE here, he said, training's cost center. Inference is value extraction. Correct. On stage at KubeCon last week, Tim Hawken, Google engineer, CUBE alumni said, inference is the new web app. Okay, so digest that. What do you guys, what's your reaction to those statements? Do you agree? If so, what's that look like in the future? What's that, what's that, what does that turn into? Yeah, look, I think the short answer is if you look at how sophisticated the models are becoming, especially those foundation models, it's costing more and more to produce those, got to ingest more data because it makes the models better. That's where you get to a supercomputing class solution in order to do that training. But then it's all about, like you said, the deployment side of things. And if you're spending all that money deploying that model, you want to be able to deploy it in as many environments as possible. Edge, cloud, on-prem, you name it. It's universal, that user experience is something that we've been talking about a lot. Obviously something that matters a lot to you guys. And you've got a lot of different instances across verticals in which people are building and doing things. I'm curious since you sit at the forefront and you get to talk to the coolest players, I mean, let's just be honest about that. What are some of the, are there any industries that you feel like are really pulling ahead or embracing AI and this increased processing better? Are there, like what are the trends? Give us the trend report. Joseph, I see you got it. Yeah, I'll start and Andrew, please chime in. I don't know who's ahead or who's, like really, but here's what I have seen. I think we're all still kind of like in a Mario Kart race. I think we're still alive, we're still trying to figure out all of this. Yeah, there'll be a banana popping up out of nowhere. Right. Yeah. I haven't seen that. Bottle that banana. Right. There's a metaphor there somewhere, we'll get there. But I'll say like in the healthcare industry, happening at the edge, right? That's really important, right? There are emergency rooms, there are doctors that are working on, looking at x-rays and all sorts of tests that need answers for patients right now. Don't have a week, don't have a month to go, send it all, get it all, analyze stuff. So having that right then and there, critical. Manufacturers who compete with each other have to manufacture products that are high quality, that are made in the needs of the customer base, as fast as possible. So what I'm finding just generally is when it comes to industries, they're trying to get to the rightest answer as fast as possible. And that's not just to compete, but to serve patients and for financial markets and things like that. So I think industry is looking at artificial intelligence less about a robotic thing that's sitting back in a data center or a co-location center somewhere. But more of how can I serve my end customer faster, solve a problem, get somebody healthier? That's what I'm seeing a whole lot of right now. The rightest questions, I'm going to use that term from now. I know I love that. You have to emphasize the rightest answer. You have the emphasis definitely on the right syllable there. Andrew, what about you over there on the trend side? Yeah, look, I think the answer is we're just seeing it be, it's deployed across every vertical. There's no one area that I would say is, greatly stands out more than another. I think, yeah, the healthcare, the finance, obviously the scientific community and what's happening there as a rich history in applying the latest concepts and techniques. I would say even some of our systems on the top 500, you look at the amazing work that's been done on that. General Electric on the frontier system that we deployed at Oak Ridge. They had something here this week talking about the advancements in studying wind turbulence, for example. And so, I mean, there's just many, many examples. I think that's the thing to take away is the application of AI like that is, that'll be the next big story even next year coming into this event to see those amazing breakthroughs that have happened. Andrew and I are on stage next year with you guys. We'll have this discussion. What has, I mean, just imagine a year from now. I'm really excited just to think about it honestly. Yeah, just think about it. Think of all the advancement, the technology, but then applications and all these industries, what we're going to see people make amazing happen out of AI, it's going to be amazing. I think it's a great point. I think the innovation of these new, net new workloads are going to come out. You're going to see some low-hanging fruit. I got some data laying around. Let's turn that into exhaustion to gold, classic big data, cliche, that'll happen, check. And then, wow, I got other data opportunities either that are going to be generated and or I could focus on. So I think it's going to be what we don't see now. That's right. That's going to come back. So I got to ask you because you guys are kind of set up as we talked about last night when we saw each other. Green Lake's been out for a while, okay? That's cloud-based. That's pre-hype, chat GPT kind of awakening around the world. How has that changed your game? Because one, the tech innovation is coming with AI, but also the education on the market side. How has that impacted the traction, the product and whatnot? Give us the update. Yeah, I'll say right now, I think there is an intelligence among the user base right now that says, okay, we understand what cloud can offer. I think three years ago when you and I talked about this, it was moving what you've got here and moving it somewhere else. And I think the market, all of us realize that's not really what cloud is. We realize that you can have a cloud management model that's happening on the edge. We know you can have a cloud management model that's happening in your data center. We know we can have a cloud management model that's in co-location data centers. And we know we can have cloud management through the Piper Scalers in the public cloud. What I have found with customers is they're having a deeper conversation, a longer investigation into what they do. Where is their data? What are they doing with it? How much access do they need it? How much is it going to change over the next year? How much do they anticipate that change coming? And then they are figuring out where these things need to reside. So I feel like there is a strong intelligence that is emerging among our community. And I'll say, just coming to super computing, there's a lot of users here. This is the conversation I'm hearing from them. This is a very smart audience that's here that's seeing the same things that we're seeing. It's not that you got lucky because GreenLake was very well thought out by Antonia of the Vision. It was a company bet by HPE. Early, yes. Early, and it was a business model shift as well as technology. Okay, now fast forward a few years, okay? You're going to have essentially large-scale change coming on the tech side. And productivity and personalization come up as two hot areas, always on the queue. Better personalization, precision from a broad set of data, but then also that other aspect of it. How do you get that focus and precision and personalization? Do you want to start? Yeah, so it goes back a little bit to that model. I started earlier around that tuning piece of it. So the idea, even in a cloud or hybrid model, is eventually there's going to be some layer of data that maybe you own and you need to do that fine-level tuning and personalization with your own data. And so that's where the whole data management comes back into play as well as the platforms that enable that. Productivity, your vision on productivity. Skyrocketing, we're already seeing elements of it now. Oh yeah, yeah. I think before that, I think you're going to see our markets, all these customers, healthcare, manufacturing, finance, et cetera. They're all going to see the opportunity that artificial intelligence brings them. And then you're going to start seeing some major innovations, each of these industries where we're doing things better and different and faster. I think that's the innovation you're going to see first. Better, faster, stronger. I mean, it feels like it could be a Daft Punk song if we're not careful here. It might already be, I don't know, yeah. You know, maybe that's a great idea. We should talk to them about that. That's right, let's talk about it. I think we should definitely talk to them about that. Joseph, while we were chatting, you mentioned that you have a 15-year-old son, Elijah, who is a Rubik's Cube master, a speed solver. I look forward to having him on the show. Since you have a pulse check with the teams, what's the buzz like? I'm giving you a lot of credit. That's quite a stretch, yeah. I don't know that I even have it with my son. Inference, right? Isn't that just an example of inference? That's right, exact. It's not a hallucination. Yeah, no, it's not a hallucination. It's just a little inference. Wow, we could really get the AI box going. What are our teenagers talking about AI? What are your conversations, like with your son and his friends? So I will say for both my kids, what I've found is they have learned to program in R and Python, and it's not a big deal. Like, I found out almost accidentally. Absolutely, as one does. Because when I was in engineering school, and we were doing C++ and Java, that's like, we would talk about it. Yeah, we just wrote this program, et cetera. I accidentally stumbled into knowing that both my kids know how to program in these things, and it's not, what's interesting is it's not like a tool to go get a job. It is, like with my daughter in college, she was like, I'm in a statistics class. And so I wrote this- This is so cool, like it's second nature. Yeah, I wrote this R program to kind of help figure this out, and I had to say, hold on, time out. You did what? You wrote an R program? She's like, yeah, no big deal. Wrote an R program. So I think it's an extension of what we just talked about. Artificial intelligence is a way to actually solve these things, and we are giving creative tools to this next generation to say, this is the R tool, go build amazing. And I think we're starting to see that. I love that, go build amazing. Because that creative class coming to tech is going to be off the charts. Just solving problem, jumping in the barriers to get going versus setting things up. What's the skill development? You get the first three shots out of the gate from AI that you can potentially reiterate, and as it gets faster, with chips and systems. And I've got a couple of boys as well going the college age, and we have this conversation all the time about, because working in the labs environment, we're working on, hey, you know, kind of what's next, even. Yeah. And we've got a great, it's an adapted quote from a colleague, one of our researchers, talks about, well, you know, AI is not going to replace that scientist, or that teacher, or that publisher, but the flip side of that, the teacher, scientist, teacher that uses AI, will replace those that don't. So that's the thing, embrace it, use it. It's a tool, you know, let's not get too carried away. We had a quote on theCUBE a couple of months ago, that said, AI scales intellect. If you have data in your head, too. Absolutely. So again, that's the creative plus what's available. And so we're going to get back down to the real time. So in this new model, you've got interface, model, foundation model layer, and then infrastructure just self-forming. So it's going to be a whole new infrastructure system? Yeah, what we need this market to be, the customer set is just to, don't be so worried about what's the tech. We're working on it. We're doing amazing things. That's your job. Yeah, we're doing some great things. But think big on like, how do we see more patients and how do we get them healthier faster? That's the problem, right? How do we actually make sure that financial transactions are secure? Those are the things we need. And I think we're now setting up a set of tools for these great minds to go and say, I see these big humanity problems and here's how we're going to go and solve them. And a great example of that. I want to plug our group just a little bit on this. We'll allow it. You've been nice to us. So we'll allow those. Getting back to one of those areas where AI is going to make a difference, right? So we're working on two projects in partnership with the Department of Energy, related to applying AI on fusion related, right? So we know our generation, our kids generation, right? Energy efficiency, sustainability, it's a huge topic. And so these are groundbreaking type work that we're involved in. It is. And you can tell just how excited and passionate you both are. Thank you for sharing your insights with us. We can't wait for this to be the most watched segment from Supercomputing 2023. I think it will be. I have a feeling. Shout out to Elijah. I'm looking forward to him fixing my earrings. Yes. And I just want to give you guys- That's all he's going to fixate on when he watches. That's not solved. I know. I'm sorry for those of us who are a little more OCD or worried about balance in nature. I also want to give you guys both a shout out. I'm not sure what the very loud noise is that's going on right now. Are we in danger? But you have managed. I've been wondering, it sounds like there's a fire alarm. I'm not sure what's going on. There's always a lot of exciting stuff going on the floor. John seems very calm, so I was kind of feeding off. But I just, I hear you are dropping a serious knowledge on the audience, dealing with that hats off to you. John, always a pleasure to co-host with you. Naturally, that's when the beeping stops. Of course. We have you for tuning in to this thrilling episode of The Cube here alive in Denver, Colorado at Supercomputing 2023. I'm Savannah Peterson. You're watching The Cube, the leading source for emerging tech news.