 From around the globe, it's theCUBE with digital coverage of Exascale Day, made possible by Hewlett Packard Enterprise. Welcome everyone to theCUBE's celebration of Exascale Day, Shaheen Khan is here. He's the founding partner and analyst at Orion X and among other things, he's the co-host of Radio Free HPC. Shaheen, welcome, thanks for coming on. Thanks for being here, Dave. Great to be here, how are you? Doing well, thanks. Crazy with doing these COVID remote interviews. I wish we were face-to-face at a super computer show, but hey, this is working, we can still have great conversations. And I love talking to analysts like you because you bring an independent perspective, you have a very wide observation space. So like many analysts, you probably have sort of a mental model or a market model that you look at. So maybe talk about your work, how you look at the market and we can get into some of the megatrends that you see. Very well, very well. Let me just quickly set the scene. We fundamentally track the megatrends of the information age. And of course, because we are in the information age, digital transformation falls out of that. And the megatrends that drive that in our mind is IoT because that's the fountain of data, 5G because that's how it's going to get communicated, AI and HPC because that's how we're going to make sense of it, blockchain and cryptocurrencies because that's how it's going to get transacted and that's how value is going to get transferred from a place to a place. And then finally, quantum computing because that exemplifies how things are going to get accelerated. So let me ask you, so I spent a lot of time at IDC and I had the pleasure of the high performance computing group reported into me. I wasn't an HPC analyst, but over time you listen to those guys you're learning. And as I recall, HPC was everywhere and it sounds like we're still seeing that trend. Whether it was the internet itself, with certainly big data coming into play, defense obviously, but is your background more HPC or sort of these other technologies that you're talking about? It sounds like you're a high performance computing expert market watcher and then you see it permeating into all these trends. Is that a fair statement? That's a fair statement. I did grow up in HPC. My first job out of school was working for an IBM Fellow doing parallel processing in the old days and it went from there. I worked for Cray Research. I worked for floating point systems. So I grew up in HPC, but then over time we had experiences outside of HPC. So for a number of years, I had to go do commercial enterprise computing and learn about transaction processing and business intelligence and data warehousing and things like that, and then e-commerce and then web technology. So over time it's sort of expanded, but HPC is like a bug. You get it and you can't get rid of because it's just so inspiring. So supercomputing has always been my home, so to say. Well, and so the reason I ask is I wanted to touch on a little history of the industry is there was kind of a renaissance many, many years ago and you had all these startups. You had Kendall Square Research, Danny Hillis, Thicking Machines. You had Convex trying to make mini supercomputers and it was just this tons of money flowing in and then things kind of consolidated a little bit and things got very, very specialized and then with the big data craze, we've seen HPC really at the heart of all that. So what's your take on the ebb and flow of the HPC business and how it's evolved? Well, HPC was always trying to make sense of the world, was trying to make sense of nature and of course, as much as we do know about nature, there's a lot we don't know about nature and problems in nature are, you can classify those problems into basically linear and non-linear problems. The linear ones are easy, they've already been solved. The non-linear ones, some of them are easy, many of them are hard. The non-linear, hard, chaotic, all of those problems are the ones that you really need to solve the closer you get. So HPC was basically marching along trying to solve these things. It had a whole process with the scientific method going way back to Galileo, the experimentation that was part of it and then between theory, you got to look at the experiment and the data, you kind of theorize things and then you experimented to prove the theories and then simulation and using the computers to validate some things eventually became a third pillar of science and you had theory, experiment and simulation. So all of that was going on until the rest of the world thanks to digitization started needing some of those same techniques. Why? Because you've got too much data, simply. There's too much data to ship to the cloud, there's too much data to make sense of without math and science. So now enterprise computing problems are starting to look like scientific problems. Enterprise data centers are starting to look like national lab data centers and there is that sort of a convergence that has been taking place gradually really over the past three, four decades and it's starting to look really real now. Interesting. I want to ask you about, I always like to talk to analysts about competition, the competitive landscape. Is the competition in HPC, is it between vendors or countries? Well, this is a very interesting thing you're saying because our other thesis is that we are moving a little bit beyond geopolitics to technopolitics and there are now imperatives at the political level that are driving some of these decisions. Obviously, 5G is very visible as a piece of technology that is now in the middle of political discussions. COVID-19, as you mentioned, itself is a challenge that is a global challenge that needs to be solved at that level. AI, who has access to how much data and what sort of algorithms? And it turns out as we all know that for AI you need a lot more data than you thought you do. So suddenly data superiority is more important perhaps than even it can lead to information superiority. So yeah, that's really all happening. But the actors of course continue to be the vendors that are the embodiment of the algorithms and the data and the systems and infrastructure that feed the applications, so to say. So let's get into some of these mega trends and maybe I'll ask you some Colombo questions and we can geek out a little bit. Let's start with AI. Again, it was one of this, when I started in the industry, it was AI, expert systems, it was all the rage and then we should have had this long AI winter even though the technology never went away. But there were at least two things that happened. You had all this data and then the cost of computing declines came down so rapidly over the years. So now AI's back, we're seeing all kinds of applications getting infused into virtually every part of our lives. People trying to advertise to us, et cetera. So talk about the intersection of AI and HPC. What are you seeing there? Yeah, definitely. Like you said, AI has a long history. I mean, it came out of MIT Media Lab and the AI lab that they had back then. And it was really, as you mentioned, all focused on expert systems. It was about logical processing. It was a lot of if, then, else. And then it morphed into search. How do I search for the right answer? Needle in the haystack. But then at some point it became computational. Neural nets are not a new idea. I remember we had a researcher in our lab who was doing neural networks years ago and he was just saying how he was running out of computational power and we were wondering, what's taking all this difficult time? And it turns out that it is computational. So when deep neural nets showed up about a decade ago or more, it finally started working and it was a confluence of a few things. The algorithms were there, the data sets were there and the technology was there in the form of GPUs and accelerators that finally made this tractable. So you really could say, as in I do say, that AI was kind of languishing for decades before HPC technologies reignited it. And when you look at deep learning, which is really the only part of AI that has been prominent and has made all this stuff work, it's all HPC, it's all matrix algebra, it's all signal processing. The algorithms are computational. The infrastructure is similar to HPC. The skill set that you need is the skill set of HPC. I see a lot of interest in HPC talent right now in part motivated by AI. Awesome, thank you. And then I wanna talk about blockchain and I can't talk about blockchain without talking about crypto. You've written about that. I think obviously supercomputers play a role. I think you had written that 50 of the top crypto supercomputers actually reside in China. A lot of times the vendor community doesn't like to talk about crypto because that's all the fraud and everything else, but it's one of the more interesting use cases that is actually the primary use case for blockchain. Even though blockchain has so much other potential, but what do you see in blockchain, the potential of that technology, and maybe we can work in a little crypto talk as well? Yeah, I think one simple way to think of blockchain is in terms of so-called permissioned and permissionless. The permissioned blockchains are when everybody kinda knows everybody and you don't really get to participate without people knowing who you are and as a result have some basis to trust your behavior and your transactions. So things are a lot calmer. It's a lot easier. You don't really need all the supercomputing activity. Whereas for AI, the assertion was that intelligence is computable and with some of these extra scale technologies we're trying to, we're getting to that point. For permissionless blockchain, the assertion is that trust is computable and it turns out for trust to be computable, it's really computational intensive because you wanna provide an incentive base such that good actors are rewarded and bad actors are punished and it is worth their while to actually put all that effort towards good behavior. And that's really what you see embodied in like a Bitcoin system where the chain has been safe over the many years. It's been no attacks, no breaches. People have lost money because they forgot their password or some other custody of the accounts have not been trustable, but the chain itself has managed to produce that. So that's an example of computational intensity yielding trust. So that suddenly becomes really interesting. Intelligence, trust, what else is computable that we could do if we had enough power? Well, that's really interesting the way you described it. Essentially the confluence of cryptographics, software engineering and game theory really where the bad actors are incentive to mine Bitcoin versus rip people off. That's right, that's right. Because their ROI is better. So, okay, so make the connection. I mean, you sort of did, but I wanna better understand the connection between supercomputing and HPC and blockchain. We know we get crypto for sure. I can mine a Bitcoin, which gets harder and harder and harder. And you mentioned there's other things that we can potentially compute beyond trust. What else, what are you thinking there? Well, I think that the next big thing that we are really seeing is in communication. And it turns out, as I was saying earlier that these highly computationally intensive algorithms and models show up in all sorts of places. Like in 5G communication, there's something called a MIMO, multi in, multi out. And to optimally manage that traffic such that you know exactly what beam it's going to and what antenna it's coming from, that turns out to be a non-trivial partial differential equation. So next thing you know, you've got HPC in there as I didn't expect it. Because there is so much data to be sent, you really have to do some data reduction and data processing almost at the point of inception, if not at the point of aggregation. So that has led to edge computing and edge data centers. And that too is now people want some level of computational capability at that place. Like you're building a microcontroller which traditionally would just be a, you know, small low power, low cost thing. And people want vector instructions there. People want matrix algebra there because it makes sense to process the data before you have to ship it. So HPC is cropping up really everywhere. And then finally when you're trying to accelerate things that obviously GPUs have been a great example of that. Mix signal technologies are coming to do analog and digital at the same time. Quantum technologies are coming. So you could do the usual analysts two by two where you have analog, digital, classical, quantum, and then see what lies where. All of that is coming and all of that is essentially resting on HPC. That's interesting. I didn't realize that HPC had that position in 5G with multi in, multi out. That's a great example. And then IOT, I want to ask you about that because there's a lot of discussion about real time inferencing, AI inferencing at the edge and you're seeing sort of new computing architectures potentially emerging. Nvidia, the acquisition of ARM, perhaps a more efficient way, maybe a lower cost way of doing specialized computing at the edge. But it sounds like you're envisioning actually super computing at the edge. Now, of course, we've talked to Dr. Mark Fernandez about space-born computers. That's the ultimate edge. The nanosatellites. Yeah, you have supercomputers hanging in the ceiling of an international space station. But how far away are we from this sort of edge? Maybe not space is an extreme example, but you think factories and windmills and all kinds of edge examples where super computing is playing a local role. Well, I think initially you're going to see it on base stations, antenna towers, where you're aggregating data from a large number of endpoints and sensors that are gathering the data, maybe do some level of local processing and then ship it to the local antenna because it's no more than 100 meters away, sort of a thing. But there is enough there that that thing can now do the processing and do some level of learning and decide what data to ship back to the cloud and what data to get rid of and what data to just hold or... Now those edge data centers sitting on top of an antenna, they could have a half a dozen GPUs in them. They're pretty powerful things. They could have one, they could have two, but it could be depending on what you do. A good case study there is like surveillance cameras. You don't really need to ship every image back to the cloud. And if you ever need it, the guy who needs it is going to be on the scene, not back at the cloud. So there is really no sense in sending it, not certainly not every frame. So maybe you can do some processing and send an image every five seconds or every 10 seconds. And that way you can have a record of it, but you've reduced your bandwidth by orders of magnitude. So things like that are happening and to make sense of all of that is to recognize when things changed, did somebody come into the scene or is it just, you know, they became night? So that's sort of a decision can now be automated and fundamentally what is making it happen, it may not be super computing, exascale class, but it's definitely HPC. It's definitely numerically oriented technologies. Shane, what do you see happening in chip architectures because you see, you know, the classical Intel, they're trying to put as much function on the real estate as possible. We've seen the emergence of alternative processors, particularly GPUs, but even FPGAs, I mentioned the ARM acquisition. So you're seeing these alternative processors really gain momentum and you're seeing data processing units emerge and kind of an interesting trends going on there. What do you see and what's the relationship to HPC? Well, I think a few things are going on there. Of course, one is essentially the end of Moore's law where you cannot make the cycle time be any faster. So you have to do architectural adjustments. And then if you have a killer app that lends itself to large volume, you can build silicon that is especially good for that. Now, graphics and gaming was an example of that. And people said, oh my God, I've got all these cores in there. Why can't I use it for computation? So everybody got busy making it 64-bit capable and some RAS capability. And then people say, oh, I can use that for AI. And now you move it to AI and say, well, I don't really need 64, but maybe I can do it in 32 or 16. So now you do it for that. And then tensor cores come about. And so there's that sort of a progression of architecture, trumping basically cycle time. That's one thing. The second thing is scale out and decentralization and distributed computing. And that means that the intercommunication and intracommunication among all these nodes now becomes an issue, big enough issue that maybe it makes sense to go do a DPU. Maybe it makes sense to go do some level of edge data centers like we were talking about. And then the third thing really is that in many of these cases, you have data streaming. What is really coming from IoT, especially an edge, is that data is streaming. And when data is streaming, suddenly new architectures like FPGAs become really interesting and hold promise. So I do see FPGAs becoming more prominent just for that reason. But then finally you gotta program all of these things. And that's really a difficulty because what happens now is that you need to get three different ecosystems together. Mobile programming, embedded programming, and cloud programming. And those are really three different developer types. You can hire somebody who's good at all three. I mean, maybe you can, but not many. So all of that is challenges that are driving this industry. You kind of referred to this distributed network and a lot of people, they'll refer to this. The next generation cloud is this hyper-distributed system. When you include the edge and multiple clouds, et cetera, space. Maybe that's too extreme. But to your point, at least I inferred, there's an issue of latency. There's the speed of light. So what is the implication then for HPC? Does that mean I have to have all the data in one place? Can I move the compute to the data architecturally? What are you seeing there? Well, you fundamentally want to optimize when to move data and when to move compute, right? So is it better to move data to compute or is it better to bring compute to data? And under what conditions? And the answer is going to be different for different use cases. It's like, really, is it worth my while to make the trip, get my processing done, and then come back? Or should I just develop processing capability right here? Moving data is really expensive. And relatively speaking, it has become even more expensive. While the price of everything has dropped down, its price has dropped less than like processing. So it is now starting to make sense to do a lot of local processing because processing is cheap and moving data is expensive. DP is an example of that. We call this in-situ processing. Like, let's not move data if you don't have to, except that we live in the age of big data. So data is huge and wants to be moved. And that optimization, I think, is part of what you're referring to. Yeah, so a couple of examples might be autonomous vehicles. You're going to have to make decisions in real time. You can't send data back to the cloud. Flip side of that is when you talk about space-born computers, you're collecting all this data. You can, at some point, maybe it's a year or two after it's lived out its purpose, you ship that data back and a bunch of disk drives or flash drives and then load it up into some kind of HPC system and then have at it and then do more modeling and learn from that data corpus. I mean, those are... Right, exactly, exactly. I mean, driverless vehicles is a great example because it is obviously coming fast and furious, no pun intended. And also, it dovetails nicely with the smart city, which dovetails nicely with IoT because it is in an urban area. Mostly, you can afford to have a lot of antennas so you can give it the 5G density that you want and it requires the latencies. There's a notion of how about if my fleet could communicate with each other? What if the car in front of me could let me know what it sees, that sort of a thing? So, vehicle fleets is going to be an opportunity. All of that can bring all of what we talked about to one place. Well, that's interesting. Okay, so, yeah, the fleet's talking to each other. So, kind of a Byzantine fault tolerance, that problem that we always talked about. Right, exactly, yes. That's kind of cool. I want to sort of close on quantum. It's hard to get your head around sometimes when you see the demonstrations of quantum. It's not a one or it's a zero, it can be both. And you go, what? How can that be? So, and of course, it's not stable. Looks like it's quite a ways off, but the potential is enormous. It's, of course, it's scary because we think all of our passwords are already not secure and every password we know is going to get broken. But give us the quantum 101 and then let's talk about what the implications are. All right, very well. So, first off, we don't need to worry about our passwords quite yet. That's still ways off. It is true that an algorithm came up that showed how quantum computers can factorize numbers relatively fast. And prime factorization is at the core of a lot of cryptology algorithms. So if you can factorize, if you get number 21 and you say, well, that's three times seven and those three and seven are prime numbers, that's an example of a problem that has been solved with quantum computing. But if you have an actual number with like 2,000 digits in it, that's really harder to do. It's impossible to do for existing computers and even for quantum computers ways off. However, so as you mentioned, qubits can be somewhere between zero and one and you're trying to create qubits. Now, there are many different ways of building qubits. You can do trapped ions, trapped atoms, photons, sometimes with super cool, sometimes not super cool, but fundamentally you're trying to get these quantum level elements or particles into a superimposed entanglement state. And there are different ways of doing that, which is why quantum computers out there are pursuing a lot of different ways. The whole, somebody said it's really nice that quantum computing is simultaneously overhyped and underestimated. And that is true because there's a lot of effort that is like ways off. On the other hand, it is so exciting that you don't wanna miss out if it's going to get somewhere. So it is rapidly progressing and it has now morphed into three different segments, quantum computing, quantum communication and quantum sensing. Quantum sensing is when you can measure really precise minute things because when you perturb them, the quantum effects can allow you to measure them. Quantum communication is working its way, especially in financial services, initially with quantum key distribution, where the key to your cryptography is sent in a quantum way and the data is sent a traditional way. There are efforts to do quantum internet where you actually have a quantum photon going down the fiber optic lines and Brookhaven National Labs just now demonstrated a couple of weeks ago, going pretty much across the Long Island at like 87 miles or something. So it's really coming and fundamentally it's going to be brand new algorithms. So these examples that you're giving, these are all in the lab, right? They're lab projects or they actually... Some of them are in the lab projects, some of them are out there. Of course, even traditional Wi-Fi has benefited from quantum computing or quantum analysis and algorithms. But some of them are real, like quantum key distribution, if you're a bank in New York City, you very well could go to a company and buy quantum key distribution services and ship it across the waters to New Jersey. And that is happening right now. Some researchers in China and Austria showed a quantum connection from like somewhere in China to Vienna, even that as far away as that. When you then put the satellite and the nanosatellites and the bent pipe networks that are being talked about out there, that brings another flavor to it. So yeah, some of it is like real, some of it is still kind of in the lab. How about I said I would end the quantum. I just, I want to ask you, you mentioned earlier the sort of the geopolitical battles that are going on. Who are the ones to watch? And who are the horses on the track? Obviously the United States, China, Japan still pretty prominent. How's that shaping up in your view? Well, without a doubt, it's the U.S. is to lose because it's got the density and the breadth and depth of all the technologies across the board. On the other hand, information age is a revolution. Information revolution is non-trivial. And when revolutions happen, unpredictable things happen. So you got to get it right. And one of the things that these technologies enforce, one of these revolutions enforce is not just kind of technological and social and governance, but also culture. The example I give is that if you're a farmer, it takes you maybe a couple of seasons before you realize that you better get up at the crack of dawn and you better do it in this particular season or you're gonna starve six months later. So you do that two, three years in a row. A culture has now been enforced on you because that's how it needs. And then when you go to industrialization, you realize that, gosh, I need these factories and then I need workers. And then the next thing you know, you got nine to five jobs and you didn't have that before. You didn't have a command and control system. You had it in military, but not in business. And some of those cultural shifts take place and change. So I think the winner is going to be whoever shows the most agility in terms of cultural norms and governance and pursuit of actual knowledge and not being distracted by what you think, but what actually happens. And gosh, I think these exascale technologies can make the difference. Shaheen Khan, great guests. Thank you so much for joining us and to celebrate the exascale day, which is on 10, 18. And so really appreciate your insights. Likewise, thank you so much. All right, thank you for watching. Keep it right there. We'll be back with our next guest right here in theCUBE. We're celebrating exascale day, right back.