 Hello and welcome to theCUBE pod episode 51. I'm John Furrier, Dave Vellante. We're here in the Palo Alto studio for this podcast. Dave kicking off 51. What a week. It's been a couple of world win, three weeks. We've been in studio for SuperCloud six. We had the big week at GTC. Our CUBE team is up in Paris, France for KubeCon, CNCF, CloudNativeCon. Just an amazing time right now in the industry and NVIDIA's conference really kind of set the tone and raised the bar in terms of like where these AI systems are going and just the financial performance of NVIDIA's stock price and just the financial performance on the business fundamentals has everybody in the industry on notice like, wow. It's mind blowing performance. It's technology shift that's categorically new. And the CEO, Jensen Wong saying this is a new category of how things are going to get done. And it represents a cultural inflection point. And I think this week with all the content going around the world with KubeCon and then NVIDIA GTC. And of course, we were at the Broadcom Financial Investor Analyst meeting yesterday morning for a few bunch of hours getting the scoop on their entire process for custom silicon and new packaging techniques based on open foundational standards. You're seeing the rise of a generational shift in the computer industry and the impact is something like from an experience standpoint, I've never seen before. I've seen great shifts in my time in coming out of high school and in ecology so that move from mainframes to PCs and many computers in the rise of local area networks, internet working, the internet, mobile computing, social media, but now we're at an era that is just so different and it's pumping on all cylinders. I mean, every theater is on fire in a good way. We got an industry revolution, industrial revolution, we call it the tech revolution again, the systems revolution. You got the computing paradigm has changed radically. Applications are going to change radically and then ultimately you have a physical world of robotics connecting in with digital. So you have the perfect storm of innovation happening and I think NVIDIA's event is a flash point because it gets everyone's attention. This greed, this power, this technology, it's social impact. I mean, the money being made around this new era will be fantastic, a great opportunity. And a conference like NVIDIA is about chips. It's a developer conference that is for the alpha nerds yet hedge funds are there. Venture capitalists were there. You had VCs hanging around the bar in the lobby. Where's the action? I mean, people can smell the money, they can smell the opportunity, they see the value. Also, some people are scared too. So I think this week represents what we've been talking on theCUBE for some time as that inflection point and it's real and you can see the quantification of where it's going to go. You start to see visibility into the money, the applications, the systems revolution, specifically the tech innovation and the societal impact and of course new things like robotics. So robots, we're going to be in a robot era. I mean, it's incredible. Yeah, so to me, John, this was the single most important event in the history of the computer industry. And I don't, I mean, in terms of like a show. I really been thinking about this and I was trying to say, okay, what else could match this? So I go back to the early Comdex days when Bill Gates was presenting on the future of the PC. This was bigger. I think about the early reinvent, maybe even the first reinvent. You know, the first or second reinvent when Andy Jassy basically really the first one stood up. So when I say that, I don't mean in terms of size. There was whatever, 20, 25, 30,000 people there. There are many bigger events. But in terms of the industry impact, I think this was the biggest event in the history of the computer industry. And the reason I say that is to your point, it was a developer conference, but everybody was there. All the partners were going to talk about, there's just so many press releases that were done. Every company did a press release, super gluing themselves to NVIDIA. But also there were industries there. I put out a tweet of one of the slides that Jensen showed on his keynote. And these were speakers, so JP Morgan Chase, MITRE, Honeywell, Foxconn, Deloitte, Disney Research, eBay. I mean, these are presenters at the conference across industry. So we've talked for years about how every company's a technology company and every company's a software company. Well, every company's now an AI company. We're all experts at AI. But so the reach was really tremendous. Everybody's talking about it. All the news shows and the business news channels talked about it, not just as a one hit. Hey, we stopped by CES and we did a couple interviews. This was like a week long and people are still talking about it. And then the other thing is, the really positive thing in my mind is despite the tough macro, the market's broadening. It's not just the magnificent sevens, it's not just NVIDIA. Jensen talked about ANSYS, Synopsys, Cadence, Dell, got a big lift. I mean, of course, you know, HPE was there and Lenovo was there and Supermicro and on and on and on. Snowflake and Databricks did a deal with them and everybody is, you always said, rising tide, lifting all ships. It is happening, finally broadening in the stock market, which I think is a good thing. And the other thing is Micron announced last night and they announced that their high bandwidth memories, John, are back ordered till like 2025. And we heard yesterday at the Broadcom financial meeting and I know we're going to talk about that a lot, how Broadcom is developing, you know, the communications technology to talk to all the XPUs and all the memory. And so they got a big lift for other reasons as well. We'll talk about it. They got a third big custom customer. But this, again, I think it is the single biggest event in the history of the computer industry. Well, I think the Micron is actually a good announcement. It's going to be its allocation of supply. The demand is so high for content. I mean, for components and also the supply chain is challenging. I'm going to see if that's going to cause a change in pricing of some of these companies. That's going to be something to watch. But back to your NVIDIA thing, and we wouldn't do want to touch on Broadcom because we did have a lot of time with the execs there and we got the full picture. And it's important to explain the custom silicon trend that now they have only three customers, but you're going to, I predict you'll see chips of the masses come out of this process the way they've managed to open technologies. But if you go to NVIDIA, the thing that jumped out at me, besides what I said earlier and what you just said is that every single company that we've been following pretty much has a deal with NVIDIA. Everyone's a partner, you know, flexing on Twitter and LinkedIn, our strategic partnership with NVIDIA. And everyone's got the strategic partnership with NVIDIA, like what the hell do they have? So when you peel back the onion, they're really not a lot of partnerships. It's pretty much Barney deals, reference implementations and or, you know, we love each other and more intense commitments. We're intending to do something. So I didn't really see a lot of NVIDIA good deals that I could point at and say, wow, that is a real partnership. The Dell AI factory was kind of, you know, it's a buzzwordy name, but, you know, building end to end. Jensen said Dell, nobody is better at building end to end systems than Dell. I mean, he stood up on stage and said that. I was like shocked that he said that. I wonder what the true partnership is. I mean, it almost feels like everyone kind of is revealing that they're in and they're ready to, they're working on it. It's not like it's out yet. Like there's a lot of behind the curtain activity. Even with Dell, and we follow Dell pretty closely, I do think they have a good relationship. I think Jensen is telegraphing a little bit to Dell. The way he called Michael Dell out in the keynote, Michael had all his lieutenants there, Sam, he have and others, their booth was on fire. I thought they had great booth presence there. Right, I mean, I thought Dell technology stole the show from all the vendors. They're writing big checks. And so, they had good activations. I mean, they were showing rag and action, and you can see where they're going, right? Your point about their booth was, their booth was packed. I mean, a lot of booths were packed too, but Dell is getting inundated with inbound now. I mean, you know, I mean, Dell generally still is a low margin hardware company, but they're so relevant now. Like overnight with this AI wave, and we saw them at Mobile World Congress, and so they're doing some things, and then HPE with the supercomputing acquisitions that they've made, all of a sudden, they're like relevant overnight. And if you look at the show floor, you look at the show floor, I think there could be a lot more booths there. HPE was there, a bunch of other ones, and new names were out there. Obviously, Super Micros rising to the top. I mean, then the brand of Super Micron, Micro has become significant in its brand value. I mean, it was a great brand as a, quote, supplier of servers, and anyone who's done anything over the past decade has probably bought some. Yeah, key ODM, for sure, right? Super Micron boxes assembled their own data center, but I think now we're living in an era where with this big AI shift, every time you've seen movements like this, and again, like I said, I think this is bigger than anything I've ever seen by a multiple factor, 10 to 100x more than the PC revolution and other waves. Anytime you have a shifting of the winds like this, okay, I think of it whether it's a big wave or the changing of the wind, there's opportunity. And so companies like Dell and others will capture that. And there will be winners, big winners, there'll be either existing incumbents that will take advantage of the wind shift, or the tide turning, whatever you want to call the metaphor is that that's going to happen. And then you're going to have people who won't. And we're going to see a lot of losers that are going to come out of this. So if you look at the market, it's going to be very clear winners and losers in the arena here. So like I'm telling you right now, you're going to start to see the tell sign of what a winner looks like and what a loser looks like. And if you're a loser, you better start thinking quickly, I better get on this new AI infrastructure because the infrastructure is changing fast. And what came out of the NVIDIA show, one of the things that wasn't really talked about on some of the mainstream outlets like CNBC and others who were there kind of going gaga over NVIDIA, the real story that's coming out of NVIDIA is a brand new infrastructure is being reset. And that means the infrastructure to power AI will be a lot different than the infrastructure to power the last generations of the computer industry. That means pretty much everything will be old fast. And so if you have old stuff and you don't make it new and cool like the new stuff, you will be on the losing side. And like I was saying yesterday on theCUBE, if you're not software defined, you're in trouble. If you're embedding compute in your, say flash arrays or your servers and you're not decoupling that, you're in trouble. So physical and software defined will be a big part of it. And I think that's what I also learned at Broadcom, that foundational technologies based on open technologies will be the winning hand. And we're going to see how people can figure their infrastructure. So I want to just think about something you said about a lot of Barney deals and it's true. There were a lot of Barney deals because people want to get attention. But I think John, there's also a lot of creativity going on where people are saying, okay, I can get my hands on these GPUs and I can do things with them that Nvidia is not going to do, although I'll come back to that, but that give us a unique value in the marketplace for customers. And I'll give you an example that one of the more interesting conversations we had, I thought was at that vast luncheon with Genesis, which is, you know, you see all these alternative clouds, these GPU clouds coming out, AI clouds, like CoreWeave, Genesis, and there are many others. Vast has sort of running the table on those guys because they're doing petabyte scale, you know, object storage and they've got their file system. But these companies are competing with the cloud players who of course have so much money, but their premise is we're going to be more agile. We're built for the AI era. We're not like traditional general purpose workloads with multi-tenant. We are building infrastructure that's specifically for AI. Now, we asked the Genesis guy, you know, how are you competing from a CapEx standpoint? They're obviously well-capitalized. But these companies have a bunch of dough behind them, but they are able to compete, it seems anyway, with the hyperscalers. I don't know what kind of lead they have, whether or not the hyperscalers will close that gap, but John, it's a classic innovator's dilemma, right? Well, I think the PC is changing. I think Dell's opportunity, I think NVIDIA has been bundling in Tensor Core and all the millions of PCs. They've been running CUDA in a way with the graphics processors. So I think the AI PC is going to be a real deal. I think that's going to have legs. I think it'll look a lot different. We'll see how they bundle in, say, LLMs and language models into these PCs, but they'll have the hardware, they'll have some NVIDIA in there. And I just think that, you know, this idea of the AI factory that, and Jensen and NVIDIA have talked about, again, point to the fact that the infrastructure outside of the big cloud guys are going to be reset. So the classic modern enterprise is going to look at new architectures. And it will include AWS and Azure and Google Cloud and Oracle Cloud, as well as on-premise activities. And Edge, so I think you're going to see a distributed computing paradigm continue to emerge. And mobile devices will just be an IoT kind of edge. And you'll see small language models on devices. Oh, like Qualcomm at MWC, we had Qualcomm on and they were talking about how last year at MWC they had a billion parameter model running on a phone and then a smart phone and this year, they had seven billion and I think they're going to be, you know, tripling that in the future. So that's pretty amazing. You know, the other thing, David Floria came to me in 2021 and said, Dave, we got to do something on NVIDIA. This company is going to dominate the data center. So we wrote a breaking analysis how NVIDIA is going to dominate the data center and run the table on AI. And what struck me is at the time, he obviously knew and understood well as we've been reporting on CUDA and the impact of CUDA and what an enabler was, he said at the time that NVIDIA could give away the hardware which they're not doing and then they could just sell the software and license that and they could have a great business. Now, of course, they're not giving away the hardware, supposedly Blackwell, which is the new chip that they announced is going to be $50,000 a pop, which is kind of double the previous generation but we'll see what the actual pricing is. But NVIDIA did announce an enterprise license for $4,500 a year per GPU for building digital twins and they demonstrated really some powerful software. So that's a whole nother vector of growth for them. And when we were in the analyst round table with NVIDIA with Jensen, somebody asked like, how big can that market be? He said, well, let's assume we have a million GPUs out there and it's 4,500 bucks a piece. It's a four or $5 billion revenue stream annually that can grow. And so you're starting to see these business models. There are concerns, I will tell you in private conversations I had is, Jensen said we start with this concept of a system and a data center, you've been talking about clustered systems and then we break it up and sell it in parts. Well, he's showing racks. I mean, he's showing supercomputers, basically saying, we'll sell this to you. So a lot of the people who sell that equipment are looking at that going, huh, buy it from us. So he's got his own, he's got his clouds going, right? You can get the GGX cloud, he's selling systems, he's selling parts, he's selling to everybody. And so I think that has some people a little bit concerned. Now, they don't have the service infrastructure that like a Dell or an HPE or Lenovo have. But that's something that people are talking about and I think has people concerned. Yeah, I think one of the things people are also talking about is that the flip around how chips are being built. Old school conventional wisdom was make the chips smaller, faster, pack them all on a motherboard. Now, the model as Jensen pointed out with Blackwell was we want to make the largest chip possible because the benefits of having a chip gives you power and all that into high speed interconnect built into the chip works better. And you mentioned NVLink, NVLink and NVSwitch, these things can't go down. He also said the AI chips need to be bigger and you get the low energy series to bridge them together. That's the logical layer to make it simple, chips talk to each other, you can start chips talking to each other on these transformer engines. But the bottom line is that he basically said, we basically build the data center and we just sell them in parts. So they're essentially building the data center of the future which could be renamed as cloud or a set of machines. I mean, Amazon and as Larry Ellison once said on the Churchill Club, remember that big video years and years ago, the cloud, what's the cloud? It's just a bunch of servers, right? So, and he's kind of right because Amazon is a collection of servers. James Hamilton and the team there innovated around how to host and Amazon is to me the poster child of innovation around how to build large scale data centers as is Google and Facebook as well as they've started realizing the large systems. So Jensen's essentially building a data center for Nvidia with all their stuff and he's basically renting out parts. You want some GPUs and they charge per GPU and the controversial statement I will say is that they're not in the software business. They're in the GPU business, chip business and they happen to sell a license for CUDA on a per GPU basis and he said, it's very simple, we sell $4,500 per GPU. They put a million GPUs out there, do the math, right? And that's their business model, but it's pegged to the CPU, not a software renewal license. You're right. So it's like they're in the chip business. They're in the hardware business. They are a hardware company. I mean, there's no question about that. I mean, of course software, they're software's key, but like the chips themselves. But it is that is the enabler, the ability to program these GPUs. That was the enabler. I mean, you listen to Ilya talk about that when they got their hands on CUDA, they were like, okay, now we can really actually do things with this system. And you mentioned, to me, the two most interesting, we talk in these ways, oftentimes it's the picks and shovels. People talk about picks and shovels, meaning the infrastructure companies that are going to do really well, using the analogy of that too made all the money in the gold rush, the guys who sold the picks and shovels, not necessarily the miners. But anyway, to me, the two most interesting AI companies right now and both sort of by accident, by design, are NVIDIA and Broadcom. And they have two dramatically different philosophies and strategies, obviously NVIDIA is making giant chips, giant GPUs and building huge systems. Broadcom doesn't make GPUs, right? Broadcom makes all the technology that connects all the GPUs and the CPUs and the neural processing units and the LP use whatever XPU you want to use. And all the memories, all the high bandwidth memories, they connect all that stuff together. Why is that important? Because you have this processor renaissance going on in the old world of x86, x86 managed everything. You know, all the memory management, I mean, everything, all the peripherals around it was x86 was the center of the universe. And what's happening now is all these xPUs, they want to do stuff. And so they need access to memory. They need access to resource. So Broadcom makes the technology to enable that. And you can't do AI without that. So they're just sitting in an unbelievable position as well NVIDIA not only makes the GPUs, they make all the interconnectivity both within the system and across, you know, switching. And the acquisition of Melanox enabled them to do InfiniBand. And then there's a sort of urinary Olympics going on between InfiniBand and Ethernet, right? Jensen essentially said Ethernet's useless for AI. Yesterday we heard from Broadcom is that that's bullshit. Everybody's doing Ethernet. All these hyper scale companies are doing Ethernet and they're doing AI. So that's nonsense. And then the ultra Ethernet is what allows them to scale. And so, you know, there's that interesting debate. And then of course Broadcom says, by the way, NVIDIA is our fastest growing customer. In one of their divisions, who knows? So, yeah, well, so, and then of course they announced, I'm switching to Broadcom here, but they announced that the third big custom silicon customer, we know that Google's been a customer for 10 years. We know Meta has been probably the last four or five years been a custom silicon customer and Haktan sits on the board of Meta. And we suspect that his bite dance is the third. You suspect that. I do suspect that, you don't agree? Well, I think Facebook must be in the mix because obviously Haktan is on the board of Meta. Yeah, that's the second one. There's three, they announced a third. We know it's Google. How do you know it's Google? We know, trust me, I know. Okay. And it's always been, they've been the biggest customer. They were their first, Broadcom's first custom silicon customer and Meta was the second. And I really do think the third was bite dance. There was an analyst. It might be Amazon. Well, let's talk about it. So the reason why I really think it's bite dance is because there was a question that was asked, are there restrictions in terms of you selling to China? So that was an attempt. And I remember Pierre was a really sharp analyst in front of us, said, nice try. And I was like, yeah, nice try, but I'm really interested in what the answer is. He's asking a different question. He's basically saying, he's basically saying, is the custom silicon customer bite dance and are you restricted from selling to them because of the China restrictions? And Charlie Cowell said, at this time, there are no restrictions for selling. The question was clearly trying to see if that option was to eliminate bite dance. So it did not eliminate bite dance. So by process of elimination, you have to keep bite dance in the mix. Now the other, if we unpack this and decode what Charlie Cowell was saying, he was very clear that the consumer AI, the ROI in the business case is so clear. Bigger clusters, I mean, you can do better AI, you can get more people to click on your stuff and stay on your platform longer. And so it's a no brainer, he's saying, for these consumer internet companies, i.e. Google search and Meta and all the social media companies to build bigger clusters, very clear business case. The business case in the enterprise, not as clear. So that's why I feel like it's got to be bite dance because they can make more money. Let's talk about Broadcom. So we are at the Broadcom Financial Analyst Investor Day where they bring in all the top investors which essentially all the hedge funds and buy-side, sell-side analysts to come in and the sharp people had a few industry analysts. We were there and some other, only two other industry analysts were there. So it was like four industry analysts. We're technically kind of in between. We're like above industry analysts, below financial analysts because theCUBE gives us a little bit more view there than the industry analysts. So we're kind of in the middle between those two categories. But it's going to get access. So we saw Charlie Kowaz, Jess Trembley and all the execs by the divisions and they laid it out. They basically presented, this is Broadcom's semiconductor strategy and set of key products and technologies that we think will dominate our business for the next decade. We kind of had a lot of visibility on what they were doing, but we didn't have a lot of the details. So for me, I learned a lot and connected many dots. So my notebook was full. Clearly it was great to level up, but it's very clear to me that Broadcom is essentially laying out, look at, we're open book. Yeah, and they were polite about the Nvidia comment. I thought that was clever. Hey, of course they're a customer. Everyone's a customer. But Broadcom I think is doing something so interesting around custom silicon. And that was really, I wanted to focus on that. The interconnect answer to N-V-I-N-K, okay? And PCI-E, Role of Ethernet. What will be the interconnects? And what they presented, what Charlie Kowaz and team presented was the clusters of the future, the AI clusters that are needed to run AI at scale. They laid out the architecture. It was a magical picture. I loved it. And then they also talked about what the AI chip needs to look like. And so what they meant by that is that this is the kind of chip that's needed for AI-like workloads in these clusters. And this is how the clusters have to behave. I thought that was really right on point. And the second thing was, is that they show the role of Ethernet and the role it plays with power and cooling in these new clusters at scale. So they really kind of laid it all out. And then finally they really kind of showcased, I think they're a competitive advantage, which is their custom silicon process, which will allow them to manufacture custom silicon at a pace of less than 12 months, ramped up into production. So what that means is, not withstanding supply chain constraints, like HBM, a high bandwidth memory and other components, they literally could be creating custom silicon for the masses. And that's a game changer. That's just never been done before. So you have this open foundational technologies that they built on, Ethernet, PCIe and others. They essentially have prefabricated an automation factory for all the design side. So all you got to do is swap out your processor. So as processors change, they're going to build the apparatus around that system for these specialized chips that work in mega clusters, which we've been calling clustered systems. That's the PC of the future. That's the server of the future. So we're used to PCs and servers, you rack the server, you use PCs to work on. The chip of the future will work in a new kind of combination. I think Broadcom is, I think really forward thinking on this. And I was blown away. And finally, their final competitive advantage will be their packaging. Their unique ability to do packaging of the processors and the chips is really going to be a major differentiator. So if you're in competition with Broadcom, good luck. And they were kind of, they weren't totally grandstanding too much, but they were proud. You can see that they're like, they're like, look at this is like what we got. Good luck with the competition. It was really, I mean, and video is more like bragging about their competitive position, but Broadcom was kind of humbled. They're like, good luck with the competition to try to ramp up here. By the time they even get into design wins, they're already shipping the next chip. So I think Broadcom is really misunderstood by a lot of people. And I think the purpose of this, by the way, this was their first financial analyst event, I think ever. Really? Yeah. And Hock-Tam wasn't there. Okay. So he obviously talks on quarterly calls and he goes to industry conferences, but he wasn't there. This was Charlie Cowess's show. They have 26, I think business units and 17 of them are silicon. They obviously have some software as well. They've got CA, they've got VMware and others. But it was really, I think, designed to educate people on their business. And what struck me is, And we did a good job on it. I thought they did a great job. Absolutely. And we learned this, Charlie laid this out at Mobile World Congress and when he was on our show, there were three things he said. We start with the market and we always look for markets that are durable, meaning we don't really care if they're on a 20%, 30%, 50%, 100% a year growth rate. That doesn't matter to us. What matters to us is, will this market be here 10 years from now? So this is durable. And then the second thing he said to technology, we look for areas where we can be the technology leader. And then the third was execution. The old saying, if you execute or be executed, but they're really good at execution. And so that's great. Okay, that's their strategy. But what impressed me was every speaker that stood up, talked about their markets, the history of their markets. They demonstrated a deep understanding of the market and where it came from and where it's going. They also demonstrated technology leadership. How many times yesterday did we hear, well, we were first at this PCIE chip, we were first with this AI chip from 2014, first, first, first, first, first. And then they showed their roadmaps. By the way, first, they said that first a lot, but also they said, we started way back here. And me personally working, breaking my career with IBM and HP in nine years at HP in the 80s, 90s, HP was an, the old Eulet Packard was very instrumental in a lot of the Broadcom success. And they actually called him the grandfather of some of these chips. So that made me feel, and with the combined presentation, that felt like Silicon Valley. It felt like the old school Silicon Valley, we're engineering killer products. And it wasn't just like fly by night, hey, we got some new shit. It was like rehab real engineered system. We're proud, we're going to be, they weren't too loud about, they weren't too cocky. Like I said, they were kind of humble, but they weren't, they were like, look it, who's got this? Well, and then the third piece was their execution. And then they, again, they demonstrated, they got leadership in all these different markets. And then they took us to these rooms and they showed us, you know, they showed us the tech, we couldn't take pictures. They wouldn't allow us to take pictures, but it was, it was really impressive what they did. And, you know, we saw the custom, there was no photos allowed, just for you guys watching, there's no photos allowed. We would have had photos. The only photos they had was some stage PR stuff that we weren't involved in that. We were too busy getting the demos. They had a breakout after the presentations, five rooms. We got, we were, we spent all of our time in room five. That was the custom Silicon room. That's guy Frank Ostrojik. He was awesome. I loved his speech. I see jazz, I think it's got the killer strategic piece of interconnect. So I think jazz, trembling and Frank, jazz and Frank on the team had the two kind of killer products. These interconnects and how they're engineering that inside the chip is key, but the custom Silicon was great. Room five was all about custom Silicon. And what they're doing is amazing. And they're, and they're custom product packaging with optical, the CPO product. Dave, they have this thing called a CPO. A CPO is custom package optics. Where they embed the optical components into the chip. So all you need is two connectors and you're done. And in the old ways, you had to put multiple connectors in there for these transceivers, the optical transceivers. And they're prone to error. You got human error, you got cable error, connector error, but embedding the custom package. Again, this is going to show the advantage of Broadcom is it's going to be a Silicon war of who's going to have the best product. And again, they did great. The other thing you saw was a real emphasis on getting as much out of copper as possible, because it's lower cost, more reliable, and it's less complex. And so basically there's a sort of optics avoidance. I mean, you need optics, right, to go distance, but the optics avoidance at all cost, and then maybe not at all cost, but use the process and your technology and then only go to optics when you really have to and bring in that complexity. The other thing that they really did a good job of is explaining, look, one of the speakers used the analogy of surfing waves. He said, I'm a surfer. I can, me and my daughter surf. We can surf the one foot to two foot waves, no problem. You start getting those five foot, six foot waves, it starts to get hairy, maybe you can try that. Those 15 foot waves, whoa, you don't want to do that, because you're going to get hurt. And basically- You also use a skiing analogy. You use a skiing, right, but which maybe resonates better with people, but he basically said, we're really good at surfing the 15 foot waves, the scary stuff we do and we do it really well. And to your point, you know, they weren't really bragging about it, but just like, hey, who else can do this? They did a good job of bragging without bragging. I mean, first of all, it's not bragging when it's true, right? So they laid it out. The other analogy that he was skiing was, you're going to do the black diamonds. You want to make sure you have your- Black diamonds with the skeletons. That's what we really good at though, those pump rungs. Because the shoots in the west. Kind of like you and I, to your valley that time. The old days, yeah. I mean, I can do one black diamond run now. I'm kind of retired to blue squares, but you know, I wrote on LinkedIn on my summary of the event and I just think this, these XPU, they call it XPU because it's CPU, GPU, TPU. What is a lot of new, there's new, new MPUs, all kinds of processing units coming out. And they don't care. They're like, whatever you want to use. Well, I think the genius of their strategy is not to compete with the processors. And the way they built their architecture, it's like a Lego blocks to just plug the processor in, very modular in the design. And again, shorting down the design cycles will enable potentially Broadcom to build chips for the masses. And as custom workloads become more prevalent in the enterprise, and I think that's going to be interesting to watch as, because we know for example, I mean, first of all, Broadcom came out and said, it's important to say Broadcom came out for this event and said, we're talking about consumer AI, not enterprise AI. And I think their main reasoning is, all the money right now is in consumer AI because those companies like Facebook, TikTok, or ByteDancers, TikTok, and the Amazons of the world, the hyperscales that are serving, applications that have a large scale. And they sell the Apple as we know. So those are where the action is. But okay, fast forward, consumer always is a predictor of enterprise. If enterprise have the ability to do custom workloads for distributed computing, public cloud, premise edge, you're going to see workloads that are highly specialized in the enterprise that are going to potentially have the opportunity to have a custom chip. Why wouldn't you? It's like having a server that's dedicated for a workload that a customer knows is running on their business. So I think you're going to see workloads on the enterprise that will be candidates for custom silicon. And that's going to be the new thing that will emerge. It brings up an interesting point is what are the design, what are the decision points at which Broadcom and its partners decide to go down the custom route? And Charlie Cowis, I thought did a good job of explaining this. He said, look, if you take merchant silicon and you apply it, it's great, but it doesn't give you that differentiation. But these big internet companies, they have very specific workloads. And if you can build custom chips, specifically purpose built and designed for those workloads, you can save a ton of money. You can save a ton of, he was talking about Meta and Google. I mean, these are huge hyperscalers that, if you can save 80 Watts across millions and millions of chips, you're saving a lot of money. And of course, this is a huge problem for data center operators. And so that's where they decide to do it. That plus the volume, obviously. And of course, can they make money at it? So that was really interesting. And then the other pieces, they talked about software that they've developed that they're engineers. They're an engineering company. I mean, it's very clear that's engineers everywhere. You know, there's not a lot of sales and marketing there. So it's a hardcore engineers, but they've developed software as part of their workflow so that they can get the tape out. He gave two examples. I think one was seven months and one was nine months that we were able to get the tape out. And then they showed, to your point, this packaging about the size of this laptop and they just pop in XPUs. The customer can put in XPUs. They can put in high bandwidth memory. They can configure it any way they want. And Broadcom's IP is the connectivity between all this. And they showed one of the most powerful slides they showed is they put it up there. They said, okay, we're the red, right? On the slide, all these different components. We're the red. All this hard stuff that nobody wants to do. And then they weren't the XPUs. You know, that's Intel. That's NVIDIA. That's Qualcomm, whomever. But that's not us. We don't do that. We do everything in between and it's really lucrative. And you just look at their numbers. I mean, they're a highly profitable company. So I got a note from VJ after we left because we were in room five. Charlie Kwas had a surprise with the people who come around. They demo, they demonstrated the CPO, the custom package optics, which is an optical interconnect for AI systems, where they had one CPU switch, a custom processor with optical that replaces 128, 400 gig optical modules. Okay, let's put that in perspective. A chip about this big, okay, fully contained with optics interconnects. So it's a AI chip with all the optics in there. Replaces 128, 400 gig optical modules. Wow. So, talk about, all right. So talk about instant ROI. This is why chips are getting bigger, not smaller because you have a lot of benefits inside chip architecture that Broadcom laid out and video was kind of encapsulating in their model as well as a monolith is that you can do a lot more in the chip if you engineer it this way. And that's essentially going to lower the energy envelope, right, the power envelope. So you get power and cooling are the new constraints. And again, AI systems, we've been saying this in the queue for over a year now, Dave, there will be a new infrastructure for AI. It has to be. And it's not what the old one was. So you start to see a glimpse. That's why I think the NVIDIA event and the Broadcom financial analyst was well timed because you have the curtain being lifted on the requirements for the new infrastructure. When that bit gets flipped, pun intended, you're going to see software snap in line. So you're going to see a huge renaissance and a Cambrian explosion of software development. Okay, because Amazon has already kind of laid it out with cloud NVIDIA with CUDA shows that you can run great software stack on top of existing hardware. And then as these AI clusters and systems and chips come out, the AI chips, AI clusters will enable a new AI software model. And I think that's going to be a very, very fun computer science boom in AI. And today, again, we do back to the picks and shovels. So who are the picks and shovels? Obviously NVIDIA, clearly Broadcom, some of their customers, we see Dell and Supermicro, HP selling systems, Synopsys, Cadence, companies like that who do the design software. But that's all sort of infrastructure. And then to your point, the next wave is software. So all SAS is going to be injected with AI. So you're going to see intelligent apps emerging. So ServiceNow, Snowflake, Databricks, Adobe, Workday, on and on and on Oracle, their apps are going to get more intelligent. So they're going to add more value. But the other piece, John, is the end customers of those products, right? The industrial manufacturers, the healthcare companies, the pharmaceutical companies, the automotive companies, they're all going to be using AI. And they're probably collectively going to create way more wealth than the computer industry suppliers. And so that is when the rising tide lifts all ships. But I will say this, right now the macro's softening. We talked last week about how it's back loaded and that was preliminary data. I've talked to ETR and they're like, hmm, it's looking like this is not, it's still preliminary, but it's looking like pretty solid. So the IT spending is going to basically go from an expected, this is across 1700 IT decision makers from 4.3% down to maybe three and a half percent. So it's softening, it's being even more back loaded. Q1 and Q2 look a little bit softer and AI continues to steal from other budget buckets. So the reason I bring this up is because we've got to start seeing ROI. Customers, two thirds of customers say that they want to see ROI within 12 months. Now I don't think that's across the board right now. I think the ROI is still pretty elusive. The other thing is 44% of customers tell us that they're stealing from other budgets to fund their AI. Okay, so there's still a lot of proof points going on and Charlie Cowis made this point yesterday when he was talking about the clear business case in consumer. It's not as clear in enterprise. People keep talking about what are the use cases? Well, you know, helping us write code, summarize documents, write marketing, all good. All wonderful, but the real killer use cases that are going to throw off cash and allow gain sharing really have not emerged yet. And I think, you know, your point about that is interesting. If you look at on social media, Michael Dell just shared something where Jensen was at the booth and someone caught a camera. If you want to buy from IT, buy from Dell. So he's like out there. And if you look at, he was, what I love about Jensen is on the floor going all the booths, getting photo ops and obviously it's his show. It's like a rock star. It's good. It's good marketing. There are hundreds of people following you. He went to every booth. I mean, I think that is just a genuine, cool thing to do and props to him for doing it. He's a school guy. Why wouldn't you go? He spent a lot of time on the shows. I mean, if you're coming to my show, I'd love to go to your booth and give you an acknowledgement and give you a thumbs up and give you a shout out, right? Why wouldn't you? It's good politics. And it's good to do the raw rock because they'll come back and get a booth next year. But you mentioned Dell. Dell announced this idea of an AI factory. Dell AI Factor, they call it Dell AI factory, not NVIDIA AI factory. Smart marketing. So if you look at Dell and Dell has been a supplier of PCs and servers, okay? And so that's their core competency. I think, like I've been saying, this is a refactoring of the computer industry. You're seeing the AI systems being refactored not as servers, but as clusters. So what Dell is doing is they're refactoring their value proposition of a server and PC to be AI, which means that their customers will be deploying AI factories. Now, if Dell can provide that server resource as a service, does it matter? It's still a product and service from Dell. I mean, the service, remember the first servers were Pentium based, remember? And then like they got better and better and better and then blades and then scale. And so this is just another evolution of servers. So if you're Dell, of course you want to be on the AI factory. You don't have a cloud, okay? You're not Amazon, okay? Remember everyone tried to be Amazon for a few years? Yeah, I mean, John, Dell is remarkable to me. And Michael Dell is like Midas. You think about what he's done. You think about the risk he took with VMware if I may for a second. The timing of that was just unbelievable. And you can say it was lucky. Maybe it was, but his instincts were amazing. He was able to raise money when interest rates were so low. He was able to recapitalize his companies when interest rates were so low. He was able to take his private company public during a time when interest rates were so low. And he just leveraged that. Like his timing was unbelievable. Now, you know, I mean, I've always said Dell is a low margin company. I'd like to see them get more into software. I was a big fan, was never going to happen, but I would have loved to see them spin in VMware and become the Oracle of infrastructure kind of like Broadcom's doing now. Consolidating down and really getting focused. I think that would have been like an unbelievable strategic move by Dell and it would have propped up their margins. But of course, they just had the opportunity to make such wealth creation. But we had IBM on the queue. But so hold on. So, okay, so that's cool. I have been saying for years that Dell, despite the fact that they are a low margin company that's absurd that they were trading at like 33 cents in the revenue dollar, and I've always said they should be at least one X. And frankly, I think they should be higher than that in terms of just the simple revenue multiple. Well, because of their supply chain and because of their dividend, they've thrown off so much cash that they're returning to shareholders. And because now the AI wave, Dell is basically very close to trading at one X revenue, which is I've always felt like they should at least be there. Now with this AI wave and this Dell AI factory and you see the momentum when you go to their booth, you saw it at MWC, which is a longer term play for them. And that's the payback's gonna take a while longer. But the AI play is real and it's now. And you saw that at their booth. You can just feel the momentum there. And the street loves this stock. It's like cranking. And like I said, Michael Dell is just remarkable. His timing and his mightest touch. Who wins with NGD? Dell's a great example. I think HPE can win with NVIDIA. IBM can be a great winner. I mean, if you look at IBM, the way they're poised, they came on theCUBE. Talk about that. And they have all the answers there. They have Watson X, which is essentially a reboot of Watson. I mean, IBM, what better time to recycle Watson right now? Because you can actually take all that work they've done and leverage some of the modern AI techniques with Jenner of AI and bring the benefits of the old Watson and with the new Watson, Watson X, you got Watson X. You got a beautiful storage server portfolio. You have the same workstations. You got the professional services. And so IBM could really be the AI factory company for the enterprise. And I think Dell and IBM are just world-class companies and HPE are world-class suppliers of IT solutions over decades with great built-in channels of distribution, partner networks, ecosystems. All of them have ecosystems. So I look at those three on the track right there and I go, wow, those guys could transform themselves. They missed the hyperscale. I want to be AWS. They all tried, okay? VMware is now part of Broadway. It comes into a separate animal. We won't, we'll talk about that differently because that software business has a huge opportunity. Oracle kind of did it in their niche. But yeah. I think Oracle's kind of like more above, not just being a supplier, but they're more cloud. They have a cloud opportunity. That's what I'm saying. They're OCI stuff. You're comment about AWS as I was just responding. Oh yeah, yeah. So you have those, they're the fourth cloud, whatever you want to call it, fifth cloud, but there I would call them in the opportunity, but they still are a supplier to IT with the database and all other things. But those other companies that missed the cloud wave to build hyperscale cloud when they backed off was a smart move because they couldn't win. Now could get into the game with a DGX-like approach. And so NVIDIA being such a great system, they essentially could be the ingredient of the AI factory, meaning now they can enable their partners, Dell, HPE, and IBM, and others to be the AI factory supplier. And companies like Vast, DDN, Weta, Storage, Specialized Storage, Great Object Store, Great Scale, those guys become key components, key subsystem components in the AI factory. And again, you look at all the other players, NetApp, Pure, they could miss it, right? So I'm watching those two companies specifically right now and I'm questioning myself, NetApp and Pure in particular, are they positioned for the AI? Now Pure stocks up, so that's an indicator, but are their products set up for AI? That's a question, we'll have to ask those guys. So that's a follow-up I want to get with Pure and NetApp. I think they will survive. Again, are they an ingredient to the AI factory or are they an integral part of a subsystem? And that's going to be the question, these AI factories and we'll unpack that in other pods for sure. I want to come back to IBM. IBM is back, there's no question in my mind that what Arvin has done is he's really focused the company back to I think some of its original roots which are a product and innovation for so many years under Gerstner made the decision that we're not going to split the company in two and basically they bought PwC and it became a situation where the services, the consulting services were the tail wagging the IBM dog and under Palmosano and Ginny, they were really services led in my opinion and what Arvin has done, and by the way, during that time it was almost as though the complexity was a friend of IBM because they had really great services capabilities like Accenture and they had product and so they could put them together and that was their strategy to make money but it didn't work well from the standpoint of IBM's position in the industry. What Arvin has done is he's really focused the company on hybrid cloud, on AI leveraging consulting very effectively but what he's done is during those years where services led that the R&D that IBM did which is an American gem IBM research never turned into product and it was always my big criticism about IBM but they are very clearly focused on that Dario Gill who runs IBM research and Rob Thomas who we know well who's got software chops, he's got M&A chops and he's got go-to-market chops. They're very very focused on investments that are made in R&D and IBM research getting to market in the form of leading products. When was the last time IBM had a leading product that you could say that they're number one at and they're sure they're up in the magic quadrants and so forth but they have an opportunity to really be an AI leader. They learned a lot with Watson Original, DOG Watson, they made a lot of mistakes, they tried to put it in places where it didn't belong but now they have like really legit deep AI learnings and AI capabilities across the board from analytics to governance to language models to silicon and you know we'll see what happens with quantum. Jensen made some interesting comments about quantum but irrespective of that IBM in my opinion is really back and has just strong days ahead. Well one of the things we're going to chronicle is certainly is the winners and losers of the AI factory model coming, the AI chip, the AI infrastructure, the AI software stack. It's interesting how NVIDIA is using the word super, super pod, so super clouds got traction, I'll see that sticking, super chips, what was another word we heard, super switch. So you're starting to see this multi-cloud the thing that Broadcom came about, I think this is why I like Broadcom so much right now on the semi-side because it's converging and I think VMware is going to have to catch up the Broadcom a little bit on this. Is their AI, I think a little bit more private AI, a little bit more stalling, holding pattern in my opinion. Broadcom gets that the biggest problem that's going to face the industry, the biggest opportunity will be to solve the infrastructure from the chip to the app for what I call the distributed computing problem set. The distributed computing problem set is this, in distributed computing, well-known computer science theory of which they're very familiar with, if you're in computer science or been an engineer, distributed computing is a multi-decade computer science problem. AI has to run on a distributed computing architecture and what's being done is a refactoring of the chips and systems to support the most robust, scalable, highest performance distributed computing platform that's global and open. It's not one mainframe. It's not a company. It's everything has to run distributedly and that's going to be an amazing opportunity and it's going to be new. It's going to be something that's going to be magical. So you'll see distributed computing become the fabric of life. That means everything will have to be re-engineered for distributed computing and I think that's what I heard from Broadcom. If you connect the dots with the Broadcom meeting and NVIDIA and the work we've been doing with clustered systems and the research on cube research is that everything's revolving around distributed computing problem and the opportunity that it yields. Private cloud, public cloud, distributed edge, mobile cars, robots, AI, intelligence, data access, all has to be refactored for distributed computing. The chips are coming out first and then the architects are for the clusters and then the abstraction layer for software enablement then the invention creativity happens and then that's where GenRiv AI I think hits and I think it's where Jensen nailed it by saying generative AI is a new category. Life before generating AI was prerecorded. Everything in life was prerecorded, how we get our news, how we get our opinions, everything was set up for us, databases, query a response, prerecorded, get an answer. So I think generative is a more robust environment. So the distributed computing paradigm is going to be not a one company thing, but an industry thing. And I think the servers move to clusters, clusters move to systems, global systems and everything kind of goes on from there. So I think that is the exciting moment we're in. One of the things I wanted to mention before we break is Reddit. Reddit IPO today, they're going out $34 and they're saying it's going to open about 30% or 35% above. What was their valuation before they went public? I don't know, but I think they're like a $5 or $6 billion valuation is what they'll end up at. They priced their IPO at 34 bucks a share. There you go. Okay, 5.9 million they raised, 519 million they raised at nearly 6.5 billion ahead of the market debut. That was the headline. So and they're allocating, I think, 7 or 8% of the shares for retail, for their Reddit army, which is kind of interesting because you're going Reddit and they're like, that is a shit stock. It's their own people are trashing them, but that's what Reddit's all about. It's Reddit's amazing. I love Reddit. Reddit's everywhere, right? Reddit has something for everybody. You got marital problems, go to Reddit. You got looking for a job, go to Reddit. You're looking for advice on hair, go to Reddit. I mean, it's just amazing. And you know, they're small in the grand scheme of things, but they're not extracting your data and monetizing it. That's what I really like about Reddit. I think that their model could, hopefully this sets off a little IPO action, John. We can see data bricks potentially. You can see Arctic Wolf maybe going on. Will the window be open? A couple of companies that we've been following. And then the other news this week was Intel secured an $8.5 billion grant from the government, from the Chips Act. And they got, I think, another $11 billion in secured loans. So it was like $20 billion. So Intel got a bailout. Is that a bailout? Yeah, Intel's getting a bailout. I mean, I think that's the... Pat Gelsinger thanked everybody except the US taxpayer. And I understand why, because that's sort of controversial to say that, but it's true. The US taxpayer is bailing out Intel and covering for its mistakes in the past 10 years. And so yeah. But I'm for it. You know, I'm happy to take my little piece of the tax money and put it there. I think the question is, John... The question is, can they win? Well, that's the big question. Can Intel win? Their goal was to be number two in Foundry by 2030. And the whole concept here, and you listen to the Commerce Secretary and President Biden and others, is to bring manufacturing back to the US. And I think that's a long, that's a tall order. So if your objective is to do that for national security reasons and supply chain control, you could make an argument, John, irrespective of the fact that Intel is a US-based company, and I would much rather see US-based companies get my money. You could make an argument that you'd be better off, you'd be a higher probability of success giving that money to TSM and Samsung because they're way ahead of Intel. And so, but that, again, is controversial, right? We have a lot of government interactions going on here. You got Intel getting fed money by the government. Some say they don't deserve it. I'm thinking, I'm questioning, is that good use of taxpayer money? My opinion is, can they even win? And will there, the way they design and ship chips, is that going to work when you see what Broadcom's doing, how fast they're going? I'm just thinking maybe that's time for Intel to rethink that. Who knows? We've got to get going. We've got to get more data. In bed, 20 billion, by the way. We'll get in two thirds of the factory. The government making decisions on who the winners will be by giving money subsidizing Intel. That's a subsidy, it's a bailout, whatever you want to call it, it's money from the government. Now the government's involved and they decided who made that decision, whatever. Good job for Pat, get the cash. The other issue going on now is Apple got sued by the DOJ for alleging it's a monopoly and blocking competitors from accessing iPhone features. So Lina Khan's like 0 for 5 on that front. This is DOJ. What happens this? This is the Department of Justice. Yeah, so okay. This is not Lina Khan. Well, that's bigger. That'll have more teeth then. This'll, this, the department said that they leveraged dominance of block software and limit functionality of competing devices. I think they do, John. I think actually Apple does. I think that, I'm actually kind of in favor for the DOJ getting Apple in a little bit of a headlock and saying, come on. Well, do we want the government to basically have their hand in designing technology? Apple's products are great. So what, get the hell out of the way. You know how I feel about this is that the markets, market forces have always been much more successful in moderating monopolies than the government. Having said that, if a company is breaking the law and violating monopoly antitrust rules, they should be investigated. Now the problem with Lina Khan is she is rewriting, trying to rewrite those laws. But if the DOJ is determining that Apple has violated that law, then they should investigate. I'm all for that. But let's not drag it out for 10, 12 years like they did with Microsoft and IBM. It's just waste of money and a waste of time. I mean, I see what Apple's doing from a competitive strategy standpoint. I just don't, I mean, they have monopolistic scale. First of all, I would say the same thing about Amazon, AWS, Google and others. When you have that kind of scale, you got leverage. But at the end of the day, there's a lot of other bad stuff going on. They live in a highly competitive market. I mean, it's not like Apple sitting there going, well, I mean, they're making a boatload of bank on how they make money and they deserve it because they made a great product and they have a good model. It's not like it's not competitive. There's plenty of choice. But it's the App Store, right? It's the App Store and the iPhone. It's like, so they have the. I mean, iPhone, you can get an iPhone, you can get a smartphone from Samsung. They make great smartphones. Yeah, and you can switch plans, use Android. So it's not like it's not a competitive market. So, you know, one thing Apple does really well and we take it for granted is they have privacy protection. I mean, I love Apple's biometric stuff. Their facial recognition for getting into passwords. Sometimes it's almost too good. I forget my password. It's like, I'm on another machine. I'm like, oh shit, what's my password? I can't remember it. There's so many Apple passwords. I'm using the finger on the bio on my Mac. I got the face, the face login on the. I love Apple too. I love Apple too. I mean, come on. It's just a superior product on security. Now, okay, I can't choose Epic Games feature or whatever. Okay, I just don't, I just think it's a little bit over the top. I get nervous. I get nervous. I gotta go. When government is getting involved. All right, well, that's a wrap. Dave's got to catch us playing. Uber's here. All right, Dave, have a good trip back to Boston. Great week. And shout out to the folks in Paris for KubeCon, Rob Streche, Savannah Peters, and Dustin Kirkland and our community. Wish I could have been there this year. I'll see you in North America. I love that show. One of my favorite shows is what CNC is. I think cloud native is going to KubeCon. It's going to morph into cloud native con. Open source is booming. Open source AI models are booming. Cloud AI, AI systems, AI factories. Episode 52 is going to be insane. I can't wait for the next week. Dave, good to see you. Thanks, John. Thanks for watching. Thanks, guys.