 Hello, everyone. Welcome to the Cube Pod. I'm John Furrier with Dave Vellante, our weekly podcast where we riff and talk about the hottest stories in technology, Silicon Valley, enterprise, and emerging tech, tons of stuff going on. Dave, we break down. Positive and great. We got a lot of comments last week on the whole Israeli position. I thought we set it straight up and to the point. A lot of people come into that position now, but this has been a very busy week in the technology scene. Obviously, it's been 10 days since the Israeli war and the fallout from our last pod was interesting. A lot of support. I think we need to write call in terms of how to position it to human conflict. There's not a lot of solutions being thrown around. A lot of people bitching and moaning, and it's just this terrible situation. So it continues to go on. Our hearts go out to all the folks there, and a lot of people are scared. People we work with getting emails. They're not in a good place. Let's hope they can get an inclusion and destabilize the situation. It's hard to see an easy solution. I listened to the all-in pod guys after our pod last weekend, and they were like, yeah, two-state solution is the only answer. It's like two-state solution is very difficult now. I mean, there was a time, perhaps when that would have been easier, but it's not so simple as, oh, yeah, let's just do a two-state solution. And so much has to happen before even that can be contemplated, and Israel's going in basically hand-to-hand combat, and it's just not going to be. People are scared. Can you imagine living there? It's like I got to do a Zoom call and have a normal life and have a family, and you're wondering if you're going to be defending yourself as war going on. It's just incredible time, and it's just fingers crossing. The best we could do is shout our position, and this whole two-sides thing, I mean, look, I'm totally called bullshit on. People trying to play down the middle here. There's only one side. It's anti-terrorism, and the fact of the matter is whether you're a bit of a Palestinian or Israeli, it doesn't matter. And what happened in Israel was a terrorist attack, period, full stop, and that's the bottom line, and anyone who's trying to justify and create this, you got to choose is ridiculous. So that's kind of my take on it, and I really don't want to get into it. I'm not an expert in politics, but we'll do our best at a silken angle to help where we can. Talk to Israeli companies, startups help them if they need funding or amplification. A lot of companies are in rounds of funding, their staffs are being called away to war. I mean, imagine if he was really company day. Your staff has got to go either get called into service or they're offending for their life, and it's just incredible time, and I've never experienced it personally, except when our country got attacked in 9-11 by terrorists, what I felt there, I can only imagine what's going on there. So a lot of our clients were called up, and I think this is very much like 9-11, and I thought Biden made some good comments when he said, look, we learned a lot. We made a lot of mistakes after 9-11, and it's a different situation, obviously, completely different situation, but there are similarities. I just don't see, yeah, terrorism, terrorism is terrorism and it's absolutely unacceptable and requires a response, because if you don't respond and they just escalate, that's so true. The flip side of that is there's just no easy answer. Even, look, we never should have gone into Iraq, but even if we didn't go into Iraq, we might have admired in Afghanistan for a long time. Who knows? Maybe we could have got Osama earlier, but it's just like George Bush, the elder said, if you don't have an exit strategy, it's a scary thing when you go in, well, I don't know, what's the exit strategy? Is it wipe out Hamas and what does that mean? And it's just, there's no easy solution. It's just terrible, our hearts go out to people. And interesting, one of the tech, on the tech side, I've never seen anything like this before in my life. One of the infamous event organizers who started as an entrepreneur, started it years ago, Web Summit, Patty Cosgrave, he got slammed hardcore by making some, I guess, side choosing comments or insensitive comments early on, and they tried to walk it back and he went on a massive, he just imploded. And he had no, he didn't read the room at all. And apparently Web Summit, he's got an event in the Middle East and he's paid a lot of money, like 30 million or plus around there to go to an event. And in, you know, where everyone goes for cash, the VC, so like the blood money, if you will. So a lot of VCs are saying, I will not support you. I'm not, I'm canceling my trip. People who are Israeli startups that were AI startups are saying, hey, you know, already from an art for a super cloud, our keynote speaker from an AI startup. He basically, he was a keynote speaker. He said, I'm not coming because he's holding in the Middle East. They look at it at a payola. Massive boycott, almost blacklisted Dave. That's how fast that went down. So, you know, there's a, it was a major drama there. I was inside baseball, where you had VCs first, first round capital and you had entrepreneurs all over the world because they were insensitive. In fact, that was a killing. That was a terrorist attack. He was trying to play like, oh, it's not, it's a two. They tried to play the old classic narrative when it was reality was not the same. So the Israelis kind of freaked out and it was just terrible. And, and it's just, again, he stepped on himself. But let, you know, we got to move on our lives. We're going to do our best again for the folks listening. Our hearts are out there and we're going to do our best. I even floated the idea of, you know, helping start a VC fund for these startups, Dave. You know, anything we can do with our silicon angle, any startup needs to get some help. We'll do some PSAs. Let us know, we'll do whatever we can to help, you know, get the human, humanization back and anti, the anti-terrorists kind of wiped off the face. Okay, so big news going on this week. Broadcom VMware starting to come to a finality there. The high level merger is happening. Broadcom takeover. You got the US, China, AI chip export ban potentially coming. NVIDIA and Intel, two huge aims kind of positioning themselves. We wrote this up on silicon angle to control exports to China. Okay, misinformation and war. Big story in the New York times headline around the whole hospital bombing that didn't happen. It was a rocket that they were using on site. They claimed it was an Israeli hit. Sparked a huge misinformation. What's the role of media? Of course, you know, we know tech, we know media. So we have an opinion on that's gonna be my rant for today. But you know, these Hamas terrorists used hospitals as human shields. So what happened was a missile blew up. They kind of blamed it on the Israelis. New York times ran a headline to that effect called a strike. Then they walked it back and said a blast. And then they said something like they had to water it down completely. So we'll talk about that in one of the segments. Misinformation, role of media. And in times of crisis like a war where the media impact could trigger biases and whatnot. So it's a huge conversation. We got SuperCloud next week. AI models looking good. The VC market's changing. Carter reported that since January 543 started to shut down so far. The cybersecurity actions up. And the big earnings are coming out next week of the big cloud guys. So tech innovations here. You did a big report on the sixth data platform which was killer. And we get the big general AI event here in Palo Alto. So we got a lot to talk about. So welcome to pod 34. So your Broadcom comments. So it looks like China's trying to hold it up, right? That's the big news. That China might try to scuttle the deal at the last minute. So there we go, right? More delays. I mean, you remember when Dell was acquired EMC, China required, if I recall, China required like a five year separation between Dell sales and marketing. And they had other restrictions on there as well. I don't remember the specifics but they threw in these sort of last minute, you know, no merger for you unless you do this. And so there may be something similar with Broadcom or maybe it's just like retaliatory because of the no chips for you act. Yeah, yeah. And so... I think it's a complete paper tiger, so to speak. You know, this is bullshit. I don't think it's gonna happen. I mean, the narrative, I mean, look, to me, I've got some sources inside the company. Here's what I'm hearing. First of all, Broadcom according to sources and this has been publicly reported either on a Slack channel or whatever. So it's a number that's out there. They had an all hands meeting last Monday. Octan said that they're gonna lay off 800 marketers. Okay, are gonna be let go. And they're gonna focus on their core SDDC and look for synergies on the upside. Well, if you look at and count, so this is kind of what I'm reporting. If you look and count the number of markers at VMware, they don't have that many. Okay, so I then do a follow up with some of my sources inside the company. And it's basically is that what you got here is that Broadcom is not looking at his takeover of VMware. Octan is saying that he's gonna integrate VMware to reshape Broadcom. So it's basically a high level merger. I won't say if equals because Broadcom is clearly in charge, but Broadcom is clearly now signaling publicly that VMware will reshape what the new Broadcom will look like. So, I think we nailed it in the VMware Explorer when our narrative on our summary was it's the closing of the chapter of VMware to the new chapter of Broadcom plus VMware where Broadcom never really nailed it on software. They had CA, they got semantic, but with the VMware, and you pointed this out on one of your breaking analysis, this is the crown jewel of software. So, the word coming out of the VMware community and some of the insiders at Broadcom are telling me is that our narrative of chips to apps is absolutely the key. And if you look at the chip to all the AI conversations we're gonna have next week, and this came out of our previews coming up to it is that all the action on inference and training is at the chip level and the cost of what it takes to do the inferences. And as you look at the small language models emerging, the chips are gonna be more important. And we pointed this out early on when that first happened, and you and I were like, that's good, there's good theory that Broadcom's actually thinking chips to VMware, maybe they just want the cash flow. And so I think we might be right. And you brought up EMC. If you remember, Dave, when VMware was bought by EMC, you said it's a steal of the century, it's the best deal ever. Remember that? You were very vocal on that. Oh my God. It's like, $600 million, $630 million for VMware. It was a song. Joe Tucci was a genius, but think about EMC at that time. What happened to the company? They had a storage, shops were good, not great, declining, new markets were emerging. They bought VMware to help them sell and refactor storage. Broadcom is buying VMware in my opinion, this is gonna be the story, I think it's gonna come out of this is that to help them sell more chips for the machine learning and AI workloads. Absol, and the security side. So you got security, machine learning and AI. VMware will absolutely be the Broadcom stocking horse to establish their chips for those workloads. And look at the trends, all of it's on-premises and in the cloud. So even if Amazon and Azure stock up on chips and silicon, VMware and Broadcom could own the enterprise on-premises and Edge. So, I know it's very nuanced, but this is interesting. The new Broadcom is a thing and it's not the old Broadcom or them swallowing up VMware. If the numbers of 800 layoffs are true, that means Broadcom is getting laid off too. And I hear rumors there's a new CMO coming out that's gonna be announced for VMware. So it's not gonna be someone from VMware or someone from Broadcom, someone net new. So I think Broadcom and Octana, look at this as a new entity, a new co of Broadcom. So I think the China thing will be just a sideshow. Well, we'll see. China's, like I said, probably just retaliating for the lack of high-end chips and all the tensions. But think about it this way that after the VMware acquisition, about 50% of Broadcom's revenue is gonna be software. I mean, this is a chips company. Of course they bought CA, but yeah, half the business will be software. Now, to your other point, about that sort of hardware and software integration, I guess is what you're alluding to, that would be a huge pivot for Broadcom because generally it runs itself as a collection of whatever 17 or 20 or 23 different business divisions. It tends not to allow one division to draft off the success of another division and rely on that division's performance. They tend to be very independent in terms of being able to hit their targets. But having said that, there's some really good examples of hardware and software integration over the years. I mean, look at Apple, you look at Larry Ellison with Oracle when after they bought Sun and did engineered systems, buy-buy HP UX. Remember it was Oracle and HP with a gold standard and then once Oracle bought Sun, forget it, they were just engineering. And by the way, they absolutely destroyed everybody on the planet. I was talking to some, it was a long, long time ago so they could spill the beans, but it was a story of a company that was competing against Oracle and an Oracle came in with Exadata and it was a long time customer of this company and like years and years and years, loyal customer and they sat down with the executive and said, hey, good to see you, we're dumping you. And the vendor said, well, wait a minute, we've had 15 years together, you gotta give us one shot. He goes, okay, we'll give you a shot. And they gave him a shot and Oracle just smoked him because of the engineered system and Exadata. So that's an example of hardware and software coming together. And then Tesla is another example, the software-defined car. So there are examples and so maybe brought comes thinking we could bring the hardware and software together. That would be a major pivot for the company. It's absolutely the case. I think you just pointed out the key to this. And I remember you and I got, I won't say laughed at, but people were like poo-pooing my conspiracy theory that Broadcom actually had a master plan here. If you look at what's happening with the enterprise, we just put a post on Silicon Angle, Paul Gilan wrote a great article on premises action on cloud. If you go look at Silicon Angle right now, the article is still up there. They'll feature, I believe. I know that's the Davis-Drom one. Paul Gilan, he's in one of the top posts here. Let me pull it up here. Yeah, Paul Gilan does some great work. He basically said all the actions going on premise for machine learning and AI. So if you look at that, and that's true, if you look at all the stuff we've been digging in on the AI side, this is a generational shift for the entrepreneur. Wait, wait, wait. What's he saying? Because I don't know if I buy that, John. I see it mixed. I see it kind of mixed 50-50, but I mean, all the action, frankly, he's in the cloud right now, but. He's specifically talking about generative AI, right? So let me pull this up. That's a good point. Yeah, but look, the cloud is already one. Dude, we've kind of, okay, we've been shouting from the mountain top, you know, okay, developers, you know, one in the cloud, the cloud, next gen super clouds happening, which we can still do more of that. But the developer side of it is all next generation. What's interesting is a new generation of ITs coming around the corner, which is what Gilin's pointing out to do. And I think this is where I think VMware and Broadcom might have a genius situation here, is that the on-premise, it's cheaper to do a lot of the stuff on-premise with hardware. This could be a boon for, say, Dell technologies. This could be, I'm actually chatting with Varun over there now and talking about this, because I want to get some more stories out there. This could be a complete reset for Dell, another generational shift, because, you know, the old generation of PCs and servers get decimated by the cloud. Okay, AI model training rekindles interest in on-premises infrastructure. Yeah, for sure, because people don't want their IP leaking into the cloud. And number one, number two is the data's on-prem, or not all the data, but there is data on-prem. So bring AI to the data. And by the way, the survey data shows that it's about 50-50 where that AI work is going to occur. But having said all that, all the good shits in the cloud, right? Well, define good shit. I mean, right now, I'm talking about LLMs, all the, you know, all the, bedrock just went, you know, GA, Vertex AI, Open AI. Not really, Dave, not really. I mean, like the power law that is, yeah, the large language models are in there, but if you look at the uptake from a development standpoint, people are getting killed on costs, right? So David Strum just wrote a story how companies are scrambling to keep control of their private data from AI models. If you look at Amazon's top message for bedrock, it's all about VPC and managing all the data. So what's happening is that, yeah, the large language models on the cloud, by the way, Amazon has Anthropic, that's not theirs. They have Titan. Open AI is not even a cloud. That's not Azure. That's also Open AI. Azure has a piece of it there. So that's kind of, there's not a lot of good shit in the cloud right now. Okay, wait, wait. So let me be clear. If you're Open AI and you have access to GPUs, you can stick them in your data center in Ohio and they're doing that and they're kicking ass and there's a lot of action there, building, you know, training very large language models on-prem. No question, I agree. But if you are a financial institution or a manufacturer or a healthcare company, you're still doing- You definitely want that on-premises. You do, but today they're mostly doing experimentation in the cloud, summarizing text, helping write code, the basic stuff that we're doing with ChatGBT because it's early days in order to get the infrastructure, the hardware and the software, the LLM capabilities and the GPUs on-prem to actually do what you want to do with that existing data. So right now there's a lot of experimentation going on, but- Actually, I don't think you're accurate on that. I think- Hold on to your point- You're blending two things. To your point about the power law, people definitely want to do stuff on-prem, the data suggests it's not happening in a broad-based way yet. Other than typical ChatGBT type of work. It's happening on a major way. So here's what's happening. So the data that you're looking at is survey data from enterprise buyers. So that's measuring production workload. So unless you have other data that I don't have or you don't see some of the data that I'm getting, which is more developer trends, everyone's doing stuff with AI and some of them, the data is small. So if you're talking about massive petabytes, you're not going to be doing that with, whether it's retrieval technology, the rag they call it or vectorizing with embeddings or doing kind of indexing. You at best can do what they call an AI wrapper and you call it GPT. That's not AI data. That's basically like, that's not real AI. That's like, I have content. I'm going to throw it into LLM like open AI and I'm going to put that as a wrapper around my data and make it do things. That's called a wrap AI wrapper. That's one use case. The other use case that's emerging aggressively in the experimentation that's almost going to production is the idea of AI native. AI native is when you're actually take your data, create not all NLP, but you create embeddings and you got tokens, you got a context window. When you look at that, I just read a paper from Databricks this week. This stuff could be done on a Dell server on your desktop. Okay, and by the way, if you put in the cloud that's going to, why pay the cloud for the money when you can do it on premise? So there is a model saying you can take some of the large language models that are going to be small, walled garden distributed data sets and still apply technology to them. It's not just GPU, it's inferencing too, which is compute. So it's complicated, but I guess what I'd say is that the on premise is a safer bet because everyone who's in the cloud is reporting that their inference costs are getting killed, especially if their models get bigger. So dumping a knowledge base that's got petabytes in the cloud, you're looking at a major bill. And that's the blocker right now. So I agree, but it's still most of the work is training that's happening in the cloud. And I agree with you, we've written about this over time. The dominant workload is going to be AI inferencing. And I think it's going to be at the edge, by the way. And I think it's going to happen on very attractive power per watt, performance per watt systems, which are going to be dominated by ARM. But that's very diffuse right now. I mean, it's not hard to find. When you say all the work's being done in the cloud, I don't, when you say AWS Azure or open AI is its own cloud. So you're bringing up a good point. So there's a lot of activity going on, a lot of experimentation going on by everybody. And most of that's happening in the cloud today. There's also big training going on. So there's big models being trained like open AI and they're not necessarily doing that in the cloud. I agree with you. But I agree also, the real opportunity is AI inferencing at the edge. And I think that's going to largely be done on ARM-based processors on chip. My point on the whole Broadcom, getting circling back to being went a little tangent there on AI, which, you know, there's different perspectives depending on how you look at it. If you're looking at it from your perspective, from buyer data or from a development data, you're going to see to that. I'm just saying Dell and HPE right now are not making a lot of money on gen AI servers. Because, And they will. Because no one's calling them saying they don't have a solution. But they, Right, that's what I'm saying. If they're smart, they will. And as a result, most of the action right now is in the cloud because you can go to the cloud and you can play with this shit. Or you get a bunch of GPUs and you put them on-prem. But most people can't do that today. That's all I'm saying. Okay, so yeah, I appreciate that. What I'm trying to circle back to is the Broadcom's bet on the enterprise with VMware. Because when they bought, when they made the intention to buy VMware, the whole point was, let's get it on-premises. Because that's an on-prem, let's get the data center. That data center paradigm is perfect for AI machine learning on-premise. Okay, and as we always said, the data center is a fat edge. But the point is that this could be a genius strike by Broadcom because they're not going to be able to compete at scale with AWS, probably be a supplier to them at some level. But what they can own is the chips and the software on the company's side of the AI. Because if you believe the power law that we put out, which you do because you did, we put it out together, then there'll be a smaller set of language models or foundation models that are going to be, I guess, walled gardens of data. And it's proven in AI today, in today's state of the art that the higher the quality of the data, the better the AI. So what you see in the large language models with a proprietary layer is not like open AI and a cohere and anthropic and stability AI and all those guys, they're everything. And that's where the hallucination or the data is not as clean as it could be for say a specific domain. So the vertical nature of the data is going to imply that there's going to be tons of data sets that have to integrate together. And so this is the opportunity that I think you brought up with the edge, inferencing at the edge, also it has to include models because the whole, you know, move the compute to the data concepts was pre AI and generative AI. So now if you add generative AI, you have to not only move compute to the data but have compute at the data, right? And so there's so much going on that could be innovated with the chip set. So again, EMC bought VMware to make storage better. I think Broadcom bought VMware to make the chips better for the machine learning AI workloads and security. So, you know, I think it could be, look like a genius move. If the data coming in from the open source world as well as these developers is that you can train a model and do inference on stuff with small compute and GPUs while leveraging the cloud for their cost structure for training. So in other words, rather than training my own data, I'll just go to the big models who already have the training done. So it's going to be a completely different paradigm around how people do software. And that's what's going to come out of SuperCloud for next week. I think we're going to end up validating a lot of the Broadcom moves as well as the things you brought up around AI because everyone's pointing to the fact that the economic model of AI has to improve, otherwise it won't get to production. And the production blocker is the combination of the right capabilities and costs. And so, what's the size of the model? Should it be big, it should be small, it should be a collection of small models? That's what's going to come out, David. I think at the end of the day, it's the silicon and the apps, which powered by data. Yeah, so there's a lot of examples that we can point to today of AI inferencing happening at the edge. I mean, just every time you do face recognition, right? That's an example of what Tesla's doing in its cars with its neural processing unit and its ARM-based system is an example. And there are others. For SuperCloud 4, we have the CEO of SEMA AI coming on. They're doing a system on chip, basically not trying to compete with Nvidia and GPUs, but rather they're doing kind of robotics and they're doing hardware and software integration, something that we just talked about a little earlier, where they're basically building these AI machines on robotics and drones. And I think that's gonna, in factories, I think that's gonna be an absolutely enormous market. The chip content alone in that market today is like 40 billion. That's just the chip content, let alone all the other value add on top of that. And so, and within, by the end of the decade, it's gonna be five to 10 times that 40 billion. And again, that's just the chip content. So think about layering in all the other components of the value chain, it's a trillion-dollar market. And I think that the economics of that market are going to, they're already, I would argue, but they are gonna find their way back into the enterprise with ARM-based processors. And people talk about other alternatives like risk five, et cetera, which are all wonderful, but volume wins. Volume is the killer app in semiconductors. I think you're right on. I think the other thing that's coming out is that you're gonna start to see specialized silicon come out where you have use cases that need certain things, right? And so one of the things that's come up on the SuperCloud 4 preview is the CTO, I was talking to the CTO from Box, he said, it's like coffee and you can either get flavored coffee or you can get like the Starbucks and, you know, Phil's or Pete's. Right. And then it's like, and it's like, it's coffees like that, you get the big brands and you got flavored coffee. He says, the other side of the coin is like wine. Like name a winery that you like. Like it's all kinds of wine you can buy. It depends on your taste. So his point was with AI and chips the relationship between power, horse power, performance throughput and AI is gonna come down to either one of those directions. So that's kind of the conversation in the Silicon Valley circles is that, is it wine or is it coffee? Because in wine, can you think of a major brands? Not as obvious as say coffee is like the big brands, Starbucks, you know, Pete's, whatever. Dunkin Donuts, Phil's from the West Coast. And then if you want flavored coffee, you can do whatever you want. So AI is gonna have a lot of these specialisms to it. So I mean, think about like these areas that have domain expertise. Why would you wanna pollute data if you have clean data in a, say you're in healthcare or say a vertical? The data specifically in the vertical will be directly related to that domain. Very clean, very precise, very organic. Why would you wanna like blend that in with other data to make it more diluted? Because that won't be more powerful for AI. So I think the chip thing is gonna be a very important conversation. Like when do you use it? Our mouse is gonna win the edge. What's the core chips that are gonna offload, say compute or GPUs? And that's my question. Well, I think there's a lot of wasted cycles going on today in the data center, doing things like networking, networking management, storage, storage management. They're being done by general purpose processors and those gonna be done by specialized processors. That again, I think they're gonna be highly efficient processors. Many of them are gonna be ARM based processors embedded inside of these storage arrays or networking systems that are gonna be dedicated to those specialized tasks. And as well, there's gonna be accelerated computing workloads that are gonna require a very wide spectrum of GPUs and specialized processors. First of all, I think NVIDIA has got an awesome lead. I do think they build a mode up with their architecture and their software architecture, but I do think there will be alternatives. I mean, Intel's not just gonna sit still and other competitors are gonna come out and you're gonna have all these startups and most of them are gonna fail, but still some will make it and there will be alternatives to do, especially that inferencing and some of the low cost work and some of that specialized work. But I think in general, it's gonna be dominated by a couple of architectures and it could be somewhat more fragmented than the X86 market was. AMD's gonna have their solutions and Intel have their solutions and clearly NVIDIA, but it's still gonna be a handful. It's gonna be an oligopoly in terms of chips in terms of who really dominates. And then of course you're gonna have a lot of specialized design chips like you have today with Qualcomm. And I'm really interested to hear what you think about the SEMA AI, you know, very, I think forward thinking company. And again, who knows who's gonna make it. But there's a lot of VC money going in and a lot of people trying to sort of disarm the monopoly essentially that NVIDIA is building. I think there's gonna be an interesting action. So that's a great point about the, it's come up a lot in my super cloud previews and preparing for next week is that the game is on. It's definitely a shift. And it's not even compared to crypto. People are like, oh, a crypto is a hype. I think, you know, I've debated this on the pod before. Crypto or blockchain is an infrastructure shift. That distribution is gonna happen. The thing about this AI wave that's coming out is that clearly it's gonna be applications. And I was down on this whole AI wrapper or just wrapping GPT around data. I think that's gonna be a viable category. And I'll tell you why. I saw an analogy on the web on Twitter, Tren Griffin, he's a old timer, worked with McCaw on McCaw Communications back in the telecom days. He's a telecom guy, knows his telecom early school. He had a conversation we were having around how telecom, the internet and the NSF in 1995 laid out all the plumbing for the internet, the connectivity, and which created the internet, okay? NFS, 1995. That created the World Wide Web, which is a collection of sites. Websites became the application for the web, right? Search engines helped you find more websites. So if you look at that dynamic of telecom or the connectivity, that empowered the web. The web, so put that together, connectivity, telecom plus the web, World Wide Web, is a fabric, created websites. That to me is what AI rappers are right now. Websites are things that sit on top of the existing infrastructure. That allowed for the internet and then the picks and shovels came with that. That became the web boom. And the bubble that came as everyone knows is the dot com bubble. The web would not have happened if the NFS grants didn't lay down all that fiber. And that fiber made everything happen. So that enabler made it happen. With AI, our super cloud narrative essentially is that the hyperscalers have set the table for large scale AI because they have more harsh to power, they have more compute, they have more ability to do development. So you've got the combination of open source and video GPUs, CPUs, power, cloud, enables the AI piece, hence open AI. So now what's the application of AI? And that's kind of where we're at now. And I think what the web and the kind of points to our debate of was the web inflection point that looks more like AI. And I think this validates at least in my opinion or my thesis or your our thesis together that said the web is the most accurate with AI is because the early days of the web had the same kind of clunkiness to it, Dave. Where's the money going to be made? In fact, Jim Clark who invented the browser was quoted and saying on his tweet, on the tweet that Griffin wrote, if you look at Jim Clark said, this is what Jim Clark the founder of the browser said in 1994, what I reckon quote, what I recognize after talking to Mark Andreessen who invented the browser as a student at an University of Illinois, the Mosaic browser, was the PC was to compute, was talking to, no, what I recognize that's talking to Mark Andreessen was that the web was to networks in 1994 what the PC was to computing in 1982. Of course I knew what the internet was, but I hadn't thought about what the implications were in terms of his growth rate. In other words, it was tiny, Dave. Tiny. So I think a lot of people are going to look, swim through the analysis and see probably a dead cat balance relative to the performance of the earnings, but it's still like way early on this. And I think this is why I'm in belief that this is a generational shift at the developer level, at the infrastructure level. So super cloud is going to power super apps, which are AI driven. And that's why your sixth generation data post is what's interesting to me and there's all this conversation around that. And think about it. Just a story's been running about how Parquet and Iceberg are going to change the data warehouse market. We're doing a big report on that. Rob Stretcher is doing that, right? So that's that. And then Card has got statistics that say 543 startups have shut down since 2023, okay? The failure of the startups pre-bubble means that the transition's happening because from those ashes, everyone knows when startups fail that creates the fertilizer for the next batch. And that's what's going to happen. So the sixth data platform, whatever that is, six, seven, eight, it's going to probably create startups because Databricks and Stoflake can't hold on to the lead. So I think super cloud as a substrate is going to power the infrastructure of AI so that you're going to see the large language models, you're going to see tools come out, picks and shovels. And then what the website was for an app, you're going to see these AI apps emerge and you're going to be very, very fast. So I thought that analogy kind of points to what we're going to squint through on the earnings coming up next week on that. So again, was the web more important than say the mobile format? Well, so the way I look at it is I see this new AI wave as like the PC wave from a productivity standpoint. It was personal productivity. We all started using PCs. We all got our own PCs and it created this massive productivity boom. And I think it's like the internet in that everybody's going to be able to take advantage of it. It's going to be one of these rising tides, lifts, all ships types of thing. Because you remember with the internet, you know, we all thought, oh, well, yahoo. You mean the web, not the internet? Yeah, I'm using internet and web interchangeably. But yes, the web, absolutely. But of course, it ran on the internet. But telecom. Yeah, it was telecom infrastructure. I'm just being a historian. No, it's a good clarification. But to the point, you were able to get over the top providers. You were able to get startups like eBay and Amazon. You were able to see companies like Dell create, go from mail order to internet order. And so every company, so to me, this AI wave is like both. The personal productivity impact of the PC and the transformative industry paradigm shift of the web and the internet. Exactly. And by the way, I totally agree. And that's why I think it's a bigger inflection point than either of them individually and maybe combined. Because it brings the best of both productivity. That's why I wanted to bring that up. Because remember, we had that big debate where we were yelling, no, no, you're wrong about the web. And you said internet. Which one, John? I can't remember. So it was a debate we had about this one topic. And this is why this telecom thing is an interesting debate, because it's the telecom companies that lay down. In fact, Trent Griffin worked for Craig Macaw, who built internet services. And internet services were proprietary. So if you remember that back in the day, I think it was called NextLink. They had all these telecom. Well, the telecom companies were like the AT&T and the phone companies, and they had laid all this fiber down. They were building basically proprietary services, like terminals that had, they wasn't standard. It was the web, the worldwide web that created the HTML and HTTP standard, which created websites. The standardization and the combination of the network of the internet made the web kind of grow. That's to your point. So again, what I realized was that we were both kind of arguing about the same thing, which is the other same position. It was telecoms, the pipes. And the web was the standards that enabled the websites. And then from the websites, you had search engines from the websites, you had eBay, then you had other native web apps come out. And I think that's what AI is happening now. You got AI wrappers, which I call websites. You're leveraging the best of AI, say chat GPT or open AI and taking data like we're doing with vector embeds and making it better. And then you start using these embeddings for retrieval. Now you start getting into AI native type services. More picks and shovels are coming online to help developers. Hence, the functionality is going to increase radically. And that's the same progression that the web had. And you add in your PC argument about productivity. You're already starting to see productivity. I saw a developer online today shouting from the Twitter mountaintop saying, oh, my god, I can't believe I just did this open source project in like three weekends. And it's a huge success. He was part-time working on the weekends and he hit a home run. Why? Because it was easier. He could do it, productivity's up. So this is a first generation problem that we've never seen before. And it's an opportunity that's being solved. So I think that that's the key point. And again, that's kind of a, it's not really a rant. It's more of an awakening, validation for us is saying with SuperCloud 4 next week, the generative AI conversations are going to be very broad and targeted around these areas. So I mean, I'm pumped looking at some of the early interviews, Dave, from SuperCloud 4. That should be a must watch, must listen to event. We have so many people lined up for that. Well, and we got some great people coming in studio, John, live. So as always, looking forward to that. And next week's a big earnings week, right? I mean, it's all the cloud earnings. Yeah, let's get into this. So next week at Amazon, Microsoft, Google, we're going to report to Q3 earnings. Okay, I'll see the focus is going to be on the cloud platforms, AWS, Azure and Google Cloud. Dave, you got data on this. We've been debating, we've just had a little bit of a mini debate on what the AI will look like. And is it good or bad? I'm saying it's okay that the earnings aren't popping in because it's not a lot of production. That's going to come out of our SuperCloud 4. But there's a lot of buzz around these companies, especially the forerunners, you know. Amazon's getting, you're going to bring up the Fitsie thing, but Amazon got hammered, even Jassy got hammered on that. But you know, you got compute power and GPUs were constrained or costly. Is it cautionary? Is it a cautionary tale right now? Hype is there. What's the buying cycle? What is it? What are you expecting? I'll give you some interesting stats. So I've been covering this since probably 2013, 2014, trying to do quarterly estimates. For the first time ever in Q1 2023, the sequential growth of the big four clouds, AWS, Azure, GCP and Alibaba did not grow. So for instance, 2021, sorry, 2022 Q1, 33 billion, then went to 34 billion. This is sequential quarterly revenue for the big four, then to 35 billion, then to big jump in Q4 2022 to almost 42 billion. And in Q1, it declined to 40 billion. First time ever that we didn't see a sequential uptick in the quarter, and then it grew to 40 billion, 41 billion is the forecast for Q3. So I have AWS growth basically flat up a little bit in Q3, I got Azure basically flat at 27%. I got Google kind of in the 26, 27% range. And I got Alibaba pretty low because they're going through a lot of transitions. But I have really growth rates accelerating in Q4. So my expectation is Q4 is really where you start to see measurable impacts of AI for the cloud vendors. I got AWS popping up to 14% then Azure 28%. That's up from 12 last quarter, right? Correct, right. So I don't think you're going to have a huge incremental impact this quarter because let me look at AWS just really made the bedrock capabilities and Titan generally available. So it's not going to hit, I don't think revenue in a big way. And so, and even, Microsoft's been conservative on its guidance as has Google. So we'll see if there's any meaningful, measurable impact in Q3, we'd better see it in Q4 or this AI hype is going to run out of steam. Yeah, I think, well, first of all, that's great analysis. By the way, you were right on your market share. You should publish that more on a chart so I can publish on Twitter and threads. I will, I'll publish it after they announce next week. I'll update it. And I, by the way, I always say, this is what I forecast, this is what came in. I was right here, I was wrong there. Yeah, if you listen to this, we get the best data out there on this stuff. Other people have thrown in, you know, juggy data. But let me just tell you what I think is going to come out of the earnings. And this is going to be what I would squint through if I was the stock analyst. At the end of the day, if I were a cloud player, you got to look at the infrastructure. This is an IaaS gain back to square one day. If you're a cloud provider, if you're Amazon, you got to nail the infrastructure solutions. Everybody's concerned about costs and their data privacy and their data security. Because their data is now the IP. Again, you brought this up on your data platform post you just wrote on your breaking analysis. It's right on from my research and from my analysis, everything's pointing to democratization of data. The data world that we knew is over. The data world that was 10 years ago oils the new thing and there's going to be a bunch of refiners. That's over. Everything about data theory, databases is over. It's a whole new layer going on now, level going on now. Data is going to change. The script will be flipped. The assumptions will be tested and you're going to start to see that. And this is why I like Rob Streche's analysis that you're doing with him on this whole snowflake data bricks, DBT, happening, event happened. This idea that SQL is the language, SQL stands for structured query language. Well, guess what, Dave? What's the hottest thing in AI right now? I'll give you a clue. The letter L's in them. Language, large language models. SQL could be the LLM, the lingua franca of data and it is, I think you're going to see AI totally take SQL to the next level and you're going to start to see machines, talk to machines at a layer with data. Now, what's interesting is that, okay, that sounds like fantasy. If you look at what happened at Databricks' event, they said, we're going to make everything iceberg and parquet support. If you just take that one step further, imagine that some entrepreneur can get in the game and be democratized and be in the same business potentially with the big players with this open level playing field. So all they got to do is change their formatting and they're in business with the open data model. So I think open sourcing the data is going to be a big trend and whichever infrastructure can power this trend, that's going to be the model support. So Amazon, Azure, Google have to nail the best speeds and feeds at the silicon level and at the GPU compute level. And then the second thing in AI is, how do the models all work together? This is your power law, right? And finally, which AI works best with developers, helping be co-pilot code developers. This is going to change how users interact with information, how software is generated and ultimately what areas can be reduced from a waste standpoint, heavy lifting, the toil. That's going to be the action. Whichever company can do that in the earnings, their numbers will go through the roof because it's all set in the table right now. It's all about beach head position, not about numbers. I'm not judging the cloud right now, the cloud providers by their earnings numbers on how much they've gotten out of their AI. Cause if they try to squeeze the monetization too early Dave, that's the wrong signal. Now, I can see some numbers, obviously if this is an adoption, but I'm not expecting exponential growth. Cause I don't think it's gettable. I mean, if I'm Azure and I'm AWS and Google, I'm going to lay down the best hardware on the table, which is performance and enable the data center or edge is to be highly productive. I think that's what you're right. When we were talking about really about the edge and kind of all working together with the cloud, it's a cloud operating model with AI now as the new, I would say tailwind for how data is going to be managed. That's to me the squint through the data. How do you look at the numbers? And I'm not expecting them to do a lot of data performance on the earnings. Yeah, I mean, I again, I agree with you. I don't think you three, you're going to see a lot of, you know, AI in the data. You know, the quick question I have is, are people going to switch clouds and cloud providers, you know, to get to AI? I mean, we could argue that Google has better AI, or you could argue that Microsoft and open AI have better AI, I don't know, do they? I don't know, we're using them all, right? Well, I mean, every single company has those big hyperscalers, all have been hiring machine learning talent. Okay, AI has been around for a while. Every company, we've been doing AI forever. Okay, well, what they mean is they've been dealing with data and machine learning. Machine learning isn't genitive AI. See, we talk about the genitive AI, as you pointed out as entropy kicks in from our friend at IBM. Yeah, Jeff Jonas. He points out it's been around for a while, but the advances around genitive AI is what everyone's going crazy about, and that's why I'm excited by the whole vector database discussion, because that's pointing out of these new kinds of ways to do embeddings, which is essentially how do you handle data sets? And I think, you know, words that were once taboo, like proprietary, old gardens are coming back in vogue. If you look at the top conversations we're having in Silicon Valley and in the AI world right now, it's the word proprietary models, they call open AI, the proprietary models, they're actually open, because they happen to be owned by them, but it's the internet that they've kind of crawled. So if you look at the companies that are doing AI, they're treating their data sets as walled gardens because they don't want to leak them, have IP leakage. Right, that's the conversation we've been having. Don't let your data leak into the LLMs. Well, guess what they're doing? They're creating walled gardens of their data. That's proprietary intellectual property. So that's actually a good thing. That's a feature, not a bug. Well, but again, this is what I'm getting to. So as I started to say, our engineers are playing around with all these LLMs. You know, you ask them, hey, how's the Lama tool? Like, oh, Lama tool is looking pretty good, but, you know, open AI tools are good, but, and you know, they're just getting hands on some of the stuff in Amazon that's gone GA, and they look pretty good. And so it's like, you remember when we, and we still we're testing all the, it's like we're coding HTML by hand. Like, hey, how's that new thing? Yeah, but remember how we've, we've tested every translation and NLP translation out there. And we've been doing this for years. We know which ones are good, bad, what they're good for, what they're bad for, you know, transcription, translation, some are good, some are bad. And so am I gonna really, I've invested in AWS, it's, let's say, 75% of my cloud estate is AWS. Or if I'm a Microsoft shop, you know, 50, 60, 70% of my, my workloads are running on Microsoft and because I love their collaboration and their tools or whatever. Am I really gonna switch? Am I gonna say, okay, now I'm going to Google because they got a little bit better AI. I certainly don't see it in the spending numbers yet. You know, they certainly talk a good game. They're marketing it. You got Fitzy throwing, you know, all his, all his fud at Amazon. But I, you know, do you really see Amazon as, you know, the data shows them losing a little bit of ground in, in, in, in, in market presence, a little bit of ground, you know, they'd lost a little bit of ground. Momentum still really good. Frankly, it's better than Google's, you know, and, and Microsoft is ubiquitous. So, but are you really gonna switch clouds and go through that? I mean, it's gotta be a good reason to do that. You're gonna maybe add on some stuff, but my, I guess my point is this. What's your point? Amazon has built up a massive install base of cloud customers as has Azure. I just don't see those guys picking up their tent and, and leaving. I just don't think it's that easy or that attractive to do so. Well, I think that's exactly the right point. The question about the data is, can you move the data? So let's take data, let's take something that's really in the weeds. Vector embeddings, vector database is all the rage. We know that a vector database has embeddings that work for the vectors. So you can't switch vendors there, but if your data is that small, you can just move your data over and re-index it. So that's easy. Unless you consolidate the vector database because it's gonna become a feature of a database. Even if, even if, No, no, no, but the question is, the question is consolidating means just throwing away your old embeddings and creating new ones on the new one. That means moving your data consolidating it. But if you've got petabytes of data, that's hard to do, it's gonna cost you. So I think the big switching cost is gonna come down to how big is the data set? What's the cost on the workload for the training and inference dollars? That's why everything's gonna come down to, can people code it? What's the easy to code for my developers and use for my users? What's the cost to move data around? And what's the cost for the hardware to use it? That's what's gonna come down. It's classic kind of IT Dave. It's like, what's the throughput? What's the first token in? What's the context window? What's the cost to ownership? I think we're gonna, we're moving into an era of a generational shift of IT. And that's why I think it's a huge opportunity for the old incomes like Dell, HPE who have hardware that could sit on a stack up on a rack in a room. Okay, there's your test bed for your programming. And then when it's done, you move the shit to the cloud. So, I mean, or you build your own, like OpenAI did or have your own data center. I don't know the answer is, but it's an opportunity. And I think it's too early to tell on that piece, but from the cloud standpoint, let's see what the war is on, right? The cloud wars are here. And I expect re-invent to flex hard and re-invent. We'll see what they got. Yeah, well, like we've said before, they got the last word, right? Cause re-invent's the last conference of the year. And you know, it's gonna be good. You know, they're gonna have a good story. I mean, when is re-invent ever shit the bet? I mean, it hasn't. It's always really good. I mean, Jassy was always a high point. Adam's got his own style and he's really good too. I just, you know, every re-invent, the messaging is strong, the fire hose of announcements. I mean, you know, it's been a long time. It's been a number of years. And so, you know, maybe it gets a little bit old, but it's still really, really good. It's still one of the best conferences out there. So I expect it's gonna be, you know, a really strong showing. And I know Fitsy stuff is just all fud. He's just a Microsoft fanboy. Well, we got, I don't know if you got a hard stop, but it's almost... I do, I got a hard stop. I got like a cement hard stop right now. All right. Well, this is over, Dave. We didn't get to our rant section, but we'll get to it next week. I wanna riff on the, in the one minute we have left my rant next week will be around the news information and around the war. That's a huge red flag that I think this is gonna change how the press has done. And also the whole VC, start up market's changing. So we'll hit that next time. So check SuperCloud next week. I'll see you out here in Palo Alto for episode 35. Maybe you're gonna hang around, you're gonna head back for the next episode there. What's your flight? No, I'm there for the week. I'm flying out Thursday. All right, so you're gonna fly back. If you use through the podcast, maybe do the podcast Thursday next week. All right, Dave, thanks for, if you're watching, check it out. What we can do on Friday? I'll be back on Friday. Check out next... I'll be back for breaking analysis and I'll do podcast Friday. Next week, SuperCloud 4 live in studio in Palo Alto. Check it out. We've got a huge content, massive response from our community, major players stepping up, deep dives. These aren't shallow hot takes. These are deep analysis. We are gonna unpack generative AI with respect to the cloud. This next wave of infrastructure, picks and shovels gonna power the apps that are gonna be from AI rappers to total AI native to whole new applications emerging. SuperCloud 4 will have that next week. We'll check it out. We'll see you next week on Pod 35. Dave, we'll see you next week. Thanks, John. We'll see you. Thanks, everybody.