 Hello everyone, welcome to the SuperCloud Six. I'm John Furrier here in the Palo Alto City with Dave Vellante and our entire CUBE team presenting our sixth episode of SuperCloud, the topic is AI innovators. We're featuring the hottest founders and startups as well as leading enterprises who are setting the agenda for the next generation infrastructure, software and applications around generative AI and the future of software. Dave, it's great to see you. Thanks for coming flying out and doing an in-person. Again, great lineup. This AI innovator's theme is resonating because who's not an innovator? No one wants to be not an innovator. Of course. I mean, it's always for John, first of all, back in Palo Alto, but this AI thing is real. It's AI everywhere. It's happening. I mean, we're talking about it's powering all the trends, all the markets, and it starts at the bottom of the stack, right? We talked about this in Barcelona on our CUBE pod with companies like NVIDIA and Broadcom and other semiconductor players, but it's really all the way through that stack. It's an orthogonal slice through it. You know, this is our 14th year doing the CUBE and our team. You know, we are, it's like you get caught up in the wave, right? We are on a growth curve that we've never seen before. We've been chronicling the evolution of big data going back to 2010 during the Hadoop days. Cloud scale comes in and we're like in the middle of the storm here. And then all of a sudden, the huge growth waves coming in called generative AI. We were seeing, we were talking about next generation cloud just about a year and a half ago when we started talking about super cloud. But what it's morphed into is really super chips, super applications, super infrastructure and the thing that's going on now with generative AI is that everybody has realized that this is a new game. The game is still the same but it's being played out under new conditions, new infrastructure, new software abstractions, new kinds of chips and server and component configurations like we covered with Broadcom and others like NVIDIA, but the developer action is super robust. Linux Foundation with CUBE Con CNC have you got massive open source development. So all this is going on like at the same time. So you have this perfect storm of innovation and the founders we're going to feature today and the big companies like Uber who have literally crafted what looks like the next generation AI systems. And I think that's the theme that we're seeing this whole systems revolution, systems mindset. And if you look at the trends, we're going to get into it and I want to get the data for you have is the developers moving so fast and the infrastructure changes need to catch up. And that's where the power dynamic. Well, there's so many similarities we talked about this between this wave and previous waves specifically the dot com wave forgetting the cloud for a second. But back then you had a lot of hype. Of course you have a lot of hype today. You had a lot of CapEx build out the CapEx build up back then was a lot of companies that kind of went out of business and companies like Enron and the like and took on a lot of debt today. It's the hyperscalers that are building this out. But nonetheless, it was a situation where it was everywhere, right? AI is everywhere today. The internet was everywhere. The big difference to me, John some of the things that you were talking about you really can't have good AI without good data. So there's not only a transformation going on from AI but there's a data transformation going on where everybody's kind of trying to put data at the core of their business because it's an enabler for quality AI. And then you have this other piece it's not as much the Wild West because you have legal and compliance concerns and they're serious and their potential reputational damage is much higher than it was back in say for instance the Hadoop days where anybody could do what they want and there was not a lot of governance and then people said, okay, hey, we got to rain this in. Now they're raining it in from the start. One of the observations that's come out over the past few months and just recently is the whole growth of this new market tends to be normally bottom up. Developers show some things momentum grows bottom up. But with Genevieve AI because data's involved to your point the crown jewels of the company have to be involved in understanding what Genevieve AI means. What that means is that startups and innovators have to deal with two things. A moving train in terms of the growth trend a Genevieve AI, the new technology, the new infrastructure and then dealing with the sensitive data in the company. So it's not like throwing an experiment out there get a dev team going, you got to do both. You got to hit, you got to hit the innovation on the development side and infrastructure and bring the data that's the crown jewels of the company into the fold in the start. That's unique. I haven't seen that move in a long time. I think that's a power dynamic that frankly gives startups a challenge because you got to go in and sell essentially an enterprise sale to get the data. Because without the data you can't actually show any value with Genevieve. They can't just set up a cluster and say, hey, we're set up for Genevieve AI now what? You got to get the data. And I think that's going to be a challenge and an opportunity for whoever can crack the code on that. And I want to actually, if I may set up the macro we have some data on this from our partners ETR is the first slide that we wanted to show you was kind of AI everywhere what we were talking about. What we look at is as you'll see when we bring that up there's these sectors that are affected by AI. You can see on the vertical axis here it's just spending where people are spending what the momentum is in the horizontal axis is like the size of the market. Think of it that way. But look at ML and AI. It bottomed one month prior to chat GPT and then escalated. And if you go back in time series all these other platforms and sectors were much higher. AI was kind of waning. And then look what's happened. Everything got kind of pushed down below that red line which is like the magic line. And so the point is that we're stealing from other budgets, John, okay? And so people are going to have to start showing ROI or else they're going to get some pressure from the CFOs. It's not like the top line is growing. Okay, let's leave that slide up there. I want to just comment. If you look at the red line, container orchestration, container platforms, cloud computing, the rest that drops below is essentially not impacted negatively by the trend. This points to the data that we have that we were just commenting on that the infrastructure is where the action is right now. And if you look at what will happen next is if you just connect the dots, container orchestration essentially is cloud native stuff. So just put that into a cloud native bucket. Cloud computing will grow and continue to be a power dynamic there. Then as General AI gets better from a scale standpoint, once people figure it out everything below that red line will lift up. So you look at robotics, you got marketing, you got servers, telephony. We commented about Cisco and our last Q-Pot, IP telephony and rounding it to the bottom. So context and all that stuff will rise up. And I think that's where the action is going to happen. I think it's kind of below the line right now because people don't know what to do. They don't have yet visibility on the implementation of the use cases. And I think that's going to be the opportunity. Right, so they're experimenting now and then once they figure it out, it's going to, that has to be the way it plays out. It's going to be injected into all these other sectors and that's going to be the rising tide if you change the label ML slash AI at the top there where it's highlighted just call that generative AI that will rise up and that will pull everything up. Thank you, right on. I think the pull will happen. And again, this is going to be the classic rising tide floats all boats and all good, great, great research. Yeah, so thank you for sharing that. So, okay. And then you've also got this other premise that we didn't think back to John. It was just about a little over a year ago we were sitting here talking about this thing. Okay, what is this open AI thing? Are they going to be able to maintain their advantage? You said they would be able to, we'll talk about that in a moment, but it is accelerated so fast, certainly faster than anything we've ever seen before. And then we put forth this idea of the power law. That's the other piece of it, which is, and we're starting to see evidence that people are beginning to do things on-prem. Why do you think that is? Let's talk about that. Well, when we first came up with the power law, if you throw the power law up there, I want to just show that. Yeah, that's the second slide there. Yeah, put the second slide up there, that power law, so we had saw that at the beginning we had predicted that chat GPT and open AI would set the agenda as a large language model, be that kind of the mainstream consumer view of how AI generally plays out from a consumer standpoint. Even with all the hallucination flaws, it was clear that that was going to be a monster opportunity. And then what we were seeing in the data was, is that as you go down the power law, the size of the model, and maybe the size and small scope of it, where the specialty months, where the power law comes in, where the data was strong. So you had high quality data in the long tail of the power law, that would interact with the existing one. That's exactly what played out with the RAG, the retrieval augmentation generation mark, which is the hottest thing happening right now. Retrieval is becoming the killer app. And Jensen said on stage last week at Stanford, he sees that same thing, that these specialty models will be where the inference IP will be. The universal property around inference will come in from the specialty model. And that's that long tail there with all these industries, right? That's what you're talking about here. There's a bunch of that being on-prem, right? Yeah, and so what's going to happen is people will have a choice between using a managed service like OpenAI or Anthropic, or hosting their own Lama, Mestrel, or having their own linguistic or language foundation model. So the hosting will be the challenge. And right now it's more expensive to host than just use a managed service at the short term. So most startups are going to start to experiment with hosting models, and I've been calling clustered systems as the scale gets higher. So you're going to see a huge growth in specialized AI models that are going to be, I won't say proprietary in the sense of not open, but they're going to be intellectual property as data from the enterprises or the startup. So the data quality will be the new intellectual property. And I think that's going to be something that you're going to see people license models through API access, metering it, things like that. So I think you're going to see a surge in the, and here you have no neck and no torso in the tail. I think you're going to see a fattening up of that mid-range model there. You see the long tail extend out and go higher, but you're going to see the neck increase and the belly or the torso increase at the red line. And I want to explain that because that was the huge contribution in addition to other contributions you made to this was that red line pulling up. And you made the point early on, look, open source is really going to pull that torso up and to the right. Like a lot of power laws, that doesn't happen. It's just dominated by a few names. But now we're certainly seeing, I mean, meta estimates that maybe half of the Lama II deployments could be on-prem. The data that we have suggested at least 30% and of course, when you get inside of three letter government agencies, it could be much, much higher than that. But the other point that you made that I really want to emphasize is that domain specificity, that model specificity, AI is going to go to where the data is and there's still a ton of data in healthcare and government and retail and manufacturing that's on-prem that's never going to necessarily be into the cloud. Not that there's not going to be a ton of activity in the cloud. There is, that's where a lot of it is today. But there's no reason to try to shove all that data back into the cloud, rather bring the AI to the data. And if the tools are there, that's going to be a very successful model. The thing about that is that the cloud, it's all cloud operations. So what the big trend is, is that it's not just cloud, public cloud, it's on-premise cloud, it's edge cloud, it's actually cloud operations. That's why we saw container optimization, container orchestration, high up on the list with Genevieve AI that pull on the other slide. You're going to see that interact. And with the power law, what's going to happen, we believe in the research we're seeing, is that you're going to see specialty models in the torso and the tail interact with the large language models. And you're going to see an interaction between models. You're going to see models talking to each other and sharing data. You're going to see a model for security and model for other things. So I think the power law will end up happening. The ironic thing was, Jensen actually acknowledged the power law on his stand for videos like this. Everybody is like, hey, it's playing out. Well, that's what the value is. We were really on on that. Once people see the value of Genevieve AI, they go, I see value because they have the proprietary data. When they have their own data that's in good form, they can see instant benefits. And that's why they're going to see a lot of interaction with the bigger models. Because by models working together, they can see instant value. And they go, wow, that's transformative. It literally is a step function change and they're going to throw more money at it. The next question is going to be, what's the spend? How do I manage my spend? What's it going to cost me? And do I get more value revenue-wise out of it or cost reduction? So you're going to start to see questions like, how much am I saving enough? How much am I saving? And how much am I making? And those two things are going to play out and you're going to cross-connect that with how much does it cost? Well, and that's part of the reason why we've invited Uber in today to really understand how they're using AI, Walmart as well. These are two big examples of companies that are actually getting ROI out of AI. I think most companies are not, to be quite honest with you. But the other thing I want to bring up, and you and I have talked about this a lot, of course, Elon is in the news again, pursuing open AI, calling them closed AI and there's a lot of kerfuffle there. But the quality of these models, while they're increasing very, very rapidly, open AI still has a lead. And I want to bring up the third slide now, if you could, just to give you a sense as to how big that gap is. This is essentially the vertical axis is intent to engage. So it's activity and engagement. And the horizontal axis is mind-sharing. Look at it in the upper right. I had put a red bar around it because they're literally off the charts. Look at the gap between it's anthropic and in cohere and character AI and all the other, many of the open source. And look at that giant gap. I inferred where Lama would be from some other data that we have from ETR. But that gap is enormous. And so people are saying that a lot of people are waiting for the next generation of NVIDIA chips to train, to get beyond GPT-4, the H200s. But open AI is really doing great right now. And a lot of people are engaging with them, not that these other models aren't ultimately going to catch up, but they've got the lead right now and they have a lot of the data to lean on. What are your thoughts on this gap? Well, I think, first of all, they're running away with it mainly because they had the lead and they're going to continue to push it. And Microsoft. And the Microsoft factor, they're monetizing quickly. So I think that's going to be a great sign. I think anthropics going to do well too. I think Lama and MeScrub will rise. I think those will be the big three. Cohere will be in the mix. Jasper, I'm not too sure about, but I think anthropics, because of the AWS relationship and their other clouds, you're going to see them pull in there. So I think they'll move fast in on open AI, but the open AI will be, how can they extend that lead? And I think that's going to be in this generation of open, general AI, the winners need to have the whole, everyone's talking about what's the mode, right? What's the mode of the company? Well, the mode speed. So if you look at open AI, and we said this last time, the way they're going to win is just continuing to keep the distance between them and the number two. So in their rear view mirror, they're going to look at Anthropic and Lama and MeScrub. Lama mainly is dangerous because Facebook is going to be probably a powerhouse, a hoster of scale. And I think Amazon, Google, Microsoft, and Meta, Facebook will be infrastructure of choice. Maybe even Oracle, we mentioned that last night at dinner. These are going to be the infrastructures that people are going to run on. And the power that they have is going to be to the proportion of like how we think about electricity. And I think that's going to be really interesting conversation because that's going to be where the discussion is. Are they too big? Yeah. And will that stifle innovation? My feeling the power law shows and those graphs show that there will be a rising tide. So I think it's not going to be stifle innovation, but it does beg the question, what are they enabling? Well, what do they have a lock-in spec on that? And if you look at the CapEx spend, again, this is a big difference between the dot-com bubble and now is the CapEx is really being driven by the hyperscalers. In fact, if you look at Nvidia's last quarter, their quarterly revenue is about half of the CapEx spend in the quarter, which is pretty amazing. Because obviously people are spending a lot. These hyperscalers are spending a lot on GPUs. But having said all that, John, you've got real innovation going on. And Amazon's going for optionality. They're going for tooling. They're going for features. And that playbook has worked for them. And they're going to try to eliminate the non-differential to heavy lifting in AI. Google, despite its troubles, is obviously putting a lot of investment in AI. It's got great tech. And then Microsoft did just a brilliant move there. So, and then you have this whole entire other ecosystem saying, hey, we want innovation alternatives to Nvidia. We want innovation alternatives to open AI. And Microsoft, so the entire industry is funding this thing. And that's why the innovation is going to come. Well, there's a couple of factors. The funding, the resources required to do it are going to be constrained and contested. So, that's why- You mean like GPUs? Well, GPUs, but also that's why you see all the clouds taking equity positions in these deals as basically barter for the cost. And they're going to trade their resource. I was talking to startup, we're going to have on here, talking about cloud insurance, like flood insurance. What if you can't get your GPUs? We'll guarantee performance. So you start to see a whole financial modeling around, not just cost optimization, but cost management, cost strategy. So that's going to be the action on the cost side. I guess we have coming in from the startup side, mainly around data. Venkat from Rockset, Kyle from One House. We got NVIDIA, big name coming in Snowflake. We got Uber, Zscaler. We got Neo4j on the database side. Single store, MongoDB Ventures, YLabs doing kind of like this new, I would call a new observability category that's emerging around AIOPS. I won't call it observability, but it's because it's not true observability. And then just a ton of startups doing innovators. This is an innovator's market and the innovators will win. And what we're looking for is, how are these innovators and their customers applying AI? That's what we're going to hear from Uber and Walmart. Really interesting what Walmart's done with their cloud native platform. They're super cloud, if you will. They've built an abstraction layer called Element over it. That's where all the AI happens in the ML. Same thing with Uber, how they're evolving their platform. Walmart. Yeah, it's going to be really an amazing day here, John. AI innovators, as we see them, are the folks making it happen. They're the ones who are investing and on the cutting edge and we're going to categorize them all out here and continue this here at super cloud six. And AI innovators, we're going to continue on the 19th. We're going to have an addendum to this program. So stay with us. As we kick off super cloud six, we'll be right back after this break.