 Welcome back everyone to the closing wrap-up of day one of SuperCloud 4th, it's our fourth episode where we unpack the hottest technologies in the cloud, cloud next generation. Obviously, Genervai is our topic. We have a boatload of content. We just dropped a bunch today, tomorrow, a whole day, all day, going to be streaming here on theCUBE.net. Check it out, SuperCloud.world. That's the place to go. I'm John Furrier with Dave Vellante. Here on the closing sessions with me and Dave is Sarbjit Joel, Cube alumni, analyst, expert, influencer. Sarbjit, good to see you. And Rob Streche, Cube analyst, and for Cube Research, Rob. Great interview you did with the NFL. We're wrapping up day one. We had some live in studio performance and other successful SuperCloud guys, congratulations. Yeah, it's not over yet. Day two tomorrow, like you said. Rob, what's your take? We just did a riff with Howie Shoe's great panel. What did you see today? What was the big takeaway? I think, again, to what we've been discussing for months now is that large language models building your own chat GPT is going to be really far to the left on our power law and it's going to be very few that are making those huge investments and huge data investments and those massive models and that segmented LLMs or SLMs are going to be kind of where the entire business is at and be that on-prem or in cloud. It's going to have a cloud operating model behind it. I know you and your team at the Cube Research and advisory, formerly Wikibon, have been doing a ton of briefings with customers the past couple of weeks. Dave and Rob and Sargeet, what are you guys hearing that you heard today that connects the dots? What are companies saying? Are they, is everyone the frog in boiling water or are people actually getting it or what's the state of the market and what do we hear from these experts today? What ties it all together? I mean, just kicking it off, I think that what we've been hearing and what we continue to hear today was that one platform is not going to fit all and that you're going to see things consolidating from a data services and data features perspective in those data platforms, but that one platform will not dominate at all and your data won't all be consolidated down. You're going to have to have governance and lineage and a number of other things across all of these different data platforms. I think is, I'll just say, the one thing we've consistently heard and we heard again today and we're making fun of it is we've been doing AI forever, right? And yet I think we're going to start to see the AI haves and the AI have nots. We're not seeing it. We just have earnings numbers. I know we're going to talk about it a little bit. Microsoft seems to be an AI have, right? Google has AI, but we're not seeing the numbers. Of course, it could be advertising, but I think we're going to see a bifurcation of the contenders from the pretenders. Alphabet was solid results on Google. I think a little bit of manipulation going on and make their numbers problem. I would not judge them out yet. We'll see what Amazon does this week. Well, the stocks are down. I mean, sorry, Google's way down. Yeah. Microsoft's way up. Microsoft's up after hours, three and a half percent. Google's down six and a half percent. Amazon, people don't know what to do with it because Amazon earnings come up later this week. Well, this brings up the conversation that I've been seeing so far. Yeah, let me get your reaction too after is, Rob, we've been hearing people say nothing's yet going to production. We're still hearing that drumbeat, certainly on the enterprise side, but yet we heard from Intel that there's a production boost on productivity numbers in AI. There's a lot more in production than people are leading on to be. So I kind of see this as classic enterprise or new AI kind of, I won't say bolt-on, but like there seems to be an AI adoption rush in low-hanging fruit or more problematic or challenging AI workloads that just can't be bolted on and thrust into production. What do you think? I think that the chatbot thing is out there, right? I mean, I think a lot of the low-hanging fruit was, I need to build an HR chatbot that allows people to do their direct deposit and figure that out real quick. And that was a low-hanging fruit of taking all of my HR policies and procedures. And valuable, despite your saying poo-pooing it. What? Chatbots, that's valuable. I'm not poo-pooing chatbots. Yeah, you said, that's low-end. Well, it's been there, done that, it's been figured out. So, I mean, on the spectrum. Try calling your cable company and testing out their chatbot, it sucks. Their version of production is a whole new ballroom. It's like, yeah, you can't say, that's like saying, okay, Dave, don't even go there. Okay, what the point was is that you have chatbots on one, it's easy to figure out. Co-pilots is now the hot thing because people can see direct benefits. The third wave is the more complex, cloud-native, AI-native, predictive stuff. That's a lot harder. So, when you map that out at the Rob's point, yeah, if you count production as low-end or entry-level, easy to figure out with data. But again, we'll have some people on, enterprises on tomorrow, that will talk about how they've gone and implemented it. You have some of the, I think people, Aaron from the NFL, was talking about the fact that, hey, you gotta look at, we need to know the impact of actually rolling this AI out. Because they have, you know, NFL has unions they have to deal with. They have other franchises that aren't under their jurisdiction. You start to look at all of the people you have to bring along into this in any corporation. There's a lot of politics that needs to be dealt with within there as well. And Jeff Boudreau from Dell said, we got all this data in our services organization. We're LLM-ing that. And then we're going to give real-time answers to customers. And that's a huge product. There'll be a way in here. I think media is hyping the fear of AI into the masses, the Hollywood strike, writers, big debacle. All in all, I think the productivity boost is number one use case for people. So if you go by people process technology, that sort of frame of thinking, people are getting more productive. We need productivity. We have been going down in productivity in the US for quite a number of years now. But now this AI will let us come back, ramp up the productivity. That's number one. Number two pattern I've seen is, a lot of people are doing the POCs and a lot of proof of concept and smaller projects. But the last mile of AI is hard. So for enterprise to- We described last mile. Last mile is like- That's come up multiple times. Yeah, yeah. What does that mean? So that means like having the true real use cases where you have that predictability and accuracy coming from the large language model. So the app that's fully in production that last mile is the final piece of the puzzle. It's in production, actually, you can say. Last mile is production, right? So that is hard. Of course, we have been doing AI for a long, long time. And that's like generic AI, if you will, not the generator of AI. Generator of AI has hallucinations and problems. I think I agree with you on that and everyone else. I think the changeover that's happening, at least what I'm seeing, I'd love to get your thoughts on this, guys. Yes, it's a radical shift to the new model. The adjustments going on at the data layers, even the startups we had in here, really the young startups, they're even feeling that the adoption is not there because the people aren't ready for it. Then I mean, who's really ready for generative AI? It's like Dave changing the airplane out at 34,000 feet with a whole new engine you've never been trained on before. You got to reset your data, rethink your data. I just want to comment on productivity. The global productivity, the best number I could come up with is 1.2% growth. That's anemic. But the GDP this quarter is supposed to be in the U.S., like 5%, and it's because of this full unemployment, but it's not due to productivity, right? And so, unless we see in the productivity numbers, you know, a big boost, Brynjolfsson says 3% minimum, I'll be disappointed if it's not 4% or higher. Unless we see that, then this thing could be overhyped. I don't think that's the case. I think we will see it, but to your point, people aren't ready. Well, January is new. I mean, the comment on Howie's thing was that a room pointed out from Intel was, someone said, I don't want to give up my co-pilot. They didn't have it last year. They don't want to give it up. Like, that's like when you first started surfing the web and used Yahoo, like, okay, I don't need to open up a book anymore. But I think that is the underlying thing. They didn't build the co-pilot. The co-pilot may be trained on some of their own code per se, but it's not necessarily their code. They didn't have to go build it. That goes to the last mile thing. It's like, okay, I can believe this because it's only been trained on my data plus a little bit outside of how to program or what have you. And I think that's a lot easier to get adoption on. I think the problem is still, the use case is not well-defined in how and what the SLA will be for the answers in that. We're all fine with chat GPT kind of lying to us every now and again and hallucinating and what have you. But I think when you start to get into certain cases, like financial services, going and saying, hey, I recommend the stock to you, there's a human in the loop on that because they can't trust it to be fully autonomous in that way. But having said that, I've seen some vendors doing a great job at it. You know, for example, we are very technology vendors in this case, right? There are banking and others as well. So VMware is training their models on a smaller machine, you know, A400 or like within like few hours they trained their model on their APIs, on their SDKs, on their scripting language so that their practitioners of their technology can benefit from it, right? So I think that is a great use case for technology providers to at least first train these models on their technology. So that when programmers are using their API, they just have to just say it or just type the first three letters and it completes the rest. I think building their own co-pilot for themselves makes total sense. I think the technology vendors, it's a lot clearer use case, especially in supportability and usability of the products that makes total sense. But some other vendors are not. You know, I was at HashiConf, I was watching them and I went there for half a day. Like they are kind of, kind of negating it. It's like, oh, we'll do this. Yeah, maybe they're trying to be contrarian because everybody else is not. Well, they should be the one who are actually, who should be there. Well, Terraform, talk to me and I'll deploy your infrastructure. Terraform could be at risk with AI. All right guys, we are on the analyst panel. We're going to have two more segments coming out after the end of the day. Google Public Sector, as well as a healthcare expert. They're going to unpack a little nuances of AI. But guys, final question for this analyst angle on SuperCloud 4. What's your takeaway from SuperCloud 4 that you just mentioned, or go a little deeper? And what are you going to take away to advise clients? Rob, as you talk to more enterprises every day, Sarbjit as well, Dave, what are you going to walk away with from here that you're going to have in your repertoire to advise and clients and custom companies that are looking for leadership and direction on how to formulate the AI conversation? I mean, it sounds so obvious, but I'm going to say it anyway. You just can't get caught up in the hype. I mean, you can, it's great. We love it. It's sort of intoxicating, but you got to get down to the business case. You got to figure out the use cases and you got to make money at this thing. And that's, to me, making money is all about labor productivity. And if you can get there, then it's going to pay for itself and you're going to win in the market against your competitors. And if you don't, if you're just running experiments all day, you're going to fall behind. Assuming that Gen AI has legs, right? We can, I think it does, right? So you have, number one thing you want to do is train your practitioners and train your decision makers as well, your leadership as well, what it is, how you can leverage it. And I see that a lot of variation is a spectrum and some companies, people know and some companies that what it is and how they can leverage it and some companies they are like, oh, what is it? We don't want them. Like they're kind of afraid to just go there, right? So train, train, train, that's my advice. Yeah, and I think the advice is, clear use case, have focus, do a lot of experimentation around that, but it has to be core to your mission, your company's mission, so that you can get to an ROI and define what success means at the end of the day for that, because you don't want to be just blindly doing experiments all over the place. Figure out how do we get there and how do we, once we are there, what is the value that the company or the customer is going to extract from that? As a consummate, do more with less. I mean, I'll end with my little moneyball quote, Dave, you know what I'm going to say next. There's a scene in Moneyball where the owner of the Red Sox says to Billy Bean, if every company isn't working to your model, the baseball model, the big data model, as they say, they're going to be dinosaurs. And I think, to me, it was clear from day one that if every business and entrepreneur isn't thinking, because we have a lot of entrepreneurs here, companies as well, aren't rethinking how to execute with data. I think data is the new middleware and you have great infrastructure, the new data middleware is going to be refactored. At the end of the day, it's the apps, the North Star, what you're trying to do, if they're not refactoring with data and thinking about how they're using their data as a national property, they're going to be dinosaurs. So I think that to me is my advice, focus on your data, figure out how it works with the models, what's the end-to-end workflow, what's the application, because this is a generational shift. Attaberg over Pena. Yeah, type Pena. All right, we'll be back with two more segments to end the day, Google Public Sector and Healthcare Experts is SuperCloud 4. See you later.