 From theCUBE Studios in Palo Alto in Boston, bringing you data-driven insights from theCUBE and ETR. This is Breaking Analysis with Dave Vellante. The AI Gold Rush is on. The paths to monetization are seemingly endless, but the most obvious converge on making humans more productive or supercharging existing business models like search advertising and maybe some others that we haven't thought of. Much of AI adoption in enterprise IT is hidden. Our research shows a very high overlap around 40 to 60% between AI adoption in enterprise tech and embedded AI inside software from the likes of Salesforce, ServiceNow, Workday, SAP, Oracle and other major players. But a rapidly emerging group of independent AI firms is gaining traction, catalyzed of course by OpenAI and Microsoft partnership. These pure plays are positioning themselves to ride the AI wave while new AI startups are being formed daily with much less capital than previous cycles. Hello and welcome to this week's Wikibon Cube Insights powered by ETR. In this Breaking Analysis, we review the state of AI spending in the enterprise and look at the positions of several key players in the space that offer AI tools and platforms. And to do this, we invite Andy Terai, who is the CUBE contributor and vice president and principal analyst at Constellation Research. He's an AI expert. Andy's going to help us unpack the hits and misses from this past week's Google IO conference and give us his perspectives on what it takes to catch the AI wave and avoid becoming driftwood. Andy, hello, thanks for coming on theCUBE. Thanks for having me on. Always good to collaborate with you. I want to, before we get started, I want to set the context on the overall IT spending environment and then we'll get into it. Let's show a chart here. This is ETR survey data going back to January, 2021. And the survey reaches more than 1,500 IT decision makers. The last one was almost 1,700 ended in April. This is a quarterly survey. The lime green bars show the percent of customers adding new platforms. The forest green shows the percent of spending that is 6% of customer spending, 6% or more relative to the previous years. The gray is flat spend and the pinkish area is spending down 6% or worse. And that bright red, that is the percentage of customers that are retiring platforms or churning. You subtract the reds from the greens and you get net score, which measures spending momentum, which is shown in the blue line. And you can see coming into 2022, there was a lot of optimism that deteriorated throughout the year in 2022. And while the entry into 2023 brought an optimistic or a more optimistic outlook, we got hit by interest rate rises and earnings revisions. And that put a damper on the enthusiasm and the trend line has continued to decline. So it's no surprise there, but it sets the backdrop. And now let's take a look at the same data for AI. And we'll bring in Andy, some colored, same colored bars, but the pace of decline of the blue line is less steep. And the other really important point is the blue line is consistently stayed above the 40% mark shown by that red dotted line. 40% is considered a highly elevated level. So we're talking about spending momentum in AI being 22 percentage points higher than overall enterprise tech spending. Now the other key point is the yellow line, which shows how pervasive AI is in the dataset. That's a measure of AI's relative share of the total survey. You can see it peaked in April, 2022, that survey and then declined last summer. But since the launch of chat GPT, it's steadily increasing and reaching new highs. Andy, I wonder if you have any thoughts on this. Absolutely. So that's a great survey, by the way. My sampling set is not as big, but based on my conversations with some of the bi-level executives, which is almost in line with what your survey finds. If you could put the slide back on, I want to point something out. In that, if you look at that, before the chat GPT came wildly famous, even though chat GPT has been around for, the GP models have been around for a while, the successful POC that Microsoft has been doing started early this year, so it went wild. So as you can see in the chart over there, it started off somewhere around January, started ramping up, and then you can also see the adoption of the AI curve, it went much faster. So there are two things I see happening in here. One, because as I tell people that the AI adoption or chat GPT is the most widely successful POC anybody can ever dream of, so everybody jumped on the bandwagon, that's one. Two, because of that enterprises are looking for ways to minimize their cost using any AI use case they can, whether they'll have productivity increment by using more efficient code, to have AI increase developer's productivity by writing two or three times more code, or even introducing AI into IT operations and whatnot. So by doing operations, reducing incident management, so in other words, the overall spending in IT is reducing, but the overall spend in IT, AI is not reducing, they are trying to make up for the spending what they're doing by putting more money into IT, that's what I'm seeing. You're right on, it's definitely getting a greater mind share. Nobody's really figured it out yet, but virtually 100% of the people I talk to are trying to figure it out. I use chat GPT virtually every day. Our developers were able to develop a cube AI demo in less than a week, and it's actually very good, and it keeps getting better and better and better. Okay, it's triple crown season, so we got to line up some of the AI horses on the track. This next chart shows ETR survey data, like I say, just almost 1700 IT decision makers from the April survey. The vertical axis is net score or spending momentum, and again, that red dotted line at 40% is a highly elevated spending velocity. Anything over that is considered highly elevated. The horizontal axis is pervasiveness in the data set, and the first thing to note is the big three hyperscalers. They're dominant in this space, along with Databricks, which as we've reported is very strong market momentum, and you'll also notice how the position of Microsoft has changed since its announcement with open AI and the Bing integration. Anaconda is an AI platform for data scientists, and there's a pack of companies including H2O.AI, DataRobot, DataIcu, and some others that we will talk about in a moment, but you can see both Oracle and IBM Watson show up. First, Andy, do you have any comments on this and anything that surprises you here at a high level? So, as you can see in the chart, the Microsoft, as we discussed about the numbers, the jump in Microsoft, as one would expect after a chat GPT is phenomenal, while the other two either lost market share slightly or went up a little bit. Microsoft, as you can see, went from lower left all the way up to straight up to upper right, which is a dream chart for anybody for the matter. So, you know, that's all riding on the chat GPT hype. And also, I actually sat in and listened to Satya Nadella's Q3 earnings script, and there are some things that jumped out at me, right? One is he is talking about the new AI wave solution and expanding the TAM, and he called it as a new wave, which means this is not a workload that never existed in Microsoft Cloud before. Now they are trying to bring in this AI workload to move to them. And this is actually an afterthought for them because it never used to happen for them before. When people think of AI workload, whether it's Google or it's, you know, AWS was the number one or two for the longest time. Now they are throwing Microsoft in the mix. You know what, maybe I should do that. That's one. And two, if you listen to the call again, you will see that he claims that one, he has the largest powerful AI infrastructure that he wants people to train their models in it. Moving away from chat GPT, know what it can do for you and then train your own large models in that. So that's another market they're going after. And the third one that's from the conference call that surprised me is, you know how many customers they have for open AI right now in spite of all this hype? 2,500. That's nothing. That's a rounding error, right? But if you had to look at the Azure Arc, which is a hybrid management deployment security solution, that has 15,000 customers. So they are barely scratching the surface of the AI and they're hoping against hope by throwing in things like Cosmos DB, which is the AI database and the combination they are up. They are hoping that when people think of AI, they will think of Microsoft first instead of looking at other companies. Well, I mean, it's completely changed the way people think about technology, think about interacting with, I've often said many times, that AWS turned the data center into an API and now chat GPT, people are rethinking how we interact with technology. Let's dig into some of these players in a bit more detail. But before going there, can I make another comment, please? So to point about Microsoft AI, in spite of all the hype, people are thinking that, you know, Microsoft wants to get this newer workload of AI, but it's initial projections and initial goal is not to get that workload. It's to make a dent in Google's search business. If you are to prove that AI can do all this for you, Microsoft is hoping to revive their search business. You know what a huge market that is? Google pretty much owned it for a number of years. They're hoping that by doing an AI-powered search, Microsoft can get some of that. That's what they're going after, more than AI workloads. I mean, it's the most profitable business there is. It is. It is. Look at the pie chart. That is a huge market. If you dig the pie chart of search advertising, it's like maybe a sliver is other and then Google has, I don't know, 98% of it. It's unbelievable. All right, bring up that other, those talking points because I want to first talk about IBM. IBM, Andy, they could have had this thing sewn up. I mean, it had all the mind share back at the beginning of the last decade. Alex, I wonder if you could play the clip. Now we come to Watson. We're looking for Bram Stoker. And we find who is Bram Stoker and the wager. Hello, 17,973, 41,413 and a two day total of 77,147. So, I mean, I remember that moment thinking, wow, Ken Jennings was such an accomplished Jeopardy player. I mean, this is incredible. A machine beats a human at Jeopardy. It beat, you know, Gary Kasparov about 10 years earlier at chess. This is like that times 100. What happened? Why did IBM Watson fail? Well, so I don't remember that moment. I was at IBM at that time. So it was a very proud moment for all of IBM. And I remember that moment. So it was a very proud moment for all of IBMers at the time. Mace, but it is, it is. But the problem is, again, this goes to prove a point. No knock on IBM. But good research companies may or may not be able to figure out a way to go to market at times. You know, so they could have owned it. But there were limitations. There were technology limitation at the time. Usability cloud is not a big thing at the time. There were limitations on how you can use the technology, the implementation, the cost, the value that you'll get out of it for the money you're doing at that time is not feasible. So IBM kind of delayed a little bit. And also there was, again, I'm not speaking from internal knowledge. I'm talking about overall observation. The other thing is that the AI-related regulation governance ethical responsible AI and all of that were not matured by them, right? So now there are companies starting to do it. Still it's not matured by you, right? But if you are to look at the thing 2023 announcement they made last year, some of them were pretty good. It caught my attention. So there are a lot of things like AI Studio, Datastore, Governors, and even things like GP as a service. All of those things, they're just catching over the market. They are so far behind. They're catching over the market. The thing that did stand out to me, a couple of things. One, they have this center of excellence of AI experts, over 1,000 experts. Imagine that. That's huge. None of those other companies have. And these guys are in front of customer all the time. So if you're able to do this long-tail sales cycle with the customers, hand-holding them as a customer advocate, IBM has a way to maybe gain some traction with customers. That's one. And two, they also have this what's called an environmental intelligence suit. That will provide you information about carbon footprint, how much it takes to train your models, how expensive it is, the carbon footprint, the whole calculation. So for the companies which are very environmentally sensitive, because the training large models is not cheap, as we're talking about tens of millions of dollars. It's reducing in a faster pace. It's coming towards a few hundred thousands now. But still, there is a cost involved. So if the company is very worried about the environment, I want to say that it'll give you an idea of how much it will cost and where and all kind of things. So there are a couple of them are standing out to me. But overall, they're still catching up. I feel like IBM's greatest strength is its biggest hindrance in that it's a very services-oriented company. When they brought out Watson, Jenny Rometti was the CEO. She was the understudy of San Palmosano. Both very services mindset. They've got a phenomenal services organization. And I think a lot of IBM said, hey, we can use services and drive services revenue to implement all this AI. And it's the wrong scale model. And now we're sort of, hey, I said no. I mean, what are the brand they got? What are you going to say, IBM AI? Watson is a well-known brand, right? I mean, you know, in spite of it's not gaining that much traction, it is a household name. Right, so you would have gone a million years. All right, how about Microsoft? I've said a number of times that they went from third place in AI technology relative to Google and Amazon. And then all of a sudden they cut the line with open AI and that deal they cut. And Satya, he's got that sort of Cheshire cat grin. We were talking about this before in terms of he's only got a fraction of the search ad market. Now, you know, even a small share gain is going to both hurt Google and drop to Microsoft's bottom line. But, you know, word is Microsoft is actually talking to Firefox about embedding Bing Chat into Firefox. They're probably going to pay a bunch of dough to do that. And it's because really Bing Chat hasn't really moved the needle. What are your thoughts on this? Well, there it goes. We were talking about search, right? I mean, at the end of the day, right now, the money is not in the AI workload. That's an innovative futuristic, I'm thinking down 10 years down, again, things could change fast, but I'm thinking about 10 years, 20 years down the road that, you know, that's my workload. But right now what my workload, the major workload that I can get is the search, search advertisement. And Bing was almost never existed. So now they think they can revamp that by a combination of Bing or Firefox or, you know, doing AI infused. There is, I gotta tell you, before this, I mean, have you ever used Bing? Hardly ever anybody uses, right? No, and even now, I've used it. I'm like, yeah, it's okay. I'd rather use ChatGPT, honestly, you know? That's the point. By infusing ChatGPT into Bing, they are hoping to revive their search. But again, we'll talk about Google in a minute. By doing certain things, Google either caught up in my view or catapulted ahead of that. So it's going to be a two or three horse arrays when it comes to AI. And Microsoft and Google is at it. I actually think Bard's really good. I've used Bard a bunch and I find it, its accuracy is sometimes better. But anyway, you wrote a piece called Google's Generative AI Strategy from Google I.O. 2023, Hits and Misses. We'll put it into the show notes so that people can get access to this. I thought it was very good. You know, you basically, I mean, it was a fire hose of announcements at Google I.O. and it was unbelievable. And in this piece, I say you laid it out really nicely. And I agree with you. Workspace and email that helped me write. The photo editing is pretty cool. We've seen that before. You see the commercials of Google Pixels, Google Pixel, the phone, you know, the Bard uplift I thought was really impressive. And I said, I personally have had better experiences with Bard than I have with, you know, of ChatGPT from a quality standpoint, believe it or not. But you also had some misses in there, too. You didn't think Cody went far enough, didn't have enough scope in terms of the languages that it supported. You were disappointed with the industry breadth, even though they did have like a med, palm or palm med. I can't remember exactly what they called it for the medical. It doesn't sound like you were overly impressed, nor was I with Google Cloud Platforms announcements and integration, but maybe you could explain sort of what Google did, you know, palm in your take on those announcements. Okay, so let me talk about a couple of items in the hits, which I thought was pretty good. One was the, what people didn't probably get in that is the AI-enabled cognitive search that Google introduced. In my view, that's going to be a game changer for them in the search market, snatching it back from Microsoft again. Yeah, that's the Bing competitor, right? It's a Bing competitor at big time. But here's the thing, though. The ChatGPT equal, and if you had to do a search, ChatGPT models are not real-time. We are talking about training the model in an incremental way. I mean, the first ChatGPT when it was the hype came in, it was like a two-year-old model. People were wabbed even with the data that was about two to three years old. But Google claims, again, to be proven, the hearsay is that it's real-time cognitive search, which means if I'm searching, imagine ChatGPT giving you answers real-time versus two-year data. That's what Google is going after, right? That's one that I thought was pretty good, by doing a- That was 2021, right? So that's one. The other thing is the barred synthesized AI content when it shows that they don't want to miss, because there's a lot of people who love the original way of Google showing all the documents, pick what it was kind of thing. They, if you look at the demo, what they did at the Google IO show, they give you both. They give you that barred search synthesized search results on the top, which is equal on a ChatGPT, you can see that synthesizer results. And then at the bottom of it, regular Google results, right? So I think they don't want to lose this market and they want to get the other market, so they kind of mix it up. Innovative dilemma. So that's one. And the other one I thought was pretty good was that actually could be useful, not just on search, but now we could move things into more like an e-commerce market. You know, when I'm building stores and stuff, I could search using that, which ChatGPT cannot do, by the way. So I could do search for pictures, images, combination thereof, and I'm looking for something, I search for it. And then that enables the e-commerce stores, which is not an option with Microsoft Search right now, with Bing, I mean with ChatGPT right now. So those two actually stood out to me, in addition to the Google Workspace we talked about, for the image editing, multiple LLM models, and blah and all that, right? I'll bring this up later on, but I think that point you're making about e-commerce and shopping is really, I mean you can do some basic shopping at Google, at least from a search standpoint, you really know, you don't do the transaction, you'd rather go to Amazon or some other site, maybe go directly to the site, but what if you could bypass Amazon's warehouses, go directly to the site, and then have them drop ship it, no warehouse needed, okay? And that's kind of the model that Alibaba uses, so that could be very disruptive. Yeah, it could be, that's why that stood out to me, right? I was watching that, I don't know if people got it. And the other one actually they were talking about was, you know, that Google, when it comes to maps, it's Google, that's it, no one else you go, right? And they were talking about, you know, not only that I'm able to map the coordinates to see a 3D view of where I want to go, and a total view, for example, if I want to go for running, walking, I want to see that, but on top of it they could also add things like, you know, how would that traffic going to be, the air quality going to be, the temperature going to be, into the futuristic, into the tomorrow morning, tomorrow afternoon kind of thing, so it's like a back to the future movies. I thought that was, that wafted me. But the flip side of that is, if Google, if you're watching this, the people who do maps, make it better. I mean, they should be, if you're going to use it, they should be using AI already, I'm sure they are, but if I go into like, say Waze, which is owned by Google, and I want to pick a time, like choose a better time to go, and I choose like, let's say I live, you know, out in the sticks, so if I want to go to Boston, if there's no traffic, it's 45 minutes. If there's traffic, it's an hour and a half, or hour and 45 minutes. And it will tell me, you know, leave it whatever, seven o'clock for an eight o'clock meeting, and I'm like, there's no way I'm going to get there in an hour. And it'll say, yep, yep, yep, but then when I go, I can just see the traffic building up, building up, building up, and I can predict it as a human, better than the machine can, so they should be able to do it. That's the demo that they were showing. I'm hoping that they'll be, because they'll be able to predict, and I'm hoping that this comes out. That's what got me excited. I'm skeptical, because I'm like, why doesn't that intelligence already get in there? The only answer that I can give, I'm ranting, is that they care, they're optimizing for ads. You know, maybe they want to keep you in the car longer. All right, let's talk about AWS's Lego block approach. I'm a fan of targeting builders, which both GCP and Microsoft will do, but those two definitely have other consumer and advertising aspirations, whereas AWS does not, of course, is always Alexa. So you've got Titan, you've got Bedrock, which is a large language model as a service. You've got Code Whisperer. What do you think of AWS's Lego block approach and their chances? So each one have their own approach, as we discussed, right? And Amazon's approach is, you know, as we talked many times, I'll give you all the building blocks you need, Legos. And used to be, they used to give it at the, you know, much of the infrastructure now, right? Now they are moving the thing up a little bit because, you know, of the computation and, you know, the pressure they're getting from them. So you have Bedrock, obviously, it's going to include the foundational models, as they call it, multi-models, and they have both for textless servers, for images and stuff, and they got Anthropic in there, and they got AI 21 Labs, stability AI, and Amazon's one type. They are all accessible via APIs. That's all the base, you know, table stakes. But the thing that impressed me about that is they're also making a claim, I don't think it's available yet, but whenever it's available, you can privately customize there from using your own organizational data. That could be a differentiator because basically you take the models, whatever the model is offered to you, and then you train the models on your own data and keep a private instance of it. So basically what that means is, if I want to do a support chat, you would take the knowledge corpus you have and then train the whole model using your data, then all of a sudden you have the super support persons available that you can talk to at any given time based on the data that's available. That I'd really like. And the other thing I like is the Amazon code, well, I wouldn't say I like it, Amazon code whisper is decent, but it's not at the level of code pilot yet because code pilot has been around for a while now, so they are up and running much better than these guys are. And then I think they hit it off the park with the hugging face partnership because if I'm going to build the blocks, then I gotta give option for people to get the models as well from hugging face and start using that. I thought that was huge. And then also obviously as you know, they have their own easy to versions of customized, I mean Google came and talked about it multiple times saying that I'm gonna, both Microsoft and Google saying that I'm going to have customized infrastructure or optimized infrastructure for Amazon. Yeah, they're falling. Not only silicon, but also the easy, they'll have a training based for training of the models. They have a specific easy models and then chips and then sending for inferences for inferences based models. Yeah, so you got, so Titan is the new chip. Titan is actually the model, FM model for. So I see, okay, so the new inferences. Trainium was for training of the models. Trainium and inferences. For inferences of the models. Which are ARM based silicon from the Annapurna acquisition. So I mean Amazon's probably five years at least ahead of those guys. In certain areas, yeah, I mean, you know, look, again, going back to this, the Microsoft's, you know, set up a thing saying that, you know, I will give you at a top level. I'll give you, you ask for what you want. I'll cook the whole thing and give you as a chef, you know, the final finished product. But Amazon is like, you know what, I'll give you all the ingredients and best of the ingredients and best of the best and you can build what you want. And it's, it's, there's nothing wrong with either model. You got to choose what you want. But aren't, isn't, aren't Google and Microsoft going to do the same? The difference is they're also competing on the full stack with, you know, the completely integrated. I mean, Google's going to be going after search, right? They're going to go after, you know, actually, you know, some, some e-commerce models, potentially Microsoft's going to be infusing into their application software. Whereas Amazon, I think is generally saying here, go build it and then go compete against whether it's, you know, up the stack. They're not today anyway, getting, you know, deeply into building application software. They've got certain verticals. Like, right. Think about their, their call center. But that's the thing. It's not like, you know, I, it's not like I just give you infrastructure or chip level, you know, I'll go up the foundational models. I go up even applications if you want. I'll integrate with the full corpus of knowledge data, what do you want. So they can go up the stack a little bit one at a time. But again, there are, there are models that, you know, I'll give you the components you want. And, and my Amazon's primary goal is to drive traffic to their cloud. It's not about, you know, selling AI tools and whatnot. That's my model, you know? All right, let's take a look at some of the emerging companies that aren't yet public. This is data from ETR's ETS, the Emerging Technology Survey. It's exclusively tracking privately held companies and plots net sentiment, which is a measure of the net percent of customers that intend to engage, i.e. either evaluate or adopt, on the y-axis. And it's plotted against mindshare, which is measured based on the presence and the survey of these 1200 IT decision makers. And note that open AI, open AI wasn't even in the data until last fall, when chat CPT was announced. And then they now, at the time, and now they still have the number one position on both dimensions by far. Databricks is very prominent and always has been with its ML and AI tool chain, but also some of its other products that may be seeping into the sector, but it's intended anyway, the survey to be clean with AI ML. There's a whole another data platforms, database, data warehouse section. So Lakehouse should fit in there, but they also show up, as I say, in that section. So it's intended to be AI only. And you can see the other emerging companies like Anaconda, which is a platform for data science, data robot, data IQ, hugging face, which we had at our AWS startup showcase, doing partnerships with AWS. And then this month they announced with IBM and others. So what do you make of this data? Are there any surprises in here to you? Yeah, a few surprises, right? So one is, well, I mean, obviously open AI is not a surprise that everybody should know. I'd be surprised if it is not rated that high. You know, it went from, came from the left free as they say, to get everybody by surprise. So that one is fine. But the one that actually most surprising to me is hugging face, because hugging face has been all over the place. You know, if you know, if you know open AI, pretty much, you know hugging face. All over the place in a good way. And they have been forging relationship with, you know, from, they actually have a relationship with Azure as well, with Azure, Amazon, with IBM recently, what they released in Google, they are all over the place in arms and relationships. You know, they have become the defect of model repository for all the model, you know, distribution. So essentially when you're a data scientist, you know, if you create a models, you need a place to, you know, share it and hugging faces become that. It's not just that they also provide, you know, a platform for you to train models as well. But again, it's well-known in the model management, model training, model repository area. I'm surprised that, you know, it's that low. I don't know why and how this is measured off, right? So that's a surprise. Well, there's a bias toward large companies, large U.S. companies. I don't know if that makes a difference potentially because it's a much smaller comparatively size-wise. That's possible. So that could be, you know, maybe a lot of the sort of, you know, this is hardcore enterprise IT, right? It's big banks, it's insurance companies, it's large manufacturers, it's the global, you know, 2000. That's kind of where it's at. So, you know, maybe they're the fat middle and the later adopters, you know, they're probably not so much the early adopters, although some of the financial, you know, sector is going to be early adopters, so. As well, in certain use cases, yeah, it's useful. The other thing is Databricks. So Databricks has become more of a company that that's more, you know, they are providing all the tools, almost taking the Amazon way, suggesting, you know what, they showed, remember that they had the employee, their own employees collected data set and they showed them how to train an LLM. Basically, they're trying to showcase the same way that Amazon was doing, said, hey, you know, you can build a model with me. Why are you going somewhere else, right? Because at the end of the day- Wasn't that GPT washing by Databricks a little bit? Hey, in the next, yeah, actually, I wrote an article about that as well. We talked about that. So at the end of the day, if you're going to pick everything up and move to another cloud, if you have no need for me, then I'll just exist. So I'll show you how to do in mine. I mean, yes, that's valuable and I'll show you how to do with mine and then you stay with me, right? So, all right. Okay, so it's one thing to have MindShare, but to really get ROI, you got to have adoption and show real business value. So this next data aims to do just that. It plots the MindShare data that we showed earlier. That's that blue line and then the percent of those customers that are familiar with the AI platform, so they're aware of it, but those that have also evaluated the platform and they intend to use it or expand their existing usage. So it's a measure of adoption. Databricks has the highest adoption rate of 26% as I just defined, followed by Anaconda, then open AI at 13%. We can talk about that. Everyone is using chat GPT, you know, but personally, what about adoption in the enterprise? Then you see hugging face to your point, 13%. I mean, it's not off the charts, but it's still very solid, data robot, data IQ, data IQ in the low double digits and then the rest. Anything here, surprise you. What do you make of this? So pretty much all of them in that list that that's in line except the one, two and four, right? The first one Databricks, as we talked about, you know, they're trying to have companies stay with them and rather than going somewhere else. So I don't think Databricks will ever be in a business of providing AI tools. They are more of an AI platform. I will figure out a way to help you create models using my platform. That's the goal. And I think they are succeeding in that which your data shows. And the second one, Anaconda is a little bit surprising to me because Anaconda is not necessarily top of the mind for a lot of people to use it as an AI platform. I mean, it's a data-signed platform. It's a very good one at it. But as a AI platform, I'm not sure because it's missing a lot of components that are in place. And of course, hugging phases is there as I was telling you earlier. All right, let's close with just some final thoughts. Okay. You're thinking on this. So Enterprise AI is different. It was really the first point that we want to make here. IT, they're rigorous. They need guardrails. They're concerned about IP leakage. I'm concerned about IP leakage. They want super strong governance, privacy, security, transparency, explainability, bias controls, et cetera. These are hard things to do. And the second point here is most enterprises, they're not going to build AI rather they're going to buy it as embedded within their enterprise apps. It's Salesforce. We didn't talk about Oracle. We kind of skipped over Oracle, but they're embedding AI into everything. It's that AI powered infrastructure that could be physical infrastructure from whether it's Dell or HPE. They're infusing AI, but up the stack is really where you're going to touch it. Einstein's service now is going to be big in this. What do you think about those two points, Andy, that first of all, you've got to have more rigor in AI and it's going to be largely purchased through your application vendor embedded as opposed to I'm going to go build it. Yeah, so there are two schools of thought. One is if you are a vendor, in order to drive efficiency, you have to infuse AI into all of your applications. We talked about AI being increasing productivity. We talked about their email. When I'm writing an email, I got to do a short email, long email, or mid email, long email. I just have to choose that, then boom, it'll give you what you need. So that kind of productivity or coding efficiency, all those things, it did be built in as part of the applications to make the applications more efficient, right? So in that sense, if a vendor does it, one, probably the vendor takes the responsibility and liability for it, right? All of the issues that come with that. But if you were to do on your own, we talked about the governance, we talked about the security aspect of it. We talked about the ethical aspect of it. We talked about responsibility aspect of it. There are so many issues in there that CEOs need to be, CIOs need to be start thinking about. Matter of fact, in the last all of the advisory calls that I have for the last few months, I would say, beginning of this year, almost obviously, as you know, everyone is talking about chat GPT, how do I use it? How do I use chat GPT in my whatever I have? And am I liable for it if something were to happen? How do I reduce my risk more importantly, right? So when it comes down to it, if you're an enterprise, if you're going to use that, my recommendation would be, you start off with your business case first. Yes, there's so much of hype. Don't get caught in that. Start off with your business use case first. What is the problem you have? What do you want to solve? Work backwards to that. And once you've figured out the use case that's going to work, build all the rigor and governance and controls that we talked about onto it, then you'll have a good application. So I mean, I'm sort of basically putting forth the premise that it's going to be embedded. At the same time, I think a lot of companies will think, okay, we have this corpus of data. How can we apply AI to it? Most certainly they're going to be using open source tools and they're not going to be building AI tools themselves, but they're going to be using AI tools. But I can say I'm kind of contradicting what I said earlier. I can see specific use cases for companies searching documentation, would be an obvious thing. Again, theCUBE, we apply it to our corpus of data. Now we're doing that with again, open source tooling, certain APIs. But people, as I say, I was just trying to figure this out. I mean, the last time Andy came on breaking analysis in December 2022, we said, the premise was AI goes mainstream, but ROI remains elusive. And I think Andy, we got that right. ChatGPT is very intriguing and it's kicked off responses from all sorts of competitors. We're seeing AI washing everywhere. We've just talked about that. We've seen also Google's code red, IBM has responded, AWS, everybody's announcing AI. As well, everybody's thinking about, okay, how can I apply it to my own corpus of data? We're hearing a lot about, oh, we're not going to need BDRs anymore, of analysts who are building dashboards, they're going to disappear. Everyone's trying to figure out the ROI and the right business model. And it's pretty clear people see a path to better productivity with humans in the loop, but radical business models are not as clear. We're going to pave the cow path in search and ads and subscription license models that we know, but I think people should expect, you know, new radical business models to emerge. Imagine a disruptive, we were talking about an e-commerce model that goes after Amazon's massive warehouse infrastructure, which has been a competitive advantage. What if Google, Google enables better shopping and direct shipping from a manufacturer, more along the lines of what Ali Baba does and what about industry specific use cases and business models that could emerge? You know, something new is going to come out of this that's going to surprise a lot of people. You know, the kind of the iPhone moment. It's like, it's like ChatGPT was the new software iPhone and now it's like here, figure out how to apply it and everybody's scrambling to do so. I think in addition to ChatGPT and OpenAI, they're learning what to do with this. What are your thoughts? You're exactly right. Again, I don't know if it is fair to compare ChatGPT to iPhone because ChatGPT is a much bigger moment than iPhone moment, in my view. Wow, wow. Think about that for a second, right? I mean, I tend to agree. It's like bigger than the internet. It is. I can even go that far. It is very big. So the bottom line is this, just because you have a Swiss Army knife, as I say, if you got a hammer, don't go looking for the nail. Yes, all these capabilities are built in, but again, go back to your original point. What are the use cases? What are my problematic areas? Where am I inefficient? For example, in the IT operations area, I see that it's not necessarily ChatGPT, but again, a combination of AI and some of the ChatBots and ChatGPT equal enough that LLM models. There's a lot of applicable use cases, all of this AI ops companies, IT ops companies, service ticket companies, try to reduce that time or reduce the incidents. So your systems will be up and running all the time. People don't realize that. An incident can cost you upwards of millions of dollars for every hour it's down. So if you are able to efficiently manage that, manage your AI optimized infrastructure, you could be having your business running on all the time. So, let's see. So Andy, you should follow Andy on LinkedIn. I mean, you're constantly posting on LinkedIn. You do a lot of great research. You talk to a ton of people. So I want to thank you for coming in here. Really nice job today. Thank you, appreciate it. Thanks for having me. All right, that's a wrap folks. Many thanks to Andy Turai for his outstanding collaboration and input to today's episodes and he and I talk all the time. We brainstorm and he's just a great friend and a wonderful collaborator and super AI mind. I want to thank Alex Meyerson who's on production and manages the podcast, Ken Schiffman as well. Kristen Martin and Cheryl Knight, they helped get the word out on social media and in our newsletters and Rob Hoef is our editor-in-chief over at siliconangle.com. He does some great editing. Thank you. Remember, all these episodes are available as podcasts wherever you listen. Just search Breaking Analysis podcast. Publish each week on wikibon.com and siliconangle.com And don't forget to check out the cube.net for all our events and videos where the cube's going to be next. You want to get in touch, email me at david.balante at siliconangle.com or DM me at dbalante. You can comment, reach me on LinkedIn. Pitch me, love to hear ideas. If you got a good one, I'll respond. If not, don't take offense. We just, we get a lot of inbounds. Please do check out etr.ai. They got great survey data, the best I think in the enterprise tech business. This is Dave Vellante for theCUBE Insights, powered by ETR. Thanks for watching and we'll see you next time on Breaking Analysis.