 From theCUBE Studios in Palo Alto in Boston, bringing you data-driven insights from theCUBE and ETR. This is Breaking Analysis with Dave Vellante. At Cloud Next, Google showcased its strong leadership position in AI. In our view, Google's messaging, its demos, and tech-centric narrative have brought appeal for developers and next generation startups. As well, the company's focus on solutions contrasts its strategy to the typically disjointed services that we've seen coming out of AWS over the past decade. Google also showed off an expanded ecosystem of GSIs and smaller cloud service providers encouraging the broad use of Google's kit globally. While Google remains a distant third in the cloud race within the IS and past in a revenue perspective, it's one-fifth the size of AWS, for example. It is playing the long game and betting the house on AI as a catalyst to its cloud future. Hello and welcome to this week's Wikibon Cube Insights powered by ETR. In this Breaking Analysis, we unpack the key takeaways from Google Cloud Next with Rob Streche and George Gilbert. We'll share ETR data that positions Google's AI relative to other leaders and will contrast Google's data-centric strategy with traditional architectural models. We want to start with the key takeaways from Google Cloud Next. John Furrier called it the Google Trifecta. Trifecta, developers, solutions and ecosystems. They showed off a very rapid pace of advancement with gen AI, co-pilots everywhere with low code and no code solutions and frameworks for developers and Google accelerated compute which brings more optionality in a world of GPU shortages. This enables gen AI across all of its products and the caveat of course is you got to use Google Stack. Google introduced the most advanced integrated data and AI platform that we've seen to date showing a data-centric versus a DBMS-centric vision which we'll discuss. We also saw our major focus on security, full dev sec ops just when Microsoft is embarrassed again with another security failure. Let's start with you, Rob. You were at the show, 20,000 attendees at Moscone sold out, how was the show? What would you add? So, first off, the show was fantastic. It was, I think, had a ton of energy and if I contrast it to the week before with VMware Explorer, it was, the show floor was packed every moment of the day. All the demos, all of the different showing off like they had a Wendy's drive-through that you could actually see the AI in action. It did give me the wrong thing. It added extra bacon to my order when I did that. Which is a good, if it's gonna go wrong, adding extra bacon is always the thing to do. So, and part of it was saying, they talked about their process for putting the demos together. We actually had the CTO for Google Cloud on and he was talking about how it may have, that may have been on purpose. So, I'm good with that. I thought, again, the drumbeat of, overall for me, the drumbeat of the developer to John's points about being developer-centric was key to where they were going, talking about plugability, open source and APIs and also reuse of extensible layers or extensibility layers. I think that was kind of the three things they kept coming back to and was killer. Now, George, you were watching Remote as was I. We're gonna go deeper into some of the thoughts on architectures, but any other takeaways that you had? Well, my big takeaway is that GenAI is the biggest accelerant this, the tech industry has ever seen. We've gone through big platform transitions, GUI, the browser, mobile and the cloud, but we've never had a technology that accelerates both the demand for applications as well as the ability to build and essentially add to the supply of applications. Most of them have accelerated demand, but this turbocharges your ability to build new applications as we'll get into. And really, we're only constrained right now by the supply of infrastructure, Dave, as you mentioned, that's been an NVIDIA bottleneck, but more than anyone, I would say that Google having anticipated the need for acceleration built out an infrastructure, not just of now fifth generation TPUs, but an infrastructure that knows how to put them together. And that's why they have the capacity to put GenAI across all their products, whereas Microsoft internally was rationing access to GPUs. And as we'll get into, I think that's why GenAI was pervasive that we saw everywhere within their products. Okay, so it's well known that Google is a distant third, I think I said one fifth the size of AWS, but let's take a look at some ETR data that shows the overall Google spending profile. Let me explain this chart in some details. This shows the granularity of ETRs, proprietary net score methodology. Net score is a measure of spending momentum. The lime green boxes show new customer ads. You can see in the July survey, 8% of the customer surveyed were new to Google. And then the forest green represents customers spending 6% or more, that's 37%. The gray is flat spending. That's a big chunk of Google's profile as it is with most vendors these days. And then the red, sorry, the pinkish area at 7%, that's spending down 6% or worse. And then the 4% bright red, that's churn. You subtract the reds from the greens and you get a net score of 34.8%. That's that blue line that's sort of been descending but then popped up after the April survey. The yellow line is pervasiveness in the data set. In other words, it takes the total end of Google divided by the total end of the entire survey which is around 1700. And that measures sort of the share within the survey. But watch what happens if you isolate only on Google's AI performance. So you see here, net score of bottoms in October, 2022 that a month before ChatGPT is announced, we were seeing the slow deceleration of AI ML spend post pandemic. And then Google responds with BARD, other AI tooling throughout the year. And you can see its net score jumps to 52.1%. Now we should note that despite Google's AI prowess, still ranks lower than the other big cloud players, Microsoft and AWS. We're going to show you some comparison data in a moment but Rob and George, Google's momentum, it's headed in the right direction. And based on what we saw at Google Next, they intend to make inroads, Rob, your thoughts. Yeah, and I kind of said this a little bit while I was out there and doing some of the wrap ups but I think that their ability to put AI everywhere is because of their architecture and how their compute layer, how their storage layer, how all of their applications and solutions are put together. And I think that allowed them like to put it across workspaces similar to what Microsoft is doing with Office 365 with open AI. But I think it was even more so pervasive across even more of their solutions because they own all of those solutions and they are solutions on a single platform that has been architected in a way to take advantages. I think that is something that I don't know how AWS is going to tackle that because basically all of the 300 plus, 380 plus services they have are kind of individual children and they have to have some common layer. So maybe they can go and put it in the console, you can have it with tech docs and stuff like that. They'll probably have to build something into it. Each service will probably be required to it but I think it's going to be hard for AWS to really step up and be either number one or number two in this part of the market. Well, I mean it's definitely has played off of its SageMaker expertise. There was an article in the information this week about sort of inside how AWS is behind in AI and they sort of, Charles Fitzgerald came out with his sort of weekly blog kind of crapping on AWS and the PR people there because of the gen AI less company or something like that. But I think that sort of actually understates AWS's AI chops and I do think it's early in the game but George, your point was that because of Google's infrastructure, Google is not as constrained on GPU, not as reliant on GPU for training and developing some of these models and using its AI. Can you explain your premise there? Yeah, Dave, I want to build though on something that Rob said about AWS having hundreds of these independent services. That's a different mindset that AWS has compared to Google and Microsoft. And I think that's why AWS was caught a bit flat footed and why they're behind in at least gen AI because both Google and Microsoft saw this coming and they had all their other services. They considered part of a software platform. They have a software platform mindset and so they replatformed those services all to take advantage of gen AI. Whereas AWS has a hardware platform mindset and hundreds of services to make the hardware more useful but when a new software technology came along they did not see it as a strategic opportunity to replatform all the services. And so one, they are constrained now in the infrastructure they have available, the GPU accelerators that they have available but they also are behind in the core tooling that all those services would use to embed the equivalent of co-pilots. So they're still struggling to get out their bedrock, basic gen AI development and customization tooling. And without that they can't really get started embedding it in all the other services so that there was the mindset problem, there was the delay problem and then there's the infrastructure sort of rationing problem. You know, John Furrier on theCUBE this week talked about how most developers who were in their 20s they grew up using Google Docs and Gmail and they have an affinity toward Google. I think Google said that, I don't know, I think it was 70% of the unicorn AI vendors using Google. It was kind of the, there was sort of this undertone of AWS is like the boomer cloud. I believe he even said that, that it was the boomer cloud. And I think what was, I think to George's point I think that having been at AWS and seeing how all these services are put together and all layer caked on top of each other I think it's going to be difficult for them to catch up in this. And again, AWS also doesn't play for third place. They always play for first. So it'll be interesting to see how they really approach this going into reinvent in the next couple of months here. Well, speaking of the horses on the track here's a comparison with all the, well not all of them but the AI leaders that are, many of them that are in the TR data set the chart shows net score or spending momentum that's the vertical axis and the penetration in the data set or overlap on the X axis. There's that inserted chart on the bottom right that informs how the dots are plotted. And the key points are, I mean, first of all, look at open AI. It's got an 88% net score and it's got 314 citations. So these are IT decision makers saying we're using a open AI tooling that and is only second to the ever ubiquitous Microsoft. And of course Microsoft is open AI. So it's sort of, you know, you could probably double count that. But so this is really people responding directly on open AI and then directly on Microsoft. So you could probably add them together and most of that is open AI. So, but Microsoft has catapulted itself into a leadership position with that relationship. AWS, as I said, has been a major player with SageMaker but that's kind of like yesterday's AI. And now with Bedrock and Titan and other gen AI tooling, it's, you know, it's still showing strong momentum. It's got a big install base and people using it. Databricks shows very strong in this chart. They've got spending momentum, it's got net score above that 40% red dotted line. That's always an indication of a highly elevated net score. And you can see Google's progress with that squiggly orange line. A very impressive move to the right, although I would say so did, you know, Microsoft make a big move to the right. And then you see the rest of the pack it's Spark Cognition, DataRobot, DataIcu, Anthropic, who got some press at Google Cloud Next, H2O.AI, C3's in there. And interestingly, you see IBM and Oracle, they're in the mix, you know, at their lower, obviously in the momentum axis, but they show up in the survey. So guys, the point here is that while Google is strong, it still has a lot of work to do to catch up in market performance. Will Google's strong technical position in AI, in your opinion, change the game in cloud? If not, why not? And if so, Rob, when? Yeah, I think that what's interesting about Google is instead of trying to play the IaaS and PaaS game, like AWS was forcing Microsoft to kind of play, they've kind of said, let's go to our home field advantage, which is AI. And I think they went back to solutions, they're doubling down on workspaces. I think it does help them. I think that it helps with their momentum. I think this was a big coming out party for, hey, we're the AI cloud. I think you and I had talked about this, the only people who'd really talked about LLMs as a service was HPE and the one cloud they're building. This was LLMs as a service. I mean, the Vertex AI stuff that they're doing definitely was a way for them to push in on that and really, I think strengthen their position. I think it will definitely help them in cloud, it definitely think it will help them in the IaaS and PaaS stuff as well. George, let's get into some of your graphics here. I want to dig into some of the announcements and the architectures Google showcased at next. Here's a slide that George put together. The pyramid up top underscores the evolution of data apps. We've often used Uber, we saw George's ad at Amazon.com. These are leading examples of highly advanced companies that have a lot of engineers and deep technical expertise. And George, there's a lot of data on this chart. Maybe you could explain it and both George and Rob, address whether you think Gen AI will accelerate the industry more than, for example, George, you were talking before about the GUI, the browser, mobile phones, the cloud and how you see Google relative to the competition, George. Okay, so let me start by drilling a bit into why this is an accelerator that we've never seen before. On the demand side, simplifying the user interface takes applications into context and use cases. The industry hasn't reached before, GUI, browser, mobile. Only this time where we've only really seen part of the advance on the demand side, we're probably gonna see by the end of the year agents. These go beyond the chat interface where they can actually take action on behalf of the user. It's like a Siri that works and does stuff for you. And, but on the supply side, as we've talked about, it turbocharges software development unlike anything we've ever seen before. And just in terms of where Google is relative to the other vendors, first, Gen AI favors the hyperscalers because they have hundreds of building blocks that Gen AI can glue together as far as part of platform engineering. Eventually we'll be able to take a multi-vendor stack and accelerate that as well. But that happens only once the market agrees on a multi-vendor stack. Where Snowflake and Databricks fit in is they've been the data and AI layer on Amazon and Azure for the most part because those vendors have relatively weak data and AI layers. Specifically on AWS, we talked about how they're constrained with their infrastructure capacity and they're really primitive with their Gen AI development tools. They've not even shipped it yet and there's been reporting in the press that even the reference customers they cited are saying it's not ready for prime time. With Azure, Google is way ahead in terms of development tools and security for Gen AI. And we'll talk about this more when we get into the data and AI layer. But Microsoft emphasized at their build conference more the embedding co-pilots and desktop productivity and low code development tools. And they had it somewhat in their fabric data layer but Google was much more pervasive in putting Gen AI as an accelerator in the coding of all their services. That might be because they're less capacity constrained on the infrastructure side or it might be that they're just farther ahead in development. So Rob, on this chart, George has a notation on BigQuery. Obviously Google wants to push BigQuery as its primary, a primary data platform. And we reported last week, we talked about how Snowflake's got momentum in AWS and now Microsoft, probably Google is not next to play on the upcoming conference because Google really wants to push its own data platforms. Having said that, and so that's probably why you don't see as many database companies at Google, unless you do see maybe some smaller booths from Snowflake and Databricks that you mentioned that. But then this other notation here on governance, what was that governance ecosystem like? Yeah, I think the governance ecosystem was actually pretty robust. And I think that was one of the things that data quality aspects, even though Google has its own tooling and was talking about that, I think that you saw a more robust, I guess you could say ISV ecosystem around data quality, data governance, data security is still there. I think when we talked to the Googlers about it, where is the white space? And I think that is definitely still part of where the white space is. I think also with George's pyramid there, I think with Palm at the bottom and Looker and BigQuery and Vertex AI, I think also it's not just Palm that they have there, it's Cohere, it's AI 21 Labs and their models. So I think it's beyond just their models and they've brought a lot of the pieces together. And so they are building an AI ecosystem. I think there's, to your point, the fact that Databricks was really late to the game going to GCP and being able to be on top of their open, I guess you could say their object storage. They only launched, I think, towards the tail end of last year, just getting onto Google. So I think, again, it's probably because of demand and I think George's point was really key and actually made me think just now that, hey, maybe this is an opportunity for Databricks with AWS to really be that platform layer for AI for them because I could see that being a huge play. And if I was Databricks sellers, I'd go be buddies with my AWS sellers even more so than I am now. Yeah, I mean, if Google is really kind of pushing, for instance, Snowflake and maybe Databricks away, it opens the door for both Databricks and Snowflake with AWS to come up, to provide a stronger product perhaps than they do with the Microsoft Cloud, with Azure, you get better security and so forth. But it's an interesting dynamic, but Google very clearly is doubling down and wants to really push its data stack. All right, this next graphic is sort of the flywheel chart and talks to how Google's product portfolio across infrastructure, data and AI reinforce each other. And guys, let's talk about the various tools Google showcased at next, the particular Vertex AI, the framework and Duet, which is more of the solution oriented chat capability and how they fit and are embedded within Google's growing portfolio. So George, start by explaining the key points of this chart and in your view, did Google make a strong case that its data and AI platform is more advanced relative to the competition? In a word, yes. And let me hit the high level of the four points and then drill down on one and we can go point by point interactively. But most important, we're seeing a profound shift to a data-centric architecture among many of the platforms where it's managed, the data is center stage, there are no pipelines, multiple compute engines, all talk to one system of truth. This is a profound change that we're seeing. And part of the reason is cost effectiveness. That's the thing we missed or I missed before, which is it's too expensive to go through one DBMS as gatekeeper, even if that DBMS is giving you transactional integrity that having multiple compute engines can't offer in equivalent functionality. The big universal storage here is Big Lake. It actually manages all data types, structured and unstructured and it runs also on Amazon and Azure. And the fact that you have universal storage means you have no pipelines or no silos, all the different engines, whether it's BigQuery, third techs, Dataproc or Spark or the streaming or third party engines can all access, manipulate and enrich this one system of truth. And what's going on is we're unbundling the DBMS and it's a profound shift in data applications architecture where all the compute engines share one storage engine and the DBMS is no longer the gatekeeper. Now there's advantage to having a DBMS centric architecture where that system of truth is you have greater integrity, but customers seem to be choosing choice and cost effectiveness and losing some of the transactional integrity. Let me stop at that point and then we can go through this and the other points one by one. So Rob, anything you'd add to this? Yeah, I think again, this really shows off where Duet was one of the big advances for them was this chat bot across everything, this simple, easy way to get from workspaces all the way to Vertex AI to collab notebook and really have a simple way of using the tooling. And I think this plays to where the position that it was in the pyramid as well is that again, it's more a solution for the masses versus just being for experts. If you're an expert, you can go in and use it basically in expert mode and you still get a lot of the differentiated value that Google is providing to you even if you're not as advanced. And I think that to me was the key is that they know there's limitations to what Duet can do. So they had Duet working in one of the demos, Duet working in coordination to help build an app inside of workspaces that then built an app inside of Vertex to do some image recognition using the Palm model. So I thought it was really good at how they were bringing this all together. I think the story just hits and I think George's comment on having that common storage layer, they actually didn't hammer as much as I thought they would when they came up and they weren't hammering it as much in their keynotes as well. I was kind of surprised about that. We had on the head of storage and he hit it. He talked about files. I guess they had an announcement around files that wasn't in their prep docs or anything like that as well. So yeah, they buried that. So I mean, as I said on the thing, it made my head explode in a good way when he actually mentioned storage for the first time. But I think, again, their goal is offer a solution, offer an app, not I as a pass. I think they're, I mean, although you could call Vertex AI a pass, I guess it could. But the low code, no code movement, I always felt like it was kind of elusive, but it really is on in full force at Google Next this past week. All right, bring up that slide again, Alex, if you would, George, you got some other key points here I think that you wanted to hit. Go ahead. Yeah, let me, I want to build on something both you and Rob said, one about Duet and one about low code and they're related, which is again, Duet AI, which is their co-pilot, Microsoft is sort of popular as the concept of the co-pilot. It's a coding accelerator. It generates, it explains existing code. It synthesizes help. It's a tremendous productivity accelerator for developers. And Dave, to your point, they emphasize the low code tool app sheet with Duet very heavily in the keynote. And I think they're probably feeling some pressure because co-pilot was used so pervasively throughout the low code power platform. And power platform for Microsoft is now the, essentially application building layer that's low code. And it's used both for dynamics. It's used for office and it's used for Azure. That was a very strategic bet by Microsoft that you would have this application development environment that is for non-professional developers. And they are now accelerating it with co-pilot. So we saw heavy emphasis on app sheet, which is a good deal more primitive, but Google feels like they need to showcase this capability. But taking Duet, they also showed it within BigQuery. So where you're doing things like your cleaning and preparing and discovering and visualizing data, you might be using SQL, Python, it's in a Spark notebook. All of that can be accelerated by Duet as their code generator. Databricks showed something similar with Lakehouse IQ, but this was more of a demonstration, a very advanced demonstration, but not shipping yet. So they're trying to show something different where they understand more about that data that an LLM has been fine-tuned on and that can help you build not just queries, but it can help you build full applications. You know, I asked, it's just a quick aside, just watching from afar, they were trying to convey that they were kind of the cool kids, right? There was the demos, there was a lot of enthusiasm, which oftentimes you don't see at Google, you know? A lot of times you see really technical people out there and they're not too excited, but they had a lot of energy as you pointed out. I think the other thing too is, in some of the early keynotes, they sort of depositioned the legacy, sort of I think positioning Amazon and Microsoft was the old guard, I'll have to use an Amazon term. And Curing even said that in our conversations with customers, they want to work with leading edge, the leading edge technology. So it's almost like, you know, if you can't fix it, feature it, right? Because Google's been criticized for the lack of sort of go-to-market prowess. They just sort of always focus on the technology piece of it. They're really trying to turn that into an advantage. Yeah, no, I think that they really put full force into, hey, where you want to build your apps, your kids are gonna want to build their apps there. I even said it, my son who's at Arizona State getting his CS degree, he's almost never used Amazon. He actually would say, hey, that's your boomer cloud over there that you worked for. So, but I think it's actually really interesting. I know he's used Git, and so again, I think it's gonna be a fight for the hearts and minds, but I don't think he's ever used Word on a laptop, you know, independently. So I think, again, you get to these workspaces, AppSheet, how the AI and Duet tie together. I did ask because in George's slide there where it had Looker and BigQuery with Duet, I thought what was interesting is in the demo they showed to us on a little tour that they gave us of the demos of Looker and BigQuery with Duet, what was funny is that I asked the question, did you have to have Looker on top of BigQuery to get the advantages of Duet? And they said, one person said yes, and one person said no, and I was like, so I think they're still trying to figure out, I think to that go-to-market, there's still some work to be done on what pieces do you need to be there to get full advantage, and I think that'll be key. Well, they're obviously pushing Looker, although the ETR data shows that while they're pushing it, the momentum is dropping. At the same time, BigQuery is the dominant, we were looking at the ETR data, some of the double clicks that we're not showing here, but BigQuery is the dominant platform today versus some of the operational databases, but in terms of the adoption, it's very, very high. George, were there other parts of this chart you want to hit on, or should we move on? I just want to, I do want to drill down quickly on the bullets two, three and four, but I do want to add to what Rob said about Boomer Clouds. I think what's going on with the energy, Rob, that you observed at the show is that Google is becoming the go-to platform for tech-centric companies, where Amazon still doesn't have a data platform, and if tech-centric companies were using Amazon as a platform, they were either using Snowflake or Databricks, but Google's data and AI platform now is so strong that a tech company that in the past decade would have grown up on Amazon, now will grow up on Google, and they'll probably move their data-centric workloads to Google. All right, George, hit on your points that you want to make on this slide. Let's wrap this up. Okay, just really quick, as an example of how strategic and coherent the platform is, BigQuery uses Vertex, Vertex uses BigQuery, and Dataplex, the governance underpins it all. Really quickly, like within BigQuery, it can call out to Vertex without moving the data without any pipelines. It can call GenAI models that might generate personalized email or might summarize documents or extract structured information and then enrich a structured record. This is how the pieces fit together, and this is something Amazon can't do, and frankly, so far Microsoft hasn't been able to show it, although they're a little closer. I'll leave it at that, and we can move on. Great, thank you, George. A couple of things we didn't hit on, Rob. I mean, there was a big focus on security, Mandiant Plus, Mandiant has a show coming up in a month or so, it's own show in DC, so they Google acquired Mandiant late last year, so that was a big focus, and then the other is MultiCloud. I think it was either Sundar or maybe it was Thomas, talked about their networking, cross-cloud networking, across any clouds, networking super cloud, if you will. Touch on those two and anything else that stood out. Yeah, so I thought the Mandiant duet in Mandiant for threat intelligence, again, bringing the co-pilot to the SOC, I guess you could say, the Security Operations Center, made a lot of sense. I think that especially when we talked to the folks from security, the VP of security for Google Cloud, and we had this discussion about how there's still a skills gap, and that you need to bring these AI tools because the threat from all of the bad actors is using these types of tools as well and going faster. So you have to be able to respond faster. AI is helping with that, bringing it to Chronicle, bringing it to the Security Command Center. They also had Mandiant Hunt for Chronicle, and they have some more stuff, and we'll be at Mandiant, in fact, I'll be at Mandiant, their NYS conference in DC in a couple of weeks, and we know that there's gonna be even more coming out at that, and I think it's critical that they brought the AI, and this goes back to George's original thing. I mean, the platform, it's a platform with AI in it, and it's not just a strung together bunch of co-pilots. I also thought to your point on the networking thing, it's very interesting. I think they're looking at it as, how do we move data between places faster? How do we help that? It kind of was a, I guess you could say, homage to the AWS, WAN, the Amazon WAN stuff that they did last year or two years ago even, and I think that the networking is still one of the biggest problems in cloud. Helping solve that is definitely gonna help. I was actually surprised there weren't more partners in that announcement to, and I think maybe that's still coming, and we'll start to hear some more about that. The other thing, the Google cloud, Google distributed cloud, I thought personally I think that was one of the ones that flew under the radar. They had AlloyDB Omni, which can actually run on disconnected, not in the cloud on any Docker container instance, so you can run it on your laptop. The fact that they're bringing more to the enterprise and it's not just hardware that they're bringing to the enterprise, but they're bringing a full stack. They talked about GKE Enterprise. I kind of said, well, that kind of spells the death of Anthos. We'll see how that plays out. I think GKE Enterprise, multi-cluster is still gonna have a tough time competing in that market. I personally think that Red Hat is just so far ahead with OpenShift, but we'll see. I mean, if anybody's gonna be able to do it, it's the guys who created Kubernetes, so. And you also saw, then it's about time, but you saw a bigger presence from the GSIs, PWC, Deloitte, Slalom, Cognizant, probably missing some. ATC, Wipro, Infosys, yeah, everybody was there. I mean, and most of them were on with us, and I think from Deloitte to PWC, HCL, we had discussions, just deep discussions about why now, why invest now, a billion dollars. I think it was at PWC, Deloitte, Infosys, they were all invested, I think everybody uses a billion dollars. I guess that's the nomenclature now, to build a dedicated business unit. Over one year, two years, five years, 10 years, or is it a billion? It's a billion. It's a good number to throw out there. So I thought that it was good to see them there, and I think that really helps with the ecosystem, because this, to me, brought it back to what they're strong at, which is they're strong at the people and the process. Technology's gotten easier with AI. So we had the head of AI and data from PWC on, and we talked about that, and about they can now focus on bringing really strong processes and people to that AI to make people more or make companies move faster. And I think that was key to what they were doing there. All right, George, we'll give you the last word. Bring us home. I think reinforcing the theme we've been talking about for a while, we're seeing a shift from building cloud apps to building cloud data apps that are intelligent. In other words, you're using data coming from the real world to program and then to be informed by AI models that control or instrument people, places, things and activities. And I think we're witnessing the emergence of this platform. So it's a molting of the basic infrastructure and the emergence of a data and AI platform as the new place to build applications. And I think Google has a very coherent and powerful story. I think it's probably the leader among tech-centric companies. I think they're limited by their go-to-market reach in terms of reaching mainstream enterprises, but I would say it's now a three-way battle between Google, Databricks and Snowflake. And then Microsoft and Amazon are still in catch-up mode. Guys, thanks so much, George. Rob, flying the red eye in for the breaking analysis. Good job. I really appreciate you guys' time and insights and look forward to the next show with y'all. All right, many thanks to Alex Meyerson who's on production and manages the podcast, Ken Schiffman as well. Kristen Martin and Cheryl Knight helped get the word out on our social media and in our newsletters. Rob Hof is our editor-in-chief over at siliconangle.com. Remember, all these episodes are available as podcasts. Wherever you listen, I'll give you a search for Breaking Analysis Podcasts. We publish each week on wikibon.com and siliconangle.com. If you want to get in touch, if you've got a pitch, email me, david.volante at siliconangle.com. We get a lot of inbound, so don't be insulted if we don't respond. DM me at dvolante or comment on my LinkedIn post and please do check out etr.ai, the best survey data in the enterprise tech business. They're continually adding more to their taxonomy and doing more drill downs. This is Dave Vellante for theCUBE Insight, powered by ETR. Thanks for watching and we'll see you next time on Breaking Analysis.