 Welcome back everyone, live CUBE coverage here in San Francisco for Google Next. I'm John Furrier, host of theCUBE here with Rob Streche, heading up to CUBE, collective, our open analyst organization. We've got Dustin Kirkland, contributor here. Lisa Martins here, got the reporters here. We're getting all the data for you, Sheriff here, live on the show floor. Our next guest is Will Grant. It's vice president and CTO of Google Cloud. Will, great to see you again. Thanks for coming back on theCUBE. Oh, it's great to be here. So it's great to kind of have that last day to kind of look back, kind of. Everyone's kind of like, I won't say burnt out, but like, man, what a lot of energy. I mean, so much stuff here. Even headline news getting kind of buried a little bit. We talked about storage and other things. Alloy DB, Rob, you mentioned that yesterday I thought it was a great point. It's the headlines that we're really grabbing everybody to do at AI, AI infused in a lot of the products. It's like, it's a AI party here, it feels like. And people are pumped. And we're relaxing. Now we look back. What are you seeing? What's your view of the show so far as we hit in the last day here? What's the highlights that you see? Cause you got a lot of prep coming and you know everything. What kind of, what customers are seeing? Well, so first and foremost, you look around the floor, the energy, the partners, the ecosystem, the customers. Just the growth has been incredible. And one of the things that really sticks out to me is that it's all forms of customers. So you have your digital natives, your AI natives, right? And they're coming for the infrastructure and the TPUs that you mentioned. But then you also have more traditional companies that are coming for a platform that allows them to make use of AI more quickly. And then you have everyone in between. I mean, like Wendy's, you know, it's right around here somewhere. Yeah, right over here. And you know, that's an application of embedded speech to text in Chrome and Android, plus LLMs, you know, plus dialogue flow. And so all the components of this cloud platform coming together to create something of value. And you know, you get free fries as a result. I mean, you can't lose. Well, I think if you got it right. I liked it. I went and actually did the demo and I did a baconator because I, of course, you got to go with the bacon. And I said, give me a triple, no, no, double cheese with tomato. And I had to get a frosty, a chocolate frosty and a large fry. It came back. It did catch up one thing. When I said triple, it actually added bacon. So it gave me more bacon. Which, I mean, that's not really a loss, is it? That's a feature, not a bug, right? That's a feature. It defaulted. That's what I learned. It defaults to more bacon. Which, that's a great default. Well, you know, as having a team that's been involved in the prototyping, that may be intentional, maybe not. Who knows? Who knows? Yeah, just had bacon and everything. Exactly, it was awesome. So let me follow up with one. And this is one that I don't think I've seen many headlines about, but man, the demos back right behind you, Will, are stunning. It's around Google Cloud at the edge. It's been a few years in the making. Tell us a little bit about that. Yeah, absolutely. So first, it's rooted in demand. So one of the earliest customers that really, they were saying, hey, we'd really like to get Google Cloud and get that experience built in at the edge as telco companies. And so, you know, we took the time to engineer correctly and as an early pioneer, you know, you can appreciate what it takes to take the power that's inside a data center and bring as much of that as possible out to the edge. And some of the AI has to be driving that. I mean, that wasn't something we were really looking at. Absolutely. And to your point, not just raw compute and not just a little bit of storage, but also control plane, observability, plus some first-party AI services that we've been able to kind of package down for inference at the edge. I love the segmentation too of getting the news in. You got the technology platform with the TPUs and GPUs and you got the solutions and they've got the software and then do it at the top of the workspace. The thing I want to ask you, and I get your perspective when we were talking about this yesterday in our analyst segment was if you have, imagine this use case. We're in college, we're all sitting around the dorm room having a few beers and some Wendy's. And hey, let's build an app. And we have an idea. What console do we pull up? Now think about the demographics that Google has. This is where we want to get your reaction to this because we were saying that if you guys can hit this new AI demographic, let's build an AI app. If we're in college, we were using workspaces in middle school. So guess what? We know what a comment is. We know how to use annotations. That is a use case that I think you guys have an advantage on with the younger demographics because if you're coding, if that's the new CI CD pipeline. Absolutely. That changes the developer angle on AI cloud. What's your reaction to that? Can you share your thoughts? Sure, well, as a CTO, part of my job is looking the future a little bit. So I think what you've seen also this week with Duet AI, especially in workspace and making that intelligent assistant and kind of always on collaborator there with people when they're in the experience. Now think about that. Plus what we're also doing with Duet AI inside of GCP. So for ops, for analysis, for security. Now imagine those two worlds blurring a little bit. That's what we're really excited about. So this year, it's about setting the foundation. But buckle up because to your point, the experience as an app developer or somebody wants to get value out of the platform, they don't care about whether it's GCP or workspace. They know it's Google. They know we're a great place to build modern apps. And so if they're in workspace, what's to preclude them from taking their idea, pulling it out of a spreadsheet, pulling it out of a Word document and maybe rapidly prototyping inside of a Colab notebook or the console, it's not much. Write the executive summary and just say, Bill. Yeah, the continuity there is absolutely remarkable. Yeah, the continuity from one, all the way to the end. Partners, ecosystem, the SIs here, everyone is here. And that seems to be changing. I'm guessing that multiplies the access that Google Cloud has. Yeah, absolutely. And you pointed out, I've been here for a while. I'm in my ninth year here in Google Cloud. And I look across this floor and this is 80%, 90% partners, customers, how they're using the technology, how they're actually ramping in, helping customers who maybe don't have all the digital skills within their own org, approach and get closer to AI. I think across, it's probably a rough stat is around 100,000, 110,000 new people trained in Google Cloud Platform and generative AI capabilities across our SI partners. So what that means is that any company that's looking for skills to help start a project, execute a project, complete a project, they're going to have plenty of options. Another thing I think about is how the lines are blurring a little bit between, what is a partner and what's a customer? So Workday, SAP, a couple of world class software companies who, part of the relationship is migrating infrastructure from their owned to GCP. But now they're also using the relationship and what they learn about it to create entirely new capabilities like in SAP's case for some of their ESG offerings and automobile manufacturing offerings. And in Workday's case, you saw the job description spin up, kind of conjuring a job description out of a knowledge base very easily. You're going to see more and more like that. We've been talking about ISVs that are evolving into we call super cloud vendors because they're building apps that look like cloud but they're on a cloud. So they're enabling people to build a platform. That's an app that's got platform looking features. Ecosystems develop, so you see customers that have ecosystems on top of yours. So they're an ecosystem partner and they have ecosystems. Absolutely. That's not an ISV. Right, it's kind of a new class of ecosystem. You know, it's interesting too because if I rewind the clock back 10-ish years or so, nine-ish years, it feels very much like the origin days of when GCP really started to catch scale because we're seeing a plethora of enterprises starting to take those first use cases and really deploy them, but we're also seeing a very vibrant startup ecosystem. I was just over at the startup lounge yesterday speaking to some of these companies. Now I can't use the company's name because they're private but I met with the founder of one of these companies said, listen, we decided to go all in on GCP roughly two years ago. We had basically zero dollars in revenue, right? They were living off of funding and within two years, this year late last year, they're up to $10 million in revenue. They're going to be profitable this year and next year, they're projecting 80 to $100 million in growth and it's that kind of growth that reminds me of the early days of GCP where some of our foundational, digital-native startup customers exploded from kind of a niche-y app into world-class industry leaders. I'm seeing this all over again. I was going to say, understanding that that kind of leads into that 70% of AI, you know, unicorns, kind of unpack that a little bit. Absolutely. Why are they here? There's two reasons. One, usually those unicorns are all about AI-optimized infrastructure. So they want the fastest processors, they want a platform that they can get to easily, GKE Enterprise, by the way. I think as you mentioned some of the announcements, you know, that can benefit from a little more light, I think that's one of them. Because a lot of these digital natives are starting off with raw compute right in their early journey, get the training done, do that at scale, but then they maybe, they look at vertex for maintaining pipelines and more LLM ops. So that's the first reason is powerful infrastructure, AI-optimized infrastructure. The second is our roadmap. And they see us thinking about and building with them collaboratively a future roadmap, imagine a world where ML infrastructure is fungible, where they just bring their training workload or their inference workload and we map it to the right type of processor and the right type of clusters for their use case. They see that kind of activity and they get really excited. I want to bring that up because the reset that's happening in cloud is an opportunity. And we said, we always talk about you guys as a great cloud number three on the rankings, but. Number one in your hearts. No, but with the AI cloud positioning, you can hit the trifecta, you can win the new developer market, which is start ups here. Solution oriented and ecosystem, that's a trifecta, that could propel you guys instantly and change the order, backrank the order again. And the question is this, back when the early cloud started was, the alternative was the data center. So it was really obvious for Dropbox and Twitter and all the people that were in that generation that I knew were building to go to AWS. Now, there's competition. What's the equivalent alternative that's not good? What's the issues? Is it data? Because if you guys are going to win that developers, it's going to be obviously demographics is a good alignment, but it has to be a concrete value. The data center was time, cost and hassle. Is that the same use case problem that you guys take away because the developer's going to go where the action is? Absolutely. We're in our dorm room going to build a product when we have a brand that we trust, but it's going to be easy. Absolutely. It's got to stand up and push the solution now. Yeah, it's a really great point. And it is a very subtle and nuanced point in that one of the biggest pieces of feedback we've gotten this week is programs like the Jumpstart program that was, I think shown during the developer keynote, programs like that really differentiate us in that now you can go from idea, concept to deployment and you can get a map that has been used by others. And candidly, that's something that we also have improved on over the years is not just a capability and documentation, but also here's a path through and here's how you can succeed both in the early days and at scale. That's getting a lot of very positive feedback and kudos to the team that's been building the Jumpstarts here at Google Cloud. Yeah, I want to jump back to those startups and that startup ecosystem because it's super impressive to walk around here. I do see a little bit of a reset point perhaps in that to John's point, many of these startups grew up on Google technology. If they're going to bet on someone to produce the infrastructure and tools that they need, especially around AI driven startups, it's all there. There's another piece though, which you talked about the blurring of customers and partners. There's a bunch of potential startup competitors here, in fact, and it's I think really, I think a compliment to the way Google enables some of these, but you know, we talked to Elasticsearch the other day, there's a number of databases that would compete with many of Google's potential, you know, channel conflict. How do you think about that? How do you enable that and try to avoid channel conflict? Well, we start and end with the customer. So our primary goal is to make sure that our customers have the choice. They're the ones that choose, we don't choose for them. So that means we have to have fully featured databases, means that we also have to partner with other database providers, so that if they want to take advantage of maybe it's scaled infrastructure under the hood using somebody else's database, we've got them covered. Go for it. But if they want like VectorSearch with AlloyDB, well, we need PG Vector availability. So all of those little bits really just go to the point that customers are at the center of everything we do. You look at our roadmap now, you know, it's being built collaboratively with our customers every single day with those startups that bet on us early in GCP, also by the way, with large enterprises that we've been working with for many years. You know, Deutsche Bank is a really great example of someone who, you know, because of their early work and some of their work in synthesizing content for financial reports, we're now thinking, you know, how do we enable them to combine that in a way that is safe, secure, performant that maybe in the future, all financial services companies, I can take advantage of. Well, in the last few minutes we have left, I know you got a hard stop at the top of the hour. What's the roadmap look like? When you mentioned your collaboration with the startups, and then what's the white space that you can share for people out there building startups? Right now the startup question is obviously AI cloud. You guys are kind of being perceived as the AI cloud. We kind of called the data cloud in the past, but that's a good differentiation and that's awesome. But as a startup, I don't want to get, I want to pick a nice area inside the cloud and get a position. If the entrepreneurial wave comes, which it will, you're already seeing it right now on the AI side, there's a lot of opportunity given your position and view into the roadmap. Where's the white space? What should startups think about if you had to offer some advice? Well, first off, all crystal balls are a little blurry. So let me just put that out there right up front. Second is, there are so many opportunities related to AI and generative AI in particular and there are some patterns that are emerging that I think are really important for startups to consider. One is it's highly unlikely, in my view, that generative AI will stand alone as a single model or a single synthesized piece of content. It needs grounding. It needs factuality. It needs trust. And so companies that are able to combine the best of generative AI plus authoritative data, you're starting to see this already with some companies building their own versions of generative AI using authoritative data sources. I think the more that these companies focus on what's the market they know really well, what's the key piece of intellectual property that they have, and then bringing that into a place where I think tied to generative AI and tied to some type of authoritative, trust-building scenario, I think that's going to have a lot of legs, a lot of demand. And on the app side, you see the DevOps blurring. Is that an opportunity area too? Yeah, absolutely, because our job is to build a multi-tenant platform. So these extensions, these use cases that go out and reach into a very particular scenario in any given industry or any given use case, we're really counting on our customers and our partners to build that and we're counting on these startups to build it. So like a verticalized version of an app like in healthcare, for example, that would really unlock the ability for a doctor to spend more time with patients by using generative AI, live captioning or live transcription, and then getting that right to an EHR. Those are the types of things that are really, really well suited to maybe startups that we're just going to try to be the platform and the capabilities they build with. I was joking with Rob, that I think it might actually be his doctor's information that screwed up his Wendy's order, maybe pro-actor. That added the big firm restrictions. Yeah, firm restrictions. So on, I remember the original Google, then we did when the Compute Engine came out. Yeah. All the original Googlers were still there, they hadn't been on to do. What, Craig McLaughlin, I think was around then. He was the product manager on the product. So many amazing people have laid the foundation for today, so many. It's really fun to watch, and I was kind of waiting for the AI moment because I've been saying that Google has so much AI, because I live in Palo Alto, so I know a lot of ex-Google, I know the founders, and I know for a fact there's a lot of AI in Google. So the question is, it's just a matter of getting it and kind of pulling it together. It seems like now it's like, it's retail time, get the store open, here's the AI, it seems like it's on the front lines. You guys did a good job there, love the clarity on the ecosystem and the solutions. How would you rank the top in stories in this show? How would you stack rank the top three most important stories of this event? Well, number one, no surprise I'm gonna say this, number one is the depth and breadth of customers that are succeeding in generative AI with Google and Google Cloud. Look around, this place is full. So absolutely number one. Number two is the depth and breadth of the platform that allows those customers and partners and ecosystems to succeed in whatever path. We talked about it already. If you're an AI, generative AI native, you get to come in through some high performance infrastructure, if you're a company that's looking to maybe combine some of your data and like a state in time foundation model and combine those securely, maybe you come in through Vertex. So fully feature platform is here. I think the third, and it hasn't been, I think explicitly stated, but I think it's more implicit and you alluded to it, is all of Google mobilized for AI. So there are capabilities that you see here that have come from various parts of research, deep mind, labs, Google Cloud, all at the point of the customer and in service of the customer. So we're on the move. Will, thank you for taking the time of your valuable schedule to kind of do a wrap up segment with us here at Google next year. It's an honor. We've got a few more interviews, great to see you. And we're going to have to unpack this. So much to go through, post event, after everyone chills out a little bit. We'll come down to the Googleplex and we'll get some unpacks on these great topics. You got it. Thanks, Will, appreciate it. All right, thanks guys. That's the wrap here for the Cube global here inside San Francisco, Google next. We'll be right back with our next guest.