 Good afternoon, Cloud Community, and welcome back to fabulous Las Vegas, Nevada. We're here midway through day one of Google Cloud Next. We've got three days of coverage, power packed with over 35 different segments. My name's Savannah Peterson, joined by analyst Rob Stureche for this one. Rob, this is the phone one. Is this your second Google Cloud Next? It's my third. Your third? Yes, yes, but this is second one with theCUBE and I think it's really just, I mean, it seems like we were here yesterday, but the amount of announcements since the last one, last fall is just massive, it's just unbelievable. It absolutely is massive. 30,000 people here. Our next guest, one of the folks behind the infrastructure at Google. Sachin, welcome to the show. Must be a thrilling day for you. Absolutely is, and super excited to have this time with you guys. Yes, we're grateful you could make the time big series of announcements coming out from your division today and over the course of the week. Can you give us some of the highlights? Yes, I think some of the things that Thomas also started mentioning in the keynote, I just want to echo a little bit more. The best place to run AI is clearly in our public cloud regions, but there are reasons why sometimes customers can't use those regions. And that could be because there's a regulatory need, there's a compliance need, survivability or latency-driven need that forces the deployment to remain on-prem or at the edge. And that's why we introduced Google Distributed Cloud. And we're seeing tremendous momentum with customers both in completely air-gapped environments where the data must stay on-prem, but the operations are all also air-gapped, no connectivity to the internet ever, as well as connected environments like retail stores that may have hundreds or thousands of locations, but they still need continuity of critical services on-premise. So, super excited with the momentum there, and also super excited with what we announced at last next, actually, which is cross-cloud networking, helping customers simply and securely get the best out of AI models, out of data, regardless of where that data resides today. So, you know, very, very excited about those. How has that cross-cloud networking evolved since that announcement was made eight months ago-ish? Yeah, so cross-cloud networking is really about how we help you build distributed applications, how you secure your workforce, and so how can you connect every single one of your locations into our backbone, leverage the power of our backbone, but bring your security stack of choice? So that's where we partner with Palo Alto and many of the other players so that the customers don't have to compromise on security at all. There's, you know, maybe three things that I'd like to highlight in cross-cloud network that we're announcing here. One is cross-cloud networking is now service-centric. So what I mean by that is we had a technology called Private Service Connect that allowed you to have, think of it as a proxy, in front of your Google Cloud services and applications. But now we extend that to other clouds or applications and services running on-premise. So customers can have one consistent way for NetOps and SecOps to set this up, and DevOps can now move much more quickly. So that's number one. The other thing we're doing is you may have your data in Hyperscaler One and may have it on-prem, but you may want to be leveraging Gemini as the model. So how do you connect that cost effectively with the right SLAs? Cross-cloud network helps customers with that. And then the third one is you're trying to do inferencing, serving these models for all those wonderful enterprise use cases that we talked about. And you want to make sure that you're using yet ML infrastructure very effectively, GPUs and TPUs. And that's where we help you with our load balancer. It's now AI-aware and can help you optimize the experience for customers as well as help lower costs. So lots of great innovations in CCN. Yeah, I think let's unpack that a little bit even more because I think the service-centric, just when GA, right? And I think it's when you start to look at how organizations are building out their applications, they either want to bring the AI to the app or to the data or the data to the AI. And this would seem that you're helping them do both and choose where data lives, what kind of lakes it lives on and things like that and bringing the AI to it in certain circumstances and then with the distributed cloud portion as well. But also making those connections across there because most people are multi-cloud or have multiple different places or have SaaS or something of that nature. So maybe let's unpack that using a customer example. We've been working very closely with Scotia Bank. And Scotia Bank is trying to connect on-prem and other cloud environments using interconnect or something we call cross-cloud interconnect. Into Google Cloud. And now they're trying to represent every one of the services they have using this whole service-centric approach we have. But it's not just our services, they have wallet services, credit card services that they must integrate with. And so when you think about AI applications and bringing the data together, those AI apps have to integrate with all of their existing services. And so the new enhancements and the new capabilities in cross-cloud network thinks about how you provide that connectivity fabric, how you provide that security, and how do you provide that proxy and help create a mesh for your services and your application tiers that may sit anywhere. And so Scotia Bank is able to get consistency, better security, lower cost, all at the same time. And this seems like a good place where you're bringing together not just the power of Google, but the power of your partners as well. Like you mentioned, Palo Alto, who we had on earlier in the day. How does that really play into this to really help customers get a full end-to-end solution? You know, very often customers are forced to make a hard decision where they have to compromise on their security stack of choice. Because they may go to a cloud provider who says, use my firewall. And their firewall may not be the best and their security needs may be something different. So first of all, we partnered with Palo Alto Networks to create our cloud next generation firewall. And it's 20 times more, it has 20 times higher efficacy than firewalls in other cloud providers. But at the same time, if you don't want to use our firewall, we made it very easy to bring your own firewall and integrate it as part of our backbone. And so Palo Alto, Broadcom, there's just so many different partners that we've onboarded so that customers do not have to compromise. And this goes back to our strategy of remaining completely open. We want to make sure that that experience is great and customers can bring their security stack of choice. It's so important and it is really exciting to see the different big players and hyperscalers all coming together to your point to play nice and to create these solutions. 20X more efficacy on security is really compelling. It's no wonder that you just got authorization from one of the most secure entities in theory on the planet, the US government. Tell us a little bit more about that. Yeah, let me talk a little bit about that. So we're super excited. We announced this today and this is now shifting gears to our Google Distributed Cloud product which has now received authorization to operate in top secret and secret use cases with the US government. And that doesn't happen just by chance. It's our fundamentals in zero trust and building the platform from the ground up with zero trust framework in mind and then going through that accreditation process working with the US government. So super excited about that. We're seeing great traction with Google Distributed Cloud in many countries and for many, many different use cases. Congratulations. That is a huge deal. I'm just curious, how long was that process? More than a year, how about that? Yeah. I'll leave it there. It definitely, it is definitely a topic. Is that top secret, is this the tradition of this project? Something like that. But we've been working on it for a long time and on the product side, we've been working on several years for the product to be able to serve these kinds of needs. Yeah, and I think also within the Distributed Cloud you're bringing partners to bear there as well and ISVs in particular, not just packaged ISVs like SAP and Citrix and Starburst, but things like where Starburst having Trino and Service Mesh and stuff of that nature or Data Mesh, you also have your own in those. So it seems like you're bringing up, again, that choice to the edge and to where it needs to be. Yeah. Actually used. Yeah, so let me just provide one more example on that. We're introducing a generative AI search capability in Google Distributed Cloud. So it's running fully air-gapped, no connectivity to the internet, no connectivity to Google. You can feed your most sensitive data into this thing and search it, multimodal input. So the way it works is we take that input, we apply Vertex API to do optical character condition, speech to text conversion, translation API. We then use an open source model to create embeddings. We then take those embeddings, we store it in a vector database, which is alloyDB, and then we create a chatbot interface and use a large language model, which is Gemma, the Google open source model, in order to interact with this. So imagine you've got all these data sets private, you could not gain insights from it, it was way too complicated and we now give you a complete solution where you don't have to worry about your data ever leaving your premise. Now, I just talked about a whole bunch of Google components that pulled this together, but if you want to use somebody else's translation API, a different embeddings model, a different vector database like Elastic, for example, if you want to use Lauma instead of Gemma as the LLM, every component there, you can swap out for whatever you want as a customer. So we give you this entire solution, very attractive, very useful, but completely open where every part of it can be swapped out for your choice. It's like a little Lego kit. It's just like a solution kit, where we give you the entire solution framework, we show you how it's all integrated, you can take it at the entire repository and deploy it, and if you're like, no, no, I actually want to use something different in this box, very easy to do. It makes it a wonderful, it shortens the adoption curve, I would assume, for a lot of these customers. It absolutely does, I mean it's a Kubernetes based service, very easy to load, very easy to run. You know, in Chennai world, you can't be talking about years. I mean you can be talking about days and weeks in terms of how quickly innovation is moving, and so we want to make sure if you need to update any one of those components, you can do it easily. Yeah, absolutely, I think it's awesome. You mentioned customers, I know you're in the spotlight this afternoon sharing some customer examples. Company I did not expect to be talking about as much as we, because we already have today as McDonald's. Tell us a little bit about that use case. So McDonald's, you know, we've had a great partnership with them. I've been working with Brian and Steve from there for a long time now. And it was great to sort of understand what are they trying to do as they reimagine their store experience. For them it really starts with the customer and crew experience. And so when they thought about infrastructure and modernizing those restaurants, the infrastructure there, it was a great conversation on hey, what should be in the cloud? What should remain in the restaurant? What kind of reliability do they expect? You know, the restaurants must continue running. The friars must run, the menu systems must run. And it was great to sort of design with them what does this cloud future look like in those restaurants? And then with Google Distributed Cloud with our connected offer, using one RU servers, three of them in a cluster, in every single restaurant globally, it's just so powerful. And I think there's two things that I just wanted to highlight there. One is operating that environment. I mean, it's, are still quite legacy in many, many places. But we can bring a complete DevOps model to this. What's your blueprint? What's your policy? How do you segment your markets? How do you roll things out? So a lot of our customers love that about Google Distributed Cloud. Tenth of thousands of locations, it has to be structured very, very simply for how you roll out, how you monitor. And then secondly, it's about the AI applications we can enable. And so now how do you do automated order taking? How can we recognize on your tray, did you actually place all the right food items? Because you want to prevent loss, but you also want to make sure that the customer is satisfied because they got everything they ordered correctly. And so a ton of applications that we can provide. And these are restaurant examples, but it extends to many, many verticals. And some other things you were talking about or what was on that Thomas was talking about this morning is some new form factors coming to this as well. And again, with everybody, if you don't talk about NVIDIA and talk about AI, something seems wrong, but there's some form factors coming with NVIDIA as well. So we announced support for multiple things, L4 GPUs, H100 GPUs from NVIDIA. At the same time, we have one RU servers. We have multi-rack support. And for the air-gapped environment, we also announced a tactical appliance. So it's a small, I think it's about 100 pounds or so appliance that can go in on a vehicle, for example. And it's a full cloud inside this appliance. We also announced actually something called a Konex design, which is racks that actually go in a container. So your shipping container, like you can actually deploy the container and bring up an entire cloud to service an area. It's just, you know, we're really working with our customers on what they need. And we're going to continue delivering the right form factors for the different locations they needed at. Yeah, different environmental factors there even. It may make sense in certain environments you have a shipping container as your server rack. And the sustainability, I was walking the show floor earlier and I was seeing that there was a big presentation on AI for sustainability, both from the food chain to everything that we do that's so wasteful. I can imagine in the McDonald's incident, you're reducing a lot of food waste as well. Yes, yes. I would imagine that's always a challenge for them. The streamlining operations, increasing efficiency, reducing loss, reducing cost, it's super important. Another example I'll give you is, a lot of the manufacturing companies are trying to use video vision detection to identify errors as it's going through. Yes. And so, you know, if you can pull that off and not ship it to an end customer, rework the part, and do it automatically, it's extremely powerful. Improves customer satisfaction, reduces loss as well. Tana use cases, I mean fraud detection in financial customers. Speaking of sustainability, a lot of the energy companies are prone to cyber security attacks. Mm-hmm. You know, everybody sort of, after that it's critical national infrastructure. And so they tend to not share their data and have like fully air gap systems. But now we can give them data capabilities, AI capabilities that is fully air-gapped. So they can optimize how they're delivering energy and, you know, meet their own sustainability goals as an energy provider. So very excited about helping them in that journey. And I would assume this also has a supportability aspect to that that was really helpful to these organizations because even though, you know, we have historically low unemployment here in the States and everything like that, there's still a difference between what it takes to actually do AI on-premise versus doing it in the cloud. And this must help that as well. I think, Rob, you bring up a great point. Like so many of my customer conversations are about I want to get out of the business of managing infrastructure, you know, managing databases, managing solution integration. And what I want to focus on is what is the data that I care about? How do I securely gain insights from it that can help me innovate faster, improve my customer experience, lower my cost? And I tell them, you know what my business is? Is to deliver them that complete architecture and solution, infrastructure, past services, database services, AI services, make it a fully managed solution. So they don't need to worry about that, right? And that's not what they want to be in the business for. And they want to focus on their own core businesses and we're super happy to be able to help them there. And I would say that that probably hits both sides of the fence that you're talking here on the networking side as well as the distributed cloud. Yes, yes, absolutely. You know, we're sort of going back to the networking conversation. So often that is the hardest part when you're thinking about on-prem migration or connecting different data silos securely. It sometimes can be the hardest, it can take the longest. Removing that, reducing that pain significantly, showing that we're open to working with other hyperscalers, showing leadership, for example, in our data transfer upon cloud exit. We're the first hyperscaler to make that free. So showing leadership that we're going to lead here has been super, super helpful to our customers. Yeah, it just seems like you continue to have all of these things and that really, and it's paying out with the customers. I was going to say that Orange was on, was talked about by Thomas as well this morning. And that was a really interesting use case as well. Yeah, I mean, Orange, again, 26 countries, everyone that had their own jurisdiction, nobody wants the data. Like these are called data, et cetera, to leave their country, but they need to apply AI to that data. They need to enhance customer experience. Orange is in the business of infrastructure, but telecom infrastructure, mobile infrastructure, they don't want to worry about cloud infrastructure. And so we're giving them Google distributed cloud in each of those 26 countries to provide those services so easily. And now they can focus on analytics and AI instead of worrying about the rest. That user experience is so much more enjoyable for them. I think that's such a good thing. Sachin, since you see quite a swath of the market across verticals, across nations, across the entire industry, what do you think is the biggest risk to the hype and excitement that we're experiencing right now in AI? I think that's a great question. You're probably going to get a lot of different answers to that. I think customers are trying to go from experimentation to real enterprise deployments, production use cases. And we talked about so many customers in the keynote who are already deploying applications this way. Thinking about security, is my data still secure? Am I making sure that I'm providing results that are completely grounded in reality? Like if you're using this to engage with your end customers, that engagement has to be high quality, it needs to be correct. You cannot have hallucinations, making sure we're taking care of data privacy, right? Copywriting, I think is super important. And so, I think the risk is, I mean Google makes this a huge priority and we want to make sure that as you go from experimentation to production in enterprise applications, that all the things you care about, compliance, security, privacy, performance, cost, we can help you. And I think the risk is, people may try to skip some steps there, maybe working with other players as well. And, you know, I'm really hoping that they follow a methodical approach and our partners can help, we can help with our own experiences here. We're also here to learn about one of the challenges here, to actually make that vision a true reality with AI. I think that's very well stated. And I think you're right, skipping a step or taking a shortcut now has really big massive impacts in the long term, especially with the size of these data sets and the infrastructure that we're dealing with. And I think what we're saying is, we're taking so many of our AI innovations and bringing them to our own applications at massive scale, like Workspace. And we can bring the knowledge of how to do that security, safely, privacy, copyright, performance at scale, the AI supercomputer that we talked about to get the best price performance, sustainability, how you do this in a carbon-free or sustainable way. We can bring that expertise and really partner with our customers to help them with their objectives. I love that. Last big question for you, and then I've got one little lightning round. What do you hope you can say next time you're seated here as a CUBE alumni with us, that you can't say yet with the current ecosystem? That one's easy. For me, it's always about taking the innovation and technology we're bringing and demonstrating how we've helped customers. And so the customers we haven't delivered as the solutions to or enabled yet, I want to be able to come back here and talk about more customers where we really help them solve their own use cases, their own problems, innovate faster with the innovation that Google Cloud is bringing. So it starts with customers, and I'm hoping to talk about many more successful stories when I come back. Well, we absolutely love that. Well, we have a chair for them next time. You can absolutely even do a hot seat. We'll get them running through. We would absolutely love to have that. Final question just because I'm curious and you've been working with them for a little while. What's your McDonald's order? What's the? What's your McDonald's order? So I'm vegetarian. Okay. So I'm a little boring on this, but I really like their fries. No, I was going to ask how many fries were consumed during the iteration of the partner. Yeah, I mean, we actually celebrated some of this with McDonald's for everybody. So super, super excited about that. And one of the interesting things is if you go to their Chicago store, sort of their primary store, it every week does a rotating menu of different countries in the world. Oh, cool. It is so cool. Because I also traveled to India and the McDonald's there, you get like local, spicy, great options too. It's wonderful. That's great. That's great. Rob, what's your favorite McD's order? I'm chicken McNuggets. You're a nugget man. I'm a nugget man. So that is. You are a nugget man. I'm a chicken sand-o girl with mustard and extra pickles. In case anyone's wondering, in case anyone wants to bring us this snack. Sachin, thank you so much for being here. Your insights are absolutely fantastic. I learned a lot. Always a pleasure to have you on my left. And thank all of you for tuning in from home for our three days of live coverage here at Google Cloud Next in a beautiful Las Vegas, Nevada. My name's Savannah Peterson. You're watching theCUBE, the leading source for enterprise tech news. theCUBE. The leader in high technology and event coverage is traveling to Las Vegas.