 Welcome back everyone to theCUBE's live coverage here in San Francisco, California. For Google Next, Google Cloud Nexus, the cloud show for Google, all the actions around AI. I'm John Furrier, your host theCUBE with Rob Streche, Dustin Kirkland, our new CUBE analyst, Lisa Martin. We're all here for two and a half days of getting all the data and sharing that with you here on theCUBE. Of course, a lot of action in the cloud, Google. Introducing some killer new things, but more importantly, it's a roadmap for where they are taking the future. We're here with Lisa O'Malley, Senior Director of Product Management for AI, for Google Cloud. Lisa, thanks for coming on theCUBE. Thanks for having me. So your baby search, the AI search, but you got the AI conversation on Vertex, Vertex AI, the hot product. Yep. It's the model garden, right? The garden, what do they call it? Model garden. Vertex model garden, yes. I call it the open garden. There's one more open source in there. Warren and I will talk about that later. Yeah. You got really amazing innovations. I love the embeddings that are going on with vector databases. Yep. You got connections, you got connectors, you got conversations. Vertex is quite the set piece of the show. Yes. I mean, it's pretty impressive. What is Vertex AI happening? What's going on with the news here? A quick minute to explain. Yeah, so let me dive into sort of some aspects of Vertex. We want to be able to meet developers where they are. They can be machine learning experts or they can be novices. And we have a suite of tools within Vertex that enables all of them. At the bottom layer, you have the foundation models and you'll talk to Warren about Vertex AI and model gardens where developers can really get their hands dirty and use the models directly. One step up from a layer of abstraction up from that is our Vertex AI search and Vertex AI conversation. You may remember, we referred to them as GenApp Builder in the past, but we've brought them in under the Vertex umbrella. And what they do is they bring the power of Google search and the ability to create chatbots and voice bots into the hands of developers. What's different about the chatbots and the search now? I mean, you know, conversational AI has been around for a while. They've seen some startups flame out, some succeed, get sold. But this isn't your yesterday's chatbot. There's a lot's changed. What's the most important difference people should know about around this new game-changing search and chatbot? It's not just, hello to me, your customer support is deeper. It's much deeper than that. So first you start with search and search enables you to search across your websites, your internal data, structured data, unstructured data, your applications. And so it enables you to help both employees and customers find what they need. Built on top of that, our chatbots and voice bots enable you to interact with that data in a really natural way across multiple languages. You may have seen in our consumer search product, you see this generative search experience where now you have this ability to have a multi-turn conversation and to summarize the findings of the search. And so that can happen either in a search bar or in a chat interface. And so it's really a much more natural and intuitive experience. What's changed on the technology? Is it more reasoning? Is it leveraging some of the models on Google? What's the underpinnings on the updates on Vertex AI search? Yeah, so on the search side, we've been using large language models for a long time. Semantic search and semantic understanding are large language models. Google search is, hello. The technology we use for retrieval, ranking, snippeting. How do we present the search results back? All of that has been using large language models for a while. Now we're making it available to enterprises on their data. So whether that's an externally facing website for their customers, an external chat bot, or tools internally to enable their knowledge workers to use all of the data that they can have at their fingertips much more easily now. So one of the things, and we kind of rift on this a little bit before coming on set here, but one of the things that's always top of mind for enterprises is security and data privacy. How are you addressing that with those customers? So in a couple of ways. So first of all, within GCP, your data is your data. We never use your data to train a model. We never use your customer data to anywhere else in Google or in GCP. So that's sort of baseline foundation. We also, most customers want to be able to use the world knowledge that the model was trained on but also their own enterprise data. And so we enable what we call grounding. So grounding enables you to make sure that the answers that you're providing to your customers or your internal employees is grounded in the fact base of your enterprise corpus of data. We're also actually experimenting with grounding on real time search with the technology that our search organization uses as well. And then lastly, I'll touch on our commitment to privacy security, data governance and what we call our AI responsibility principles. And we really take these very seriously. So the grounding is where you guys manage the risk around data leaking into the public models or how to interface with public models with proprietary data. Is that kind of the- So actually slightly different from that. So there's two parts in your question. So we don't allow enterprises data or enterprises customers data to touch the models. You use a replica of the model and you can train that based on your data and that always stays in your tenant within cloud. Separately, grounding enables you to check that the answer that you're giving is based on the data within your enterprise corpus. So it enables the model to use your data to answer the question. So that's a risk mitigation factor for the enterprise. It is a risk mitigation factor for the enterprise and you can actually play with that and allow no creativity at all and make sure the answer is always within our data or you can enable some creativity for generative tasks like making a new email, building a new slide deck, making new marketing materials. I got a dirty look the other day last night when I was like, oh, you can lock it down. They're like, no, don't say that word. Nothing's locked down. It's open. Again, a data is open. That's the beautiful thing. And we got to watch ourselves. But it also helps and we talk about it with our AI that we have that it really is a corpus of jargon. And so this helps with that grounding, right? So if I'm a, you know, I describe, I'm using it for financial AI and I want, or for legal, my company has certain jargon that they use. It becomes part of that grounding. It does. And so you can train the model. You can train the model in different ways. You can train it with, you know, prompting. You can fine tune it. You can also just use search to make sure that you're searching on your own corpus of data that it will pick up your style. And you noticed probably in the keynote today, now even with Imogen, you can use style tuning for images as well based on your own corporate style guide. You know, one of the things we were talking about before you came on, and I wanted to ask you is because you see successes like Lang Chain out there, other companies. Training is really hard to do all the time on data and to get the AI ready for people to use. So people are using these extensions. You guys have extensions. It's a great way to kind of keep current and allow people to work with data without compromising policy and compliance. This is a huge deal. I mean, what I just said sounds very nuanced, but it's like, that's a huge deal. Explain why this is so important. So extensions are really game-changing and the reason is is because now not only can you answer questions or have a conversation about the data, you can actually take action within your corporate environment. So for example, I might as an employee search and say, you know, how many vacation days do I have left? I see that I have 16, great. Let me book a day off through Workday Extension right there in my chat experience. And so it allows you to interact with many systems as long as people are building to the framework. And the alternative, the old way was the data for HR was restricted for certain employees. You have to get someone to administer access, create a data set, deploy it. I mean, like how slow or worse replicated and have to build the models and replicate. And worse again, when you were building that chatbot, you needed to define every step of that pathway. And now with simple language, you can describe what's the flow that I want to create and our chatbot can create it for you and allow lots of different paths. I've always said this on theCUBE and I've always been waiting for someone to do this. I think, because we live the media business all the time, we're real time, we like to get the data out there as fast as possible, surface the truth, get the right experts. All the personalization and recommendation engines in the past have always been old school, like web-based stuff. And so the question for you, this seems like a game changer in the sense that you can bring personalization. I mean, you can literally roll your own anything with search and chatbot and build that in and not have it to be yesterday's recommendation engine. Or I mean, talk about the personalization. Did I get that right or what? Yeah, so search and recommendations can be personalized now more than ever. You can have real time data flowing in, user data. We can use that data to anticipate what your user might want next or might want to do next. That can be for internal users and for external users. So it really is, you're right, quite game changing. Okay, so I want this really bad. How do I get involved? What do I do? Do I download something? Do I sign up? Is it in beta? What's available? Give us the key to the core. Yes, so we announced this week that both Vertex AI Search and Vertex AI Conversation are fully GA, so available to anyone. You could, as long as you have a GCP account, you can sign up for access to Vertex and get access to them. Now we do have a lot of more exciting things coming in preview and experimental and those have specific lists associated with them. And that's really more CUSC design partners that come in that you guys agree to work with. We never build products in isolation. We always build them with design partners and that helps us to build better products. That makes total sense. Just a quick question on your industry perspective, as someone who's been in the industry for a while, how would you describe the pace of change right now in terms of the speed and the velocity? It's pretty massive. It's pretty mind-blowing. I have to say, I don't think I've drawn breath since last October or November. And things are changing all the time. Every time we talk to a customer, we learn something new and that's multiple times, multiple days a week. And even just the ability to synthesize all of the incoming information and what's coming from our strong technology and research teams and put it into a format that makes sense to customers has been an amazing opportunity, but very fast-paced. Has that changed how you do product management internally? You don't have to give away any secrets, but I'm sure it probably got some AI assistance on the inside the Google Plex over there. So yes, we use a lot of AI-based tools for generative purposes. Because you have shortened the cycles. You've got to get stuff out. We have shortened cycles dramatically. Humans can only go so fast before collapsing. So what do you change? What's changed for you in terms of? So I think what's changed is a very significant level of internal collaboration across all of GCP. There's very clear goals and how do we get to the point that our customers need us to get to quickly with a lot of collaboration. Yeah, I think what I love, just circling back to the extensions was the announcement with the ecosystem. And I think this is a place where Google sometimes has not done a ton with this, but what's really exciting is Alastian with Confluence and Jira and then Salesforce and letting the data stay there and being able to use it in there in the models and be able to go and be able to build on top that. Being a product person myself, a product background using Jira and Confluence where you're documenting things and you have such a corpus of data in those for building out programs, for building the next gen. Are you seeing, is that where customers are pushing you with this? Yeah, so customers are really looking for productivity and productivity is slow when people have to go search for information. And it's in the Jira's, the Confluence, the Salesforce, the SharePoints of the world. And so they want to be able to bring all of that into one place and search really quickly and easily across it. And then have an interactive experience, multi-turn search, multi-modal search across that data in all of its various formats. And so our connectors enable them to bring that data in and then to take action on that data. I think the integrations between apps with Duet, seeing some of the demos, I can imagine there's going to be a lot of under-the-covers details around dealing with data integration. Yep, data integration and that's the strength of Google Cloud when you look at tools like BigQuery and Vertex Connectivity with BigQuery is going to be really, really important going forward. And you must view multi-cloud or the super cloud idea is you're just searching cross environments, just another API, connected tissue, no issue there, just all the same game. So as you said, there's a lot going on under the covers and so it's not that it's easy to do, but it needs to be done. Lisa, great to have you on. Thanks for coming on theCUBE. Final question for me, Rob might have another one, but what's next? Because you run it hard. I mean, obviously, chatbots, I think it gets oversimplified, but it's a lot going on with that personalization, the augmentation of humans and data coming together. What's next for Vertex? Have you had to kind of throw a little directional roadmap out there? What should people know about Vertex AI's direction? You know, I think there's probably a certain areas that you'll see us continue to develop on. So having the ability to have first party, third party, OSS models available, providing all of the tooling for different levels of developers capabilities will be really, really important. Making sure that really expert data scientists have tools that they can use, and then more novice developers or enterprise developers can use things like Vertex AI Chat or Conversation and Vertex AI Search to build applications really, really easily. What else can I think of that? That's great. What's the bottom line for the developer watching? He says, he or she is saying, what's in it for me? What's the bottom line? How would you like bumper sticker this? The bottom line is that, you know, with RGA today, we've really enabled enterprise developers to build applications quickly and easily. They don't need to chunk the data. They don't need to worry about indices. They don't need to worry about getting the data in. It's all there already and the tools are ready to go. Data as code. It's kind of like infrastructure as code, Rob. Absolutely. I think that will really accelerate a lot, not only just the devs themselves, but we talk about data developers and how they go through that and how they think about it. And I think this is the step forward that they need as well. I agree. I mean, you think the data developers are legit new persona. We've been saying on theCUBE that data as code is like infrastructure as code for DevOps. There's going to be an SRE for data out there. I mean, I'm using your language, but platform engineering now, we're going to call it. I agree with you. I don't know what this is going to be called, but that function certainly exists and is super critical. They all have the same words. Guardrails, policy, automation. So a lot of the same things that's happened in security is going on with data. So, you know, we think of data IT markets emerging. In the cloud operator or architect, I mean, operator and admins will be automated away. We believe, but the data has to be smarter, has to be intelligent, addressable, horizontally scalable. And all in one place. And available to be domain specific at a moment's notice. Yep. And so you will see, you know, we announced a couple of domain specific models as well with SecPOM and MedPOM. And we intend to enable partners to build those and to build applications upon those. Well, I won't use the word lock-in anymore. That's for sure. As long as you guys don't use the word walled garden, because walled garden is bad, open garden is better. I like the garden, model garden is phenomenal. I like how you guys put that all in one place, curated to your own foundations with Curia and third party. And I think open source is going to be interesting to watch. The innovation come out of the developers. Exactly. What, I mean, small little models integrating with other models. I don't think the speed at which everything is moving, no one company can build it all. And so enabling others and the innovation there is critical. Lisa, I feel grounded after this interview. Thank you for coming on and sharing the data. We get that and share with the audience. Thanks for coming on theCUBE. Thank you for having me. This is theCUBE. I'm John Furrier, Rob Stretcher. We'll be right back after this short break.