 Hello and welcome, my name is Shannon Kemp and I'm the Chief Digital Officer of Data Diversity. We would like to thank you for joining this Data Diversity webinar, Make Data Work for You sponsored by Google Cloud. Just a couple of points to get us started. Due to the large number of people that attend these sessions, you will be muted during the webinar. For questions, we'll be collecting them by the Q&A panel. And if you'd like to chat with us or with each other, we certainly encourage you to do so. And just to note, Zoom develops the chat to send to just the panelists, but you may absolutely change it to network with everyone. You can define the Q&A or the chat panels. You may click on those icons down in the bottom of your screen. And as always, we will send a follow-up email within two business days containing links to the slides, the recording of the session, and any additional information requested throughout the webinar. Now let me introduce to our speakers for today, Monisha Deshpande and Joyita Banerjee. Sorry, Joyita. My tongue is getting tired there. Monisha has over two decades of experience working at the intersection of business and technology. She currently runs Google Cloud Solution Value Advisor Team, which helps customers understand and realize the value of cloud solutions such as data, AI, security, and collaboration. Before that, Monisha spent seven years in Google Ads, where she led pre-sales for Google Marketing Platform. She led the overall deployment of a global value selling approach, delivering 200-plus customer business cases and customer adoption of a digital marketing maturity framework. Joyita is currently the global head of the data analytics and security solution value practices at Google Cloud, where she works with customers globally to demonstrate the financial business case of the data analytics and security solutions and how their solutions can have a 10 times impact on helping customers realize their business objectives. She has more than 20 years of business experience working across industries globally and has spent a large part of her career as a management consultant. And with that, I'll give the floor to Monisha and Joyita to get today's webinar started. Hello and welcome. Thank you so much, Shannon. It's wonderful to be here. Hi, everybody. I'm Monisha Dishkande. I'm going to kick this session off if we maybe progress to the next slide, please. Wonderful. So we're going to start with our take on what it means to really close the data value gap and why that's critical for success. Joyita will walk us through how to accelerate business outcomes. We're going to make this very tangible for everybody today and provide some tips on getting started and we'll end the session with some questions and answers. OK, with that, so I think this e-book is probably the reason why a lot of you are here with us today. And what we do in the e-book is we go through eight specific ways to reduce expenses and boost revenue. Great tips, some really interesting insights and content. However, what we want to do today is take a step back and really start to think about building a very holistic case for driving value with data to give us a foundational element in terms of what that business case looks like, which will ultimately lead to the opportunities that you see here and many more. So with that, if we go on to the next slide, closing the data value gap ensures competitive advantage. That's why we're here today. That's what we want to discuss. So if we progress to the next slide, please. Perfect. OK, so I am a visual person and I was curious as to what 181 set of bytes worth of data would actually look like, which is what the world is expected to generate by 2025. And so I Googled it as I do many things and I got close. I don't know if maybe others have a better kind of visual than this, but for 173 set of bytes worth of data, if we were to put that amount of data on Blu-ray disks, we'd have a stack of disks to get us to the moon and back 23 times. So that's how much data we're talking about. It's incredible how much data is being generated and will continue to be generated even in the near term. And the largest irony about this abundance of data is that 60% 68% of organizations are unable to realize tangible and measurable value from the data. And I want to highlight the importance of both the data being tangible and measurable because that's how you start to build the case. I think data for data's sake is no longer enough, right? So how do we really build the case for a tangible business outcome? And Joytha will provide some specific examples of this later on. But right now it seems like the challenges are insurmountable. The challenges are numerous, right? We're seeing growing data silos across the organization. We're seeing more complexity around how and in what format the data is captured and how it's governed. We're seeing the increase in customer and consumer PII data, which results in more risks. And these are just some challenges to name a few. We'd be very curious to hear about the specific challenges that you all are seeing in your organizations as well. Now, if we continue on to the next slide. Perfect. So when data and analytics is in lockstep with business strategy, it increases production of business value by a multiple of 2.6x. There's some research out there that kind of substantiates that. And what we do know is history sets a precedence. So let's explore some well-known examples. One of our favorite examples is Netflix, which, as everybody knows, started as a mail-in subscription service. And in 2011, earnestly focused on its streaming business. And that's when its market value grew by 8x. There's a company in China. It's called Ant Financial. It's actually a digital bank. It's a new digital bank. And it employs fewer than 10,000 people to serve more than 700 million customers with a broad scope of services. By contrast, as we know, traditional banks employ hundreds of thousands of people to serve a much smaller base of customers. And let's be honest, probably offering a more limited array of services. So the juxtaposition of what digitally-native companies can do because they've built their foundation on data is quite stark. Nike, a beloved brand, nearly doubled its market share when it announced a sharper focus on its own direct-to-consumer business. And this was more recently. And in fact, direct-to-consumer or digital-native companies like Peloton, Allbirds, AwayLuggage, Viori, Dollar Shave Cup, Warby Parker, the list goes on. They do extraordinarily well because they have unprecedented access to customer data and have built a foundation to aggregate, analyze, and activate that data from these companies' inception. It's in their DNA. And so then this begs the question about how traditional enterprises who are not digitally-native can also make their data work for them. In a traditional enterprise, we don't have the luxury of pre-built systems and processes, but we can still work diligently towards deriving value. In the traditional enterprise space, we need to be more deliberate about our data strategy. And once we do that, companies that serve us value from data are 8% more profitable. 8% more profitable. That's quite a compelling carrot. So if we go to the next slide, please. Perfect. OK, what we're observing in our conversations with data leaders, such as yourselves, is that their focus is shifting. Or rather, perhaps I'd say it's expanding. And here are the questions they are asking now that I think demonstrate the expanse of thinking. Number one is, have I made the right investments in my data foundation? Meaning, with new analytical and infrastructure capabilities, have I shored up my organization for the explosive growth and data we are experiencing? Have I done the right things? Have I done right by my organization? Have I done right by my data? And am I using the right technology in order to really shore this up? All right, the second question that we're asked a lot is, how can we better leverage our data to drive business value outcomes? Data, for data's sake, is no longer sufficient. That's what we've been discussing. But understanding that data is an organization's number one asset, OK, maybe number two asset behind people. I think we all believe people should be our number one asset. But nevertheless, this is an incredibly important asset and it's key to then implementing an approach to deliver business value from that data. So how can data be a catalyst and a driver of business value? Number three, how do I position myself to leverage AI ML use cases to drive change with data? And to that point, what are the organizational use cases that allow me to drive change and value? I think the ebook illustrated several. The question kind of leading up to this is then, do I have access to the data, the skills, and capabilities required to really take advantage of new developments in AI and ML? The fourth question we receive is, how can we make our data more accessible and usable for different teams and functions? So in other words, how is the data delivered and in what format and to whom and is the data actionable? And lastly, how can we effectively govern and secure access to our data across the organization? As we see this explosive growth of data, it's an opportunity to get closer to and build sustainable relationships with our customers. But this opportunity comes with lots of responsibilities. And added security and governance measures are mandatory in this time. So this gives you a sense for the types of questions I think everybody's grappling with. And so when we think about building a business case, and on the next slide, we're going to demonstrate kind of our levers and our thinking. It's these drivers that help kind of outline what the business case is. They can also be performance gaps if they're not addressed. So the first is technology. Only 27% of executives, say it's companies, big data initiatives, are even profitable. So if we explore why this is the case, what we learn is that organizations may be dealing with legacy data systems. And even if they're not dealing with legacy data systems, their data still sits all over the place in a highly decentralized fashion. And that makes things like data governance almost impossible, where we start to see redundancies and duplication of efforts and systems. This leads to the profitability challenge. The second is addressing the talent gap. Data scientists spend 45% of their time on data preparation tasks versus really understanding the output of the data and turning that into action. Unfortunately, data analytics, the use of AIML and other things is not a one size fits all model. And we see that. So the first step is for organizations to determine their own readiness to take on more advanced projects. And part of this readiness assessment can be conducting a skills assessment. And based on that, really determine what specific analytical tools and capabilities you require. So for example, does your organization have an abundance of data scientists? Because it could be then you are ready to really start digging into more advanced use and complex use cases. Or if that's not the case, but you're seeing an opportunity to drive some quick wins in some tried and tested applications of data like marketing spend optimization, or perhaps it's within your call center. There are prepackaged analytical capabilities that you can bring in to your organization to help you serve that. But your talent at the end of the day can be leveraged in different ways once we really assess where your talent is and how mature it is. And the third area is process and culture. So 95% of executives from Fortune 1,000 companies cited cultural factors as the number one reason inhibiting big data adoption. So another important consideration is data literacy across the organization. Are the business owners who are responsible for activating these concepts or making these decisions equipped to even intake the insights? Are there proper processes in place to both collect the data and then disseminate the information? So these are the considerations, the three considerations we think about that if not addressed lead to performance gaps. But if are incorporated into the business case planning can drive really strong outcomes. And with that, I'm going to now hand it off to Joyeta to walk us through how to frame the business value from data. Thanks, Manisha. Good afternoon, good evening, everyone. Good morning to some of you joining us early. Really good to be here to discuss some of our approaches in working with customers to make data work for you. So as Manisha alluded to, in all our conversations with CDOs and CAOs, we really look to start with the business objectives. And understanding business objectives really helps define a data strategy that really starts creating the foundation for a data-driven enterprise. And I'll walk you through the framework. The framework itself is really simple. It starts with what are your business goals in the next 10 to 12 months? Now, why do I specify 10 to 12 months? I think all of us on this call can agree that we are living through extremely uncertain economic climates. We are seeing shifts in enterprise strategy. And that requires our data leaders to really balance long-term thinking with short-term execution. And I'll walk you through various examples of working with customers where we really look to solve for that business challenge. And how can your data strategy fuel revenue growth? Because that's the ultimate holy grail. We explore three critical pillars that are really foundational to answering that question. That's technology optimization. Am I spending the right amount in my data strategy? Operational effectiveness. Am I really delivering 10x impact to the business? And people productivity, as Manisha alluded to, data scientists, data engineers, business users, are they really getting the right insights from the data? And so we've seen customers struggle with all of these in creating that foundation of a data-driven enterprise, which is at the bottom of your screen. And really, as you think about that data platform, it really starts with data producers, how data gets captured on the left, whether it's from customers, colleagues, systems, enterprise data, coming in in different formats to how it gets used at the very end as data consumers and how they can use that to make the right decisions for the business. And everything in between collecting, publishing, storing and processing the data, which really becomes a core challenge in a lot of businesses, analyzing and activating it to really get, use that data in a powerful manner. And then ultimately, all of us are hearing about AI-powered outcomes. How do you create that data foundation to now leverage the power of AI? And it also focuses on data governance and security. So many customers we talk to will actually not move out of legacy data systems, because security, PII data, regulator data is top of mind. And so security strategy and data strategy we are finding is really very closely linked together. And how do you operate in a multi-cloud environment? So underpinning all of this data platform is really the data culture and data culture that democratizes data usage and insights to drive simple, well-informed decisions. So this is really the flow that we take right from the very top, business objectives to data strategy, to three pillars of area of focus, tech, operation and people, to building a very technical architecture that is then able to deliver on business outcomes. So moving forward, I might be a little biased, but we believe Google Data Cloud really accelerates business outcomes. We've been working with hundreds of customers and today I want to share with you just some examples of what we've done with various customers and solving their business challenges across the data platform. So let's start with the one that is probably the most commonly heard. Have I made the right investments in my data foundation? How can I think about optimizing my technology, my data platform? So a lot of customers we work with are dealing with legacy systems, have moved some data onto multiple cloud environments, but are still dealing with a lot of tech debt. They're looking to eliminate redundant tech licenses. How do they look to reduce cost? How do we break these data silos that are causing duplication of data resulting in increasing storage and compute costs? How can we scale for peak loads without optimizing our spend? And so really want to touch upon the Google's data lake house architecture. And it was designed specifically for this to unify data warehouses and data lakes. Our lake house is called a big lake and it really gives teams the power to analyze data without worrying about the underlying storage format or system and eliminates the need to duplicate or move data thereby reducing cost and a lot of inefficiencies. So with big lake, users gain fine grained access controls along with performance acceleration across BigQuery and multi-cloud data lakes either on Azure or AWS. And Google's data cloud thus begins to offer a unified, flexible and cost effective lake house architecture. I also want to stress on the fact that in addition to the big lake architecture, Google is able to provide technology optimization, cost optimization across our database environment as well. So let's take share chat, for example. Share chat as a company has a simple goal. They want to simplify content and people discovery by using very personalized content feed on its mobile app homepage. Now they were able to achieve 30% reduction in cost using Cloud Spanner databases. So what they did is they migrated their NoSQL database to Cloud Spanner, a relational database, and reduced costs by 30%. So when traffic grew by 500%, they managed to scale horizontally with zero lines of code change. And I'm only sharing one of these examples, but across the board, as customers look to migrate their on-prem data warehouses, they move from data lakes into a more simplified data proc kind of an architecture, explore our big lake. We've worked with hundreds of customers where we've delivered very tangible cost savings that help them really justify that spend on their data platform. Let's move to the next pillar next. And that's enhanced operational effectiveness. So CDOs are constantly asking the question, how do we best utilize new technologies such as AIML to extract insights? Now, all of us on this call have heard the news, possibly experimented with our latest large language model applications. It's probably hard to miss out the news these days. And every customer is really looking to answer the same problem. How do I apply it to my business? Now, as we think about these line of business challenges, we've worked with customers on several very critical use cases as you look across kind of the end to end value chain of a customer. Some customers look to reduce supply chain disruptions and drive transparency, increasing visibility across their end to end supply chain. They want to ensure predictability of their inventory, reduce stock outs, have the right product at the right place with the right price point, the classic 3P strategy. For CMOs that we speak with, they're looking to improve their marketing effectiveness through personalization. And ultimately reduce their cost for impression or the return on advertising spent. And in a macroeconomic environment, I think our LinkedIn feed is today spattered with so much around reducing staff. It becomes a low hanging fruit for a lot of CFOs. And so when we speak with customers, we do hear about how they're looking to make call centers either more efficient. And we have to continue talking to them to say, it's not just about cutting these contact center costs, it's about making them more efficient. Now I will be the first and this is in new news, Google is the world's largest AI company. For two decades, we have constantly leveraged AI to solve some of the toughest challenges in the market. And we have made our AI algorithms available through Google Cloud. From either enhancing the performance of our search algorithm with ML to sharpening content recommendations that are served up to you on YouTube with unsupervised learning, we've really learned how to make data to AI workflows as cohesive as possible. And so with Google, you can really shorten that time to value for AI initiatives by eliminating these barriers that we talk about between data and AI. And so let me walk you through a real world example of where we did this, Marx and Spencer. We were able to reduce the volume of calls that the store was receiving by 50%. Now that's a huge number when you think about a Marx and Spencer. And the solution was relatively simple. It was by implementing a dialogue flow solution that helped deliver a very lifelike natural customer experience with their virtual agents. Now when you get that, that starts resolving issues by routing 92% of the calls to the right destination. Think about the number of times you've been routed to the wrong person for the wrong problem. And at that point, you're ready to just give up on them in an extremely frustrated customer experience scenario. But Marx and Spencer was really able to leverage our AI powered conversational speech recognition to increase their customer engagement and satisfaction. And this is really one of various use cases we've worked on with CMOs or supply chain officers in really helping them use new technologies, extract insights from data, and really apply that 10X to a business problem. People powered productivity. Now this is also a question we keep getting asked. How can we make our data more accessible and usable for different teams? So as many of you on this call might be practitioners sitting in data teams where you are leveraging the data all the time. So keep me honest, when I say this, there are many personas within your organization that are really looking to leverage or work with data. And what this does is it causes fragmentation of experience across the different personas with different versions of the same data. A data engineer is looking to just format it consistently while a data scientist is building and training ML models and data analysts are using that data to inform decisions and to accommodate that need in using the data in different ways. What's happening is cross-functional teams are setting up technology stacks that is leading to further silos and data architecture complexity. So in a way, it's only exacerbating this problem of data silos. And these stacks really don't work very well together. You're introducing new tools into the environment. You're working with proprietary data and tools that really risk that data breach. And they aren't as easy to translate either. So we see customers constantly making trade-offs on these challenges. So let me walk you through a couple of these scenarios. So you have a challenge in an analytics user where analytics tools can't access the right data and the related artifacts. What's the trade-off? Do you now give your analytics teams free reign or do you really limit the data access in turn for analytical agility? Let's take another challenge, business users. Getting access to data and analytics that they can trust and use is extremely difficult. What's the trade-off they are making? Do you limit the data you give the business so you can ensure the highest quality or do you open up access to all the data they need even if that means sacrificing quality? And these are the trade-offs and challenges that really create massive tension and overhead for IT as they manually build and manage the glue across all these systems, trying to minimize the impact of these trade-offs. Google Data Cloud, we believe supports the needs for all data users and democratizes AIML innovations. From reducing operational costs of managing legacy data platforms, enabling self-service analytics to drive process efficiencies with AI solutions such as Vertex AI, or speeding up document-based workflows and getting access to off-the-shelf APIs that developers can easily call upon, pre-trained models to quickly solve real-world problems. For example, thanks to AIML capabilities built directly within our data cloud, data scientists can minimize contact switching when developing and training ML models. So as a result, a lot of data scientists that have worked with our ML models can fast-track that model deployment and experimentation by 5x with 80% fewer lines of code. Now that's music to any data scientist's ears. Fast-track model deployment by 5x do it with 80% fewer lines of code. So to give you an example about this, the Belgian FinTech company Unified Post, they primarily provide procure-to-pay services to small and medium-sized businesses. And they operate in almost 15 European countries. So pretty large scale of operation. It requires this company to handle nearly 300 and 15 million invoices, receipts, other documents in a year. And now because the business was growing rapidly and it served customers in so many different languages, Unified Post needed automation to help its customers transition from a very paper-heavy process to digital ones. And so what it did is it turned to Google Cloud's Dock AI for a very cost-effective way to extract data from documents in more than 200 languages. And they were able to do that while reducing the total cost of ownership of a procure-to-pay process by up to 60%. And here's really the key of this, boosting data accuracy by almost 250%. And so really these are very simple, quick-win solutions by which we are able to help customers make data more accessible, usable, accurate across the various user groups that are really ultimately looking to use the data. And finally, the fourth pillar, and I always call this the holy grail because in all our customer conversations, Manisha and I hear this multiple times over, how can we better understand and leverage our data to drive business value? What does it do to my bottom line? And the use cases vary. Some customers are looking to acquire net new customers. They want to drive cross-sell and upsell between existing customers or products. Several are looking to reduce customer churn, while various others are looking to really deploy net new revenue streams via data monetization. So a really interesting example where we've worked with the customer is Carrefour. Now, Carrefour, as many of you might know, headquartered in France is among the world's leading and largest retailers. They operate supermarkets, e-commerce platforms, various other store formats in almost 30 countries. And to retain leadership in its markets, the company really wanted to strengthen its omnichannel experience. So Carrefour transitioned to Google's data cloud over time and they developed a new platform called Darwin. And that enabled their data scientists to securely access enormous amounts of data across the various channels in which customers were coming to them to buy their products in a very structured manner within a matter of minutes. And what this does is it really helped them build smarter models to predict customer behavior better and underpin their personalization recommendation engine for their e-commerce services. Ultimately, what does all of that mean to the bottom line, right? So the business outcome that the company partly attributes an increase of more than 60% in e-commerce revenue. And I will repeat that statistic, 60% in e-commerce revenue during the pandemic is due to this personalization. So it's really back to if I, you know, we go back to that framework. It's about what's your business goal? How does your data strategy fuel revenue growth? Look at the three pillars of tech optimization, people, process and build a data platform that is technically going to allow you to not just save costs, empower your people, drive 10X impact on your business processes, but ultimately deliver revenue growth. And that's really, you know, the critical need of the R in this current macroeconomic environment. So really how do you get started? Ultimately, we have a very simple philosophy at Google. We will meet you where you are. And what do I really mean by that? You know, so many customers come to us and they say, we want to experiment with Google's AI platform. We want to build for the AI frontier. And I always suggest, let's take a step back. Let's start with your business priority. And there can be multiple ways to get to that not star of building for that AI frontier. It can be either building the data foundation where you optimize your current spend, you mitigate risk, you can create data unification across either on-prem or a multi-cloud environment. You can implement cost controls and reporting to improve cost predictability. And this is a huge, huge area of focus for our customers because so many customers say, why will I move to the cloud? If I can't predict my cost in an extremely accurate manner. And so we have throughout FinOps practice, we work with various customers to help them implement best practices. And with a lot of our tools built into Google Cloud Platform implement cost controls and improve reporting. Secure the data foundation, create strong data governance. It really just takes you one security breach for you to be on the Wall Street Journal headline and to have millions of market cap washed away. And so that's why this is all about building the data foundation before you go to the AI frontier. The second avenue in which we can meet you. If you want to deliver business value across line of business, we have entire data models, ML models. We have use cases built in supply chain, marketing analytics, contact center AI, various customers looking to come in and help us with their finance operations, with their accounting, with HR processes. And ultimately enable data sharing across organizations with partners, launching new products and data monetization products. And empowering end users, if all you want to focus on is I want my data scientists team to be 33% more productive. I want to shift them to higher value work. They're spending too much time just working with data that they aren't even able to extract insights from. We have a pathway to help there. And it's really not a very linear process, right? We will be able to help you in a very non-linear manner as well. But really the end state is we will meet you where you are on your data journey. And so finally, as Manisha alluded to, we are the cloud value advisory team. One of the critical banners in which we come in in our conversation with customers who have decided to move to Google Data Cloud or are evaluating their options is we help build a financial business case. Now, what does that really mean? It's a financial or an economic value-based model that CFOs, COOs, CTOs really look to build with us in very close collaboration with us to build a case for why they need to move from their current architecture to Google Data Cloud. And we think about it in terms of four value levels, if you will, one of which is revenue acceleration. What are the use cases that you are able to really deliver value across with your GCP data platform? So that's your first pillar. The second very obvious one is how am I going to, how much am I going to save? Will I move from my current data warehouse into Google Cloud or my data lake to Dataproc? Am I going to save 30% or 40%? What really makes me move? And that's very important. I almost always say this is kind of the brass stacks of the conversation. The TCO has to be compelling. But then we think about all the other elements that really need to start getting quantified. There can be a lot of revenue loss from planned and unplanned outages. Just in a recent conversation with the customer, they were pretty happy with their spend on the platform. They were very happy with all the business use cases that they were delivering across the platform, but they were dealing with six hours of unplanned outage every single month. Now that is a direct dollar conversion of six hours of unplanned outages. They were also dealing with security breaches and it was a patchwork of problems and patchwork of tools looking to help them with their data security. And so there is a way and a mathematical approach we've developed at Google to quantify that risk. And finally, staff productivity. And I know this one is probably the most debatable value level, but we believe ultimately, productivity is not about how can I do more with fewer heads, but how do I repurpose people to really redeploy them to higher value work? How can my data engineers spend less time on their activities? How can my data scientists spend more time just curating and governing the data? And how am I able to capture that better value? And so we again, have an approach where we quantify that. So I urge all of you to really think about what's the financial business case as you as data engineers and architects as leaders want to make the case for change. There is nothing better than numbers. Numbers speak to the facts and that's always a very compelling reason. Anyone has to take action and move from their current state environment to the future state environment in building this financial business case. So with that, I'll turn it back to Manisha to close us out. Wonderful, so Joe outlined really well how we work with our customers. These are five steps we tend to use with them to get the process started. And the first is identify a strategic starting point. Joe talked a lot about this in her section, select a business area where improvement would have the most impact, such as online transactions, sales and marketing logistics, customer service, whatever it means in your organization. And then within that identify use cases that you wanna solve such as automating document processing or making delivery routes within your logistics environment more efficient. So within that define your measurable goals. So we recommend not boiling the ocean but actually starting small so that we can measure. The second starting point is bring your data together in one place and organize it. So now that we've identified the use case that we wanna rally around, we can create a central repository of the relevant data so we can store historical current and future data that assist in that use case from all applications in any format, both static and streaming, structured and unstructured. And by the way, using a cloud-based data platform helps ensure flexibility and extensibility as well as interoperability of the data. The third step is implement AI by creating models that address your priorities. So we've identified the use cases, right? So then we can start to think through using ready to develop machine learning models or AI solutions that provide actionable insights around your top needs because we've created that rallying cry around what we're trying to do from a business perspective. The AI models and solutions are leveraged in ways that actually can drive actionable insights in your organization. So for example, your priority may be predicting outcomes or automating business processes. Start with proofs of concept or prototypes that ensure you have a plan to then scale into production. Again, we recommend starting small and standardizing and then scaling from there. The fourth tip, build operational frameworks for ongoing work with data. This is an important one, right? An important one. We need to treat our models and analytical insights as business assets. We need to develop stores, potentially life cycle management systems including version control. And we need to be able to automate these deployments. We need to create a mechanism for secure sharing. So all the things that we can think about from an asset perspective. So for example, if you're monetizing data we need to think through with the pricing models and the billing mechanisms look like. We need to be able to track revenue and consumption to measure ROI. Our operational frameworks in essence become key products that we need to be able to commercialize fully end to end. And then lastly, and this is such an important one I need to emphasize this. We need to partner with the CFO and our finance teams. We need to be able to set up a value measurement team that can truly lay out what we think the business case is, create a tracking and a measurement system and then actually measure the value that's derived. And we need to adopt a financial resiliency model which is the framework that Joita walked you through to create an environment of continuous optimization. So these are our five steps in ensuring that we start to drive real value from data. And with that, we would love to be able to open this up to Q and A. Anisha and Joita, thank you so much for this great presentation. If you have questions for them feel free to put them in the Q and A portion of your screen and just to start with answering the most commonly asked questions just a reminder I will send a follow-up email by end of Thursday for this webinar with links to the slides and links to the recording. Let me give everyone a moment here to type some of their questions into the Q and A portion here. So let me just ask, you talked about some really cool stuff and some really nice case studies that you've got going on. You know, what was that big surprise, that big aha moment for your customers that they have once they start implementing Google Cloud? So I'm sure, Jo, you probably have lots of good examples. I think from my perspective, it is that it's not just technology, right? It's not just, okay, I'm gonna leverage BigQuery or some other application from Google Cloud to take care of my marketing needs that it's really much more of a holistic approach, right? Do I have the right teams on board? Have they bought into this? What is the data used for? How do I capture it in meaningful ways? How do I grow this from an organizational perspective, to create lasting impact? Culturally, am I ready to take this in? So I think it goes well beyond, okay, what's this tool and how is it gonna be implemented to a much more holistic kind of business consideration? Yeah, plus one, Monisha. I think in all our conversations, when we really just limit the conversation to a feature functionality and what we are able to do with, based on a particular product feature versus what we are able to bring, that's very powerful, but I think that only starts to answer just a little bit of the real problem here. And that's why when we go in and we try and talk about more than just pricing, more than that cost of ownership and really look to say, what are the use cases you are going to start looking at to deliver value? And then post implementation to go back and say, were you able to do this? Were you able to do it efficiently at speed in a very tangible manner? And that's why we are very proud to call a lot of our solutions, quick time to value data and AI solutions. That becomes extremely powerful when they've realized that value in a very quick manner. That is really where we see a lot of our customers adopting our solutions over time. I love it. Everyone's pretty quiet today in the attendees. I don't see any questions coming in yet. Anything else you wanna add? Well, for our team? Yeah, I think something we've been, we talked to so many CDOs and CAOs and looking through kind of just the way they've built their overall platform. Something that sticks out in all these conversations for me is building for the future. And building for the future while keeping certain key tenets of governance and security in mind while optimizing your cost spend. As you move to these cloud platforms, you need to have some of the governance and security very integrally built into the data culture. And so there is a huge component that CDOs can do in integrating all of these conversations. So it's not just about the processing, the storage, the computing. It is about how do you really drive and build a data-driven culture across the enterprise? And I think that is kind of the building for the future where you're able to then retain talent that wants to grow with your data platform, right? We started the call by talking about the zettabytes of data that are growing. These are really unlocking extremely interesting use cases to solve for. So the more you plan for the future while solving for this incredible amounts of data growth and integrate that data culture in your organization, the more you're able to get top talent to stay and experience that growth and ultimately deliver business outcomes. I think for me, that's been a massive learning process through these conversations with customers. I like it. I love it. And it's such valuable information here too. You know, there's a request if you have any additional resources, you can recommend like webinars or books that can help enhance the data journey. I think that's a good question. We can probably get back to you on that. I think the other thing that's very much in lockstep with this is our perspective on financial resilience. We've been very pleasantly surprised by the fact that organizations are not shying away from this topic at the moment, right? I think before we've seen perhaps in past kind of recessions that there's a contraction, right? A contraction of really important projects. And we're not seeing that, you know, currently, the market is dictating that organizations to Joe's point should actually be thinking about laying out a really great groundwork for what the future entails. And so the concept of really building financial resilience within your organization, I think the foundation of that is data. And so Shannon, to answer your question, we have published some pieces on financial resilience, just kind of more generally speaking, focused on data and then other parts of the organization that I think would make for some really good reading as well. Well, I love it. If you have links to that, send it over and I'll include it in the follow-up email for everyone. We'll do. That'd be awesome. And I love this question and I was hoping somebody would ask this. You know, what do you think is the main cause of failed AI projects? So for failed AI projects, so I think it is a few things. One is I'm not sure organizationally we've aligned on what AI means to us, right? So are we conducting AI projects in a vacuum because we have a really smart kind of data scientist or analytical team that is running some really cool predictions, but the business is not involved, right? Or is the business trying to spearhead something that's really important to them but without really aligning to what our analytical capabilities are as an organization? So I think that we see a lot of that misalignment. I also think that people, you know, let's be honest, AI has kind of been in the news a lot recently. It's kind of the next sexy thing. And I think a lot of organizations want to leverage AI to actually solve really basic problems that can be solved with some actually some just really smart basic analytics. And so I think it is right sizing the use case and the business opportunity and then developing a plan where you're putting these AI models into effect would be a much better approach than what we're seeing which is kind of a haphazard deployment of AI projects that don't really have the funding or the visibility to be seen through to fruition. And just to build on that to give, I think this is David, thank you for your question, David. Just to build on that, one of the, you know, very recent conversations I was in, they were looking to implement certain AI projects where they were looking to see if there was really incremental change across their supply chain. I think it was a supply chain use case and really it came down to the fundamental problem of their source data not being accurate. And so, you know, when we talk about building that strong data foundation where you have the right governance, the right data quality, the right data that you're extracting, I think so much of that depends on that. What are you really taking in? I mean, to use a very trite phrase, garbage in, garbage out. And so it's really around how do you build that governance in your data with the right data management with the right rules and protocols and really create high quality data that is available for you to then build an AI model. And we really talk through three steps with our customers. You know, creating this unified data AI platform is step one, create an MLOps practice where you're really efficiently and responsibly managing and governing your AI and then create it in an open scalable way where it is flexible, you can securely deploy it and then you start building for future AI projects. I think that's kind of what we've been walking our customers through. And that's what we see as fundamental problems when they look to deploy these AI projects. I hope that helps. Absolutely. We have seen some explosion and requests for data modelers and data architects because they needed to prep the data first for these AI projects. They tried it without that and failed. All right, so you hadn't mentioned why are customers being able to innovate their processes and outcomes through data. What's your take on this about how data democratization and making data more generally available? I'll take a stab at this. If I'm following your question correctly, it's really about, you know, where have customers been able to take advantage of data being generally available to all the users and how can they then use that in a way to innovate in their processes and outcomes. And my, you know, just in speaking with customers again and seeing what we've done with various others I think there are multiple data, you know, when you democratize that data and you make it generally available, again, there need to be a lot of guardrails in place. We do find a lot of innovation that customers are doing with that data across any of their line of businesses or even with data scientists who are able to create multiple views that become a source of, you know, real power to the customer. So we have a product we acquired and it's now seamlessly integrated into our platform, Looker. Looker is a powerful visualization tool and really looks to democratize the data. We have found users looking at, you know, leveraging Looker with BigQuery who are able to find avenues to 10x use cases with their data either be it new monetization streams or be it generally available to get more insights. And so we're building business cases around it but we find that as once you're able to visualize it with BigQuery and Looker together, you're able to do more with kind of that data monetization or data being more generally available. Yeah, and, you know, I'll just add on. To be a data-driven company, it requires a shift in mindset, right, and cultural and it needs to be top-down, right? So if we really want to leverage data to be more innovative and to kind of improve our processes, that needs to be a top-down cultural shift broadly speaking. And so when we see customers make that pivot and I think the examples, you know, we've discussed upfront around the director of consumer businesses, these are organizations that built their ethos was founded on the fact that they wanted to be data-driven. So if we're in a traditional enterprise environment, we need to be very deliberate about what that means, right? Are we leveraging data to make all sorts of decisions? And then to just point if that's the case, how do we make that data generally available? But we have to be deliberate, we have to be intentional about what are the decisions? What are, again, it comes down to what are the decisions that we're trying to make or what are these cases that we're trying to be better at and leverage data in more meaningful ways? That's perfect. Well, that is such a great webinar to both of you, Monisha and Joita. Thank you so much. That is all the questions that we have for today. Thanks for our attendees for being so engaged in everything we do. Again, just a reminder, I will send a follow-up email by end of day Thursday with links to the slides, the recording and the additional resources for this webinar. Monisha and Joita, thank you so much. Thank you so much, Shannon. It was wonderful to be here. Thank you for having us. Likewise. Thank you for having us, Shannon. Thanks, and thanks, y'all. Have a great day.