 Hello, and welcome to this episode of our series, The Road to Intelligent Data Apps. And what we like to talk about is the sixth data platform. I'm Shelly Kramer, managing director and principal analyst at theCUBE Research. And I'm here today with my co-host and fellow analyst, George Gilbert and Rahul Aradkar from Salesforce. There is, before we get into our conversation and more introductions, I just kind of want to set the stage a little bit. There's a big transition underway and part of the challenge that organizations face and this big transition that they're navigating is that they need to be able to do a few things differently than we've done before. Some of those things are being able to embed and integrate all of the data that runs throughout your organization so that you have an enterprise-wide view of the status of the enterprise. Once you've got that integrated view, then you want to up-level that view from strings into things, which George likes to talk about a lot and I'm sure we'll mention that again several times in this episode. And that's important because right now, we're all at kind of low-level SQL stuff. And how you up-level this is you capture all of the intelligence that's traditionally been siloed across the organization in operational apps. And if you're nodding along right now, trust me, you're not alone if you're thinking about trying to get your arms around this challenge. It's a big challenge. It's a big transition and lots of companies are trying to navigate this. So intelligent data apps, they're a big deal because they can call on all of the things and each other and help us better engage with customers and with people with insights and information throughout the organization. So while all this sounds very cool, it doesn't come without challenges and putting the intelligence into things requires metadata, which we kind of think about and talk about around here as sort of the new holy grail. But that metadata needs to be standardized and shared throughout the organization. How do we make that happen? So George and I spent a lot of time thinking about and talking about this topic and we wanted to explore Salesforce Data Cloud. And what we're gonna be diving into today is taking a look at how Salesforce is reframing the modern data stack with Data Cloud and to help us navigate that conversation, we've got, as I mentioned, Rahul Aradkar from Salesforce, Rahul, welcome. It's so great to have you. Thank you. Thanks, thank you. And Shelly, glad to be delighted to be here and thanks for having me as your guest. Yes. Absolutely. Well, we are so excited to be sharing some gray matter with you. And before we dive into our conversation though, indulge me please and just tell us a little bit about your career backstory. Some of the things you've done, how'd you get from there to here surprises? I've been at Salesforce for a couple of years, driving essentially our data and AI-driven transformation for business applications, more specifically for CRM where Salesforce has been the number one CRM company in the world for a long time. We call ourselves the number one AI CRM right now. I did a few startups before this, related to applications AI and data. I have a systems background in that I worked at Microsoft for a long time in the Microsoft Office 365 Azure middleware. I started in the virtualization and data stack. And I've done different layers if you may off the stack within Microsoft. I have an interesting background before that in that I spent a long time in the automotive industry doing high performance computing simulation work. Think of it as computational simulation of crashes and how vehicles really swirl when they go around, and how transmissions work and how engines work. And I grew up in India. I'm an immigrant from India. I spent my first formative years after engineering working for Indian space research on satellites and rockets. I keep it, I joke around saying that I could be called a rocket scientist in that sense in my career there. But it's been a pleasure to go through the entire journey if you may from where I started to where I am right now. We are excited about where we are taking as a group and also as a company where we are taking the industry towards this AI and data-driven transformation. And you refer to that as a road to the intelligent app. So, some sense in a nutshell, that's my background. Absolutely, well, thank you so much. And thank you for indulging me here. I will tell you every answer that I get to this question is, it's different. And I love being able to see how somebody's gone from the automotive industry to being a rocket scientist now, I mean, I think that we all have such varied and interesting career journeys. And so sharing that, I always find super interesting. So now we're gonna talk about Salesforce data cloud. So from the outside looking in, it looks as though the value it delivers is pretty simple. It helps you bring all the metadata that defines application objects from the core operational applications to a data cloud that's architected to be a real-time historical system of truth for analytics on first and third party data. Okay, that whole system of truth, incredibly important. So this matters because with shared metadata, setup then becomes a no-code configuration, not ELT coding, which makes it so much easier for users, right? And then customers are able to operationalize or activate their analytics back into Salesforce with the same no-code configuration. And then what I see here is simplicity. And argue with me if I'm wrong, but I think that to me it looks like it's designed to be simple, as simple as something kind of complex can be, but I think that's so important because I think that's what all customers want today, simplicity, ease of use. I don't have to go to IT for everything that I wanna do. So anyway, tell me what, share a little bit if you would your vision, the vision of Salesforce anyway behind what you're doing. And if you would share with us a little bit about the impetus and the genesis of data cloud and the setup for the metadata-driven approach, that would be awesome. Sure, that's a great setup, Shelley. The call-out on the metadata and also the ease of use and how, if you may, that enterprises are going towards business users as well as a combination of developers as well. It's a great setup. I wanna take a step back and talk a little bit about the vision and the impetus behind data cloud. Okay. So first off, we envision at Salesforce a future in which data cloud delivers on the promise of the customer 360. It is essentially the ability for our customers to become customer companies and deliver the promise of engagement automation through AI insights and BI insights with across all channels and across all touch points. And in that sense, our vision as we have talked about is to provide a future of data plus AI plus CRM plus trust. We have been talking about that for the last six months starting with Dreamforce. And in that regard, our mission has always been that we want to get our customers to become customer companies. So if you take a look at, if you take a step back and take a look at how the days in the past when enterprise applications like CRM prioritize business logic and efficiency through automation while data analytics and AI were relegated to other systems. So in some sense, intelligence was relegated to other systems. So data was an exhaust from apps and was used for analytics and later for AI. And that was the realm of different departments within enterprises. Now there has been an inversion. The inversion that we're seeing is where data is defining processes and AI is making those processes dramatically effective and more efficient. That's the journey we are on. We've already started really transforming our customers as delivering value through our customers. And essentially new applications we believe will prioritize engagement and experiences driven by data, AI and automation which will now drive business processes instead of the other way around. So to that end, our customers have been asking us for many years about this digital transformation that's underway and with a few ships that happened over the last few years. One of them was, as we all know, engagement data and all the digital signals that we're getting within enterprises dramatically improved exponentially gone up. And that became even more the case during the pandemic. And another shift that was underway was the circular shift away from third-party data, the cookie-less world, if you may, where privacy was driving a lot of those shifts. Now what that means is that first-party data and engagement data, all of that becomes all the more valuable for it to be used to drive data in AI-driven engagement. And in the past, we were doing only transactional data and now we're looking at all the other engagement data that needs to be brought to life. And that was impetus behind data cloud. And our customers were trying to build all of this through a variety of different fragmented systems and they are in the business of delivering value to their downstream customers. They're not in the business of really stitching together all these different fragmented systems. So that led us to believe that we need to have a data cloud-driven approach so we can drive these AI and BI-driven engagement and automation for our customers. That's the genesis behind it. We've been at it for the last several years and it is data cloud is the fastest growing organic innovation in the history of our company. I am in no way surprised by that. So tell us a little bit if you would sort of the metadata-driven approach and why that's such a key part of this conversation. Sure, so data cloud is a full-fledged intelligent business data platform, if you think about it that way. It allows our customers to bring federated data, transform data and harmonize data, unify, create AI and BI insights and actions on these. And this is, it's an open and extensible data stack that's deeply built into the Salesforce metadata platform which essentially what it means is it enables consistent business semantics, declarative programming through the metadata and business-friendly interfaces. The idea that you would have a shared everything, shared data across multiple applications is harder to achieve rather than shared metadata if you may, that's the world we are in, in that Salesforce has historically been the reason why we had the number one CRM in the world is because we had the metadata platform that we built decades ago and customers have really built on top of it, they have modified our metadata, our objects if you may in the metadata, as in they use our out-of-box objects, they modify them, they have custom objects and we provide in-place upgrades and our metadata platform abstracts all that away. Now we are leveraging all of that along with all of the data engagement data that we get with data cloud which is deeply embedded into the platform. What does that mean? All objects, engagement objects in data cloud are standard objects or we refer to as S objects if you may in the core platform. And all core platform objects then can be enriched using the data from data cloud. So we are leveraging the value of all the metadata that we have had for several years and we're building on top of it. That essentially implies that the non-transactional data that also data that is structured and unstructured now we'll talk a little bit about unstructured as recent announcement that we had will now be brought to life through the metadata layer that we have innovated on for the last couple of decades. Makes perfect sense. George Gilbert, my colleague, my friend, my fellow tech nerd this is the longest you've been quiet in any conversation we've ever had and now I'm gonna unleash you. I know that you have questions for Rahul about data cloud, so take it away. So Rahul, let me unpack a little bit what you were talking about, especially with the metadata. So we had for many years the rise of a dupe where people were trying to create a sort of centralized analytic data platform. And then we move that into the cloud with Snowflake and Databricks. Position technically the Salesforce data cloud how you've still got a need for this system of truth that centralizes all your operational data and brings in third party data. But the value you're adding is that you're giving meaning to all the Salesforce data objects. And that whole data model is now transplanted in metadata on the analytic side. And that you're keeping them both in sync, both the metadata and the data if you need to activate one side to the other. Can you elaborate a little more? In other words, you're creating and explain to us how you're creating this new layer of value that you're not providing another, you know analytics SQL DBMS, you're providing a layer on top of that. Yeah, I think it might be worthwhile looking at the fact that first off, let me start off by saying that you bring up a real great point about another analytics platform, right? Let me address that first before we get into the metadata details that you had. So if you think about data cloud, it's essentially not a static we like to call it as not a static body of water if you made it's not a static lake. What does that mean? It's an active business data platform that delivers data and AI actions, data-driven actions and automation. So you brought up this notion about having other platforms in enterprises. Although data cloud allows for data federation or through disparate data sources, it is not really essentially stops at orchestration. The purpose is not to pass through or move data around from different places for movement's sake. But the rather the purpose is to provide, unify that data, which is what Shelly mentioned, for example, a minute back, ingest through what we refer to as zero ETL or federation and bring your own lake and insights and drive insights in AI to service scenarios that we've talked about before, which is engagement, automation, et cetera. So that's one thing that I wanted to make sure that it's not like any other regular static body of water. So in that sense, so the other thing to account for when it comes to metadata, you brought up this as well, George, which is Salesforce I mentioned that the Salesforce data cloud is built into the Salesforce platform. But what does that mean? I mentioned this before, all objects in data cloud are standard objects in what we refer to as S objects, enabling seamless use across flow automation or Apex coding or lightning or other Salesforce packaging that allows our ISVs to build on top of it and package and ship. So what it essentially implies is that earlier the objects that were metadata, if you may, in our platform were the ones that were transactional objects. But now all the engagement data, whether it is engagement data that comes in structured or unstructured, relational or otherwise, we'll talk about unstructured. They now can be expressed as objects as well as S objects. And the objects in our existing platform can also be enriched through all the data that we're capturing inside of data cloud that then can be used in the form of whether our first party transactional apps or our customers could use it in their own engagement surfaces that they have that they're running alongside Salesforce. Okay, this starts to unpack what's going on under the covers. It sounds like you needed new data objects. We've been talking in the past about the evolution of the data platform with needing a semantic layer so that this is up leveling strings to things. It's from what I'm gathering, there's the semantics of the data objects that were in the Salesforce operational applications. And there's new ones that add to that data model that are in the data cloud and that maybe what you can elaborate now is some use cases where you're taking this enriched analytic data and operationalize it or activating it either back into Salesforce or into other applications and add some color for us. So before you dive into that, what I would like for you to do, George, is I mentioned strings to things in the beginning part of this conversation. And I don't want to assume that our audience knows what we're talking about. So will you just take a quick second and explain why that is important, the transitioning strings into things? Okay, so this would be, and this Rahul will probably provide when he talks about an example, but rather than a table that has a bunch of columns, you might have a customer object and there might be engagement objects like what they've done with various touch points. And so you're talking about a customer journey rather than a bunch of strings about transactions and interactions. Makes sense, okay, thank you. Rahul, take it away. Yeah, so just to respond to what was asked before in a minute here, in terms of use cases, right? So it'll be interesting to have the use cases. We can attach it to what does data cloud really bring. So let me start a little bit in terms, I'll address a few use cases. And then from there, we can take a step back and start looking at how it works, how those use cases come to life. So let's take an example of InterBanko. So I'll take a traditional use case, which is a CRM centric use cases and a non-traditional use case that is more common in the future. Use case that is more customer engagement that is non-CRM like. So let's take an example of InterBanko. That's a bank in Brazil. It's a fintech bank. It has variety of different businesses, downstream businesses, whether it is consumer banking or whether it's commercial banking, mortgages, et cetera. They also have a gaming business as well downstream. They have several of these downstream businesses. So we have within data cloud in addition to bringing in all the first-party data sources that we bring from our own CRM applications, which we bring in with no ETL, it just shows up as objects inside data cloud. We have an array of different ways in which we can either federate or bring data into data cloud. Now, in their case, in InterBanko's case, we have a web and mobile SDK that allows bringing and streaming web or mobile clickstream and other data. So what they did was they instrumented that, they got in all of that data and they ran it through our transformation, harmonization and unification pipeline. And once it got there, they built insights on top of it. These are just simple cubes and we have a whole bunch of them out-of-box cubes that we build and we also provide simple for use cases for business users to develop these cubes as well. Now, imagine a world in which somebody's clicking through what specifically it's required whether they're buying a home, they're clicking something about home-related, clicks and then they build cubes using that and using those cubes, they build segments. And using those segments, they drove the word earlier that was used was journeys. So they took their customers through journeys and that's their activation. Now, with and without data cloud, the difference was pretty dramatic. They got a 30X better engagement, 30X better engagement. That's not our numbers. That's their numbers. That's a big number. That is a big and they got- That's impressive. Exactly. Shelly, you got it right. It is a pretty big number. And they claim that they got about 20X ROI. What does that mean? That meant that because we have a business user friendly, metadata driven, declarative ways in which they were creating these segments and delivering the journeys downstream, they saved a significant amount of hours and significant amount of dollars or local currency to deliver those journeys. But more importantly, they delivered more value to their customers. So that's an example of it. The reason I'm calling that a traditional CRM use cases because the first party engagement app there was a CRM app and it's a customer driven journey. Another example of a use case would be an automotive company, but there are more than one automotive company that is using a whole bunch of signals that IoT signals. And this applies even to a gaming company in Las Vegas using IoT signals that are getting adjusted into data cloud. What does IoT got to do with CRM, you might ask? So, or what is gaming signals coming from a slot machine got to do with CRM? So what they would like to do is take those signals, attach it to how they engage with customers. You might be walking around running a few slot machines and you might be, and they measure what the signals from there are coming on and they have certain thresholds. After certain thresholds, they find that, hey, the slot machine's not doing what it's supposed to do. They actually deliver some coupons onto your, your cell phone, your app on your app. To entice me. Exactly, to entice you to do more. So that is attaching you to the engagement system that they have known as a CRM engagement system, right? It could be an existing app from Salesforce or it could be their own engagement surface. So those are some, a traditional CRM use case and then a non-traditional CRM use cases. So those are areas in which, and by the way, along the way, the way they engage is with BI and AI insights. The BI insights could be cubes that they created based on engagement they're seeing for the last few minutes in this case of Interbanko streaming insights or it could be some batch related insights they're creating or it could be a prediction that was driven through an AI model. And now we're starting to introduce a generative AI part of it as well where they could generate some responses that they want to provide in the moment so that they can engage with their downstream customers. So those are some two examples. Hope that helps George to connect the dots with your question. Well, and I think that what I get out of this when I'm listening to the core values of data cloud is that Salesforce is not necessarily an inexpensive business solution. I think everybody knows that and I'm sure I'm not surprising you by that rule but the reality of it is for the person or the team who made the decision to start working with data cloud and who's able to get that kind of an ROI, that person is a superhero within the organization. And I mean, that's the kind of use cases you're looking for, right? I mean, isn't that what everybody's trying to do? Simplify, deliver better results, better customer experiences, better employee experiences and also have an impact on the bottom line. You got it. So let me continue this process of unpacking with the use case that you were talking about, Rahul. Let's do a sort of before and after of a customer if they were just using the modern data stack, how they would have to try and ingest, shape all that data, resolve it into individual entities, then do the insights and operationalize it or activate it. How, start to explain to us how this is so much easier with this extra metadata layer. Maybe you can describe how you sort of collapse these code heavy pipelines into something that's more configuration driven. That's a great question. So first off, that being digging into how data cloud works in some sense. And then I'll make a compare and contrast as to whether if it was not that, right? So first off, let's start with ingestion. So we have built-in connectors across all Salesforce clouds, sales, service, marketing, commerce, others. We basically bring in data and it's fully shaped and formed in that there is no ETL necessary. They show up as objects. In the traditional system, you'd have had to use an ETL system to shape and form it when you bring it into a warehouse or a lake house. That's the first distinction. We also have an array of connectors that really bring in data from other hyperscalers. For example, AWS, Azure, GCP and other databases and applications also. I refer to the fact that we have the web and mobile SDK that brings in a click stream data, all of that, that are then modeled as objects. We have MuleSoft as an asset and all the other applications centric connections that we have that MuleSoft delivers is first class in data cloud. That we have access to that data as well, right? Which is typically very hard for traditional warehouses and lake houses to deliver. Now, we've also innovated heavily in the space of what we refer to as zero copy or zero ETL. It's essentially a federation and a sharing. Federation essentially means we could mount other lakes and tables or other lakes and warehouses and without having to copy data from any of these, whether it's a snowflake or whether it's a data breaks or whether it's a redshift or a Google BigQuery and all of them are in different stages of the roadmap. We could essentially not move data around but we could have data live where it is and also put it to use inside data cloud. So that's another innovation that we have driven and quite frankly, that's, I would say that if I could say, so myself, it's an industry defining if we are playing the role of the Switzerland, if you may, where we are really, we are transcending formats. For example, we got an iceberg driven sharing format with snowflake, we are doing query based formats with data breaks and we are also elevating it to look at Delta as well, their format. So we are working with all format formats in that sense, we're not moving data around. So that's pretty big deal for enterprise CIOs and CDOs leaving the data where it is for it to be resident and governed and secured in the way they want it to be secured without it having to run around and move around if you may, right? So that's from an ingestion standpoint. Then you get into, we also have built a transform layer in there where transform is drag and drop type transform or you could speak SQL semantics or you could do DBT transforms, we are allowed that as well, it's in a transform standpoint. Next is the modeling layer, the harmonization layer. These things are not, the harmonization layer is very simple in order to help with different schemas, right? It's one thing to connect the data sources but the main problem that people have is the different schemas and formats that come from different systems, right? So to help elevate the problem data, data cloud provides a very sophisticated mapping layer called the data model objects that we refer to them as DMOs, it generates DMOs, that what it allows you to do it harmonizes different schemas into a canonical model and we are shipping out a box, quite a few of these models and because of our extensibility and packageability our customers can do the same thing our SSIs can do the same thing, our SSIs can do the same thing. What it does do is it allows you to bring data from many, many, many different sources and map it to the canonical format, allowing the applications to work off of this canonical model, which is entirely metadata driven. What we essentially doing is we are isolating the schema challenges and we are isolating the source system from the place where you want to activate the data, right? So for example, my own system that I demo in right now started with three sources, it's grown to about 30 some sources but nothing has changed downstream, right? From an activation standpoint. And then that's essentially the harmonization and then you have the unification in that now I have done the schema harmonization but there are multiple Rahul's in the system, there could be Rahul at Gmail, there could be Rahul at Live, I could have an email, I could have a mobile signal, I could have multiple touch points, multiple channels through which I'm coming in. What is the source of truth that business needs to use to engage with me, right? We drive this unification and the unification is driven through deterministic ways in which you have rules-based unification or we have probabilistic or AI model-driven unification and our own Salesforce research has built models for us that we have built into data cloud that allows for unification using probabilistic techniques from the AI models that we've built in. Once you're done with it, now you can create the words that you're right now because essentially what you're doing here is now you're creating segments out of it, you're creating insights out of it. Using that you could drive data-driven activations, you could drive table-driven activations, you could drive segment-driven activations or you could just feed it to an analytics engine like Power BI or Tableau. Tableau is the first class citizen on data cloud, so it's Power BI or you could have, bring your own model in that, I could take this and feed it to SageMaker or Vertex in which I could build AI models and bring the inferences back. So it opens up a plethora of opportunities from an activation standpoint. Now, if you think about that whole pipeline starting with ingestion, federation, transformation, harmonization, unification, creation of insights or BI and AI insights, activation, whether it's data-driven activation or otherwise, that whole pipeline is metadata-driven and it's intelligent. Now, imagine if you were to do that in a traditional data platform, you would have had to stitch a lot of things together for that to come to life. We are not competing with and we are not a data-based system. Those find value in a lot of different places, our enterprises really use it effectively. There are big companies built around it. We coexist with them, but those are not what we're competing against. We're really delivering business value for our customers through their CRM engagement applications. Yeah, that's awesome. Okay, just to be clear, so this is really a layer above the modern data stack in that the metadata has made this pipeline of ingesting, modeling, and the modeling step I assume is to harmonize data from all the different sources into Salesforce data objects or data graph, I assume. And then you're unifying so you have all the contextual information for each individual entity. And now describe to us a little more about what sort of analytics you might do on individuals, on segments, on populations, and then how you activate that data either back into Salesforce or into third-party applications. And if it's third-party, maybe give us some examples of those too. Sure, so let's take some examples. So if I want to do some profitability analysis as an example, where do I find a single definition of the metrics? And in addition to that, I need to get a understanding of the lineage and where did the data come from and consistency of the data where it's coming from. Another example would be how do I get access to lifetime value metrics, for example, or forecasts and models and predictions and things like that. We have built, while we finish the ingestion, federation, transformation, harmonization, et cetera, we have built this concept known as calculated insights module. This is essentially a central repository of all kinds of metrics and analytics views. And also their definitions. This is business user friendly. This is where we track, if you may, the data model, the DMOs that I refer to, the data model objects that contribute to these calculated insights. And these data model objects essentially are composed of different lake objects, whether they are resident inside data cloud or we are doing federation. And their values are continuously changing because something could have changed upstream in the systems. And as a consequence, these calculated insights are changing on an ongoing basis, whether it's A or BI insights. Now we're using those to drive engagement downstream, right? We also, lately, we've also released in pilot form an analytics semantic layer to support guided analytics across the Salesforce intelligent applications. We refer to intelligent applications as being like revenue intelligence or segment intelligence, for example, right? And we introduced this semantic data model of the SDM as data clouds adoption goes well beyond data cloud itself and into Tableau and analytics that go, that analytics in other systems. It can be used in analytics applications was I refer to service intelligence, marketing intelligence, segment intelligence, revenue intelligence. So in addition to driving what are calculated insights that are needed for engagement, we also have this semantic layer that we have built inside of data cloud as well. Can you, Rahul, elaborate on that a little bit, like drilled into like what revenue intelligence looks like and this semantic layer that supports it? Is everybody like that? Right, revenue intelligence, hello. Think about it as intelligent applications built on top of transactional applications. Like for example, you have, let's take segment intelligence. You have marketing automation that you're driving through Salesforce marketing cloud. And when you're doing the marketing automation, you're creating segments and you're driving journeys through them. The journeys could be ongoing email journeys or it could be Twitter, it could be Facebook, it could be SMS journeys. And you're continuously measuring the efficacy as well as other aspects of those journeys you're taking your customers through as an example. And we also have deep integration with ad tech systems like Google, Meta, Amazon. So you're driving paid media, privacy safe, first party driven journeys through these ad systems. Now, how do you know what you're driving there is effective or not, right? So we bring all that data back as a feedback loop. That's what I meant by it being active as in we bring it back as a feedback loop and we build among other things analytics and AI models. An AI model could be a lookalike segment as in saying, hey, something happened that really was resonating really well over the last five days, over the last week or a last hour as you're driving these campaigns. I want to do a lookalike segment. I could just build a model that allows me to build a lookalike segment. And all of those, the effectiveness of these journeys that you're taking through with segments, we refer to that as segment intelligence. Imagine an analytics app that's in the hands of marketeers who are looking at this and saying, hey, I'm now actually measuring the outcomes of my marketing campaigns, my marketing automation. That's an example. And that's all built on top of data cloud. It's built on top of the data on data cloud, but more importantly, it's built on top of the semantic layer that we built inside of data cloud. And it's happening in real time. It's happened, absolutely. Thank you, Shelley, for bringing that up. Which is, I mean, I've been doing this long enough that I remember the days of developing a campaign and launching a campaign and waiting and waiting and waiting to be able to try to measure so that the real time aspect of this is amazing. Absolutely, in real time aspect, the reason why it is really amazing is that imagine a change in a signal that you captured about the customer, about the campaign, about the engagement, et cetera. That change is now fed into the, not just the analytics, but also the downstream automation you might have driven. As an example, since you brought up real time and changed data capture and CDC or changed data feed, CDF, we have, for example, Salesforce uses data cloud for driving their entire digital business. So if, for example, there is a change that we detect or within a threshold, the sales cloud team that has implemented sales for Salesforce, they see a change in a data feed and they immediately it's detected and if it has crossed a certain threshold, we send a Slack message. So it's seamlessly connected to Slack. The Slack is essentially telling the sales agent or in this case, maybe the customer success agent as in saying, hey, there is a change that happened, there's a threshold that's happening where customers adoption is shot through the roof or the customer's having failures or the customer's unhappy. So you might want to contact that. That's almost like automation in real time, so to speak. Well, and so that delivers on the campaign measurement aspect, but it also delivers so effectively on the customer service part of it because as we know, it only takes a minute, a second for somebody to have about experience and for that to escalate. So I think that's a really big part of value prop as well. Yeah, just on that topic for a second Shelley there, we started data cloud with an intent to drive marketing automation because all along knowing that we would be driving a whole bunch of what we refer to as a C360 as in across all touch points, all engagement channels, whether it's outside the funnel, inside the funnel, customer service, sales, marketing, we knew we were going to touch, let our customers to touch their customers downstream using data cloud and AI. That was the vision, but we started with then with the product market fit in marketing automation which has really been, it's taken off and over the last year or so, we have really seen significant use, which is your point, which is well beyond where we are with respect to marketing automation and all aspects of how our customers are touching their customers. Well, and it also impacts employee experience and the better tools that we can provide our team to do their jobs more quickly, more efficiently, more effectively, the happier they are, that resonates with customers, they feel like they're better served, they're happy, they stay, all of these things, they work together. So it's a really important part of the equation. Absolutely. Yeah. So I have a quick question about customer advantages when it comes to having one vendor specify all of the metadata that we've been talking about here versus best of breed, like when each component knows about and builds on the knowledge and all the other pieces, what's in it for customers to have one vendor sort of specify that metadata? It's an interesting question. Like I think like I said on the previous, we coexist with existing data assets. Right. You might have, for example, as an enterprise CIO, Shelly, you might have, you might have, for example, this is a real example, you might have BigQuery and in BigQuery you have all the order information associated with some ERP system. We're not suggesting that you master that in data cloud. We will just federate with you your asset, which is a BigQuery asset on which you're driving your ERP and you've really got your order information and that would just be a virtual copy of your data that is basically referenced, if you may, inside of data cloud without it having moved from BigQuery. So we are not mastering that inside data cloud. We're just using that master that you already have. So the idea is, I think I mentioned this earlier, open and extensible is a key design principle for data cloud. We embraced open source standards. We embraced right at the spark layer all the way up to the file layer. We embraced open source standards. We contribute back to the open source. That has allowed us to federate and coexist easily with CIOs and CDOs' data assets or data estates, if you may. Yeah, I think that's incredibly important. I think that so many vendors today want you to do everything humanly possible within their platforms. And then you have sometimes a vendor lock-in situation and it doesn't, I think a federated approach is so much more appealing to customers and it's better for them. So I really like that. Thank you. So let me just drill into that for Sekhra Ho. When I assume you've got these, you've still got this, this data graph where you've modeled everything and some of the data is not resident in data cloud, but those objects refer virtually to external data entities. So when someone is executing a query in data cloud and some of the data is external, let's say in BigQuery, what does that query look like? How does a developer navigate? And then when does the query engine, does it call on BigQuery to execute part of that query and under the covers are you executing part of it locally for whatever data is resident locally? Yeah, when we do federate there, George, the simple answer is when we do federate with a snowflake table for a snowflake type data, we do a predicate push down. Essentially the compute optimization is happening at the end and we do drive, it's optimized for us to ask, make the right query, but the predicate is essentially pushed down and those systems are really performing that and any change that happens there is captured at our end as well. Okay, so then let's bounce up to how these analytic applications are built. You've talked about because it's metadata driven, it's you can do low code, no code, but if you need to drop down into pro-code, there's an option. Maybe tell us what the different developer experiences look like. Sure, let's take an instance. An instance would be in case of, let's use AI as an instance. In the no code scenario, we surface through Einstein, insights and predictions in line of work. On the sales side or the marketing side or on the sales side, you could have some propensity scores that just show up in the line of work. That's essentially unbeknownst to the user, it just shows up in the line of work. As a matter of fact, we deliver over a trillion predictions per week from a predictive AI. So that's a no code type scenario. Then you have the low code in that we have really built this concept known as Einstein, with this thing known as Einstein Studio inside of data cloud that allows our users to do configuration driven models inside of data cloud using data cloud assets so that they can do very simple configuration driven models. Those are low code scenarios. Then going to the full spectrum, considering the maturity and the evolution, if you may have AI, our customers are telling us and we talked about this, our financial services customers. We have talked about this in the context of, in the context of few of the customers with pilots that we ran for bring your own model. There is a medical diagnostics company in the Northeast that is doing it as well where they drive predictive models using SageMaker as an example. We want to meet our customers where they want to be met in that their professional developers are doing what they're doing in SageMaker. We have a partnership with Amazon that allows for a seamless way in which our customers can federate with data cloud to get data cloud snapshots to drive those models and train those models, bring those inferences back into Salesforce through data cloud across entire surface area of Salesforce, whether you want to drive a flow based automation or you want to inject it into an Apex code segment or snippet and you want to automate using that or you want to inject it into sales cloud, et cetera. It's available whether you want to inject it into a lightning component. It's available easily, all those inferences. And as Shelly mentioned earlier, it's not just about what happened, it's also about what is happening in that there's a change in any feed and the change is changing the model and the inferences change. It's a full loop, it comes back. So we're allowing our professional developers also to have a full say in how they're delivering value through their business applications. So it's the whole spectrum. I'm using AI as an example, but that can be applied even to other aspects as transforms that can be applied to other aspects as building calculated insights. So they all have the no code, low code, pro code spectrum experience. Yeah, that makes sense. So George, before you ask another question, I just want to set the stage for the rest of our conversation because as I expected, the three of us could talk about this for hours and we don't have that much time. So I just want to talk about a couple other things if we can. I'd love to shift to ecosystems and the data cloud ecosystem boundaries. I know you have some questions on that front and so if we can talk through that, that would be great. And then I'd like to shift a little bit to talk quickly about adoption and scale and where you are there, what's ahead in the future. And so we good with that? All right, George, what are your questions here? As it relates to ecosystems. So I know you have some. Well, I guess we did touch on this somewhat when we were talking about the predicate push down to external systems. This core asset of the semantics that's captured in the metadata, which is like the data graph. Now, if you're a customer and you've still got your Databricks system or your Snowflake system, but you want to access data that's in data cloud, what do you see when you're going that direction? We talked about it from within data cloud accessing the external systems. What do you see when you go in the opposite direction? Well, you're accessing data from. From Snowflake or Databricks and you want to access data in data cloud because you've still got some applications that you're in your modern data stack that you're working on, but you want to integrate data from data cloud. So you want to take it from data cloud and put it into Snowflake or Databricks? Potentially, you either move it or access it. What does that look like? So there's two ways to, so we talked a little bit about federation in that I can get, when I was responding to Shelly I mentioned when she referred to metadata and other systems, I referred to order information in BigQuery, right? In that I'm federating and I'm getting that order information for me to drive better engagement or better transactional based engagement in CRM apps. Then you're talking about the other direction, the reverse direction, in that for example, I want to push some data associated with whatever the customer wants to do inside of Snowflake from data cloud, from Salesforce data cloud. We have this concept known as sharing, data shares. So we create concept of data sharing that's fully modeled data, if you may, that is required at the other end, and we keep them as data shares that then are queryable, that are accessible, if you may, through query-based access or with a file-based access, depending on what our partners really choose to implement. Like Snowflake has been doing it, using on file-based, we're doing it with Databricks using query-based. So they get access to these objects, like the objects are fully modeled and that's what we refer to as zero ETL, they're not doing any ETL there, they're just getting it business modeled and transformed objects that are being used in the context in which they really want to use it in their systems. In the past, these objects would have had to be modeled, they would have had to be transformed, they had to run through ETL. Now they're easily accessible from data cloud as virtual objects that are shared as part of our data shares. And just to build on that, those are individual objects, I guess what they can really see is the full data graph again, that's an interesting one, right? I mean, we have, so you've referred to data graphs a few things, I just want to disambiguate that a little bit. We have three kinds of graphs, so to speak, right? We have this concept known as an enterprise graph, which I refer to as data modeled objects, if you may, that is the data across the entire enterprise model as individuals, as orders, as contact, or as profiles, they're modeled across the, we refer to that as the enterprise graph, we have got data across the entire enterprise. Then we have this organizational graphs, this idea that you're separating graphs based on organization, this concept known as data spaces that allows you to separate across multiple, whether it is for regulatory reasons or legal reasons, there's a company out of Europe that does 40 of these data spaces because they have 40 regions in which they operate, they legally have to be separated, that's an organizational graph. Then we have this concept known as the customer data graph or data graph, which is a denormalized view and that's configurable by the customer as in everything that the customer has done with the profile associated, whether the profile could be a person, the profile could be a product, the profile could be an account and all the associated engagement data with it, which is flat denormalized view that is offered through a JSON and you can traverse the graph and you can get access to it in real time. That can be fed to different applications, it can be fed to systems outside of data cloud, it can be fed into systems that want the engagement through that. A mortgage company, for example, wanted to run quick real-time lookups when customers call in and the fill out their mortgage applications and we're using these data graphs to feed those engagement systems and those engagement systems might be going through other data, underlying data platforms as well, but we're feeding this data graph into those engagement systems. That's an example of how our harmonization, unification and creating this single source of truth allows us to then bind all of this engagement data that I referred to into this graph that is surrounding whether a profile or surrounding whether it's a profile based on product or customer or account profile and then deliver that as something that you can use to engage. Okay, just, and I don't wanna spend too much time on this just because I know Shelley we're running short on time, but- And I'll crack the web. Just to be clear, so the data graph is really related, the information related to a single customer or a single, some single entity that's the customer 360 as opposed to the enterprise graph which is all the entities that the enterprise is tracking. Okay, I missed that. So now that connection rich information explain to us the trade-offs of what type of database captures and maintains all that structure, like how you've done it and how, what other approaches might there be? Yeah, I think we, I mentioned today earlier that it's a ground up. We built this innovation from ground up. It's an organic innovation. We built on underlying open source Spark. So we use Spark based underlying systems. We've got a par case of columnar file format. We've got, we use all sorts of regular industry standard Trino based query. For example, we have our own query engine known as hyper that allows us to accelerate the queries and we have built thing, we have built aspect, we have built the layers on top of it that allows for us to feed whether we feed SQL queries or what we refer to as SQL queries the SQL object query language and then all the lightning and lightning components that allow our ISVs and our trailblazers to create applications on top of it. So that's the stack that we have built end to end and the way we have really represented graphs in there is essentially we use a hybrid approach. If you take a look at newer graph database systems they have this notion of nodes and edges. They're optimized for nodes and traversing along the edges. We are basically doing set base operations for SQL querying and we are storing it such that we can do node, we can do it, we can go into a node and we can traverse along the edges of the nodes as well. We are working on that too. So we have best of breed in that we have have the SQL based set based query approach along with the ability for us to store underneath that allows for optimization between nodes and the edges. So we, so those are the merits and demerits if you may of using what we've used versus the other newer systems like graph database systems. Yeah, that makes sense. Go ahead, George. Last one on the stack. You're putting together a cost effective set of open source components. Now, we've had this theme of the modular sixth data platform. There are always trade-offs regarding complexity when you're trying to stitch together multi-vendor components. You as one of the world's biggest ISVs have the technical wherewithal to overcome that that many corporates might not. What are some of those complexities of integration of this open source multi-vendor approach that others would encounter if they tried to go down your path? Yeah, I think the important thing here is that we provide optionality in that as an example if you want pluggable as in you want to bring your own transforms, you've done your ETL. You want to change it to ETL as in you've done the extract and transform. You just want to load it. You could really go around our transform layer and go straight to the unification layer. So we provide that optionality that allows our customers to plug and play, right? But we also provide the stitching together from an end to end standpoint. If it was going to be done, then there's a lot of systems integration works. The small things and the big things that really need to come in as in what's the pipeline look like when you go from transform to unification to harmonization to unification. That's not a simple thing to do. If our customers really want to build their own deterministic logic for unification, they could do it, but then they have to inject that into the harmonization layer, right? So there's a lot of complexity associated with it. That does not mean that we are not really working. We are working very closely with the likes of open source DBT. So we allow DBT semantics to be used in transforms. We are also using regular SQL for building calculated insights. So our customers know how to do that and they can plug their own cubes in there using our SQL. So those are things that allows us to have more flexibility. And if an enterprise or a smaller ISVs were to do it, they have to go through the journey of really stitching all of these things together. Okay. Tell us a little bit. I mean, really I feel like this is, when I ask my kid, did you clean your room? The answer is always yes, and it may or may not be true. Not that I'm calling you a liar, but I know that companies are, don't always want to share information about adoption number of customers, average customer data volume, all that sort of thing. But if you'll tell us just, how is data cloud being received among Salesforce customers? Is this a no-brainer? We totally want to do this. We see the value, sign us up. Is it a little bit slower adoption? Where are you feeling? What are you seeing? Yeah. So I think, Shelley mentioned that data cloud is the fastest growing organic innovation in the industry. You did? Yes. When I say that, I mean it in the context of customer growth, in terms of usage growth, and we don't talk about the revenue, et cetera, but revenue growth, et cetera, in all dimensions. Let me just give you a few metrics that we talked about, right? Just the last quarter Q3, we ingested 6.x, I think 6.2 or 6.3 trillion customer records. Now, what does that mean? What it means is that we grew, in itself, the numbers trillions might sound very impressive, but what is even more impressive is that it grew over 150% year-over-year. And that's one end of the spectrum. The other end of the spectrum is activations. As in you did a lot with the data, and I've gone through the pipeline in my discussions. What did you do eventually with the shaping of the data and the insights, right? We drove activations on behalf of our customers. We processed hundreds of trillions of customer records to drive 1.5 trillion activations just last quarter, and that grew over 250% year-over-year. So that's the level of usage and adoption that we're seeing. And from a customer growth standpoint, just last quarter, we introduced a new program that allowed our customers to sign up for data cloud, just from a premium standpoint. And within the first six weeks, we signed up over 1,000 customers in the premium. You're just giving your sense. Well, I understand the allure. I very much understand the allure. So if you wanted to leave our viewing and listening audience with one bit of advice, recommendation, if you're thinking about walking this path, anything, what's your tidbit of advice that you share? Yeah, I think one thing that I would say is a future direction standpoint. If you think about it for the last 20 some years, the world of apps has been run by humans entering data into systems and applications, data platforms, and apps are running like that. I think that those days are gone in that the inversion has happened, in that data and AI will drive those processes and those business apps. And one of the things that we have to also recognize is that that data that I'm referring to, 80 some percent close to 90% depending on who you listen to, which analyst you listen to, but it's pretty staggering amount of data that sits in enterprises is unstructured data. What I mean by that is texts and large text, email, transcripts, chat transcripts, video transcripts, audio messages, et cetera. We, with the advent of LLMs and embedding models and vector databases, which we announced recently at the world tour in New York in a recent event, we are bringing that to life with the help of, we've always had the existence of unstructured data. You and I store documents on our hard drive. We store PowerPoint text. We're gonna store this video if you may on our hard drive. It's been around for the longest time, but with the advent and our ability to tap into the new gen AI embeddings models, we are embedding and creating this vector DB and we're bringing that to life through a semantic retrieval. What does it mean? We can now reason over structured and unstructured data. So it's not just about engagement data and transactional data that I referred to earlier, which was all relational and structured, but we're bringing to life unstructured data. It's an entirely new world. We're transforming the way our customers really realize value through data in AI. That is the most exciting aspect of this going forward. We're just getting started. It really is an exciting aspect. And I think that, for anybody listening or viewing who occasionally or often feels overwhelmed, I mean, managing data and being able to utilize data that runs throughout an organization. And really, whether you're talking about the enterprise or any other size organization, it's a lot of data. It can be overwhelming. And by the way, the amount of that data is going to double and triple and quadruple in no amount of time. So I think that to me, the solution here is to understand what it is you're trying to navigate and to figure out what the best technology solutions, people solutions and processes and all three of those things work together to kind of create the magic, if you will. Great, great. Yeah, well, good. Well, gentlemen, George Gilbert, always a pleasure to hang out with you. Rahul Aradkar from Salesforce. Thank you so much for spending time with us. We have so appreciated learning more about Salesforce Data Cloud and all of the insights and information you shared. And I am sure this is the first of many conversations that we'll have. Absolutely, George, Shelly, thank you. And really I'm honored to be a guest of your program and I really enjoyed the discussion. I can't believe we spent over an hour doing this. And we just went away. That's what happens when you get three geeks together, right? You know, we could go for it. Well, thank you everyone. Thanks to our viewing and listening audience. Don't forget to hit the subscribe button and we'll see you next time.