 Welcome back to SuperCloud 4, where we explore the evolution of generative AI and its impact on company strategies and industry transformations. You know, very often, tectonic shifts in enterprise tech start in consumer markets. The ascendancy of the PC, for example, gave Wintel, i.e. Microsoft software and the Intel X86 architecture, the scale it needed to crack into the enterprise market. The iPhone gave rise to mobile apps, flash storage and ARM processors, which have all found their way into enterprise technology. And our next guest is Sridhar Ramaswamy. He's a senior vice president for AI at Snowflake, a company that acquired Neva in May of this year, which is a company that Sridhar co-founded. Neva was a consumer search company that ran into the Google search monster and then pivoted with Gen AI to enterprise search and is now helping simplify the Snowflake customer experience across a variety of industries and use cases. Sridhar, welcome, good to see you again. It's great to see you, Dave, excited to chat. So in June at the Snowflake Summit, we talked about what was a most interesting use case to me and that is, as a customer, I want to access some table, let's say, that has some important data that I want to get to. And I'm going to do that in natural language. I'm going to bring up a Streamlet app you described. Streamlet, of course, is another Snowflake acquisition to build data science apps. And through my natural language commands, it all happens behind the scenes. First of all, is that still the right vision? How has that evolved? And where are you in terms of making that a reality for your customers? That's right. Part of what is incredibly exciting about AI itself, Genitive AI that are out of applications, but the core technology of AI is that, as you find out, it's a natural language. You can speak to it, you can write things and it can extract the structured information that is underneath and for service it to you. At Snowflake, we are building a set of technology capabilities deep into the platform. Part one, we are running language models within Snowflake so that you can access them, whether it's from SQL or a Streamlet app or your own application, without having to bring up GPUs, without having to do all of the hard work. So that is basically inferencing at scale that we host. We're also integrating the Neva Search technology, which is a combination of old school information retrieval plus vector indexing natively into Snowflake, so that you're able to just with a single command, index a table. And now where it all comes together is in, for example, a Streamlet app that you talked about, where you can talk to a table. We're also creating a co-pilot experience that builds on top of the inferencing platform and the vector indexing, so that you can talk to it in English, it writes the SQL queries and then you can click a button and have that run. It is that fluid access to data that further will democratize access to all of the great structured information that's sitting in Snowflake. And we are absolutely on our way to getting this done. We have a series of announcements coming up in roughly a month in Snowday. It's a pretty exciting time and you're moving incredibly quickly. Yeah, so I mean, we just got together, it was just over 90 days ago, which isn't that long ago, but of course in this day and age, that's a long time. What has changed? Are there any learnings since we last sat down in June that you can share? Yeah, I would say the thing that's been most exciting over the past 90 days is of course the rise of open-source language models and the role that Meta is increasingly playing in the language model space. And so Lamadu has struggled, charged a lot of efforts. We have a great partnership with Meta, but of course we are also pre-training our own models. So I think it's just provided a lot of impetus to the ecosystem, which at the beginning of the year looked like it would pretty much only be a handful of players like OpenAI, like Anthropic and the much anticipated Gemini model from Google. So I think this sort of rise in open-source and the democratization of AI models is an important development. I think the other, less relevant to enterprise, but the other equally important development is that it's very clear that chat GPT as well as Sydney, which is Bing's chat bot, have talked out in terms of how many people they have reached. And so what looked early on in the year as a sweep the table move from OpenAI and Microsoft is now sputtering a little bit on the consumer side. So I think that'll influence a lot of how the search evolved, what does Google do, even what is going to happen with people's perception of how unshakable Google is, lots of excitement in multiple places. But the core thing that is also beginning very, very clear is that contrary to popular belief, consumer might not be the biggest AI hit, it's people like Snowflake and Salesforce that are integrating AI deep into their platforms and making it possible for their customers to do more. But I got to tell you, this story is changing like by the month and that's part of what is exciting about working in it. And I want to come back to that notion of enterprise and some of the specific use cases there, but first I want to talk about some of the challenges of adopting AI. You know, I love the all in podcast, but when I listen to those guys it's like, oh yeah, you flick a switch and all of a sudden you can just deploy AI and it's not that simple. It is on one hand to create experiments on the other hand to actually create value. So I wonder, Alex, if you could bring up the slide, this is data from a ETR survey this month. ETR is our data partner and no matter which industry you filter on overwhelmingly, the blocker to moving into production is we're still in the evaluation phase. That's that big bar on the left. And oh yeah, right behind that are data privacy, security, legal and compliance concerns. You know, the lawyers are definitely in control. You know, nothing else really stands out. It doesn't seem that budget is generally concerned because they're stealing from other areas. There's no perceived lack of value. The point is despite the narrative that this stuff is easy to do, it's actually not so simple. It needs to be made easy, but today maybe not so much. So Sridhar, does your data align your information when you talk to customers, align with this and what can be done about it? What specifically are you doing about it? Yeah, I think this is where, you know, all technologists and the media have to be careful about sort of not setting expectations so high that they cannot possibly be met. I'll give you a simple example. Everybody is like, hey, talk to chat GPT. It is the most amazing thing since sliced bread. It's fine. I have a chat GPT subscription. I paid 20 bucks a month. I'm happy to pay it, but I don't think really hard about what questions I want to ask it because I know that if I ask it obscure questions, it is just going to have a synod. Meaning it's going to make up stuff, which means that instantly that cannot be part of a business application. People cannot use what comes out of that to make actual decisions. I'm very proud of the fact that like starting last year, we recognize that language models had magical capabilities, their ability to take like a 1500 word blog and write an accurate four sentence summary is nothing short of miraculous. On the other hand, that doesn't mean that you get to roll it out into an application and give it in the hands of users. So we pioneered a technique called drag, retrieval augmented generation, or you can just think of that as smart search that you need to use in conjunction with the language model in order to produce believable output. This is exactly what we are in the process of doing at Snowflake. It's taking us a few months, but we want to have that index that can set the context for a language model to always produce believable authentic information. And we would much rather have our system say like, hey, I can't answer that. Not all questions are answerable in the context of the data that a particular database has. It is the basics of how do you create value? How do you make sure that it is believable? How do you make sure that you don't over promise that I think the sort of the early hype around AI is severely misjudged? And so we are now going through the graph that you talked about where people are like, oh, it's nice, three answers are great, but two, polite fiction, what do we do with that? Or, you know, your second bar, people are going, oh, you mean like, I have to take all my data out of my database that I've spent years getting in gear and stick it into another index or into another model? How exactly is access control going to work? This is why with the things that we are implementing, we make sure that governance and security are right in there from the very beginning so that like we actually have applications where people can feel like, okay, only the people that are supposed to look at some data are actually allowed to look at that data. On the legal issues we are getting through, we actually had like, you know, a big power with the legal department internally at Snowflake about what are our policies for what data we can use to train models, for fine tuning, how do we assure our customers that their data is never going to be used for any kind of common, you know, tuning of models across artist liability work. This comes up often, people go like, oh, I see. So you're using this open source model, but if you get sued, who's going to bear copyright liability for stuff like that? So this is just stuff that every company has to work through, as I said, with every new piece of technology, we just assume that all of these things are going to just magically happen. But to your point, some of these were led by consumer products, but we can get consumers pretty much agree to anything, but in the enterprise people ask more questions. So I think the delays are predictable, but the rock solid technology from places like Snowflake is fast hitting the market and you will see people that get real value out of it. But even we are going to say like, hey, this is not magic, start with the simple ones. For example, a chatbot over a bunch of documentation, we know how to do this, and that can be made super reliable. It's things like applications like that that get you into the crawl phase, and then we can get into more complicated things like write SQL and run it automatically. That's like more in the walk and run phases. There's 100% value to be created is just going to take a few months. But on the other hand, this is still lightning fast by enterprise standards. And to your point there, street art, the data shows and when you talk to customers, they'll confirm this. When you look at the actual use cases in production, they're essentially taking chat GPT or some kind of similar tool to generate code, provide chat support. Maybe they're writing marketing copy or summarizing text to the legal firms. And well, that's all well and good. Your vision is broader. And so what are you seeing in terms of the possible use cases that are being evaluated beyond these initially simple ones are there specific ones that you can point to or patterns that stand out that are exciting you? I look like a really important part of language models is literally what the two words say, which is their proficiency with language. It is easy to discount all of the simple applications like the summarization, like the translation, like the ability to create a chat bot, over known data, but in terms of efficiency, they're pretty amazing. Part of what we are going to do as part of the Snowflake platform is not just host a set of models, but we're going to make it easily accessible in SQL, which means that the thousands of data analysts that are working in every large company, all of a sudden have access to language models in the SQL that they write without needing to deal with GPUs, deal with APIs, deal with keys, deal with the whole new like billing system, internal approvals, procurement, none of that stuff. That stuff just works. And to me, like that is actually a significant value add to how like companies use Snowflake to how products work. And I would not discount the simple use cases as not delivering value, they're going to deliver a lot of value, but there's a lot. I think it is when they get really good enough to be able to take an English question, write a piece of SQL, get you the answer and like summarize and point out interesting things in the answer or even better, let's say as an enterprise, you have your own APIs. The example that I give people is, let's take a shipping company, they have all of the packages that have been shipped out in Snowflake so that we can know like, hey, what are packages that are still, that Dave is going to get, but there's a different system, let's say that is currently storing where that package is actually located. What models can do is now you can create a chatbot that you put in front of the customers of the shipping company and they'd be able to ask questions like, oh, where is this package that I am expecting from a Wayfair, let's say, which is a Boston company. And the model will then decide, I'm going to use Snowflake to look up the packages, but I'm going to use this other API in order to figure out where that package actually is. It is that ability to fluidly blend in different kinds of applications that I think is going to be powerful. And we have so far in the context of enterprise, not even touched on the generative use cases, you talked about marketing copy, but I can think of other things like personalization of image creatives, using data say from a Salesforce connector coming into Snowflake, you use that and other characteristics of your customer to write custom newsletters to be able to create custom images. I think the generative applications will also come. I don't know if you have played it on with things like video generation like Runway ML. It's early, I find that stuff like fascinating. So I think there's a lot to come. It will take time. Yeah, and what we're playing around with is actually creating short takes from our videos, which AI is doing. There used to be a human that actually did that. You know, the poor guy was like an air traffic controller. And your point about not discounting some of these early use cases, I think is well taken. I mean, at the same time, it's quite amazing, I guess I would say as well, it's quite amazing to me the adoption in terms of experimentation. I mean, even when you think about the cloud, it took a much, much longer time for the cloud to reach this type of experimentation level. So it's huge. The but is if you had a handicap, when do you think will go from this broad experimentation to more meaningful deployments and deeper ROI and what do you think would be the catalyst? Do you have any sort of the handicapping vision there? Yeah, so I would say by the beginning of the calendar year next year, I think you're going to see both from Snowflake and other people access to lots and lots of models from simple things that people know a lot of like SQL. And our take is that combining SQL and streamlet to write interesting applications, people are going to come up with some crazy stuff that will be incredibly valuable. So I think the first wave is going to come early next year. There are people that have built chatbots by stitching together things like, oh, let me take some data, put it into a pine cone or some other vector database and then use something like OpenAI to create a chatbot. That's a great demo. But I said people immediately go like, okay, so what is your data story? How do you make sure that you have good governance on that? I think those sorts of applications will begin to roll out in Q1, but you'll see I think a lot more of that in like Q1, Q2 next year. But models are also making crazy progress. I think you're going to see generative applications, especially in the context of marketing, where you can experiment because you're putting these ads in some external platform like a Facebook, like a Google. So I would say this is a matter of small number of quarters and not like years, to your point, things like the enterprise adoption took years upon years. And even today, we meet lots of like big name customers that are like, yeah, all our stuff runs on-prem. We need to figure out how to be more into the cloud and you guys at Snowflake are great. So compared to that, I think this is lightning fast. I wonder if I get your thoughts. We took this concept of a power law and then applied it to generative AI. And if you think about the power law of the music industry, it's a very steep right angle. There's a lot of big folks and then a huge long tail. There's a similar development we think in gen AI, although open source and some of these other tools are sort of pulling the torso up, if you will. And so it's more of a smooth and then certainly a long tail. But I'm interested in some of your thoughts on the domain specific or industry specific models, versus those consumer models that we were talking about from industry giants like Google and open AI and Microsoft, et cetera. How is that evolving? I mean, Snowflake has visibility on industries. Your go-to-market is organized that way and financial services and healthcare and telecommunications and others. Do you see that evolving? Are there any industry examples that excite you? I think it is definitely evolving pretty rapidly. Whenever there is a new platform like this, people are fascinated by the value. But to be honest with you, I think we are all excited by just the disruption that these things potentially end up creating. And as I said, what happened since the beginning of this year is that things have gone from only three companies can create meaningful large models to where these models are open source. There's a lot more creativity. So there are multiple schools of thought on this. I don't have a, you know, like, I think the jury is still out. I think incumbents are actually going to realize a lot of value from AI. Like, since the beginning of the year, it is pretty clear that neither Bard, nor ChatGPD is going to replace Google search per se. And Bard, which is Google's chatbot, is getting better by the day. It's not rolled out to everybody. There's actually, it is very good. So there's one school of thought that says that, you know, this knowledge is spreading so rapidly and companies are so ultra aware of the disruptive power of AI that they're going to be very quick to absorb, use that technology at Snowflake. We certainly are at like the cutting edge of using all of these technologies. Many of them developed in-house, but also with partnerships into our products so that our customers can use them. We're also looking into partnerships with companies like landing.ai, for example, Andrewing's company, Andrew, of course, one of the foremost people in deep learning. They're interested in image models that can do various kinds of feature detection and also video models. You now can integrate that deep within Snowflake using our extensibility platform called Container Services. So we do see these kinds of specialized applications. People also ask us a lot about, hey, can we get a fine tuned model on top of the SQL model that you have created so that it becomes better at understanding my schema is able to generate better SQL. I think one likely outcome is that the large platform companies like Snowflake obviously AWS, Google, and Azure are going to get really, really very good at figuring out the industry-specific use cases. And it doesn't lead to large-scale disruption, say that the internet created or even that mobile created when the world went from desktop computers to Android and iOS or the dominant mobile platforms. What if I could ask you, you mentioned on-prem before, there's a sort of debate going on in the industry about where these things are going to take place. Obviously inference at the edge is a big topic. I was at a Dell financial analyst meeting earlier this month. And that was a big discussion amongst the financial analysts is like, where is this going to occur? Of course, Dell would make the argument that, well, there's a lot of data on-prem and you're going to bring the AI to the data on-prem and we have solutions there. And of course, Amazon would say something different. You're obviously all cloud, very much prominent in the AWS ecosystem. Do you have any thoughts on that? Customers actually saying, hey, because of privacy, we actually want to do something on-prem, how can Snowflake help us? Or is it just a matter of the cloud making them comfortable with doing this in the cloud? Do you have any thoughts on that? I mean, look, yes, we are 100% cloud, but for the most demanding, the most unique customers, we are also perfectly happy to basically do virtual instances that are dedicated for them so that there's nothing else in that deployment. So as far as customers are concerned, we feel like we have a really solid story when it comes to, can you be on cloud? But I think you're asking a larger architectural question, which is what is the essentially like distribution of GPU capacity between server-side things and what can happen at the edge on individual computers? You probably know this, but your iPhone actually has a fair amount of GPU capacity. This is how things like Face ID work. It's three clever models that help them do this. So I think that is too early to call. I think people like Intel and AMD, but also like Apple, because they designed their own chips are going to be looking to push more of the inferencing functionality to the edge. There's also a race to develop smaller models that can be very useful. For example, with speech recognition, there is little reason why that has to happen at the cloud. There is very good reason to think that that inferencing can be done right on the device. I think there will be value created by the big models. Let's face it, while Lama2 is great, GBD4's reasoning ability by all accounts, including mine, is unmatched. So I think there will be a little bit of this hybrid world where some functions will live at the edge, other functions will be in the cloud, but a lot of these complicated technical questions can only be settled with like data and rollout, and we are in a world where stuff is just evolving very, very rapidly. So I don't think we can quite call it just yet. Yeah, well, for those, and I very much do know that this thing includes a lot of really power, like you said, GPU, there's an NPU in there, a lot of accelerators, and the combinatorial performance is amazing. Followers of my breaking analysis program know where I stand on this. I think AI inferencing is going to be enormous at the edge, and I think ARM is going to be a key platform for that, and I think it's going to, it already is, in my view, rippling into the enterprise. In fact, you're taking advantage of things like Graviton, you know, whether or not, you know, that data comes back to the cloud or maybe, you know, someday Snowflake figures out, okay, how do we actually participate in that? Remains to be seen, but I think, as you point out, it's very unpredictable, and it's those use cases and deployments that are going to actually determine what actually, what really happens. I'll give you the last word, Sridhar, we're out of time, I really appreciate, you know, your thoughts on this. What should we be looking for going forward and maybe some of the key milestones that you want to hit or other things that we should be excited about? At Snowflake, as I said earlier, we get really excited about democratizing data access within the enterprise. That's the point of being a, that's the point of being a data cloud, but not in a way that everybody is comfortable so that there's the right governance and so on. So we have cities of things coming out in the AI space that are going to further that mission, whether it is analysts just being able to do a lot more with their data, our thinkers and app developers being able to spin up apps very quickly to the most complicated of, you know, I need a vector index, I need this, you know, middle layer and I need fine tuned language models for my own use. We are excited to be enabling things like that. You know, we think of ourselves as the iPhone platform for data where stuff just works. Stuff is very well put together. We're very proud of the fact that there's one Snowflake product, not a set of disparate products. That's the lens that we are applying to AI. What I ought to tell you as a technologist, this is such a rapidly moving space. You're going to see us do a lot more things whether it is, you know, taking advantage of mass fine tuning capabilities so that more personalization is possible for more customers. And there's a whole pile of other things that are coming on. It is really exciting to be at the center of all of these discussions, to be talking to customers pretty much every single day about it. I'm like a kid in a candy shop. It's pretty fun times and lots of value to be created. Yeah, and it's our pleasure to be following this and reporting on it. Sridhar, thanks so much for participating in SuperCloud 4. Really appreciate it. Thank you, Dave. Looking forward to everything coming out. All right, keep it right there for more action live from our Palo Alto studio. This is Dave Vellante for John Furrier. You're watching SuperCloud 4 on theCUBE.