 Hey everyone, welcome back to theCUBE's live coverage of Snowflake Summit 22, live from Las Vegas, Caesar's Forum, Lisa Martin here with Dave Vellante. We've got a couple of guests here. We're going to be talking about everyday AI. You want to know what that means? You're in the right spot. Kurt Muehl joins us. The chief customer officer at DataIQ, and Ahmad Khan, the head of AI and ML strategy at Snowflake. Guys, great to have you on the program. It's wonderful to be here. Thank you so much. So we want to understand, Kurt, what everyday AI means. But before we do that, for the audience who might not be familiar with DataIQ, give them a little bit of an overview about what you guys do, your mission, and maybe a little bit about the partnership. Yeah, great. Very happy to do so. And thanks so much for this opportunity. Well, DataIQ, we are a collaborative platform for enterprise AI. And what that means is it's a software that sits on top of incredible infrastructure, notably Snowflake, that allows people from different backgrounds, so data analysts, data scientists, data engineers, all to come together, to work together to build out machine learning models, and ultimately the AI that's going to be the future of their business. And so we're very excited to be here, and very proud to be a very close partner of Snowflake. So Ahmad, what is Snowflake's AI strategy? Is it to partner, where do you pick up? And Frank said today, we're not doing it all. Yeah. The ecosystem by design. Yeah, yeah, absolutely. So we believe in the best of breed. Look, I think we think that we're the best data platform. And for data science and machine learning, we want our customers to really use the best tool for their use cases, right? And DataIQ is our leading partner in that space. And so when you talk about machine learning and data science, people talk about training a model. But it's really the difficult part and challenges are really before you train the model, how do you get access to the right data? And then after you train the model, how do you then run the model, and then how do you manage the model? That's very, very important. And that's where our partnership with DataIQ comes in place. Snowflake provides the platform that can process data at scale for the pre-processing bit. And DataIQ comes in and really simplifies the process for deploying the models and managing the model. Got it, thank you. You talk about, Kurt, DataIQ talks about everyday AI. I want to break that down. What do you mean by that? And how is this partnership with Snowflake empowering you to deliver that to companies? Yeah, absolutely. So everyday AI for us is kind of a future state that we are building towards, where we believe that AI will become so pervasive in all of the business processes, all the decision-making that organizations have to go through, that it's no longer this special thing that we talk about. It's just the day-to-day life of our businesses. And we can't do that without partners like Snowflake. And because they're bringing together all of that data and ensuring that there is the computational horsepower behind that to drive that. And we heard that this morning in some of the keynotes talking about that broad democratization and the, let's call it the pressure that that's going to put on the underlying infrastructure. And so ultimately, everyday AI for us is where companies own that AI capability. They're building it themselves. Very broad participation in the development of that. And all that work then is being pushed down into best-of-breed infrastructure, notably, of course, Snowflake. You said push down. You guys, there's a term in the industry, push down optimization. What does that mean? How is it evolving? Why is it so important? So Ahmed, do you want to take a first step at that? Yeah, absolutely. I mean, when you're processing data before you train a model, you have to do it at scale. That data is coming from all different sources. It's human-generated, machine-generated data. We're talking millions and billions of rows of data. And you have to make sense of it. You have to transform that data into the right kind of features, into the right kind of signals that inform the machine learning model that you're trying to train. And so that's where any kind of large-scale data processing is automatically pushed down by data IQ into Snowflake's scalable infrastructure. So you don't get into memory issues. You don't get into situations where your pipeline is running overnight and it doesn't finish in time. And so you can really take advantage of the scalable nature of cloud computing using Snowflake's infrastructure. So a lot of that processing is actually getting pushed down from data IQ down into the scalable Snowflake compute engine. How does this affect the life of a data scientist? You always hear a data scientist spend 80% of their time wrangling data. I presume there's an infrastructure component around that. We heard this morning you make an infrastructure, in my words, infrastructure self-serve. Does this directly address that problem and talk about that? And what else are you doing to address that 80% problem? It certainly does, right? That's how you solve for data scientists needing to have on-demand access to computing resources or, of course, to the underlying data is by ensuring that that work doesn't have to run on their laptop, doesn't have to run on some constrained physical machines in a data center somewhere. Instead, it gets pushed down into Snowflake and can be executed at scale with incredible parallelization. Now, what's really important is the ongoing development between the two products and within that technology. So today, Snowflake announced the introduction of Python within Snowpark, which is really, really exciting because that really opens up this capability to a much wider audience. Now, data IQ provides that both through a visual interface and historically since last year through Java UDFs. That's kind of the two extremes, right? You have people who don't code on one side, very no-code or a low-code population, and then a very high-code population on the other side. This Python integration really allows us to touch really kind of the fat center of the data science population for whom Python really is the lingua franca that they've been learning for decades now. So talking about the data scientists, I want to elevate that a little bit because you both are enterprise customers, data IQ and Snowflake. Kurt, as the chief customer officer, obviously you're with customers all the time. If we look at the macro environment of all the challenges, companies have to be a data company these days. If you're not, you're not going to be successful. It's how do we do that? Extract insights, value, action, take it. But I'm just curious if your customer conversations are elevating up to the C-suite or the board in terms of being able to democratize access to data to be competitive, new products, new services. We've seen tremendous momentum on the part of customers growth on the Snowflake site. But what are you hearing from customers as they're dealing with some of these current macro pains? Yeah, no, I think it is the conversation today at that C-level is not only how do we leverage new infrastructure, most of them now are starting to have Snowflake, I think Frank said, 50% of the Fortune 500, so we can say most have that in place, but now the question is how do we ensure that we're getting access to that data, to that computational horsepower to a broader group of people so that it becomes truly a transformational initiative and not just an IT initiative, not just a technology initiative, but really a core business initiative. And that really has been a pivot. I've been with my company now for almost eight years. And we've really seen a change in that discussion going from much more niche discussions at the team or departmental level now to truly corporate strategic level. How do we build AI into our corporate strategy? How do we really do that in practice? And we hear a lot about, hey, I want to inject data into apps, AI and machine intelligence into applications. And we've talked about, those are separate stacks. You got the data stack and analytics stack over here. You got the application development stack, the databases off in the corner. And so we see you guys bringing those worlds together. My question is what does that stack look like? I took a snapshot, I think it was Frank's presentation today. He had infrastructure at the lowest level, live data, so infrastructure is cloud. Live data, that's multiple data sources coming in. Workload execution, you made some announcements there to expand that application development. That's the tooling that is needed. And then marketplace, that's how you bring together this ecosystem and monetization. It's how you turn data into data products and make money. Is that the stack? Is that the new stack that's emerging here? You guys defining that? Absolutely, absolutely. You talked about like the 80% of the time being spent by data scientists. And part of that is actually discovering the right data. Being able to give the right access to the right people and being able to go and discover that data. And so you go from that angle all the way to processing, training a model, and then all those predictions and insights that are coming out of the model are being consumed downstream by data applications. And so the two major announcements I'm super excited about today is the ability to run Python, which is Snowpark in Snowflake. You can now, as a Python developer, come and bring the processing to where the data lives rather than move the data out to where the processing lives, right? So both SQL developers, Python developers, fully enabled. And then the predictions that are coming out of models that are being trained by data IQ are then being used downstream by these data applications for most of our customers. And so that's where the second announcement with Streamlet is super exciting. I can write a complete data application without writing a single line of JavaScript, CSS or HTML, I can write it completely in Python. It makes me super excited as a Python developer myself. And you guys have joint customers that are headed in this direction, doing this today, can you talk about that? Yeah, we do. There's a few that we're very proud of, well-known companies like REI or Emeritus. But one that was mentioned today, this morning by Frank again, Novartis, a pharmaceutical company, they have been extremely successful in accelerating their AI and ML development by expanding access to their data. And that's a combination of both the data IQ layer, allowing for that work to be developed in that workspace. But of course without the underlying platform of Snowflake, they would not have been able to have realized those gains. And they were talking about very, very significant increases in inefficiency, everything from data access to the actual model development to the deployment. It's just really, really honestly inspiring to see. It was great to see Novartis mentioned on the main stage, massive time to value there. We've actually got them on the program later this week, so if that was great. Another joint customer, you mentioned REI, we'll let you go, cause you're off to do a session with REI, is that right? That's exactly right. So we're going to be doing a fireside chat, talking about, in fact much of the same, all of the success that they've had in accelerating their analytics workflow development, the actual development of AI capabilities within of course that beloved brand. Excellent, guys thank you so much for joining Dave and me talking about everyday AI, what you're doing together, data-icune snowflake to empower organizations to actually achieve that and live it. We appreciate your insights. Thank you both. Thank you guys. Thank you for having us. For our guests and Dave Vellante, I'm Lisa Martin. You're watching theCUBE's live coverage of Snowflake Summit 22 from Las Vegas. Stick around, our next guest joins us momentarily.